Do you have an interest in the technical part of big data and in specific Hadoop? Than this vacancy might be what you are looking for.
What you will do
You work in a DevOps team together with other engineers to build and manage Hadoop stacks.
As Hadoop engineer, you are responsible for installing, configuring and managing the Hadoop ecosystem components, such as HDFS, HBase, Hive, Sqoop, Oozie, and Flume.
- Install, configure and manage Hadoop stacks: e.g. configuration management, release management, capacity management, performance management, hi-availability & platform resilient, environment and infrastructure integration, security architecture implementation, technology refresh
- Support data engineers in deploying their use cases on the Hadoop stacks
- Automate controls in the areas of scaling and security
This is Itility
At Itility, top professionals work together on complex IT solutions for international enterprises at the design, implementation and run stages. In multidisciplinary teams, we always combine different areas of knowledge and competencies, so that we can work faster and more efficiently at all levels.
We partner with our customers and learn from each other every day. Based on our vision, we start with building blocks to then use IT in a fully flexible manner
And we do it well, because our customer portfolio and team continue to grow. We are committed to the use of new technologies and innovative ideas.
As recognition of our work, in 2014 we won the Timmies Award for The Most Innovative Leader.
Our lines of communication are short and we act quickly.
This is what we offer
You will be given the opportunity to develop in the best way possible, under the personal guidance of a Senior Architect or Project Leader.
In addition to a competitive salary, you will receive extras such as:
- training at the 2-year Itility Academy and external customised courses
- additional performance-based variable pay
- 26 days’ vacation time
- expense allowance
- car, laptop and telephone
Who are you?
You are well acquainted with the complete Hadoop stack. In addition, you have practical experience of being part of a DevOps team. Further requirements:
- Bachelor of science / Master degree in Computer Science, System Administration, or any other IT infrastructure or software related study with a passion for the automation side of IT infrastructure
- Minimal 2-3 years of relevant work experience in installing, configuring, testing Hadoop ecosystem components
- Hortonworks Certified Hadoop Developer and/or Cloudera Certified Hadoop Developer and/or Certified Hadoop Administrator
- Knowledge of continuous integration & delivery tooling: e.g. Jira, Git, Jenkins, Bamboo
- Knowledge of configuration management tooling: e.g. Puppet, Chef, Ansible
- Strong verbal and written communication skills
- Good documenting capabilities
You have a hands-on mindset, a strong customer- and problem solving orientation, show fast results, and have demonstrated good communication skills, especially in an international IT organization. To achieve the project goals, you are able to liaise directly with all stakeholders. You have a clear focus on results and quality.
Screening could be part of the procedure.