Hadoop Data Engineer
Do you have an interest in the functional part of big data and in specific Hadoop? Than this vacancy might be what you are looking for.
What you will do
Working in teams (consisting of Hadoop data engineers, Hadoop data warehouse engineers, and platform engineers) that are building and managing Hadoop stacks. The teams install, configure and manage Hadoop ecosystem components.
As Hadoop data engineer, you are responsible for the functional part of provisioning data – e.g. building data ingestion pipelines and data connectors. You work closely with the data scientists and business intelligence engineers who are using this data to create analytical models.
- Build efficient and highly reliable data ingestion pipelines for the Hadoop stack
- Own data quality and data knowledge around all data that you touch
- Work side-by-side with software engineers and data scientists in designing modelled data sets to be used in many different applications, from proof-of-concept to production
- Understand the entire lifecycle of data that flows through any systems for which you are responsible
- Pay constant attention and effort to the reliability of your pipelines
This is Itility
At Itility, top professionals work together on complex IT solutions for international enterprises at the design, implementation and run stages. In multidisciplinary teams, we always combine different areas of knowledge and competencies, so that we can work faster and more efficiently at all levels.
We partner with our customers and learn from each other every day. Based on our vision, we start with building blocks to then use IT in a fully flexible manner
And we do it well, because our customer portfolio and team continue to grow. We are committed to the use of new technologies and innovative ideas.
As recognition of our work, in 2014 we won the Timmies Award for The Most Innovative Leader.
Our lines of communication are short and we act quickly.
This is what we offer
You will be given the opportunity to develop in the best way possible, under the personal guidance of a Senior Architect or Project Leader.
In addition to a competitive salary, you will receive extras such as:
- training at the 2-year Itility Academy and external customised courses
- additional performance-based variable pay
- 26 days’ vacation time
- expense allowance
- car, laptop and telephone
Who are you?
You are well acquainted with the complete Hadoop stack. In addition, you have practical experience of being part of a DevOps team. Further requirements:
- Bachelor of science / Master degree in Computer Science, System Administration, or any other IT infrastructure or software related study with a passion for the automation side of IT infrastructure
- Minimal 2-3 years of relevant work experience
- Capable of building/operating highly available, distributed systems of data extraction, ingestion, and processing of large data sets of structured, semi-structured and unstructured data
- Experience in building data products incrementally and integrating and managing datasets from multiple sources
- Data quality oriented
- Familiar with data architecture including data ingestion pipeline design, Hadoop information architecture
- Hortonworks Certified Hadoop Developer and/or Cloudera Certified Hadoop Developer and/or Certified Hadoop Administrator
- Knowledge of continuous integration & delivery tooling: e.g. Jira, Git, Jenkins, Bamboo
- Coding proficiency in at least one modern programming language (Python, Ruby, Java)
- Strong verbal and written communication skills
- Good documenting capabilities
You have a hands-on mindset, a strong customer- and problem solving orientation, show fast results, and have demonstrated good communication skills, especially in an international IT organization. To achieve the project goals, you are able to liaise directly with all stakeholders. You have a clear focus on results and quality.
Screening could be part of the procedure.