Do you have experience with writing code to ingest data? Do you like data wrangling, digging into data sources, and processing data to a readable and usable state? Do you love that feeling of accomplishment when data is flowing seamlessly into a data lake, day in day out, hour after hour, based on code that you have carefully crafted? Then this job opening is just what you are looking for!
We need your expertise
For multiple enterprise customers we create data connectors to make data flow from various sources to a data lake of choice: that can be Splunk, Databricks, Hadoop, or any other technical implementation. What they have in common, is that the data will be used in a production environment, so it has to flow seamlessly and needs to be monitored for disruption. Data validation and data quality are also important.
- Create data connectors, using Python or other coding languages
- Define data validation tests to run in the data pipeline
- Define monitoring and alerting to ensure visibility when the data flow is interrupted or corrupted
- If incidents occur, you take the lead in getting to the root cause as soon as possible in order to solve the incident with as little impact on the end users as possible
- You and the team are responsible for building, deploying, maintaining, and optimizing the data ingestion and data connectors to have data flow to the data lake
At Itility we merge technology and data to go one step beyond for our customers. We are a consultancy team with data experts, cloud experts, and IT professionals, working for large enterprises and innovative startups.
Our culture can be described as ‘no-nonsense, with passion’. Working at Itility is about working with people, staying close to our customers.
We twin up with our customers to shape their digital solutions. We are shapers. Delivering, running environments seamlessly, automated to the max. With an eye on the next challenge, the next innovation.
Do you like to go above and beyond? Do you want to work with passion for what you do, in a team of people fueled by the same passion?
Then we would like to meet you.
This is what we offer
You will be given the opportunity to develop in the best way possible, under the personal guidance of fellow data engineers and architects of Itility.
If you do not have the required expertise for the job but do have the passion for data engineering, we offer a substantial trainee program to get you up to speed for the job in a planned manner.
In addition to a competitive salary, you will receive extras such as:
- Training at the two-year Itility Academy and external customized courses
- Additional performance-based variable payment
- 26 vacation days per year
- Expense allowance
- Laptop, telephone, and a company car or car allowance
You believe in
You believe in scrum/agile way-of-working and in software practices that enable a professional data flow. Further requirements:
- You have a bachelor’s or master’s degree
- You have experience creating data ingestion scripts
- Experience with Linux is a prerequisite
- You have a good understanding of SQL and Python
- You are a team player and you have good communication skills
- Ideally, you have worked with data platforms and data lakes within an enterprise environment
Screening is part of the hiring procedure.
Eindhoven area or Randstad area.