Data Engineer
Role details
Job location
Tech stack
Job description
- Gross monthly salary between €3,869 and €5,526 (scale 08)
- Thirteenth month's salary and 8% holiday allowance
- 10% Employee Benefit Budget
- EUR 1,400 development budget per year
- Hybrid working: balance between home and office work (possible for most roles)
- A pension, for which you can set the maximum amount of your personal contribution
Or view all our benefits.
In the Business Lending domain, we support business clients with financial solutions, such as loans, credit facilities, lease, and factoring solutions. Our ambition? To be ahead of the market by enabling 80% of our client requests through fast and seamless digital journeys. Simplification and standardization are crucial to achieve this goal.
Our team plays a pivotal role in this transformation. We firmly believe that data is a key asset in today's financial world. Our mission is to support data-driven decision making, to ultimately help more business clients.
With our data platform we offer solutions that enable straight-through-processing and fully automated credit origination. We maintain client profiles that are continuously updated with data from client and employee journeys, and we implement risk models used for acceptance decisions.
We have the objective to better understand data: where it comes from, how it is used, what the quality is. We collaborate with product, engineering, and risk teams to continuously unlock new data sources and improve our data landscape.
What you can expect
As a Data Engineer, you collaborate with product, engineering, and modelling teams to ensure we deliver business value. You will work on the design, development, and maintenance of our data platform and implement both ETL pipelines and real-time data ingestion and processing, as well as API development to enable seamless data access and integration.
Your key responsibilities are (1) contribute to the design, scalability and reliability of our data infrastructure, (2) build and optimize data pipelines for both batch and streaming data, (3) enable data-driven solutions.
Your activities include:
- designing and implementing ETL and real-time data pipelines
- developing and deploying containerized applications and APIs to expose data and services
- implementing data quality checks and monitoring
- providing operational support and troubleshooting for data infrastructure
- collaborating with stakeholders to understand requirements and deliver business value
- continuously improving our data engineering practices
Requirements
Do you have experience in Terraform?, Do you have a Master's degree?, * A completed master's degree in Computer Science, Software Engineering, or a related field
- 1-3 years of experience in data engineering, data integration, or software engineering
- Strong programming skills in Python
- Experience with ETL development
- Familiarity with data architectures, SQL & NoSQL databases
- Familiarity with cloud platforms (e.g., AWS, Azure or GCP)
- Experience with one or more of the following is a plus: PySpark, Airflow, API development, AWS services (e.g., Glue, Lambda, MSK, S3, API Gateway, Fargate, DocumentDB), Docker, Terraform, and building CI/CD pipelines
Benefits & conditions
- Learn more at rabobank.jobs/en/faq
- A security check is part of the process
- We respect your privacy
Everyone is different, and it is exactly those differences that help us become an even better bank. That's why we want to know who you really are!
Functieprofielen op RaboHub