Dataiku
Role details
Job location
Tech stack
Job description
You will join a project team responsible for reconstructing data processing workflows whose outputs are used by Finance and Regulatory Reporting teams.
The objective is to combine Databricks (storage & compute) and Dataiku (data transformation), leveraging as much as possible Dataiku Visual Recipes.
You will work in a structured, performance-driven, and governance-oriented data environment.
Your Responsibilities
- Develop and maintain complex data transformation workflows in Dataiku following internal guidelines (Dev vs Automation node, low-code recipes, etc.)
- Leverage Databricks compute capabilities under the supervision of the Tech Lead to optimize performance and costs
- Rebuild and modernize SAS program functionalities into efficient ELT processes
- Ensure data quality, consistency, and integrity throughout the transformation lifecycle
- Participate in performance tuning and cost optimization initiatives
- Contribute to documentation and best practices implementation
- Provide knowledge transfer to both Project and Business teams who will take over operational ownership
- Collaborate with technical and non-technical stakeholders, Join Nexeo Belgium and contribute to a large-scale data transformation program in a modern cloud-based data environment.
You will work on cutting-edge technologies and play a key role in rebuilding strategic data processes within a structured and collaborative environment.
Requirements
Languages
- English: Fluent
- Dutch or French: Fluent (the other language is an asset)
Education
- Bachelor's or Master's degree in Computer Science, Engineering, Mathematics, or related field
Experience & Technical Skills
Mandatory
- Minimum 5 years of experience in SQL/SAS and Data Warehousing
- At least 2 years of proven experience with Dataiku (Dataiku Developer certification is a strong asset)
- Strong expertise in SQL and good knowledge of Python
- Solid experience in complex ELT processes and data transformation
- Experience with Databricks and distributed compute environments
- Good understanding of data platform governance and best practices
Nice to Have
- Experience in regulated environments (Finance, Energy, Utilities, etc.)
- Experience with AWS-based data architectures
Soft Skills
- Strong analytical mindset with attention to detail
- Autonomous and proactive
- Team player with strong collaboration skills
- Strong communication skills (technical & business stakeholders)
- Structured and solution-oriented