Data-Warehouse & Reporting Developer
Role details
Job location
Tech stack
Job description
We currently have a vacancy for a Data-Warehouse & Reporting Developer in English, to offer his/her services as an expert who will be based in Brussels, Belgium. The work will be carried out either in the company's premises or on site at customer premises. In the context of the first assignment, the successful candidate will be integrated in the Development team of the company that will closely cooperate with a major client's IT team on site., * Develop, deploy, and maintain scalable and incremental data pipelines from REST APIs and databases using Python, PySpark, Azure Synapse, Knime, SQL, and ETL tools to ingest, transform, and prepare data;
- Process and transform complex JSON and GIS data into structured datasets optimized for analysis and reporting. This includes parsing, transforming, and validating JSON data to ensure data quality and consistency;
- Load, organize, and manage data in Azure Data Lake Storage and Microsoft Fabric OneLake, ensuring accessibility, performance, and efficient storage using lakehouse and Delta Lake patterns;
- Document ETL processes, metadata definitions, data lineage, and technical specifications to ensure transparency and reusability;
- Implement data quality checks, logging, monitoring, and automated incremental load mechanisms within data pipelines to support maintainability, observability, and troubleshooting.
Requirements
Do you have experience in Scrum?, Do you have a Master's degree?, * University degree in IT or relevant discipline, combined with minimum 17 years of relevant working experience in IT;
- Minimum 5 years of excellent knowledge in Azure Data Lake Storage and Oracle databases;
- Minimum 5 years of excellent expertise in developing data pipelines from REST APIs and on integration (such as Azure Synapse, PySpark, Python, SQL, KNIME);
- Minimum 5 years of excellent expertise in processing JSON and GIS data;
- Minimum 2 years of Microsoft Fabric OneLake and Microsoft Fabric;
- Experience designing incremental loads, CDC processes, and automated schema evolution;
- Experience with CI/CD pipelines;
- Experience working in an Agile and Scrum framework;
- Excellent knowledge of working with REST APIs;
- Ability to implement robust data quality checks, logging, and monitoring in ETL processes;
- Ability to document ETL workflows, metadata, and technical specifications clearly and consistently;
- Familiarity with DevOps and version control best practices;
- The following certification is required: Microsoft Azure Data Engineer Associate;
- The following certifications are considered an asset: Microsoft Certified: Azure Solutions Architect Expert, Microsoft Certified: Azure Developer Associate, Microsoft Certified: Azure Database Administrator Associate;
- Excellent command of the English language, French will be considered an asset.
Benefits & conditions
We offer a competitive remuneration (either on contract basis or remuneration with full benefits package), based on qualifications and experience. All applications will be treated as confidential.