Data Architect (Cloud)
Uni Systems
Brussels, Belgium
2 days ago
Role details
Contract type
Permanent contract Employment type
Full-time (> 32 hours) Working hours
Regular working hours Languages
EnglishJob location
Brussels, Belgium
Tech stack
Artificial Intelligence
Airflow
Cloud Computing
Continuous Integration
Data Architecture
Data Governance
Data Infrastructure
ETL
Data Security
Data Systems
Elasticsearch
PostgreSQL
Metadata
MongoDB
Oracle Applications
SAP Applications
SAS (Software)
Systems Integration
Data Ingestion
Spark
Kubernetes
Luigi
Non-relational Database
Data Management
REST
Data Pipelines
Devsecops
Job description
- Develop and update a data architecture strategy that adapts to evolving needs and accommodates both Business Intelligence and AI workloads.
- Design and implement architectures for the cloud that are vendor agnostic.
- Design a modern scalable data platform to replace a large legacy data system in a phased approach.
- Align architectural decisions with data governance policies and the department's vision on cloudification.
- Establish and enforce data management policies and processes, including data quality, security and platform health monitoring.
- Ensure regulatory compliance and adherence to audit requirements.
- Provide guidance and mentorship to data analysts and data engineers.
- Facilitate change management by guiding colleagues and users through the migration process.
- Document and maintain data architecture and data assets in detail.
- Assistance with deployment, configuration and testing of the system.
- Participation in meetings with other project teams.
Requirements
What do you need to succeed in this position?
- Master's degree in IT and at least 13 years of IT experience (or Bachelor's degree and at least 17 years of IT experience).
- Experience in migrating legacy data systems (SAP DataServices, SAS Data Integration) to a modern cloud-based, open-source data platform solution (preferably Data Lakehouse).
- Excellent knowledge of designing scalable and flexible modern cloud-based and open-sources data architectures.
- Experience with AI-powered assistants like Amazon Q for innovative data solutions design.
- Strong exposure with Kubernetes.
- Previous experience with relational and non-relational database systems (PostgreSQL, Oracle or Elasticsearch, MongoDB).
- Experience with ETL/ELT processes and related data ingestion and transformation tools (like Spark, dbt, Trino).
- Proficiency in data pipeline orchestration tools (like Airflow, Dagster, Luigi).
- Knowledge of data governance frameworks and tools (like DataHub, Open Metadata).
- Familiarity with data quality management, data security, access control and regulatory compliance.
- Proficiency with system-to-system integration via RESTful APIs.
- Experience with DevSecOps practices and tools related to data pipelines, including CI/CD for data infrastructure.
- Good knowledge of modelling tools.
- Advanced English (C1) communication skills (written and spoken).