Data Solution Architect
JUVO
Anderlecht, Belgium
3 days ago
Role details
Contract type
Temporary contract Employment type
Full-time (> 32 hours) Working hours
Regular working hours Languages
Dutch, English, FrenchJob location
Anderlecht, Belgium
Tech stack
Azure
Cloud Computing Security
Continuous Integration
Data Governance
ETL
Data Systems
Identity and Access Management
Performance Tuning
Cloud Services
GIT
Data Lake
PySpark
Kubernetes
Collibra
Bicep
Terraform
Data Pipelines
Docker
Databricks
Job description
As a Data Solution Architect, you will design, implement, and optimize cloud-based data solutions using Azure and Databricks, including the migration of ETL jobs to Databricks. You will ensure that data architectures are scalable, secure, and aligned with enterprise standards.
Functie-eisen
Azure Engineering & Architecture
- Hands-on configuration and setup of Azure services
- Environment provisioning, networking, IAM, and security
- Design of secure and efficient Azure architectures with the Enterprise Architect and the team
- Troubleshooting and optimization
Databricks Engineering
- Migration of ETL jobs to Databricks
- Redesign ETL/ELT processes in Databricks
- Configure clusters, workspaces, notebooks, and CI/CD
- Perform performance tuning and workflow optimization
Architecture & Best Practices
- Define end-to-end data architectures and standards with the Enterprise Architect and the team
- Ensure alignment with governance, lineage, and security
- Produce architecture documentation with the Enterprise Architect and the team
- Collaborate with cross-functional teams
Transition & Solution Design (Legacy * New Platform)
- Support the design and decision-making process during the transition between the existing platform and the new Azure / Databricks environment.
- Analyze constraints of both platforms and propose pragmatic, step by step solutions.
- Participate in refinement to help define what to migrate, how, and in which sequence, ensuring alignment with both technical realities and business priorities.
- Design temporary or hybrid solutions when needed to guarantee a smooth and controlled transition.
Support & Collaboration
- Provide guidance on Azure and Databricks usage
- Refine, prioritize, and manage user stories, ensuring clear acceptance criteria and strong alignment with business and technical objectives.
- Actively participate in sprint reviews and planning sessions
- Fulfil Product ownership responsibilities of the Data Platform, owning product vision, roadmap, stakeholder communication, and value delivery.
- Mentor team members on best practices
Secondary Tasks & Responsibilities
- Support incident resolution related to data pipelines or cloud services
- Improve monitoring, observability, and alerting systems
- Contribute to CI/CD improvements for data pipelines
- Support data quality initiatives and validation frameworks
- Evaluate new tools or technologies
- Participate in enforcing coding standards
- Collaborate with security teams to ensure compliance
- Assist in cloud cost optimization and resource usage governance
- Provide backup support for other architects/engineers
Requirements
- Certification & Strong hands-on experience with Azure (Key Vault, IAM, networking)
- Certification & Strong hands-on experience with Databricks (Clusters, Delta Lake, Unity Catalog, PySpark, Notebooks)
- Experience in migrating ETL jobs to Databricks
- Solid knowledge of data lake house architectures
- Strong understanding of cloud security principles
- Hands-on experience with CI/CD and Git workflows
Nice to have skills
- Terraform or Bicep
- Docker or Kubernetes
- Data governance tools (e.g., Purview, Collibra)
Non-Technical Profile Requirements
- Communication & Collaboration
- Leadership & Ownership
- Problem-Solving & Analytical Thinking
- Team Spirit & Adaptability
- Professional Attitude
- Languages: French and/or Dutch (knowledge of the second language is a plus) + very good English