Architect - Global Data Platform / Data Mesh
Serviceplan Group
München, Germany
2 days ago
Role details
Contract type
Permanent contract Employment type
Full-time (> 32 hours) Working hours
Regular working hours Languages
English, German Experience level
JuniorJob location
München, Germany
Tech stack
Azure
Continuous Integration
Data Architecture
Information Engineering
Data Infrastructure
ETL
Python
Metadata Standards
Role-Based Access Control
Cloud Services
SQL Databases
Data Streaming
Management of Software Versions
Data Logging
Large Language Models
Prompt Engineering
PySpark
Information Technology
Data Pipelines
Databricks
Job description
- Supporting the design and further development of our Azure-based Global Data Platform (ingestion, storage, processing, serving).
- Contributing to architectural building blocks, reference architectures, and blueprints for domain teams.
- Adding value to the integration of MCP and LLM services, RAG architectures, and agentic AI patterns.
- Implementing and documenting technical components under the guidance of experienced architects.
- Supporting the development of data products based on Data Mesh principles.
- Contributing to ETL/ELT pipelines (ADF, Databricks, PySpark) as well as testing, versioning, and deployment.
- Ensuring adherence to naming conventions, metadata standards, and documentation requirements.
- Applying and further evolving existing platform standards (RBAC, logging/monitoring, CI/CD).
- Using and expanding self-service tools (catalog, glossary, lineage, quality frameworks).
- Participating in the standardization of LLM usage, prompt design, and RAG governance.
- Supporting architecture reviews, security assessments, and technical evaluations.
- Collaborating closely with domain teams, data engineers, product owners, and senior architects.
- Creating technical documentation, guidelines, and how-to guides, and actively contributing to the Community of Practice.
Requirements
- A completed degree in Computer Science, Data Engineering, Business Informatics, or a comparable qualification.
- First relevant hands-on experience with cloud data platforms (ideally Azure) as well as data engineering.
- Basic understanding of data architectures, modeling concepts, and modern data platforms.
- Foundational knowledge in at least two of the following areas: SQL, Python, Databricks, ADF, Azure Storage, CI/CD, Git, API design.
- Interest in Data Mesh, data products, domain ownership, and modern platform concepts.
- Understanding of topics such as security, access control, privacy-by-design, and data quality.
- A structured working style, strong motivation to learn, and enjoyment of collaborating in international, interdisciplinary teams.
- Strong communication skills in English; German skills are a plus.