Senior Data Architect

Clearstream Europe AG
Frankfurt am Main, Germany
3 days ago

Role details

Contract type
Permanent contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English
Experience level
Senior

Job location

Frankfurt am Main, Germany

Tech stack

Artificial Intelligence
Azure
Big Data
Data Architecture
Data Structures
Data Systems
Datadog
System Availability
Spark
Data Strategy
Togaf
Collibra
Machine Learning Operations
Databricks

Job description

Your career at Deutsche Börse Group\n\nYour area of work:

As a Data Architect at Clearstream Securities Services, you will play a key role in driving our digital transformation by shaping the data architecture that underpins strategic initiatives such as Data & AI, the D7 digital securities platform, the modernization of our Digital CSD, and our cloud adoption journey. Your work will ensure that Clearstream's data ecosystem is secure, scalable, and future-ready, enabling advanced analytics, regulatory compliance, and innovative services for global financial markets. By defining enterprise standards and collaborating across domains, you will help position Clearstream as a leader in digital finance and data excellence. Your responsibilities:

  • Define the target-state cloud data architecture (Lakehouse) across BI, analytics, and ML-including reference architectures, patterns, and guardrails.
  • Lead domain-driven and Data Mesh architecture: curate federated data product standards, contracts, and interoperability guidelines.
  • Establish architecture decision records (ADRs) and technology roadmaps (near/mid/long term), aligning with enterprise data strategy and regulatory requirements.
  • Act as a trusted advisor to data scientists, analysts, and business product teams; unlock self-service patterns and reuse.
  • Lead onboarding, training, and enablement programs to scale platform and catalog adoption; raise data literacy across domains.
  • Mentor engineers and stewards; foster a culture of automation, reliability, and measurable outcomes.
  • Guide the design and configuration of Atlan workspaces, clusters, compute policies, and libraries; ensure high availability, scalability, and cost efficiency.
  • Own enterprise metadata strategy-ingestion, enrichment, and stewardship-leveraging Atlan (or equivalent) for catalog, lineage, glossary, and data contracts.
  • Define data quality frameworks (profiling, rules, SLAs, monitoring) using solutions like Soda/Great Expectations; embed DQ controls into pipelines.
  • Establish critical data element (CDE) standards, certification workflows, and KPIs; publish management reporting on catalog adoption and DQ outcomes.

Your profile:

  • 8-12+ years in data architecture/engineering within cloud big data platforms (GCP preferred), including enterprise and regulated environments.
  • Proven leadership in Databricks/Spark, Lakehouse architectures, and operationalizing Unity Catalog at scale.
  • Hands-on track record with metadata/catalog platforms (Atlan/Collibra) and data quality frameworks (Soda/Great Expectations), including enterprise rollout and adoption.
  • Demonstrated success aligning IT, business, compliance, and security to implement governed, high-quality data solutions.
  • Deep understanding of data modeling, data contracts, DQ SLAs, lineage, privacy/security, and observability (e.g., Azure Monitor/Datadog).
  • Working knowledge of ML lifecycle (MLflow, feature stores) and semantic/metrics layers for analytics products.
  • Executive-level communication, stakeholder engagement, and influence; comfortable with board-level reporting and audit dialogue.
  • Strong facilitator and coach; able to drive consensus across diverse technical and business teams.
  • Fluent in English; additional language skills are a plus.
  • Certifications (preferred)
  • Azure Data Engineer/Architect; Databricks Data Engineer Professional/Architect; TOGAF; CDMP (DAMA); Security/Compliance certifications are advantageous.

Requirements

  • 8-12+ years in data architecture/engineering within cloud big data platforms (GCP preferred), including enterprise and regulated environments.
  • Proven leadership in Databricks/Spark, Lakehouse architectures, and operationalizing Unity Catalog at scale.
  • Hands-on track record with metadata/catalog platforms (Atlan/Collibra) and data quality frameworks (Soda/Great Expectations), including enterprise rollout and adoption.
  • Demonstrated success aligning IT, business, compliance, and security to implement governed, high-quality data solutions.
  • Deep understanding of data modeling, data contracts, DQ SLAs, lineage, privacy/security, and observability (e.g., Azure Monitor/Datadog).
  • Working knowledge of ML lifecycle (MLflow, feature stores) and semantic/metrics layers for analytics products.
  • Executive-level communication, stakeholder engagement, and influence; comfortable with board-level reporting and audit dialogue.
  • Strong facilitator and coach; able to drive consensus across diverse technical and business teams.
  • Fluent in English; additional language skills are a plus.
  • Certifications (preferred)
  • Azure Data Engineer/Architect; Databricks Data Engineer Professional/Architect; TOGAF; CDMP (DAMA); Security/Compliance certifications are advantageous.

Apply for this position