Senior Data Platform Engineer - Dusseldorf

Boston Scientific Corporation
10 days ago

Role details

Contract type
Permanent contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English
Experience level
Senior

Job location

Tech stack

Artificial Intelligence
Airflow
Amazon Web Services (AWS)
Data analysis
Azure
Cloud Computing
Continuous Integration
Information Engineering
ETL
Distributed Data Store
Github
Identity and Access Management
Job Scheduling
Python
Key Management
Network Segmentation
Operational Data Store
SQL Databases
Data Streaming
Parquet
Data Logging
Data Classification
Snowflake
Spark
Mttr
GIT
Kubernetes
Information Technology
Data Management
Cloudwatch
REST
Terraform
Amazon Web Services (AWS)
Docker

Job description

You'll start your day by scanning platform health dashboards, pipeline success rates, latency, cost and capacity signals, SLO attainment, and data quality alerts, then action the highest-value improvements. You'll partner with data product managers & owners, data engineers, AI engineers, data scientists, and application teams to turn business needs into robust, reusable platform capabilities.

When EMEA-specific constraints (e.g., GDPR) surface, you'll collaborate with Security, Legal, and Governance to design compliant patterns without slowing delivery. Working closely with the Global team, you'll ship infrastructure-as-code, automate CI/CD for data workloads, and harden security so teams can build safely by default.

Throughout the week, you'll evolve ingestion and processing frameworks (batch, micro-batch, streaming), improve observability (logging, tracing, lineage), and coach teams on using the platform's self-service tooling and catalog. You'll run blameless incident reviews, remove toil, and continuously raise reliability, performance, and cost efficiency.

Each month, you'll contribute to global architecture forums to converge standards, reuse patterns, and ensure EMEA requirements are reflected in the enterprise roadmap., Reporting to the Data Engineering & Platform Manager, the Data Platform Engineer builds and operates the cloud data platforms that powers analytics, AI, and operational data products across the region. You will design secure, scalable, and observable platform services (compute, storage, processing, orchestration, quality, lineage, access), deliver them as well-documented, reusable capabilities, and support teams in adopting them at scale. Success requires deep engineering craft, platform thinking, and close collaboration across product, architecture, and governance., * Platform engineering & operations: Design, build, and operate cloud data platforms components (lake/lakehouse, warehouses, streaming, orchestration, catalogs) with strong SLIs/SLOs, automated recovery, and capacity planning.

  • Pipelines & frameworks: Provide reusable templates and libraries for ELT/ETL, CDC, and streaming, standardize patterns for schema evolution, testing, and deployments across domains.
  • Security & compliance by design: Implement least-privilege IAM, key management, encryption in transit/at rest, network segmentation, and data classification, ensure GDPR/data residency adherence with auditable controls.
  • Observability & quality: Instrument end-to-end telemetry (logs/metrics/traces), lineage, and data quality checks, build dashboards and alerts to prevent regressions and reduce MTTD/MTTR.
  • Automation & IaC: Deliver platform resources with infrastructure-as-code, enable Git-based workflows, CI/CD for data workloads, and policy-as-code guardrails.
  • Cost efficiency: Monitor and optimize spend across compute/storage, set budgets and alerts, recommend right-sizing, workload scheduling, and caching/format strategies.
  • Collaboration & enablement: Document platform capabilities, publish examples and runbooks, and provide office hours/community support to drive safe self-service adoption.
  • Incident & change management: Lead RCAs, prioritize preventative engineering and change controls.
  • Standards & reusability: Contribute to reference architectures and shared components, champion interoperability (contracts, semantics) with architecture and governance peers.
  • Global alignment: Partner with global platform, product, and security teams to align roadmaps, share learnings, and represent EMEA needs., * An ownership mentality for reliability, performance, and cost, and the curiosity to automate everything that repeats.
  • Pragmatic engineering instincts: you ship secure, simple, and well-tested solutions that scale.
  • A passion for enabling others through great developer experience, documentation, and coaching.
  • Continuous learning in Data & AI platform capabilities (semantic layers, streaming, GenAI-assisted tooling) and how to apply them responsibly.

What do we offer:

  • A chance to shape the EMEA data platform foundations and set enterprise-wide standards.
  • A collaborative, global network and supportive coaching culture focused on your growth.
  • Opportunities to lead high-impact initiatives that accelerate analytics and AI adoption across diverse markets.
  • A coaching culture environment focusing on your success and development!

Requirements

  • Bachelor's in computer science, Engineering, or related field.
  • 5-8+ years in data engineering/platform engineering with production experience in cloud data stacks, with a strong focus on Snowflake, Data Cloud and AWS as primary platforms. Experience with Informatica, dbt, and Azure is highly valued.
  • Proficiency in Python and SQL, strong grasp of distributed data processing (e.g., Spark on AWS EMR) and storage formats (Parquet/Delta/Iceberg).
  • Hands-on with Infrastructure as Code (e.g., Terraform), CI/CD (e.g., GitHub Actions/Azure DevOps), containers/orchestrators (e.g., Docker/Kubernetes), and job schedulers (e.g., Airflow/ADF).
  • Solid security engineering across AWS and Snowflake: IAM, secrets, encryption, and policy-as-code.
  • Observability mindset: metrics, tracing, logging, lineage, and data quality frameworks, ideally leveraging platform-native tools (e.g., AWS CloudWatch, Snowflake monitoring, Informatica data quality, dbt tests).
  • Working knowledge of governance and privacy (GDPR), retention, and access models, especially as implemented in Snowflake, AWS, and Azure.
  • Excellent communication and collaboration skills across product, engineering, and non-technical stakeholders. Experience in medtech/pharma/consulting is a plus.

About the company

As a leader in medical science for more than 40 years, we are committed to solving the challenges that matter most - united by a deep caring for human life. Our mission to advance science for life is about transforming lives through innovative medical solutions that improve patient lives, create value for our customers, and support our employees and the communities in which we operate. Now more than ever, we have a responsibility to apply those values to everything we do - as a global business and as a global corporate citizen. So, choosing a career with Boston Scientific (NYSE: BSX) isn't just business, it's personal. And if you're a natural problem-solver with the imagination, determination, and spirit to make a meaningful difference to people worldwide, we encourage you to apply and look forward to connecting with you! Job Segment: Change Management, Compliance, Database, Computer Science, Engineer, Legal, Management, Technology, Engineering

Apply for this position