Fullstack Data and Platform Engineer

Integrity Next GmbH
München, Germany
11 days ago

Role details

Contract type
Permanent contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English

Job location

Remote
München, Germany

Tech stack

API
Artificial Intelligence
Airflow
Amazon Web Services (AWS)
Amazon Web Services (AWS)
Data analysis
Software as a Service
Cloud Computing
Continuous Integration
Data as a Services
Information Engineering
Data Infrastructure
ETL
Data Security
Data Sharing
Cursor (Graphical User Interface Elements)
Distributed Systems
Python
PostgreSQL
Performance Tuning
SQL Databases
Systems Integration
Unstructured Data
Data Processing
Data Ingestion
Event Driven Architecture
Infrastructure Automation Frameworks
Deployment Automation
Amazon Web Services (AWS)
Terraform

Job description

At IntegrityNext, we are building a shared AI and data platform on AWS on top of our supply chain and product compliance platform. This platform will power semantic data access, BI, APIs, and agentic product experiences. Our platform is initially centered on PostgreSQL-backed structured data and will evolve toward unstructured data and more lakehouse-style capabilities on AWS. We work hands-on across Python-based data services, pipelines, asynchronous integrations, AWS infrastructure, and SQL/PostgreSQL-based processing. We work according to the principle "You build it, you run it": platform capabilities are owned end-to-end, from design and implementation to operational responsibility in production. We also follow spec-driven development and actively use AI-assisted engineering tools such as Claude Code, Cursor, and similar tools. This is a strongly hands-on engineering role focused on building and operating reliable data infrastructure for internal product and platform use cases.

What can you expect?

Build and operate the shared data infrastructure Build and maintain the data infrastructure that powers the platform. Design and implement ingestion flows, ETL/ELT pipelines, and shared processing patterns that enable reliable and scalable data usage across the company. Work on structured and evolving data foundations Help evolve the platform's structured data foundation around PostgreSQL while supporting the path toward S3-backed and lakehouse-style capabilities over time. Build event-driven and asynchronous integrations Design and implement asynchronous and event-driven workflows using AWS-native patterns such as Amazon EventBridge. Ensure integrations are reliable, resilient, and maintainable in production. Ensure trust and reliability in platform data Implement data quality checks, validation rules, freshness controls, and monitoring to ensure semantic, API, and AI use cases can rely on trusted data inputs. Support reusable data access Help build curated datasets and shared access layers that support semantic models, API products, and AI-powered features. Contribute to platform operations and delivery Implement CI/CD for data workflows, maintain infrastructure as code, and support production readiness through monitoring, troubleshooting, and continuous improvement. Work closely across the team Collaborate closely with the Data & Platform Architect, Semantic / Analytics Engineer, and AI engineers to ensure the platform data foundation is reliable, reusable, and aligned with downstream needs. Explore and establish modern engineering methods Work actively with spec-driven development and AI-assisted engineering workflows. Help establish practical ways of working that improve quality, speed, and maintainability.

Requirements

Do you have experience in Terraform?, Strong hands-on data engineering skills Strong practical experience with Python and SQL, and solid knowledge of PostgreSQL, including query writing, performance tuning, and production-grade data processing. Experience with pipelines and transformation Experience building data ingestion pipelines and ETL/ELT workflows in production environments. Experience with dbt or similar transformation tooling is expected. Orchestration and workflow reliability Experience with orchestration and pipeline scheduling, and a solid understanding of how to build reliable, maintainable data workflows. Event-driven and distributed systems thinking Experience with event-driven architecture, ideally including Amazon EventBridge or similar systems. Good understanding of idempotency, retries, dead-letter handling, and resilient distributed processing. Cloud and platform engineering mindset Experience with AWS services for data, compute, monitoring, and security, as well as CI/CD, infrastructure as code, and deployment automation. Data quality and operational ownership Experience with data quality, validation, observability, and operational reliability. You are comfortable owning data workflows in production, not just implementing them. Collaborative and pragmatic communication You work well with architects, semantic engineers, and upstream solution teams, communicate clearly, and help turn requirements into robust technical solutions. AI-assisted and spec-driven mindset You are comfortable working with structured specifications, clear contracts, and AI-assisted development workflows as part of day-to-day engineering. Fullstack ownership mindset You bring strong specialization in data and platform engineering while being willing to contribute across adjacent layers when needed to deliver end-to-end outcomes. Fluent English You can work effectively in an English-speaking environment. Nice to have

  • Experience with Airflow, Dagster, or similar orchestration tools
  • Experience with S3-based analytical storage, query engines, and lakehouse-style evolution
  • Experience with Terraform or CDK
  • Experience with AWS services such as SQS, Lambda, and Step Functions
  • Experience with semantic platforms, reusable data products, or analytics engineering
  • Experience with unstructured data ingestion and hybrid architectures
  • Experience in multi-team SaaS environments, * A professional, welcoming, and highly motivated team
  • Collaboration at eye level with an open feedback culture
  • An environment where people support each other and grow together

Benefits & conditions

  • A role with real meaning that is both enjoyable and impactful
  • The opportunity to make a sustainable contribution through your work
  • Attractive compensation as part of a growing company

Attractive Benefits

  • 30 days of paid vacation
  • EGYM Wellpass membership to support your work-life balance
  • Flexible working models to better balance work and personal life

About the company

IntegrityNext, a global leader in supply chain sustainability software, stands at the forefront of corporate sustainability and compliance. Since 2016, businesses have trusted IntegrityNext to simplify ESG compliance, reduce risks, and address critical challenges like due diligence, decarbonization, and sustainability reporting. With over 500 customers and 2 million suppliers across 190 countries, IntegrityNext is transforming supply chains into engines of transparency and sustainable growth. For more information, visit www.integritynext.com.

Apply for this position