Senior Data Engineer
Stravito
Amsterdam, Netherlands
2 days ago
Role details
Contract type
Permanent contract Employment type
Full-time (> 32 hours) Working hours
Regular working hours Languages
English Experience level
SeniorJob location
Remote
Amsterdam, Netherlands
Tech stack
Sql Data Warehouse
API
Artificial Intelligence
Amazon Web Services (AWS)
Automated Storage and Retrieval Systems
Automation of Tests
Azure
Google BigQuery
Software as a Service
Cloud Computing
Cloud Database
Computer Programming
Continuous Integration
Information Engineering
Data Infrastructure
Python
OAuth
Role-Based Access Control
Power BI
Security Assertion Markup Language (SAML)
Search Technologies
Single Sign-On
Data Streaming
Systems Integration
Tableau
TypeScript
Management of Software Versions
Document Metadata
Large Language Models
Snowflake
Indexer
Kotlin
Containerization
Amazon Web Services (AWS)
Kafka
Data Management
Vertica
Amazon Web Services (AWS)
Terraform
Azure
Looker Analytics
Data Pipelines
Docker
Redshift
Job description
The data platform underneath makes all of this possible: ingesting millions of documents, enforcing tenant-level security, and feeding the AI systems our customers rely on daily., * Build the foundation: Own the data platform that powers AI products used daily by thousands of professionals at Fortune 500 companies
- Work with data that matters: Transform event streams, usage telemetry, and millions of proprietary market research documents into reliable, queryable intelligence
- Solve hard infrastructure problems: Design multi-tenant, secure-by-default pipelines and APIs that meet the compliance bar of the world's most demanding enterprise customers
- Own it end-to-end: This isn't a role where someone else defines the work. You'll scope work independently, dig into data quality issues, field requests from stakeholders, and drive things to completion., As part of the platform team, you'll own a broad area: from building infrastructure to supporting the teams that depend on it. In any given week you might:
- Build and operate data pipelines that move event streams, document metadata, and usage data into our cloud data warehouse (ClickHouse Cloud, Snowflake, Azure)
- Design and maintain APIs for analytics and event extraction, with multi-tenant security baked in (RBAC, OAuth, SSO/SAML)
- Make data usable: whether that's modeling schemas for BI consumers, investigating a data quality issue, or helping a stakeholder understand what's possible with the data we have
- Keep things reliable and secure through automated tests, monitoring, and handling of sensitive data within SOC 2 and ISO 27001 environments
- Power our AI experiences by working with vector stores, indexing, and retrieval systems
- Drive engineering best practices across the platform: CI/CD, peer reviews, infrastructure as code, API versioning, and clear documentation
Requirements
- A track record of building data platforms in SaaS or cloud-native analytics environments
- Strong programming skills, with depth in at least one of Python or Kotlin and willingness to work across both. Experience with Rust or TypeScript is a plus
- Hands-on experience with MPP/cloud data warehouses (e.g., ClickHouse, Redshift, BigQuery, Snowflake, Azure Synapse) and cloud infrastructure on AWS or Azure
- Practical experience designing, consuming, and maintaining APIs
- Familiarity with multi-tenant security patterns: RBAC, row-level security, and identity standards such as OAuth and SAML/SSO
- Solid engineering fundamentals: CI/CD, automated testing, observability, and infrastructure as code (Terraform a plus)
- Working knowledge of data privacy requirements (PII handling, GDPR) and experience operating within compliance frameworks like SOC 2 or ISO 27001
Nice-to-haves
- Experience integrating with BI tools (Power BI, Tableau, Looker)
- Familiarity with semantic search, embeddings, or vector stores (e.g., Pinecone, pgvector)
- Exposure to event-driven or streaming architectures (Kafka, Kinesis, SQS/SNS)
- Experience with containerisation (Docker, ECS/Fargate)
- Interest in leveraging LLMs and AI tooling to accelerate data engineering work
- dbt, SQLMesh, or similar transformation framework experience
About the company
Stravito transforms how Consumer Insights professionals and Brand Managers work by building AI that automates their core workflows. We help world-leading organizations across industries accelerate strategic decision-making by turning millions of market research documents into intelligent systems that generate reports, discover insights proactively, and synthesize knowledge across vast content libraries., Join a remote-first, globally distributed team of 100+ professionals from 30+ nationalities, united by our core values: simplicity first, an 'own it, do it' mentality, embracing different perspectives, and enjoying the journey together. We bring everyone together for company events throughout the year to strengthen our global connections.
You'll grow alongside experienced colleagues with deep expertise across AI, market research, and enterprise systems. We offer exceptional career development opportunities in our fast-evolving market, competitive compensation, and a collaborative culture where everyone actively supports each other's success.
Most importantly, you'll have the satisfaction of simplifying the professional lives of thousands of Brand Managers and Consumer Insights professionals worldwide, making their strategic decisions more data-driven and impactful.