Data EngineerNew
Role details
Job location
Tech stack
Job description
As a Data Engineer in our new Data Products team, you will play a key role in shaping the quality and business value of our core data assets. You will be hands-on in designing, building, and maintaining the data pipelines that serve teams across Lighthouse. You will act as a bridge between our data and the business, collaborating with stakeholders and ensuring our data effectively enables its consumers.
Where you will have impact
- Become the expert for key data products, understanding the full data lifecycle, quality, and business applications.
- Design, implement, and maintain the streaming and batch data pipelines that power our products and internal analytics.
- Collaborate directly with data consumers to understand their needs, gather requirements, and deliver data solutions.
- Deliver improvements in data quality, latency, and reliability.
- Show a product engineering mindset, focusing on delivering value and solving business problems through data.
- Mentor other engineers, sharing your expertise and contributing to their growth.
About our team
The Data Products Team is the definitive source of truth for Lighthouse's data, sitting at the foundational layer of our entire data ecosystem.
Their core mission is to model and deliver high-quality, foundational data products that are essential ingredients for all downstream product features, machine learning models, and data science initiatives across the company:
- Data Modeling & Ownership: Defining and optimizing core data entities for product and analytical use.
- Pipeline Engineering: Building robust ETL/ELT pipelines to transform raw integrated data into trusted domains.
- Data Quality: Establishing standards and monitoring the health of all foundational data assets.
We are the core of the Lighthouse platform. We take ingested data from our Integrations teams, turn it into usable data products, and provide that clean output to both the Product Engineering and Data Science & Analytics teams. We rely on the Data Platform team for infrastructure support and tooling.
Requirements
Do you have experience in Terraform?, * Experience in a data engineering role, with a proven track record of building scalable data pipelines.
- A product engineering mindset, with a focus on understanding business context and stakeholder needs.
- Professional proficiency in Python for data processing and pipeline development.
- Strong knowledge of cloud database solutions such as BigQuery, Snowflake, or Databricks.
- Excellent communication and stakeholder management skills.
We welcome
- Experience with microservice architectures and data streaming systems like Kafka or Google Cloud Pub/Sub.
- Familiarity with data governance or data quality tools such as Atlan or Soda.
- Experience mentoring other engineers.
Technologies you will work with
Mostly, but not limited to: GCP, Python, BigQuery, Kubernetes, Airflow, dbt, Terraform, Atlan (data governance tool), Soda.
Benefits & conditions
What's in it for you?
- Flexible working environment: Work from home or at one of our global offices.
- Flexible time off: Autonomy to manage your work-life balance.
- Alan Healthy benefits: 160€/month for food, transportation, or nursery.
- Wellbeing support: Subsidized up to 80% ClassPass subscription.
- Comprehensive health insurance: 100% Alan coverage for you, your spouse, and dependents.
- Collaborative team: High-bar, friendly, creative, and passionate colleagues.
- Career development: Workshops, frameworks, tools, training, and processes to realize your full potential.
- Impactful work: Shape products relied on by 85,000+ users worldwide.
- Competitive compensation: Proactively maintained to value your work.
- Referral bonuses: Earn rewards for bringing in new talent.