Senior Data Solution Architect
Role details
Job location
Tech stack
Job description
As our Senior Data Solution Architect, you will be a key contributor in owning our core data assets as products. Acting as the bridge between technical teams and product managers, you will translate business needs into a data roadmap. Blending technical expertise with a product mindset, you will design and implement the scalable data solutions that power our products, ensuring our data effectively empowers its consumers across Lighthouse.
Where you will have impact
- Take ownership of key data products, becoming the go-to expert for its entire lifecycle, quality, and business applications.
- Design, implement, and maintain the scalable streaming and batch data pipelines that power our products and internal analytics.
- Collaborate directly with data consumers to understand their needs, gather requirements, and deliver robust data solutions.
- Deliver improvements in data quality, latency, and reliability.
- Be driven by a product-first mindset, focusing on delivering value and solving business problems through data.
About our team
The Data Products Team is the definitive source of truth for Lighthouse's data, sitting at the foundational layer of our entire data ecosystem.
Their core mission is to model and deliver high-quality, foundational data products that are essential ingredients for all downstream product features, machine learning models, and data science initiatives across the company:
- Data Modeling & Ownership: Defining and optimizing core data entities for product and analytical use.
- Pipeline Engineering: Building robust ETL/ELT pipelines to transform raw integrated data into trusted domains.
- Data Quality: Establishing standards and monitoring the health of all foundational data assets.
We are the core of the Lighthouse platform. We take ingested data from our Integrations teams, turn it into usable data products, and provide that clean output to both the Product Engineering and Data Science & Analytics teams. We rely on the Data Platform team for infrastructure support and tooling.
Requirements
- Experience in a data engineering role, with a proven track record building scalable data pipelines.
- A product engineering mindset, with a focus on understanding business context and working closely with stakeholders to solve their data challenges.
- Professional proficiency in Python for data processing and pipeline development.
- Strong knowledge of cloud database solutions such as BigQuery, Snowflake, or Databricks.
- Experience with microservice architectures and data streaming systems like Kafka or Google Cloud Pub/Sub.
- Fluency in English, with excellent communication and stakeholder management skills.
We welcome
- Hands-on experience with our data stack, including dbt, Terraform, or Atlan.
Technologies you will work with
Mostly: GCP, Python, BigQuery, Kubernetes, Airflow, dbt, Terraform, Atlan (data governance tool), Soda
Benefits & conditions
What's in it for you?
- Flexible working environment: Work from home or at one of our global offices.
- Flexible time off: Autonomy to manage your work-life balance.
- Alan Healthy benefits: 160€/month for food, transportation, or nursery.
- Wellbeing support: Subsidized up to 80% ClassPass subscription.
- Comprehensive health insurance: 100% Alan coverage for you, your spouse, and dependents.
- Collaborative team: High-bar, friendly, creative, and passionate colleagues.
- Career development: Workshops, frameworks, tools, training, and processes to realize your full potential.
- Impactful work: Shape products relied on by 85,000+ users worldwide.
- Competitive compensation: Proactively maintained to value your work.
- Referral bonuses: Earn rewards for bringing in new talent.