Junior Data EngineerNew
Role details
Job location
Tech stack
Job description
As a Junior Data Engineer, you will be a key player in ensuring the stability and reliability of our data integrations. This role places you at the heart of our technical operations, where you will learn to maintain existing integrations and become a key contributor in resolving issues that arise. You will dive deep into technical problem-solving, debug data pipelines, and communicate with internal teams like Customer Care to ensure our systems run smoothly.
Where you will have impact
- Maintain and enhance existing data integrations, ensuring the high reliability and quality of data from our external partners.
- Contribute to the investigation and resolution of technical issues, performing deep-dive analysis into our data and systems to identify the root cause of problems.
- Communicate effectively with internal teams, such as Customer Care, to provide clear and timely updates on integration issues.
- Collaborate with your team to troubleshoot integration issues and learn the intricacies of our data ecosystem.
- Proactively identify and contribute to improvements in our processes, tooling, and documentation to build a more scalable and efficient function.
About our team
The Data Engineering team is the gatekeeper of Lighthouse's vast datasets. Our mission is to reliably transform, integrate, and store the numerous data sources that power our BI products, from web-scraped data to direct API integrations. We handle a staggering amount of information-our main dataset holds over 3 trillion hotel rates, and we process over 100TB of data daily. You'll join a highly talented and collaborative group of 8 data engineers.
Requirements
Do you have experience in SQL?, * You are eager to learn and enjoy the challenge of debugging complex systems and ensuring their stability and reliability.
- Good knowledge of Python and SQL, with a demonstrated affinity for working with data.
- A meticulous approach to development, with an ability to identify edge cases and potential issues.
- An interest in building and maintaining data processing pipelines, possibly from university projects or internships.
- You are a collaborative team player, eager to learn and grow, with the ability to take ownership of your tasks.
- Excellent communication skills in English, with an ability to explain technical issues to non-technical internal teams.
We welcome
- Hands-on experience with a major cloud platform such as GCP, AWS, or Azure.
Technologies you will work with
Python, SQL, Google Cloud Pub/Sub, BigTable, BigQuery, Dataflow, Kubernetes
Benefits & conditions
What's in it for you?
- Flexible time off: Autonomy to manage your work-life balance.
- Career development: Workshops, frameworks, and trainings to maximize your professional potential.
- Impactful work: Shape products relied on by 85,000+ users worldwide.
- Mobility options: Mobility budget or company car (based on your job category).
- Net allowance: Support for home office related expenses.
- Vouchers: Lunch vouchers & Eco vouchers.
- Comprehensive health insurance: Extensive coverage for you and your dependents.
- Pension funding: Group insurance to secure your future.
- Referral bonuses: Earn rewards for bringing in new talent.