Data Platform Engineer - Remote
Role details
Job location
Tech stack
Job description
As an experienced Data Platform Engineer (d/m/w) with several years of professional experience, you and your teammates will find ways to further develop the data infrastructure at scale and at the same time drive real-time interaction with our data ecosystem. This requires communication with other teams and a structured and strategically far-sighted approach: You recognize patterns in the feature teams' questions, cluster them, provide input for the appropriate system design and thus drive automation in the product. The common North Star: actively practiced feedback culture for even better data-driven decision making
Of course, you can work flexibly from home. As we also value face-to-face meetings on-site, we will meet in our office in Freiburg im Breisgau if necessary., You support initiatives to network previously isolated data areas and implement the development and expansion of a scalable data infrastructure.
- You will develop concepts for the real-time storage and processing of customer data in the data infrastructure.
- You will set up ETL routes for the collection, storage, maintenance, processing, enrichment and transfer of data.
- You will enable teams to explore new automation solutions eg. by running benchmarks with recent LLM models., We rely on team-based recruiting - our team conducts the interviews themselves. We look forward to getting to know you.
- Molly likes to keep an eye on systems so they stay stable, efficient, and running smoothly. Working remotely from Berlin, she spends her off hours keeping up with one very curious seven-year-old.
- Friederike supports the team members in their successful collaboration and further development. Outside of work, she enjoys jogging and cycling.
- Jiri has a soft spot for data ops and customer-centric solutions to big data challenges. On weekends, he spends his time mushroom picking and following the latest trends in NLP.
- Andreas is enthusiastic about supporting the automation experience of our customers by leveraging data-driven approaches. Off the clock, he enjoys spending time on the tennis court or spices up life by exploring new culinary creations.
Requirements
Do you have experience in Terraform?, Do you have a Master's degree?, You are familiar with cloud environments (AWS, Google, Azure) and common data engineering tools (e.g. for orchestrating data pipelines via ETL).
- Your understanding of SQL, Python and infrastructure-as-code tools (e.g. Terraform, CloudFormation) will help you to implement new features, maintain existing ones and enable the internal customers to overcome obstacles in reaching insights.
- Practical experience with big data frameworks (e.g. Hadoop, Spark, Airflow) is a big plus. Ideally, you are also familiar with event-based data storage and have already applied this concept (e.g. Apache Kafka, AWS Kinesis or similar).
- Ideally, you have experience with collaborating on proof-of-concept solutions and benchmarks.
- You can speak English fluently. The working language in the team is English. Additionally, knowledge of German will be very helpful to communicate with the surrounding teams.