Data Engineer - Intermediate Level
Role details
Job location
Tech stack
Job description
The Data Engineering team enables and manages the ingestion of low latency, high volume car telemetry data that powers our engineering and data science teams to build smart and insightful products. We are looking for an experienced Data Engineer, specific expertise in Java, to join the team who will have a key role in the design, development, implementation and documentation of large-scale, distributed software data applications, systems and services. You will help to engineer data pipelines which will enable our vehicles to communicate to the cloud. The features you build will power driving experiences across the world.
What you will do:
- Work closely with Data Engineering Lead, Senior Engineers and Product team to deliver features to customers and thrive as a creative thinker that can break out from conventional solutions.
- Adopt modern principles, techniques and technology to the team, raising software quality, value and delivery.
- Perform within engineering best practices.
- Implement, and maintain complex data engineering solutions to acquire and prepare data. Create and maintain data pipelines to connect data within and between data stores, applications and organisations.
- Design, code, verify, test, document, amend and refactors complex programmes/scripts and integration software services.
- Apply agreed standards and tools to achieve well-engineered outcomes.
- Work side-by-side with other talented engineers in a team-oriented, agile software engineering environment.
- Love writing code and learning to constantly hone your craft as an engineer
- Work closely with product owners to shape and deliver features to customers
Our Tech Stack:
Please note that you do not need to be familiar with all of them as we acknowledge that in Technology there always is a learning curve; I'd go as far as to say, our key requirements are Java 11+, Spring and Kafka, but you'd get exposure to the following:
- Cloud Providers (primarily AWS, although we still have some legacy services running on Azure)
- Languages (Java 11+, Kotlin (legacy), Messaging Stacks, Kafka, Pulsar - slowly migrating those back to Kafka)
- Deployment Environment (Kubernetes (EKS))
- Frameworks (Spring, Apache Flink, Kafka Streams)
- Apache Storm (mostly legacy)
- Repositories and CI/CD (Gitlab, Gitlab CI/CD, Data Stores, MongoDB)
Requirements
Do you have experience in Web services?, Do you have a Bachelor's degree?, * Strong programming experience in Java (11+) and show a sense of ownership and pride in your code; make us believe you will excel. Experience with testing frameworks JUnit5, Mockito or Spring Integration
- Expertise in one of the major real time data processing frameworks, such as Flink or Kafka Streams
- Experience of building event driven and/or streaming data services, IoT domain would be great but not essential
- Strong database skills and experience is required, we have NoSQL databases as well as relational databases in use often with large data volumes.
- Strong data modelling concepts and principles having extensive experience of building data architectures consolidating multiple complex sources
- Experience of modern software and data engineering patterns, including those used in highly scalable, distributed, and resilient systems.
- Knowledge of and experience working with APIs (designing with Open API is desirable) and web services, CI/CD pipelines and automated testing (BDD, Performance, Security), Kubernetes and cloud native practices, containerized workloads with tools such as Docker
- Experience developing and delivering systems on at least one major public cloud provider; preferably AWS
- Passion for agile practices, DevSecOps, incremental delivery, continuous improvement and ability to cultivate a strong team culture
- We would like a self-starter - someone who would reach out to other teams as needed to seek answers and fostering an agile environment
- Willingness to get involved in problem resolution and initiatives to smooth operational maintenance of production services which are spread across geographical boundaries
- We think the knowledge acquired earning an BS in Computer Science, Engineering, Mathematics, or a related field would be of excellent value in this position, but if you are smart and have the experience that backs up your abilities, for us, talent trumps a degree every time