Staff Data Platform Engineer
Role details
Job location
Tech stack
Job description
Checkout.com is looking for an ambitious Staff Data Engineer to join our Data and AI Platform Team. Our team's mission is to build a platform where you can create reliable, scalable, AI-powered streaming & batch data applications, and share data across checkout to improve business performance.
The Data and AI Platform team is here to ensure internal stakeholders can easily collect, store, process and utilise data to build AI use cases, reports/products aiming to solve business problems. Our focus is on maximising the amount of time business stakeholders spend on solving business problems and minimising time spent on technical details around implementation, deployment, and monitoring of their solutions.
We're building for scale. As such, much of what we design and implement today is the technology/infrastructure which will serve hundreds of teams and petabyte-level volumes of data., * Work with stream processing technologies (Kafka & Flink) to build a continuously available large-scale event streaming platform
- Leverage subject matter and technical expertise to provide leadership, mentoring, and strategic influence across the organisation whilst building strong relationships with engineers and managers
- Build tooling (modules/SDKs/DSLs) and associated documentation to foster the adoption of the streaming platform by enabling upstream teams and systems to easily publish data and deploy streaming applications
- Implement all the necessary infrastructure to enable end users to build, host, monitor and deploy their own streaming applications
- Provide consultancy across the technology organisation to drive the adoption of the platform and unlock event-driven use-cases
- Participate, translate, run and execute the collection of requirements and architecture/design initiatives into action plans
- Provide hands-on support for all event-based systems including incident triage and root cause analysis
Requirements
Do you have experience in Terraform?, * Strong presentation and communication skills with a proven track record of influencing engineering organisations
- Strong engineering background with a track record of implementing and owning event streaming platforms
- Hands-on experience working with stream technologies, primarily Kafka, but also Kinesis
- Experience designing and implementing stream processing applications with Flink
- Experience working with cloud-based technologies such as AWS (MSK, S3, Lambda, ECS, SNS)
- Experience with Kubernetes (either self-hosted or on the cloud)
- Experience with SQL databases
- Experience working with Docker, container deployment and management
- Experience with infrastructure as code (Terraform or similar) as well as designing and implementing CI/CD pipelines
- Excellent programming skills with at least one of Java, Python and C#