Senior Data Engineer
Role details
Job location
Tech stack
Job description
The ICOS Development team within our Index business is responsible for the application development and maintenance of key components of the index calculation platform., You will strengthen the ICOS Development team by designing and implementing serverless, event-driven, and microservices-based architectures on GCP. This includes developing and maintaining cloud-native applications, data processing pipelines, and APIs using services such as Cloud Functions, Cloud Run, BigQuery, Pub/Sub, and Cloud Storage. You will play a key role in evolving system and data architecture within a complex business process and data warehouse environment, ensuring solutions are scalable, resilient, and cost-efficient. Tasks/Responsibilities
- Designing and developing Python-based services and FastAPI-driven APIs in a GCP environment
- Building and maintaining serverless applications and event-driven integrations
- Implementing and optimizing data architectures leveraging BigQuery and other managed GCP services
- Contributing to CI/CD pipelines and infrastructure-as-code practices
- Implementation of complex business rules and index concepts
- Mentor and coach team members in Python best practices, cloud-native design on GCP, and serverless architecture patterns, fostering high code quality and scalable solution design
- Maintenance and design of a complex & challenging data warehouse environment
- Technical design documentation
- Design and execution of test strategies
- Provide 1st line/2nd line application support primarily for the ICOS systems
Requirements
Do you have experience in Terraform?, The ideal candidate will bring deep expertise in Python development, hands-on experience with Google Cloud Platform (GCP), and a strong understanding of serverless architecture patterns. While familiarity with Oracle PL/SQL is beneficial, the primary focus of this role is designing and building scalable, cloud-native solutions using Python within a modern data and application ecosystem., * Python mastery: Deep understanding of design patterns, OOP, and Pythonic code
- FastAPI expertise: 4+ years building high-performance APIs
- Data processing: Expert-level Pandas, Polars, NumPy and optimization techniques for large datasets
- Cloud platforms: Hands-on with GCP (preferred), AWS, or Azure including containerization, Cloud functions/serverless compute
- Database proficiency: Excellent Oracle PLSQL and RDBMS experience (Oracle)
- Scheduling: Experience with Control-M or other scheduling tools
- ETL knowledge : Exposure with Informatica Powercenter
- Business domain : Indexing knowledge beneficial
- Deployment : Experience with CI/CD pipelines and infrastructure as code (Terraform)
- A high motivation to support and operate a complex & challenging data warehouse environment
- Excellent analytical skills
- Proficiency in written and spoken English
Additionally, you need to be a team player with good communication skills, highly motivated, flexible and show willingness to learn and adopt new technologies quickly. #LI-SQ1 #DIRECTOR #IT #STOXX