Data Engineer
Role details
Job location
Tech stack
Job description
development workflows, testing procedures, and deployment cycles. * Monitor the health and performance of data applications, pipelines, and databases by utilising observability tools including Prometheus, Grafana, and CloudWatch for proactive issue resolution. * Collaborate closely with stakeholders across technical and business domains to deliver reliable data services that enable analytical product development and informed decision-making. * Contribute actively in agile ceremonies such as stand-ups, sprint planning sessions, and retrospectives to align priorities with delivery goals within the engineering team. * Support the implementation of best practices in data engineering by sharing knowledge with colleagues and participating in code reviews to uphold quality standards. * Troubleshoot complex issues related to data integrity, system performance, or infrastructure reliability while ensuring compliance with regulatory requirements. * Continuously seek opportunities for process
Requirements
improvement by evaluating emerging technologies or methodologies that can enhance the overall effectiveness of the data engineering function. Qualifications * Demonstrated professional experience in data engineering or a closely related field with proven ability to deliver scalable solutions in production environments. * Proficiency in Python programming (including frameworks such as Pandas, Dask, or PySpark) alongside strong SQL skills for effective manipulation of large datasets. * Hands-on experience building API-driven data platforms using technologies like FastAPI to facilitate seamless integration between systems. * Extensive background working with AWS cloud services (or GCP), Snowflake for data warehousing needs, Kubernetes for container orchestration, and Airflow for workflow management. * Solid understanding of monitoring and alerting systems such as Prometheus, Grafana, or CloudWatch to ensure system reliability and rapid incident response. * Expertise in ETL processes as well as event streaming architectures utilising tools like Kafka or Flink for real-time analytics applications. * Comfortable operating within Linux environments using command-line tools for system administration tasks or troubleshooting purposes. * Excellent communication skills enabling effective collaboration across both technical teams and business stakeholders within an agile framework. * Bachelor's degree in Computer Science, Engineering, Mathematics or another relevant technical discipline providing strong theoretical foundations. What's next If you are ready to take your career in data engineering to new heights within a globally respected organisation committed to inclusion and excellence - this is your moment! Apply today by clicking on the link provided; seize this opportunity to join an inspiring team where your contributions will shape the future of market data analytics. #J-18808-Ljbffr