Senior Data Engineer
Role details
Job location
Tech stack
Job description
Senior Data Engineer - London Hybrid working 3 days a week in the office Salary: £70-90,000 + Bonus and Benefits About the Opportunity We are partnering with a leading global trading organisation that is transforming into a data-driven business. With a strong footprint in European energy markets, this company is building a modern data platform to unlock insights and create value from complex datasets. If you're passionate about data engineering and want to work on cutting-edge solutions in a dynamic, collaborative environment, this role is for you. The Role As a Senior Data Engineer , you will play a key role in designing and implementing a modern data platform based on data lakehouse principles. You'll collaborate with a talented team to deliver scalable, secure, and high-performance solutions that power analytics and trading decisions. Key Responsibilities: Drive the implementation of a modern data platform tailored to business needs. Design and build data storage solutions
Requirements
using open standards (e.g., Parquet, Apache Iceberg). Develop scalable ingestion and transformation workflows with observability and logging. Implement data lineage tracking and enforce data quality standards. Collaborate on data modeling for reporting and advanced analytics. Ensure compliance with governance and security best practices. Stay ahead of emerging technologies and propose improvements. Mentor team members and contribute to architectural decisions. Support monitoring and incident resolution for the data platform. Required Skills & Experience Hands-on experience with data lakehouse implementations (vendor or open source). Strong knowledge of Parquet, Apache Iceberg, and data modeling strategies. Expertise in Spark or similar distributed data processing frameworks. Proficiency in SQL and Git-based workflows. Understanding of cloud-based architectures and object storage (e.g., Azure ADLSv2). Desirable: Experience with Trino, Dremio, Databricks, or Microsoft Fabric. Familiarity with Kubernetes, Airflow/Dagster, and streaming platforms like Kafka. Knowledge of Clickhouse and observability tools. Strong grasp of data governance and compliance practices.