Data Engineer
Role details
Job location
Tech stack
Job description
We are seeking a Senior Data Engineer to help evolve and enhance our data platform and capabilities. You'll work on building robust, scalable, and intelligent data systems that power real-time services, insights, and decisions across Dot Digital.
As a Senior Data Engineer, you will be responsible for delivering key data platform features, designing and building data pipelines, and collaborating with product, analytics, and data science teams to unlock the value of data. You will have the opportunity to influence technology choices, optimise processes, and ensure the consistent, reliable, and secure movement and storage of data across the organisation.
Responsibilities:
- Design and build scalable, reliable, and secure data pipelines for streaming, batch, and real-time processing.
- Work in partnership with the Data Science teams to build scalable Pyspark workloads that can be leveraged to generate advanced models.
- Implement and optimise data models and storage solutions using Python and SQL with orchestration tools in a cloud environment.
- Leverage AI to automate both data processing and engineering processes.
- Advocate and uphold best practices for data governance, security, and monitoring.
- Collaborate cross-functionally with engineers, analysts, and data scientists to deliver impactful data solutions.
- Mentor and support junior engineers in data engineering principles and practices.
- Evaluate and recommend new tools and technologies to strengthen data services.
Requirements
Technical Expertise
- Significant experience delivering Python-based projects for data engineering.
- Experience building and tuning spark pipelines that run at scale across large quantities of data.
- Strong hands-on experience with SQL and NoSQL databases (e.g. SQL Server, MongoDB, Cassandra).
- Proven experience with modern data warehousing and large-scale processing (e.g. Snowflake, DBT, BigQuery, Clickhouse).
- Proficient with data orchestration tools such as Airflow, Dagster, or Prefect.
- Experience with cloud platforms (Azure, AWS, or GCP) for data processing and storage.
- Practical experience with Kafka or equivalent event-driven architectures (e.g. AWS SQS, Azure EventHubs, AWS Kinesis).
- Good understanding of data modelling for OLAP and OLTP workloads.
- Familiar with agile methodologies and CI/CD processes in the context of data solutions.
Engineering
- Experienced as a senior team member on complex data engineering projects.
- Able to design and optimise data structures for high-volume systems.
- Experienced in assisting with data platform modernisation or migration to the cloud.
- Takes initiative to solve challenging data issues and drive projects forward.
Bonus
- Experience using ClickHouse as part of a data pipeline and analytics solution.
- Experience using Databricks or similar data platforms.
Benefits & conditions
- Medical benefits
- Paid sick leave
- Dotdigital day
- Share reward
- Wellbeing reward
- Wellbeing Days
- Loyalty reward