Principal Data Engineer
Role details
Job location
Tech stack
Job description
We are on the lookout for a Principal Data Engineer to help define and lead the next generation of our data platform and data capabilities. You'll play a key role in building scalable, resilient and intelligent data systems that power real time services, insights, products and decisions across Dotdigital.
As a Principal Data Engineer, you will be instrumental in driving the architecture, development and delivery of our data platform. You will lead key initiatives, provide technical direction and collaborate with product, analytics and data science teams to ensure data value is realised across the entire ecosystem. Working across the entire data lifecycle, you will help shape how data is collected, processed and consumed across Dotdigital., * Lead the design and implementation of scalable, secure and resilient data systems across streaming, batch and real-time use cases.
- Architect data pipelines, model and storage solutions that power analytical and product use cases; using primarily Python and SQL via orchestration tooling that run workloads in the cloud.
- Leverage AI to automate both data processing and engineering processes.
- Assure and drive best practices relating to data infrastructure, governance, security and observability.
- Work with technologists across multiple teams to deliver coherent features and data outcomes.
- Support the data team to help adopt data engineering principles.
- Identify, validate and promote new tools and technologies that improve the performance and stability of data services.
Requirements
Technical Expertise
- Extensive experience delivering python-based projects in the data engineering space.
- Extensive experience working with SQL and NoSQL database technologies (e.g. SQL Server, MongoDB & Cassandra).
- Proven experience with modern data warehousing and large-scale data processing tools (e.g. Snowflake, DBT, BiqQuery, Clickhouse).
- Hands on experience with data orchestration tools like Airflow, Dagster or Prefect.
- Experience using cloud environments (e.g. Azure, AWS, GCP) to process, store and surface large scale data.
- Experience using Kafka or similar event-based architectures e.g. (Pub/Sub via AWS SQS, Azure EventHubs, AWS Kinesis).
- Strong grasp of data architecture and data modelling principles for both OLAP and OLTP workloads.
- Capable in the wider software development lifecycle in terms of agile ways of working and continuous integration/deployment of data solutions.
Engineering Leadership
- Experience as a lead or Principal Engineer on large-scale data initiative or product builds.
- Demonstrated ability to architect data systems and data structures for high volume, high throughput systems.
- Proven experience leading data platform modernisation or cloud migration projects.
- Comfortable taking ownership of difficult data problems and driving them to resolution.
Bonus
- Experience using ClickHouse as part of a data pipeline and analytics solution.
- Experience using Databricks or similar data platforms.
Benefits & conditions
- UK; London; Birmingham; Manchester; Glasgow; Liverpool; Leeds; Edinburgh; Bristol; Cardiff; Nottingham
- Contract
- Published: 13 hours ago
- Competitive, * Parental leave
- Medical benefits
- Paid sick leave
- Dotdigital day
- Share reward
- Wellbeing reward
- Wellbeing Days
- Loyalty reward