Data Engineer
Role details
Job location
Tech stack
Job description
We are seeking a Lead Data Engineer to join our Marketing Intelligence & Reporting team, responsible for overseeing our data engineering function and its associated workstreams across Omnicom's global client portfolio.
The ideal candidate will possess a strong technical background in the data engineering sector, with proven expertise in the collection, processing, and analysis of large-scale media datasets. Experience in designing, deploying, and managing data warehouses on cloud platforms is essential. This is an excellent opportunity for someone keen to collaborate with both regional and global clients, and to drive best practices and product innovation across our network of agencies.
Key Responsibilities
-
Engage, upskill and nurture the Data Engineering team to drive and foster a high engagement culture.
-
Be the key voice on the core design of our data engineering product and data architecture whilst leading the team to implement the plan.
-
Be a key stakeholder on our cloud-based infrastructure, providing expertise and recommendations on the best opportunities to innovate, adhere to best practices and develop the underlying technology.
-
Work with the BI team to design functional standardised data solutions that can be rolled out across Annalect EMEA's clients.
-
Lead in designing, building and maintaining data pipeline architecture for ELT/ETL.
-
Explore the use of AI within our data engineering solutions
-
Participate in running a community of data engineering specialists across the EMEA region
-
Contributing to pitch content and scope for new work and solutions for prospective and current clients.
Requirements
· 5+ years' experience in data engineering teams, ideally in a global or fast-paced environment.
· 2+ years' experience leading data engineering teams
· 5+ years' experience with modern data platforms in the cloud. Experience with Google Cloud Platform is preferred, Microsoft Azure or AWS will also be considered.
· 5+ years' experience in designing and implementing data architecture, ETL/ELT processes, and DevOps pipelines (including Docker, CI/CD, etc).
· 6+ years working with SQL and Python.
· 2+ years working with dbt, using advanced macros and templating.
· Knowledge of the media industry is essential
· Ability to translate business requirements into technical specifications and communicate these to various stakeholders.
· Ability to manage Git repositories and use the GitFlow Workflow, ideally through GitLab, and deployments into production environments.
Build the platforms. Lead the thinking. Shape the future of data for our clients.