Data Engineer
Role details
Job location
Tech stack
Job description
As a Data Engineer in the WPP Enterprise Data Group, you will play a key role in maintaining and enhancing one of our client's reporting solutions. This includes building and supporting data pipelines that deliver accurate, timely, and auditable information, enabling stakeholders to make informed business decisions and meet their reporting obligations.
You will design and implement scalable data solutions, focusing on ingestion, transformation, and delivery of client datasets. This role requires building and maintaining robust, high-quality data pipelines to ensure reporting outputs are reliable, consistent, and aligned with contractual commitments.
As part of the team, your work will centre on developing and optimising reporting platforms using Azure Databricks and other Azure-based services, covering batch and streaming data workloads. You will collaborate closely with data architects, analysts, and business stakeholders to ensure reporting solutions are fit-for-purpose and scalable.
You will be joining a group of Data Engineers passionate about delivering high-impact data products for clients and committed to a culture of collaboration, knowledge sharing, and continuous improvement.
What you'll be doing:
- Design and build data ingestion pipelines from diverse sources (including APIs) to support reporting requirements.
- Develop and maintain processing pipelines using PySpark/SQL within Azure Databricks to transform raw datasets into structured, reportable formats.
- Design and deliver data warehousing solutions and core data engineering workstreams.
- Ensure data pipelines and products are reliable, high-quality, and accessible for reporting and analysis.
- Maintain and optimise pipelines to ensure scalability, performance, and resilience.
- Facilitate the secure exposure of data to third-party platforms when required.
- Propose technical designs and develop integrations to support evolving client reporting needs.
- Collaborate closely with cross-functional teams (Data Analysts, Business Analysts, PMO, and stakeholders) to deliver accurate and timely client reporting solutions.
- To be a persuasive leader who can guide, lead and influence outcomes, and understands the appropriate approach to take depending on the audience and stakeholder group.
- Exhibit strong stakeholder management skills to ensure that expectations are clearly set and outcomes are delivered to expectations.
- Make effective decisions using judgement, evidence and expert knowledge to provide responsive solutions in a timely manner.
- Keep up-to-date with industry developments and anticipate future opportunity and risk implications for your work and the wider organisation.
- Be an ambassador for the X profession/X team by being a subject matter expert and acting as a trusted advisor.
- Deliver long-term, sustainable solutions that offer value for money and use best commercial and procurement practices.
Requirements
Do you have experience in SQL?, * Experience in Python and SQL
- Experience using Databricks
- Experience of Microsoft Azure data services -ADLS gen2, Azure Key Vault, Data Factory
- Proven experience with API integrations for data ingestion
- Ideally experience in the following -Delta Lake, PySpark
- Exposure to data science / ML is a plus