Senior Data Engineer
Role details
Job location
Tech stack
Job description
The Senior Data Engineer is a role that will support the Product Data Domain teams. You will help to build ETL pipelines to ingest and transform data to develop the data products that will power key value use cases across the BBC. You will work in an agile multi-disciplinary team alongside product analytics developers, product data managers, data modelers and data operations managers, ensuring that all work delivers maximum value to the BBC., Role and responsibilities will comprise of:
- Develop robust and scalable data pipelines to ingest, transform, and analyse large volumes of structured and unstructured data from diverse data sources. Pipelines must be optimised for performance, reliability, and scalability in line with the BBC's scale.
- Contributes to initiatives to enhance data quality, governance and security across the organisation, ensuring compliance with BBC guidelines and industry best practices.
- Breaking down stakeholders requirements into smaller tasks and identify the best solution.
- Builds innovative solutions to acquiring and enriching data from a variety of data sources.
- Works on one or more projects guiding other team members on designing and developing, testing, and building automation workflows.
- Conducts insightful and physical database design, designs key and indexing schemes and designs partitioning.
- Participates in building and testing business continuity & disaster recovery procedures per requirements.
- Evaluates and provides feedback on future technologies and new releases/upgrades based on their market knowledge of the domain when asked to do so.
Requirements
Do you have experience in SQL?, Essential Skills
- Extensive (3+ years) experience in a data engineering or analytics engineering role, preferably in digital products, building ETL pipelines, ingesting data from a diverse set of data sources (including event streams, various forms of batch processing)
- Excellent SQL and python skills with experience in deploying and scheduling code bases in a data development environment, using technologies such as Airflow.
- Good working knowledge of cloud-based Data Warehousing technologies (such as AWS Redshift, GCP BigQuery or Snowflake)
- Demonstrable experience of working alongside cross-functional teams interacting with Product Managers, Infrastructure Engineers, Data Scientists, and Data Analysts
- Strong stakeholder management skills with the ability to prioritise and a structured approach and ability to bring others on the journey
Desirable Skills
- Ability to listen to others' ideas and build on them
- Ability to clearly communicate to both technical and non-technical audiences
- Ability to collaborate effectively, working alongside other team members towards the team's goals, and enabling others to succeed, where possible
- Strong attention to detail
If you can bring some of these skills and experience, along with transferable strengths, we'd love to hear from you and encourage you to apply.
Before your start date, you may need to disclose any unspent convictions or police charges, in line with our Contracts of Employment policy. This allows us to discuss any support you may need and assess any risks. Failure to disclose may result in the withdrawal of your offer.