Data Engineer
DCS Recruitment
Holme Valley, United Kingdom
6 days ago
Role details
Contract type
Permanent contract Employment type
Full-time (> 32 hours) Working hours
Regular working hours Languages
English Compensation
£ 60KJob location
Holme Valley, United Kingdom
Tech stack
API
Airflow
Amazon Web Services (AWS)
Azure
Google BigQuery
Cloud Computing
Continuous Integration
Information Engineering
ETL
Data Security
Data Systems
Data Warehousing
Document Management Systems
Python
SQL Databases
Google Cloud Platform
Snowflake
Spark
Data Lake
Apache Flink
Kafka
Spark Streaming
Data Management
Terraform
Data Pipelines
Databricks
Job description
- Design, develop, and deploy scalable, secure, and reliable data pipelines using modern cloud and data engineering tools.
- Consolidate data from internal systems, APIs, and third-party sources into a unified data warehouse or data lake environment.
- Build and maintain robust data models to ensure accuracy, consistency, and accessibility across the organization.
- Collaborate with Data Analysts, Data Scientists, and business stakeholders to translate data requirements into effective technical solutions.
- Optimize data systems to deliver fast and accurate insights supporting dashboards, KPIs, and reporting frameworks.
- Implement monitoring, validation, and quality checks to ensure high levels of data accuracy and trust.
- Support compliance with relevant data standards and regulations, including GDPR.
- Maintain strong data security practices relating to access, encryption, and storage.
- Research and recommend new tools, technologies, and processes to improve performance, scalability, and efficiency.
- Contribute to migrations and modernization projects across cloud and data platforms (e.g., AWS, Azure, GCP, Snowflake, Databricks).
- Create and maintain documentation aligned with internal processes and change management controls.
Technologies:
- Airflow
- AWS
- Azure
- BigQuery
- CI/CD
- Cloud
- Data Warehouse
- Databricks
- ETL
- Flink
- GCP
- Support
- Kafka
- Python
- SQL
- Security
- Snowflake
- Spark
- Terraform
- dbt
Requirements
- Proven hands-on experience as a Data Engineer or in a similar data-centric role.
- Strong proficiency in SQL and Python.
- Solid understanding of ETL/ELT pipelines, data modeling, and data warehousing principles.
- Experience working with cloud platforms such as AWS, Azure, or GCP.
- Exposure to modern data tools such as Snowflake, Databricks, or BigQuery.
- Familiarity with streaming technologies (e.g., Kafka, Spark Streaming, Flink) is an advantage.
- Experience with orchestration and infrastructure tools such as Airflow, dbt, Prefect, CI/CD pipelines, and Terraform.
Benefits & conditions
We are offering a competitive salary of up to £60,000 per annum plus benefits. You will have the opportunity to work in a hybrid model, spending three days in the office. This is a chance for you to lead and mentor within our growing team while receiving professional development and training support.