AWS Data Engineer

Spait Infotech Private Limited
Leeds, United Kingdom
2 days ago

Role details

Contract type
Permanent contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English
Experience level
Intermediate
Compensation
£ 110K

Job location

Remote
Leeds, United Kingdom

Tech stack

API
Amazon Web Services (AWS)
Amazon Web Services (AWS)
Software as a Service
Databases
Data as a Services
ETL
Data Security
Data Warehousing
Document Management Systems
Github
Identity and Access Management
Python
Scrum
SQL Databases
Data Streaming
Data Logging
Snowflake
Spark
State Machines
Electronic Medical Records
AWS Lambda
GIT
Cloudformation
Data Lake
PySpark
Amazon Web Services (AWS)
Amazon Web Services (AWS)
Data Management
Machine Learning Operations
Terraform
Software Version Control
Data Pipelines
Amazon Web Services (AWS)
Redshift
Databricks

Job description

  • Design, build, and maintain scalable, secure, and efficient data pipelines on AWS.
  • Develop ETL/ELT workflows using services such as AWS Glue, AWS Lambda, EMR, Step Functions, and Kinesis.
  • Integrate data from APIs, databases, SaaS platforms, and streaming sources.
  • Build and manage data lakes using Amazon S3 and associated Lake Formation/Data Catalog components.
  • Develop optimized data models, dimensional models, and analytical datasets for downstream consumption.
  • Work with data warehousing technologies such as Amazon Redshift or Snowflake (optional).
  • Collaborate with architects to design scalable, cost-effective AWS data architectures.
  • Optimize performance of ETL pipelines, storage layers, and compute clusters.
  • Implement best practices for monitoring, logging, and cloud resource utilization.
  • Implement data quality checks, validation rules, and automated monitoring.
  • Ensure security compliance using IAM, encryption, data access control, and AWS governance tools.
  • Maintain documentation, lineage, and metadata for all data assets.
  • Partner with data analysts, data scientists, BI teams, and product stakeholders.
  • Translate business needs into technical design specifications.
  • Participate in sprint planning, design reviews, and technical workshops.

Requirements

Do you have experience in Terraform?, Candidates will be considered at junior-mid, mid-senior, or senior levels depending on experience.

Core Technical Skills

  • Hands-on experience with AWS data services, such as:
  • AWS Glue (ETL / PySpark)
  • Amazon S3
  • AWS Lambda
  • Amazon EMR / Spark
  • Amazon Redshift or Redshift Spectrum
  • AWS Step Functions
  • Amazon Kinesis or MSK
  • Strong SQL skills and proficiency in Python (PySpark experience preferred).
  • Experience designing and maintaining data pipelines for batch and/or streaming workloads.
  • Expertise in data modelling, data warehousing concepts, and data lake architectures.
  • Experience with version control (Git) and CI/CD pipelines (AWS DevOps / CodePipeline / GitHub Actions etc.).
  • Experience with Terraform or CloudFormation for IaC.
  • Knowledge of Snowflake, Databricks, or other cloud data platforms.
  • Exposure to MLops / data orchestration frameworks.
  • AWS certifications (e.g., AWS Data Analytics - Specialty, AWS Solutions Architect) are a plus.

Benefits & conditions

Job Types: Full-time, Permanent

Pay: £55,000.00-£110,000.00 per year

Benefits:

  • Work from home

Apply for this position