AWS Data Engineer
Red - The Global SAP Solutions Provider
Charing Cross, United Kingdom
2 days ago
Role details
Contract type
Temporary contract Employment type
Full-time (> 32 hours) Working hours
Regular working hours Languages
English Experience level
IntermediateJob location
Remote
Charing Cross, United Kingdom
Tech stack
Airflow
Amazon Web Services (AWS)
Data analysis
ETL
Data Structures
Python
Microsoft SQL Server
Oracle Applications
PL-SQL
SQL Databases
Data Lake
PySpark
Data Pipelines
Job description
- Design, develop, and maintain robust data pipelines and ETL processes.
- Build and optimise data workflows using Python.
- Manage workflow orchestration with Apache Airflow (MWAA).
- Perform data testing, validation, and produce data quality reports.
- Conduct data exploration and analysis to understand data structures prior to ETL development.
- Collaborate with system owners and stakeholders to gather requirements and deliver solutions.
- Monitor, troubleshoot, and ensure reliability and performance of data pipelines.
- Maintain clear documentation of data workflows, processes, and configurations.
Requirements
- 4+ years' experience as a Data Engineer.
- Strong SQL/PLSQL skills across MS SQL and Oracle.
- Extensive hands-on experience coding in Python.
- Solid understanding of ETL concepts and data pipeline architecture.
- Knowledge of data lakes and associated architectures.
- Experience with Apache Airflow (MWAA).
- Familiarity with AWS Athena/PySpark (Glue).