AWS Data Engineer
Role details
Job location
Tech stack
Job description
Design and develop scalable data pipelines on AWS Cloud platform * Build and maintain ETL/ELT workflows using Databricks, Spark, and AWS services * Develop real-time and batch data processing solutions * Create and optimize Data Lake and Data Warehouse architectures * Work with Snowflake, Redshift, and S3-based data platforms * Implement data governance and metadata management using Unity Catalog * Develop and maintain CI/CD pipelines for data engineering workflows * Monitor and troubleshoot data pipelines using CloudWatch and EventBridge * Collaborate with cross-functional teams including Data Analysts, Architects, and Business stakeholders * Ensure data quality, performance optimization, and security best practices * Create detailed technical documentation and testing artifacts
Requirements
Delta Lake * Apache Iceberg * Unity Catalog
Strong experience with Snowflake
Strong SQL and Python programming skills
Experience with CI/CD pipelines and Git
Familiarity with Infrastructure as Code tools such as Terraform
Strong understanding of Data Warehousing and Dimensional Modeling concepts
Experience with Apache Airflow and dbt
Experience building batch and real-time streaming pipelines
Strong testing, debugging, and documentation skills, Experience with healthcare, finance, or enterprise-scale data platforms