Data Engineer - Data Lake to AWS Lakehouse Migration
Dizer Corp
Dallas, United States of America
yesterday
Role details
Contract type
Permanent contract Employment type
Full-time (> 32 hours) Working hours
Regular working hours Languages
EnglishJob location
Dallas, United States of America
Tech stack
Java
Apache HTTP Server
Big Data
Continuous Integration
Data Migration
Hadoop
Hadoop Distributed File System
Hive
JSON
Python
Standard Sql
Parquet
Sql Optimization
Snowflake
Spark
Kubernetes
Avro
Kafka
Requirements
- Python or Java
- Apache Spark, Kafka, SQL
- Snowflake & Apache Iceberg
- Hadoop (HDFS/Hive)
- Data migration, reconciliation & data quality validation
- SCD Type 2 / Temporal Data Modeling
- Kubernetes (K8s), CI/CD
- JSON, Avro, Parquet
- Strong experience in SQL optimization, schema evolution, partitioning, and large-scale data processing is required.