AWS Data Engineer With Databricks and Lakehouse

Amazon.com, Inc.
Santa Clara, United States of America
16 days ago

Role details

Contract type
Permanent contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English
Experience level
Senior
Compensation
$ 200K

Job location

Santa Clara, United States of America

Tech stack

Amazon Web Services (AWS)
Data Architecture
Information Engineering
Data Governance
ETL
Database Queries
Software Debugging
Distributed Computing Environment
Identity and Access Management
Python
Performance Tuning
Cloud Services
Data Processing
Amazon Web Services (AWS)
Data Lake
PySpark
Amazon Web Services (AWS)
Data Pipelines
Databricks

Requirements

  • 7+ years of experience in data engineering or related roles.

  • Strong hands-on experience with the Databricks platform.

  • Proficiency in Python for data processing and pipeline development.

  • Strong experience with PySpark and distributed data processing.

  • Deep understanding of ETL/ELT pipeline design and orchestration.

  • Experience with Databricks Unity Catalog and data governance practices.

  • Strong knowledge of AWS cloud services, especially:

  • S3

  • IAM

  • VPC

  • Exposure to Glue / Lambda is a plus

  • Solid understanding of data lake / lakehouse architecture patterns.

  • Experience building dashboards and supporting analytics use cases.

  • Strong SQL skills and performance tuning expertise.

  • Experience in data modeling and schema design.

  • Good problem-solving and debugging skills.

  • Strong communication and stakeholder management abilities.

About the company

© 2026 Careerjet All rights reserved

Apply for this position