Data Engineer
Propertyvalue Prudent Technologies And Consulting
San Francisco, United States of America
yesterday
Role details
Contract type
Permanent contract Employment type
Full-time (> 32 hours) Working hours
Regular working hours Languages
English Experience level
SeniorJob location
San Francisco, United States of America
Tech stack
Airflow
Amazon Web Services (AWS)
Azure
Continuous Integration
Information Engineering
ETL
Dataspaces
Data Systems
Data Visualization
Dimensional Modeling
Github
Monitoring of Systems
Python
SQL Databases
Systems Integration
Datadog
Freeform SQL
Google Cloud Platform
Cloud Platform System
Snowflake
Spark
Kafka
Data Pipelines
Docker
Jenkins
Databricks
Job description
- Partner with technical and non-technical colleagues to understand data and reporting requirements
- Work with engineering teams to collect required data from internal and external systems
- Design table structures and create data pipelines to build performant data solutions that are reliable and scalable in a fast growing data ecosystem
- Develop automated data quality checks
- Develop and maintain ETL routines using ETL and orchestration tools such as Airflow
- Perform ad hoc analysis as necessary.
- Perform SQL and ETL tuning as necessary.
Requirements
-
Data Engineering skills using -
-
Complex or Advanced SQL queries
-
Python
-
Spark
-
Snowflake
-
Databricks
-
Airflow or Prefect
Experience with at least one cloud platform (AWS / Azure / Google Cloud Platform)
Nice to have skills:
- Familiarity with tools like Datorama / Improvado / FiveTran for integrating, harmonizing, and visualizing data across platforms.
- Data Modeling (e.g., Dimensional Modeling)
- Experience working with Kafka
- Familiarity with CI/CD tools (e.g., Jenkins, GitHub Actions, etc.) & Docker containers
- Exposure to monitoring tools like Datadog