Data Engineer - III
Role details
Job location
Tech stack
Job description
The Data Engineer with 5+ years of experience required, will support a strategic data platform migration initiative, building the data pipelines and warehouse infrastructure required to transition business-critical operations from a third-party SaaS CRM (Salesforce) onto an internal entity-modeled data platform. The engineer will design and maintain bidirectional ETL flows between Salesforce, distributed data warehouses, and operational entity stores, ensuring data consistency, freshness, and auditability throughout the migration. Work spans pipeline development, schema design, data quality engineering, and orchestration of scheduled batch and streaming jobs.
Requirements
Do you have experience in SQL?, Do you have a Bachelor's degree?, · Experience with Hack/PHP coding and GraphQL
· Strong proficiency in SQL
· Comfort with ORM frameworks
· Hands-on experience building and operating large-scale ETL/ELT pipelines in distributed data warehouse environments (Hive, Spark, Presto/Trino, BigQuery, Snowflake, or Databricks).
· Familiarity with distributed event-streaming systems (Kafka, Pulsar, Kinesis, or equivalent) for real-time data ingestion.
· Hands-on experience using AI coding assistants (Claude Code, Cursor, GitHub Copilot, etc.) as part of a daily development workflow.
· Strong understanding of data quality engineering -- testing, validation, monitoring, and reconciliation patterns.
· Experience with data migrations between heterogeneous systems is highly desirable.
· Verbal and written communication skills, problem solving skills, customer service and interpersonal skills.
· Strong ability to work independently and manage one's time.
· Strong ability to troubleshoot data issues and make system changes as needed to resolve them.
Education/Experience:
Bachelor's degree in computer science, data engineering, information systems, or relevant field required.
3+ years of professional data engineering experience preferred.
Must Haves:
· Entity/schema modeling and GraphQL
· ORM-framework & MySQL
· Real time data propagation (Kafka, Pulsar, Kinesis, or equivalent)
Nice to have:
· Salesforce Knowledge
· AI development workflows
· Events and Subscription handling (Iris)
Benefits & conditions
3.33.3 out of 5 stars Remote $65 - $70 an hour - Contract