Data Platform Engineer

Gravity Hair Salon, LLC
1 month ago

Role details

Contract type
Temporary to permanent
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English
Experience level
Senior
Compensation
$ 150K

Job location

Remote

Tech stack

Airflow
Data Streaming
Kafka
Amazon Web Services (AWS)

Job description

Own the multi-stage data pipeline layer that ingests from external sources (BigQuery direct queries, vendor file feeds, APIs) into a governed lake/warehouse. Deliver scaled event-driven integrations; managing ETL for downstream integrations, an event-bus for user engagement services, and API / query access for billions of data points. Responsibilities

  • Design and implement ingestion/connectors (BigQuery direct, CSV?JSON, REST) and normalization into standardized data models.
  • Build event-driven jobs and services (Python/Node) that enrich, dedupe, and apply rules; ensure idempotency and safe replays.
  • Define data contracts, schema evolution, backfill strategy, and cutover plans; partner with stakeholders on acceptance criteria.
  • Operate AWS workloads (EC2, Lambda, AppRunner, RDS, Redshift) with Terraform; secure secrets, roles, and least-privilege access.
  • Optimize SQL for MPP systems (Redshift, Snowflake, or similar); profile queries, partition/cluster, and tune materializations.
  • Implement observability (logs, metrics, tracing, lineage) and incident response; drive postmortems and remediation.
  • Maintain concise documentation of architecture, workflows, standards, and governance.

Requirements

  • 7-10+ years backend/data engineering with production ownership of large event-driven data systems.
  • Proficient in Python, Node, and similar toolsets for ETL job implementation; strong testing and reliability toolset.
  • Deep AWS experience and Terraform-based IaC; CI/CD for data and application deployments.
  • Expert SQL and performance tuning on Redshift, Snowflake, or equivalent.
  • Experience delivering idempotent pipelines, restaging/reconciliation, and parity validation against legacy systems.
  • Experience in highly-governed environments (HIPAA, GLBA, PCI, etc.).
  • Solid security and compliance practices, supporting internal and external audits (SOC 2 ISO 27001, etc.) and remediation.
  • Orchestration experience (Airflow or similar), streaming (Kinesis/Kafka/SQS), and dbt or warehouse-centric modeling. (preferred)
  • Data quality frameworks, lineage/metadata tooling, and SLA/SLO design. (preferred)
  • Exposure to real-time enrichment and rules engines; familiarity with warehouse-native features (tasks/streams). (preferred)

Apply for this position