Senior Software Engineer, Data Engineering

REMOTE HAND
30 days ago

Role details

Contract type
Permanent contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English
Experience level
Senior

Job location

Remote

Tech stack

Airflow
Amazon Web Services (AWS)
Big Data
Google BigQuery
Databases
Information Engineering
Data Infrastructure
ETL
Data Warehousing
Relational Databases
Hadoop
Hadoop Distributed File System
HBase
PostgreSQL
MySQL
Apache Oozie
Scala
Spark
Luigi
Amazon Web Services (AWS)
Kafka
Data Pipelines
Redshift

Job description

  1. About the Opportunity: The Senior Software Engineer, Data Engineering role is focused on advancing the organization's data capabilities to enhance guest and host experiences. This position involves building data strategies, developing robust data models and pipelines, and providing technical leadership. The role directly impacts critical product areas such as search, checkout, and host personalization by enabling data-driven decisions and solutions. It plays a key role in maintaining high-quality data infrastructure that supports both operational excellence and strategic initiatives., * Design, build, and maintain efficient data pipelines from multiple sources, including user interactions and external feeds
  • Develop and optimize data models for effective analysis and merchandising improvements
  • Build scalable data pipelines using SparkSQL, Scala, and Airflow
  • Collaborate with Data Scientists, Product Managers, and Engineers to define requirements and deliver data-driven solutions
  • Contribute to the data engineering community to improve tooling, standards, and productivity
  • Enhance code and data quality through internal tools for automated issue detection and resolution

Requirements

  • 5+ years of relevant industry experience with a BS or Masters degree, or 2+ years with a PhD
  • Experience with distributed processing frameworks such as Hadoop, Spark, Kafka, and storage systems like HDFS or S3
  • Proven ability to analyze large data sets for quality, gaps, and actionable insights
  • Expertise with ETL schedulers such as Apache Airflow, Luigi, Oozie, or AWS Glue
  • Strong understanding of data warehousing and experience with relational databases (PostgreSQL, MySQL) and columnar databases (Redshift, BigQuery, HBase, ClickHouse)
  • Excellent written and verbal communication skills

Benefits & conditions

  1. Pay Range and Compensation Package:
  • The pay range and compensation package for this role will be determined based on the candidate's experience, skills, and other relevant factors.

About the company

1. About Our Client: The organization operates in the global short-term lodging and experience marketplace, connecting millions of hosts and guests worldwide. It addresses the challenge of providing authentic and unique travel experiences by enabling hosts to offer diverse stays and activities across almost every country. The program supports a large community with over 5 million hosts and more than 2 billion guest arrivals, focusing on creating seamless and engaging connections between travelers and local communities.

Apply for this position