Cloud Data Engineer

Hapimag Ag
Zug, Switzerland
8 days ago

Role details

Contract type
Permanent contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English, German
Experience level
Intermediate

Job location

Remote
Zug, Switzerland

Tech stack

Airflow
Big Data
Google BigQuery
Cloud Computing
Cloud Storage
Computer Programming
Data as a Services
Information Engineering
Data Fusion
ETL
Data Security
Data Systems
Data Flow Control
Identity and Access Management
Python
SQL Databases
Data Logging
Data Processing
Google Cloud Platform
Data Ingestion
Cloud Monitoring
Spark
Infrastructure as Code (IaC)
Information Technology
Terraform
Data Pipelines

Job description

We are a sharing community for vacation rentals. Together we own 56 resorts with over 5,000 vacation rentals in popular destinations in Europe - by the sea, in the mountains or in cities. Why? Because we want to enjoy a relaxed and responsible vacation.

Our team is highly committed to simplifying the vacation experience of our shareholders and members, from the booking to the stay to the next vacation planning. All processes for guests and employees should be designed so intuitively and efficiently that our guests not only look forward to their next stay, but also take away many wonderful memories and enjoy coming back.

To fulfill our mission, we are looking for an enthusiastic and energetic person to join our IT team!

What you can expect

  • Development and maintenance of scalable, high-performance data pipelines on the Google Cloud Platform (GCP).

  • Design and implementation of native GCP data solutions, adhering to best practices for performance, security, and cost-efficiency.

  • Working with GCP services such as BigQuery, Cloud Storage, Dataflow, and Cloud Data Fusion for data ingestion, processing, and storage.

  • Implementation of ETL/ELT processes and data modeling.

  • Collaboration with data scientists, analysts, and stakeholders to understand data requirements and deliver efficient data products.

  • Ensuring data quality and availability.

  • Implementation of data security best practices, IAM roles, and encryption techniques.

  • Monitoring and troubleshooting of data pipelines using tools such as cloud logging and cloud monitoring.

  • Fostering a data-driven corporate culture.

Requirements

Do you have experience in Terraform?, * A degree in Computer Science, Data Engineering, Data Science, or a comparable technical field.

  • Several years of professional experience in Data Engineering, with at least two years focus on GCP solutions.

  • Comprehensive knowledge of GCP data services (BigQuery, Cloud Storage, Dataflow, Cloud Composer, Cloud Data Fusion).

  • Strong programming skills in Python and SQL for data processing.

  • Experience with Big Data technologies such as Apache Airflow and Apache Spark.

  • Experience implementing CI/CD pipelines and Infrastructure as Code (IaC) with tools such as Terraform/OpenTofu.

  • Strong analytical skills and a solution-oriented approach.

  • Excellent communication skills and a team player quality.

  • Fluent in German and English.

Benefits & conditions

  • A modern open-plan office at Hapimag headquarters in Steinhausen (ZG) with good transport links, a friendly atmosphere, and a dedicated, agile team with a great sense of humor
  • A hybrid work model (up to 40% remote work) and the option to occasionally work from one of our resorts
  • Attractive employee benefits for holidays at Hapimag resorts and other perks related to public transport travel
  • We promote personal responsibility and the professional development of our employees

Benefits at Hapimag

30% discounted membership 30% discount in restaurants Work across Europe and regardless of the season Workation within the EU/EFTA region Learn languages for free Exciting team events and activities

Apply for this position