Data Engineer (Remote from Switzerland)

Jobgether
3 days ago

Role details

Contract type
Permanent contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English
Experience level
Intermediate

Job location

Remote

Tech stack

API
Azure
Google BigQuery
Cloud Computing
Databases
Continuous Integration
Data Architecture
Information Engineering
Data Infrastructure
Data Integrity
ETL
Data Transformation
Data Warehousing
Relational Databases
Python
PostgreSQL
Uptime
SQL Azure
MySQL
SQL Databases
T-SQL
System Availability
Snowflake
Gitlab
GIT
PySpark
Data Management
Software Version Control
Data Pipelines
Jenkins
Redshift

Job description

This position is posted by Jobgether on behalf of a partner company. We are currently looking for a Data Engineer in Switzerland.In this role, you will be responsible for building, maintaining, and optimizing data pipelines and infrastructure that support critical business operations and analytics. You will collaborate closely with Revenue Operations, BI, and Product teams to ensure accurate, accessible, and reliable data across the organization. The position requires hands-on experience with ETL/ELT pipelines, cloud data warehouses, and API integrations, alongside strong analytical and problem-solving skills. You will help shape data architecture best practices, improve data integrity, and drive scalable solutions. The role provides exposure to cloud technologies, modern data tools, and a collaborative environment where your contributions directly impact decision-making and operational efficiency.Accountabilities:

  • Partner with BI Analysts, Operations, Product, and Engineering teams to define and assess data requirements.
  • Design, implement, and maintain ETL/ELT pipelines and data integrations.
  • Build and manage data architecture, including relational and dimensional databases or cloud data warehouses.
  • Monitor data infrastructure performance, ensuring high availability, reliability, and uptime.
  • Recommend improvements to data architecture, ETL processes, and workflows to increase scalability and efficiency.
  • Ensure data integrity, compliance, and alignment with organizational business objectives.
  • Collaborate on API querying, data sourcing, and cross-functional data initiatives.

Requirements

Requirements:Minimum of 2 years in a data engineering or data management role.Strong SQL skills across relational databases (PostgreSQL, MySQL, T-SQL, etc.).Experience with Python and PySpark for data transformation and processing.ETL/ELT pipeline development and maintenance expertise.Knowledge of cloud data warehousing (AWS Redshift, Azure SQL, Snowflake, GCP BigQuery, etc.).Familiarity with CI/CD processes and source control (Git, GitLab, Azure DevOps, Jenkins).Understanding of data architecture concepts (dimensional, relational) and data privacy/compliance.Strong analytical, organizational, and problem-solving skills, with attention to detail.Excellent communication and collaboration skills in a remote, cross-functional environment.Benefits:Competitive compensation and benefits package.Fully remote working arrangement.Opportunities for professional development, including Pluralsight, conferences, and certifications.Supportive and inclusive work culture promoting work-life

Apply for this position