Software Engineer III

Cybrient Technologies Sa Verified
Geneva, Switzerland
5 days ago

Role details

Contract type
Permanent contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English
Experience level
Senior

Job location

Geneva, Switzerland

Tech stack

Java
Third Normal Form
Artificial Intelligence
Airflow
Amazon Web Services (AWS)
Amazon Web Services (AWS)
Computer Programming
Data Architecture
ETL
Data Vault Modeling
Data Warehousing
Identity and Access Management
Python
OAuth
Software Engineering
Snowflake
Backend
Adobe
Kafka
Data Management
Terraform

Requirements

Dagster or Airflow. Implement infrastructure automation with Terraform for Snowflake resources. Integrate secure access controls using OAuth-based and OIDC authentication. Develop and maintain data models in dbt, implementing both dimensional (star/snowflake) and Data Vault approaches. Optimize Snowflake workloads, ensuring cost-efficient and performant solutions. Ingest and transform data from multiple internal and external systems. Collaborate with engineers, analysts, and architects to deliver reliable data services. Monitor, troubleshoot, and continuously improve data reliability and platform stability.Technical Expertise Education: Bachelors degree in computer science, Information Technology, Engineering, or a related field (or equivalent practical experience). Experience: 5+ years of experience in data platform engineering or cloud data architecture, with strong expertise in Snowflake. Skills: Extensive hands-on experience with Snowflake (development, optimization, and administration fundamentals). Proficiency with dbt for model development, testing, data quality checks and deployment. Understanding of Data Vault 2.0 and dimensional (Star/3NF) modelling principles. Terraform for infrastructure automation (especially Snowflake roles, warehouses, and integrations). Practical use of Dagster or Airflow for orchestration. Working knowledge of AWS services (particularly S3 and IAM). Familiarity with OAuth and secure access patterns. Strong programming skills in Python and/or Java. Strong understanding of ETL, data warehousing, and data lifecycle management concepts.Nice to have skills Familiarity with Denodo. Experience with Kafka and real-time ingestion patterns. Knowledge of commodities markets or capital markets.Soft Skills Strategic mindset with the ability to design scalable data platforms rather than only execute technical tasks. Proactive approach to improving platform governance, security, and automation. Ability to collaborate across data, engineering, and application teams. Attention to detail with a focus on data reliability and quality. Clear communication and documentation skills.

Apply for this position