DATA ENGINEER

Expleo Gruppe
Geneva, Switzerland
7 days ago

Role details

Contract type
Permanent contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English, French
Experience level
Senior

Job location

Geneva, Switzerland

Tech stack

Amazon Web Services (AWS)
Data analysis
Azure
Information Engineering
ETL
Python
SQL Databases
Scripting (Bash/Python/Go/Ruby)
Spark
PySpark
Information Technology
Data Management
Front End Software Development
Data Pipelines
Databricks

Job description

  • Technical and human support for each project and effective career management
  • Training to develop your professional skills
  • Take part in special dedicated events
  • Join a dynamic team

Responsibilities: Part of the Data Engineering team, you will play a crucial role in overseeing the maintenance and optimization of data pipelines within the Databricks platform. Your primary responsibilities will encompass addressing evolving business requirements, refining ETL processes, and ensuring the seamless flow of energy data across our systems., 1. Design, Develop, and Maintain Robust Data Workflows:

  • Create and maintain scalable data workflows on Databricks integrated with AWS.
  • Collaborate closely with cloud and frontend teams to unify data sources and establish a coherent data

model.

  1. Ensure Data Pipeline Reliability and Performance:
  • Guarantee the availability, integrity, and performance of data pipelines.
  • Proactively monitor workflows to maintain high data quality.
  1. Collaborate for Data-Driven Insights:
  • Engage with cross-functional teams to identify opportunities for data-driven enhancements and insights.
  • Analyze platform performance, identify bottlenecks, and recommend improvements.
  1. Documentation and Continuous Learning:
  • Develop and maintain comprehensive technical documentation for ETL implementations.
  • Stay abreast of the latest Databricks/Spark features and best practices, contributing to the continuous

improvement of our data management capabilities.

Requirements

Do you have experience in Spark?, Do you have a Bachelor's degree?, * Bachelor's degree in Computer Science, Information Technology, or a related field.

Essential skills: Technical Skills:

  • Strong expertise in PySpark.
  • Proficiency in SQL and scripting languages (e.g., Python).
  • Excellent analytical and problem-solving skills.
  • Strong communication skills in French (both written and verbal) and fluency in English.

Desired skills: Additional Preferred Skills:

  • Familiarity with industry-specific regulations and compliance requirements.
  • Previous experience in the energy trading domain is a nice-to-have.
  1. Personal Attributes:
  • Ability to work effectively in a fast-paced, collaborative environment.
  • Detail-oriented with effective task prioritization skills.
  • Demonstrated adaptability and a keen willingness to learn new technologies and tools.
  • Strong customer orientation.

Experience: Minimum of 5 years as a Data Engineer, with a proven track record of implementing pipelines inDatabricks Experience in cloud environments (AWS or Azure) is a plus. Fluent english required, advanced French is mandatory

About the company

Expleo offers a unique range of integrated engineering, quality and strategic consulting services for digital transformation. At a time of unprecedented technological acceleration, we are the trusted partner of innovative companies. We help them develop a competitive advantage and improve the daily lives of millions of people. Joining Expleo Switzerland means working for 19,000 people in 30 countries

Apply for this position