Senior Data Engineer with Big Data Experience
Role details
Job location
Tech stack
Job description
We are seeking a skilled and experienced Senior Data Engineer specializing in Big Data to join our team at our prominent banking client. The ideal candidate will be responsible for designing, implementing and maintaining data structures and pipelines in a high-stakes financial environment. You'll play a critical role in driving data transformation, ensuring the availability of high-quality data for business intelligence, reporting and advanced analytics. This position requires hands-on expertise with Big Data technologies, exceptional problem-solving abilities and agility in bridging technical and business needs. We offer a hybrid work model with a mix of remote and on-site work at our client's office in Zürich., * Design, develop and maintain scalable data pipelines and ETL workflows to process large-scale financial data
- Collaborate with cross-functional teams to define and implement the bank's Enterprise Data Strategy and Common Data Model
- Integrate, maintain and troubleshoot Big Data platforms, ensuring optimal performance and reliability
- Implement and promote the use of Data Platforms and best practices to enable analytics and machine learning initiatives
- Model and manage JSON-based schemas and metadata, establishing reusable templates and standards
- Work closely with stakeholders to identify and clarify data requirements, ensuring alignment with business goals
- Support data scientists by preparing and testing data structures for ML Ops and analytics purposes
- Address and resolve data-related issues promptly to maintain data integrity across systems
Requirements
Do you have experience in Tableau?, * Proven hands-on experience with Big Data technologies such as S3, Hive, Spark, Trino, MinIO, Kubernetes (K8S) and Kafka
- Expertise in SQL across mixed environments, including traditional data warehouses and distributed systems
- Proficient in Python scripting for automation and troubleshooting using tools like Jupyter Notebooks or common IDEs
- Strong understanding of data quality frameworks and ability to apply them within architectural designs
- Background in integrating Data Science platforms such as Knime, Cloudera or Dataiku
- Experience in managing large-scale data migration projects, particularly with on-premise Data Lakes
- Familiarity with data visualization tools like Tableau, Power BI or Python-based solutions for analytics reporting
- Excellent communication skills in English and demonstrated ability to manage cross-functional stakeholders effectively, German proficiency is a plus
Benefits & conditions
- 5 weeks of vacation
- EPAM Employee Stock Purchase Plan (ESPP)
- Enhanced parental leave
- Extended pension plan
- Daily sickness allowance insurance
- Employee assistance program
- Global business travel medical and accident insurance
- Learning and development opportunities including in-house training and coaching, professional certifications, over 22,000 courses on LinkedIn Learning Solutions and much more
- All benefits and perks are subject to certain eligibility requirements
-
Please note that any offers will be subject to appropriate background checks
-
We do not accept CVs from recruiting or staffing agencies
-
For this position, we are able to consider applications from the following:
- Swiss nationals
- EU/EFTA nationals
- Third-country nationals based in Switzerland with an appropriate work permit
- Displaced people from Ukraine who are currently in Switzerland and hold, or have already applied for, S permits