Data Engineer
W3global Eu Inc
1 month ago
Role details
Contract type
Permanent contract Employment type
Full-time (> 32 hours) Working hours
Regular working hours Languages
English, German Experience level
SeniorJob location
Tech stack
Agile Methodologies
Amazon Web Services (AWS)
Data analysis
Big Data
Data Visualization
Data Warehousing
Hive
JSON
Python
KNIME
Open Source Technology
Scrum
Power BI
Cloudera
SQL Databases
Tableau
Jupyter Notebook
Spark
Data Strategy
Data Lake
Kubernetes
Kafka
Machine Learning Operations
Dataiku
Data Pipelines
Job description
- Datasets and data pipelines preparation, support for Business, data troubleshooting.
- Closely collaborate with the Data & Analytics Program Management and stakeholders to co-design Enterprise Data Strategy and Common Data Model
- Implementation and promotion of Data Platform, transformative data processes, and services
- Develop data pipelines and structures for Data Scientists, testing such to ensure that they are fit for use
- Maintain and model JSON based schemas and metadata to re-use it across the organization (with central tools)
- Resolving and troubleshooting data related issues and queries
- Covering all processes from enterprise reporting to data science (incl. ML Ops)
Requirements
Do you have experience in Tableau?, * Hands-on Big Data experience using common open source components (S3, Hive, Spark, Trino, MinIO, K8S, Kafka)
- Experience in stakeholder management in heterogeneous business/technology organizations
- Experience in banking or financial business, with handling sensitive data across regions
- Experience in large data migration projects with on-prem Data Lakes
- Hands-on experience in integrating Data Science Workbench platforms (e.g. Knime, Cloudera, Dataiku)
- Track record in Agile project management and methods (e.g., Scrum, SAFe).
Skills
- Knowledge of reference architectures, especially concerning integrated, data-driven landscapes and solutions
- Expert SQL skills, preferably in mixed environments (i.e. classic DWH and distributed)
- Working automation and troubleshooting experience in Python using Jupyter Notebooks or common IDEs
- Data preparation for reporting/analytics and visualization tools (e.g Tableau, Power BI or Python based)
- Applying a data quality framework within the architecture
- Good knowledge of German is beneficial, excellent command of English is essential
- Higher education (e.g. "Fachhochschule","Wirtschaftsinformatik")