DATA ENGINEER SCALA - AMSTERDAM - 6 MONTHS CONTRACT

Global Enterprise Partners
Amsterdam, Netherlands
5 days ago

Role details

Contract type
Temporary contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English
Experience level
Intermediate

Job location

Amsterdam, Netherlands

Tech stack

Artificial Intelligence
Databases
Continuous Delivery
Continuous Integration
Data Structures
Data Warehousing
Hadoop
HBase
Machine Learning
NoSQL
Software Engineering
SQL Databases
Data Streaming
Unstructured Data
Parquet
Data Processing
Cloud Platform System
Spark
Data Lake
Information Technology
Kafka
Spark Streaming
Data Pipelines
Docker

Job description

As a data-engineer you built data pipelines, industrialize machine learning and operations research models and replace Legacy data warehousing systems with state-of-the-art data lake solutions. You do this as part of our central Data, OR and AI department from where you work for the Finance business as part of a product team. Finance will use the scalable future proof solutions you design, making use of structured and unstructured data. Once the solution is built, you ensure it gets operational and stays like that! This all of course in an agile environment.

Requirements

We are looking for a passionate and talented Data Engineer. With everything you do, you keep the end goal in mind. You are capable of doing analyses and discovering the core problem in no-time and you provide the best solution for this. Next to this you are able of explaining these issues in an understandable and clear way to stakeholders at every level of the organization. You coach your junior teammates on a technical level. You look for opportunities, make actions out of them and convince the decision makers., * Preferably Bachelor degree or higher in Computer Science, Software Engineering or other relevant fields (less important in light of work experience)

  • At least 4 years of experience building production-grade data processing systems as a Data Engineer

  • In-depth knowledge of:

  • The Hadoop ecosystem (we are migrating to GCP)

  • Building applications with Apache Spark;

  • Columnar storage solutions like Parquet Apache HBase; including knowledge about data modelling for columnar storage;

  • experience in key-value pairs databases like Hbase, NOSQL databases

  • Experience with Event Streaming Platforms like Apache Kafka or Spark Streaming

  • Development on a cloud platform, preferably GCP

  • At least 3 years of experience with, Scala

  • Understanding of common algorithms and data structures

  • Experience with databases and SQL

  • Knowledge of continuous integration/continuous deployment techniques

  • Affinity with Machine Learning and/or Operations Research concepts

  • Experience with distributed databases, computing

  • Knowledge of Kubernettes of Docker (nice to have)

About the company

Are you interested in this project and do you meet the requirements? Please get in touch with Marco Eindhoven of Global Enterprise Partners on telephone number or mail Let op: vacaturefraude Helaas komt vacaturefraude steeds vaker voor. We waarschuwen je voor mogelijke misleiding: * Wij zullen nooit via WhatsApp of in een videogesprek vragen om jouw persoonlijke gegevens (zoals een kopie van je ID, bankgegevens of BSN). * Twijfel je over de echtheid van een vacature of contactpersoon? Neem dan altijd rechtstreeks contact met ons op via de officiële contactgegevens op onze website. Important: job fraud Unfortunately, job fraud is becoming more common. Beware of such scams: * We will never ask for personal information (such as a copy of your ID, bank details, or social security number) via WhatsApp or during a video call.

Apply for this position