DATA ENGINEER SCALA - AMSTERDAM - 6 MONTHS CONTRACT
Role details
Job location
Tech stack
Job description
As a data-engineer you built data pipelines, industrialize machine learning and operations research models and replace Legacy data warehousing systems with state-of-the-art data lake solutions. You do this as part of our central Data, OR and AI department from where you work for the Finance business as part of a product team. Finance will use the scalable future proof solutions you design, making use of structured and unstructured data. Once the solution is built, you ensure it gets operational and stays like that! This all of course in an agile environment.
Requirements
We are looking for a passionate and talented Data Engineer. With everything you do, you keep the end goal in mind. You are capable of doing analyses and discovering the core problem in no-time and you provide the best solution for this. Next to this you are able of explaining these issues in an understandable and clear way to stakeholders at every level of the organization. You coach your junior teammates on a technical level. You look for opportunities, make actions out of them and convince the decision makers., * Preferably Bachelor degree or higher in Computer Science, Software Engineering or other relevant fields (less important in light of work experience)
-
At least 4 years of experience building production-grade data processing systems as a Data Engineer
-
In-depth knowledge of:
-
The Hadoop ecosystem (we are migrating to GCP)
-
Building applications with Apache Spark;
-
Columnar storage solutions like Parquet Apache HBase; including knowledge about data modelling for columnar storage;
-
experience in key-value pairs databases like Hbase, NOSQL databases
-
Experience with Event Streaming Platforms like Apache Kafka or Spark Streaming
-
Development on a cloud platform, preferably GCP
-
At least 3 years of experience with, Scala
-
Understanding of common algorithms and data structures
-
Experience with databases and SQL
-
Knowledge of continuous integration/continuous deployment techniques
-
Affinity with Machine Learning and/or Operations Research concepts
-
Experience with distributed databases, computing
-
Knowledge of Kubernettes of Docker (nice to have)