Data Engineer
STAFIDE
Amsterdam, Netherlands
6 days ago
Role details
Contract type
Permanent contract Employment type
Full-time (> 32 hours) Working hours
Regular working hours Languages
English Experience level
SeniorJob location
Amsterdam, Netherlands
Tech stack
Java
Artificial Intelligence
Airflow
Google BigQuery
Cloud Computing
Cloud Engineering
Computer Programming
Databases
Data Systems
Relational Databases
Python
NoSQL
Power BI
SQL Databases
Data Streaming
Parquet
Google Cloud Platform
Data Ingestion
Avro
Google Cloud Functions
Data Analytics
Data Management
Tools for Reporting
Terraform
Software Version Control
Data Pipelines
Job description
- Build and migrate existing data solutions to Google Cloud Platform (GCP) to support data analytics and ML/AI products.
- Develop new data products on GCP using DBT and BigQuery.
- Collaborate with Solution Architects and Business Experts to design and implement relevant and scalable data models.
- Integrate and synchronize data from multiple sources using Cloud Functions and Python.
- Build and maintain end-to-end data pipeline components using Terraform, Cloud Workflows, and other GCP services., * A collaborative, technology-driven environment focused on innovation and continuous improvement.
- Opportunities to work with modern cloud architectures and advanced analytics solutions.
- A dynamic team culture where your ideas and expertise contribute directly to impactful data products.
- A platform to grow your cloud engineering and ML/AI data capabilities.
Requirements
- 6-8 years of overall experience, including hands-on work as a Data Engineer on cloud-based data platforms or in advanced analytics environments.
- Strong expertise in GCP services including BigQuery, Cloud Run, Cloud Functions, Pub/Sub, Cloud Composer, etc.
- Proficiency in SQL, DBT, Python, and infrastructure-as-code tools like Terraform.
- Experience with CI/CD pipelines and version control.
- Knowledge of PowerBI or similar reporting tools.
- Experience working with various data formats such as Avro and Parquet.
- Familiarity with both NoSQL and RDBMS databases.
- Programming experience in a data-centric context using Python, Java, or Scala.
You should possess the ability to:
- Design, deploy, and maintain scalable cloud-native data pipelines on GCP.
- Analyze complex business and technical requirements and translate them into robust data solutions.
- Optimize data flows for performance, reliability, and cost efficiency.
- Work collaboratively across multidisciplinary engineering and analytics teams.
- Troubleshoot issues across data ingestion, transformation, and workflow orchestration layers.