Data Analytics Engineer (ID:3416)

Stafide
Amstelveen, Netherlands
4 days ago

Role details

Contract type
Permanent contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English
Experience level
Senior

Job location

Amstelveen, Netherlands

Tech stack

Data analysis
Data Architecture
Information Engineering
Data Integrity
Data Transformation
Data Systems
Data Vault Modeling
Data Warehousing
Python
Operational Databases
Performance Tuning
Raw Data
DataOps
PL-SQL
SQL Databases
Technical Data Management Systems
Snowflake
Data Build Tool (dbt)
Data Analytics
Data Pipelines

Job description

  • Bridge the gap between data engineering, data architecture, and data analysis by delivering clean, reliable, and analytics-ready data.
  • Design, build, and maintain robust and scalable data pipelines to support repeatable and accessible data consumption.
  • Transform raw data into structured, high-quality datasets suitable for analysis and reporting.
  • Develop and maintain complex data models that represent business processes and entities.
  • Implement flexible Data Vault models in Snowflake to support large-scale analytics and business intelligence.
  • Write, optimize, and maintain complex SQL queries with a focus on performance, scalability, and data integrity.
  • Monitor, troubleshoot, and proactively resolve issues in production data pipelines.
  • Automate repetitive data processes using Python and scripting tools to improve efficiency and scalability.
  • Collaborate closely with Data Engineers, Data Architects, Data Scientists, and Product Managers to deliver integrated data solutions.
  • Contribute to the design and development of data products, enhancing existing components or creating new ones as needed.

Requirements

  • 6-8 years of experience in data analytics engineering, data engineering, or advanced analytics roles.
  • Strong expertise in SQL and PL/SQL for data transformation and performance optimization.
  • Hands-on experience with Snowflake and modern cloud data warehouses.
  • Solid experience implementing Data Vault modelling techniques.
  • Proficiency in Python for automation and data workflow orchestration.
  • Experience with DBT (Data Build Tool) for data transformation and modelling.
  • Strong understanding of data warehousing concepts and data modelling principles.
  • Proven ability to work with complex and high-volume datasets.

You Should Possess the Ability to

  • Translate business requirements into scalable, technical data solutions.
  • Design and maintain analytics-ready datasets and reusable data models.
  • Optimize data workflows for performance, reliability, and scalability.
  • Automate data operations to improve efficiency and consistency.
  • Influence design decisions aligned with architectural and engineering standards.
  • Adapt to evolving technologies, tools, and analytics best practices.

Apply for this position