Data Engineer (Tech, Python)

Making Science
Municipality of Vitoria-Gasteiz, Spain
8 days ago

Role details

Contract type
Permanent contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English

Job location

Remote
Municipality of Vitoria-Gasteiz, Spain

Tech stack

Agile Methodologies
Airflow
Amazon Web Services (AWS)
Amazon Web Services (AWS)
Google BigQuery
Cloud Computing
Cloud Storage
Data Governance
Data Systems
Data Warehousing
DevOps
Amazon DynamoDB
Identity and Access Management
Python
SQL Databases
Google Cloud Platform
State Machines
Cloudformation
Data Lake
Google Cloud Functions
Amazon Web Services (AWS)
Functional Programming
Cloudwatch
Terraform
Redshift
Programming Languages

Requirements

analytics, reporting, and operational use cases. Ensure data quality through validation, monitoring, and automated error-handling. Optimize data solutions for performance, reliability, and cost efficiency. Collaborate with data analysts, data scientists, DevOps, and other engineering teams. Participate in Agile ceremonies, helping estimate tasks, identify risks, and plan technical work. Troubleshoot data issues and support customer-facing teams with technical insights. Document architectural decisions, data workflows, and implementation guidelines. &##Strong experience with AWS data and compute services (AWS Glue, S3, Lambda, Step Functions, Redshift, Athena, DynamoDB). Solid SQL skills. Proficiency in Python or another major programming language. Understanding of data lake, data warehouse, and cloud-native data architectures. Experience with Infrastructure as Code (Terraform or CloudFormation). Familiarity with CI/CD pipelines, DevOps practices, monitoring, and observability (CloudWatch). Knowledge of data governance, security best practices, IAM, and compliance requirements. Strong communication skills in English for working with international teams and customers. Experience working with Agile methodologies. Recommended: AWS certifications (Data Engineer Associate, Solutions Architect, or Developer). Experience working with Google Cloud Platform, particularly BigQuery, Cloud Storage, Cloud Functions, Dataform or Cloud Composer. Understanding of how to translate or migrate data workflows between AWS and GCP. Familiarity with multi-cloud architectures or hybrid cloud setups. ️ We give you complete stability with an indefinite job contract. We offer a fixed salary according to your worth and experience, plus a performance-based bonus. Get the most out of your salary with our Flexible Payment Plan (Restaurant Ticket, Transportation Ticket, Day Care Ticket, and Medical Insurance). You can count on flexible working hours and one day a week working from home. You will never stop learning with us: subsidized training, free language classes, learning capsules, an e-learning platform, and many more.

About the company

The First Party Data/DataOPs Business Unit is an expert provider of tailored information technology solutions. We are pioneers in the application of Big Data, Cloud and Machine Learning technologies to meet real-time business challenges and are continuously innovating in these areas, evaluating and incorporating the latest tools. The technical environment is demanding, whereby our solutions are deployed in very large companies supporting billions of events processed per day and millions of concurrent user sessions. Being a very early adopter, the company has been leveraging some of the latest technologies in the area of Big Data and machine learning over the past 10 years. &##Design and implement cloud-based data architectures on AWS. Build, maintain, and optimize ETL/ELT pipelines using AWS Glue, Lambda, Step Functions, and related services. Develop and manage data lakes and data warehouses using Amazon S3, Redshift, DynamoDB, and Athena. Create and maintain data models and schemas for

Apply for this position