IT Developer / Desarrollador
Qaracter Beyond your Challenge
Municipality of Madrid, Spain
5 days ago
Role details
Contract type
Permanent contract Employment type
Full-time (> 32 hours) Working hours
Regular working hours Languages
English, Spanish Experience level
IntermediateJob location
Municipality of Madrid, Spain
Tech stack
Airflow
Amazon Web Services (AWS)
Amazon Web Services (AWS)
Azure
Big Data
Google BigQuery
Communications Protocols
Continuous Integration
Data as a Services
Data Architecture
Information Engineering
Data Governance
Data Integration
ETL
Data Warehousing
Relational Databases
Python
PostgreSQL
Metadata
Microsoft SQL Server
Oracle Applications
SQL Databases
Data Ingestion
Snowflake
Gitlab
Data Lake
Apache Nifi
Data Management
Data Pipelines
Docker
Jenkins
Job description
Are you passionate about connecting complex systems, building high-quality data pipelines, and enabling real operational impact?, This role is ideal for someone who thrives at the intersection of OT (Operational Technology) and Data Engineering, helping organizations unlock value from industrial and corporate data sources. As a Data Integration Engineer, you will:
- Data Integration & Processing (ETL / ELT): Design, build, and maintain robust end-to-end data pipelines
- Work with ETL / ELT tools such as Apache Airflow, Apache NiFi, or custom Python pipelines
- Clean, standardize, validate, and consolidate data from industrial and corporate systems
- Automate workflows, ensure data quality, and orchestrate data ingestion processes
- Integrate with relational databases (PostgreSQL, SQL Server, Oracle) and modern data platforms (S3, ADLS, BigQuery, Snowflake)
- Interpretation of tags, signals, and historical data
- Understanding of industrial communication protocols: Integration of PLC/RTU-generated data into big data or cloud platforms
- Troubleshooting typical industrial data quality issues (lag, dropouts, noisy signals, irregular sampling)
Data Architecture & Modelling
- Design operational and analytical data models
- Structure and organize data lakes and data warehouses to enable use cases such as leakage detection, demand forecasting, energy optimization, and asset analytics
- Work with metadata, cataloging, and data governance frameworks
- Develop transformation logic and connectors via Python
- Use Docker and CI/CD tools (GitLab, Jenkins, Azure DevOps) to streamline deployments
Requirements
- 3+ years of experience working in data integration, data engineering, or ETL processes
- Strong Python skills
- Experience building and maintaining data pipelines in industrial or corporate environments
- Solid SQL knowledge and experience with relational databases
- Familiarity with cloud data services (Azure, AWS, or GCP)
- Experience collaborating with cross-functional teams (IT, Operations, OT)
- Full professional proficiency in Spanish and at least B2 / C1 in English