Data Engineer
Torry Harris Integration Solutions
Municipality of Alicante, Spain
2 days ago
Role details
Contract type
Permanent contract Employment type
Full-time (> 32 hours) Working hours
Regular working hours Languages
English Experience level
IntermediateJob location
Municipality of Alicante, Spain
Tech stack
Airflow
Amazon Web Services (AWS)
Data analysis
Azure
Business Intelligence
Program Optimization
Databases
Information Engineering
ETL
Data Systems
Data Warehousing
Relational Databases
IBM InfoSphere DataStage
Hive
Python
Microsoft SQL Server
Oracle Business Intelligence Enterprise Edition
Oracle Applications
Scrum
SQL Databases
Tableau
Spark
PySpark
Information Technology
QlikView
Data Pipelines
Job description
A Data Engineer is responsible for designing, building, and maintaining large-scale data systems that enable data-driven decision-making.
They work with various stakeholders to understand data requirements, data architectures, and implement data pipelines to support business intelligence, analytics, and data science initiatives., * Build and maintain data pipelines to extract, transform, and load (ETL) data from various sources.
Data Quality & Reliability
- Implement data quality checks and validation processes.
- Ensure data accuracy, consistency, and reliability across systems.
System Optimization & Performance
- Optimize data systems for performance, scalability, and reliability.
- Troubleshoot and resolve data-related issues and system problems.
Collaboration & Requirements Gathering
- Work closely with data scientists, analysts, architects, and other stakeholders to understand data requirements and deliver appropriate solutions.
Continuous Learning & Innovation
- Stay up to date with emerging trends and technologies in data engineering.
- Explore advancements in predictive and prescriptive modelling to drive continuous improvement.
Requirements
- Bachelor's Degree in MIS/Engineering/Computer Science
- Data Warehousing / BI Certification a plus
- Advanced in SQL
- 5+ Experience working with at least two of the top ETL tools: DBT, PWC, ODI, Datastage, DBT (mandatory)
- 3+ years working with cloud environments AWS, Azure
- 3+ years working with Apache Airflow
- 5+ years working in an IT function
- 5+ years of BI development, analyst, data modelling, and support experience
- 5+ years of Relational Database Oracle, SQL Server,
or
- 5+ years of Columnar Database Redshift
- 3+ years of Spark, Glue, EMR
- 3+ Experience in scalable python development, (PySpark, Spark SQL)
- 2+ Experience working with at least two of the top BI tools: Tableau, Qlik, OBIEE
- Flexible to adapt and quickly (willing to) learning different technologies.
Other Requirements
- English C1
- Ability to cooperate and work in multicultural environment
- Communication and teaching oriented, knowledge transfer ability.
- Multi-tasking ability - handling multiple activities in parallel
- Organized and structured
- Be updated on Scrum methodology
- Proactive, flexible, result-driven, with a "can do" attitude, attention to detail, problem-solving