Data Engineer (REMOTE) (remote)

T-Systems Iberia
Municipality of Madrid, Spain
2 days ago

Role details

Contract type
Permanent contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English

Job location

Remote
Municipality of Madrid, Spain

Tech stack

Artificial Intelligence
Airflow
Google BigQuery
Cloud Computing
Data Integration
DNS
Hadoop
Hive
Identity and Access Management
Virtual Private Networks (VPN)
Python
Machine Learning
Network Architecture
Systems Integration
Data Processing
Google Cloud Platform
File Transfer Protocol (FTP)
Load Balancing
Firewalls (Computer Science)
Amazon Web Services (AWS)
Gitlab
PySpark
Gitlab-ci
Kubernetes
Terraform
Data Pipelines

Job description

Infrastructure Deployment & Management: Efficiently deploy and manage infrastructure on Google Cloud Platform (GCP) including network architectures (Shared VPC, Hub-and-Spoke), security implementations (IAM, Secret Manager, firewalls, Identity-Aware Proxy), DNS configuration, VPN, and Load Balancing. # Data Processing & Transformation: Utilize Hadoop cluster with Hive for querying data, and PySpark for data transformations. Implement job orchestration using Airflow. # Core GCP Services Management: Work extensively with services like Google Kubernetes Engine (GKE), Cloud Run, BigQuery, Compute Engine, and Composer, all managed through Terraform. # Application Implementation: Develop and implement Python applications for various GCP services. # Integrate and manage GitLab Magenta CI/CD pipelines for automating cloud deployment, testing, and configuration of diverse data pipelines. # Implement comprehensive security measures, manage IAM policies, secrets using Secret Manager, and

Requirements

enforce identity-aware policies. # Data Integration: Handle integration of data sources from CDI, Datendrehscheibe (FTP servers), TARDIS API´s and Google Cloud Storage (GCS). # Multi-environment Deployment: Create and deploy workloads across Development (DEV), Testing (TEST), and Production (PROD) environments. # Implement AI solutions using Google's Vertex AI for building and deploying machine learning models. # Must be a certified GCP Cloud Architect or Data Engineer. Strong proficiency in Google Cloud Platform (GCP) Expertise in Terraform for infrastructure management Skilled in Python for application implementation Experience with GitLab CI/CD for automation Deep knowledge of network architectures, security implementations, and management of core GCP services Proficiency in employing data processing tools like Hive, PySpark , and data orchestration tools like Airflow Familiarity with managing and integrating diverse data sources Certified GCP Cloud Architect and Data Engineer Hybrid work model (telework/face-to-face). Continuous training. Please send CV in English.

About the company

At T-Systems, you will find groundbreaking projects that contribute to social and ecological well-being. It doesn't matter when or where you work. It's about doing meaningful work that advances society. For this reason, we will do everything possible to ensure that you have all the development opportunities by providing a support network, excellent technology, a new work environment, and the freedom to work autonomously. T-Systems is a team of around 28,000 employees worldwide, making us one of the leading global providers of end-to-end integrated solutions. We develop hybrid cloud solutions, artificial intelligence, and drive the digital transformation of companies, industries, the public sector, and ultimately, society as a whole. Project Description: We are looking for highly skilled Data Engineers to join our team in DBIZ. The Technik Value Stream teams are responsible for Data Ingest on ODE, development of the relevant data products on ODE, Operations of the data products on ODE

Apply for this position