Senior Data Engineer
Role details
Job location
Tech stack
Job description
As a Senior Data Engineer at Kyndryl's AI Innovation Hub, you will architect, build, and optimize large-scale data platforms that power advanced analytics and AI-driven solutions. You'llbe responsible for designing robust data pipelines, ensuring data quality, and enabling seamless integration across cloud and on-premises environments. Working closely with data scientists, architects, and business stakeholders, you will transform complex data requirements into scalable, governed, and actionable data assets that drive enterprise innovation.
Your Mission
- Design, implement, and optimize ETL/ELT pipelines and data models for structured and unstructured data at scale.
- Build and maintain scalable data architectures supporting batch and real-time processing across cloud (AWS, Azure, GCP) and hybrid environments.
- Ensure data quality, lineage, and governance through robust validation frameworks and policies.
- Develop and maintain APIs and integrations for data access, supporting AI and analytics use cases.
- Collaborate with cross-functional teams to deliver data solutions aligned with business and AI needs.
- Mentor junior engineers and champion best practices in DataOps, CI/CD, and pipeline orchestration.
- Evaluate and integrate emerging data technologies, tools, and frameworks to enhance platform capabilities. Who You Are
Requirements
-
4+ years of experience building and maintaining large-scale data warehouses and data pipelines in enterprise environments.
-
Strong programming skills in Python and SQL, with hands-on experience in ETL/ELT tools (e.g., Airflow, dbt, Databricks, Kafka).
-
Expertise in data modeling, distributed systems, and cloud-native data services (AWS, Azure, GCP).
-
Experience with BI tools such as Looker or PowerBI for data visualization and reporting.
-
Knowledge of data governance, lineage, and observability tools.
-
Experience with data pipeline tooling (Databricks, Cloudera, Teradata, Snowflake).
-
Familiarity with DevOps/DataOps practices and CI/CD pipelines for data workflows.
-
Fluent or native in Spanish; strong communication skills. Education & Certifications
-
Bachelor's or Master's degree in Computer Science, Mathematics, Engineering, or a related technical field.
-
Postgraduate studies in Data Engineering, Cloud Computing, or Applied Data Science are valued.
-
Professional certifications in cloud platforms (AWS, Azure, GCP) or data engineering tools are a plus.
-
Demonstrated commitment to continuous learning and staying current with advances in data engineering and cloud technologies. Preferred Skills
-
Experience in regulated industries or environments with stringent compliance requirements.
-
Knowledge of containerization and orchestration (Docker, Kubernetes).
-
Familiarity with semantic modeling, metadata management, and emerging AI/ML data integration patterns.
-
Cloud platform certification.
-
Experience mentoring and leading technical teams. Soft Skills
-
Analytical and structured thinking, with a focus on solving complex data challenges.
-
Clear and adaptive communication, able to explain technical concepts to both technical and business audiences.
-
Collaborative spirit, thriving in multidisciplinary teams.
-
Proactive and innovative mindset, continuously exploring new tools and methodologies.
-
Commitment to quality, reliability, and continuous improvement. #AgenticAI