Senior Data Engineer (Talend & Snowflake)
Role details
Job location
Tech stack
Job description
As a Senior Data Engineer (Talend and Snowflake), you will play a pivotal role in designing, developing, and maintaining scalable data pipelines and leading the migration from Talend-based ETL processes to Snowflake-based modern data architecture. You'll collaborate closely with data architects, product teams, analysts, and data scientists to implement robust data solutions that power analytics and reporting across the business., The ideal candidate will have strong experience in Talend Studio, SQL, Snowflake, Java, Python/Shell Scripting, data migration strategies, and cloud-based data engineering. You'll be part of a dynamic and innovative team, working on cutting-edge technologies to deliver impactful data products for our retail chain., * Talend Studio and Data Integration: Experienced in job design, working with components (input/output, transformations, mapping), organizing project structure, and managing metadata in both Talend Studio and Talend Cloud. Skilled at designing efficient data flows and connecting diverse sources like databases, APIs, and files.
- Data Migration Leadership: Drive the migration of Legacy Talend ETL workflows to Snowflake, ensuring minimal disruption and high data integrity.
- ETL/ELT Pipeline Development: Design, optimize, and maintain ETL/ELT pipelines using Talend (current state) and Snowflake/DBT (future state).
- SQL Skills: Proficient in writing complex queries, interacting with databases, performing validation, and developing ETL logic.
- Performance Optimisation: Monitor and troubleshoot production data pipelines, ensuring reliability and scalability during and after migration. Focused on improving job performance, parallel processing, and resource management.
- Mentorship & Best Practices: Mentor junior engineers and lead technical discussions to establish best practices for modern data engineering.
- Query Optimization & Data Transformation: Write and optimize SQL queries for Snowflake, ensuring performance and scalability.
- Data Vault Modeling: Capable of designing effective schemas for data warehouses and lakes. Implement flexible Data Vault models in Snowflake to support large-scale analytics and BI.
- Cross-Team Collaboration: Work with Data Engineers, Product Managers, and Data Scientists to deliver solutions that enable data-driven insights.
- Stakeholder Engagement: Translate business requirements into technical solutions that add measurable value.
- Data Quality & Governance: Enforce governance and quality processes across Talend and Snowflake environments. Able to profile, cleanse, and validate data using Talend's DQ components.
- Cloud & Infrastructure Support: Build and maintain scalable data solutions on AWS/Azure using DBT, Terraform, and Airflow.
- Continuous Improvement: Identify opportunities to modernize data systems and processes during migration and beyond.
Requirements
- Talend Studio Skills: Experienced in job design, working with components (input/output, transformations, mapping), organizing project structure, and managing metadata in both Talend Studio and Talend Cloud.
- Talend Expertise: Hands-on experience with Talend for ETL development and data integration.
- Data Integration: Skilled at designing efficient data flows and connecting diverse sources like databases, APIs, and files.
- SQL Proficiency: Proficient in writing complex queries, interacting with databases, performing validation, and developing ETL logic.
- Data Quality: Able to profile, cleanse, and validate data using Talend's DQ components.
- Performance Optimization: Focused on improving job performance, parallel processing, and resource management.
- Data Modeling: Capable of designing effective schemas for data warehouses and lakes.
- Cloud Data Warehouses: Strong experience in Snowflake architecture, query optimization, and DBT for transformation workflows.
- Data Warehousing, ETL/ELT Methodologies: Strong grasp of ETL and ELT concepts and best practices. Skilled in designing and implementing data warehouse solutions.
- Data Migration: Proven track record of migrating ETL processes and data pipelines from Legacy tools to modern cloud platforms.
- Java: Used for creating custom routines, components, and advanced debugging in Talend.
- Python/Shell Scripting: Utilized for automation, orchestration, and managing environments.
- Cloud Platforms: Familiarity with AWS/Azure and cloud-native data solutions.
- Experience in designing and implementing data products and solutions on cloud-based architectures.
- Version Control: Proficiency in GitHub for code collaboration and CI/CD workflows.
- DevOps, CI &CD: Experienced in managing reliable deployments.
- Data Governance and Compliance: Expertise in implementing data governance frameworks in Alation, including data quality management and compliance with industry regulations. Strong understanding of data governance and security principles
- Problem-Solving & Analytical Thinking: Strong problem-solving, analytical thinking, time management, teamwork, and communication abilities.
- Effective Communication and Collaboration: Excellent communication skills for interacting with stakeholders, presenting technical concepts, and collaborating with cross-functional teams.
Desirable Skills:
- Knowledge of Data Visualization Tools: Experience with tools such as MicroStrategy/Power BI
- Infrastructure as Code: Knowledge of Terraform and Terragrunt for infrastructure as code, CICD and DevOps principles
- GenAI Exposure: Understanding of Generative AI technologies for future innovation.
- Familiarity with YAML/JSON for configuration files.
- Basic knowledge of Scripting for deployment automation.
- Exposure to large-scale data environments.
Benefits & conditions
This is a hybrid role - you can work remotely in the UK and attend the London office 3 days per week .
This is a 6+ month temporary contract to start ASAP
Day rate: Competitive Market rate