Data Engineer
Intelligent Steps
Charing Cross, United Kingdom
yesterday
Role details
Contract type
Temporary contract Employment type
Full-time (> 32 hours) Working hours
Regular working hours Languages
English Compensation
£ 72KJob location
Remote
Charing Cross, United Kingdom
Tech stack
API
Airflow
Amazon Web Services (AWS)
Azure
Cloud Computing
Databases
Continuous Integration
Data Governance
Data Infrastructure
ETL
Data Security
Data Systems
Data Warehousing
DevOps
Document-Oriented Databases
Power BI
Standard Sql
SQL Databases
Data Streaming
Tableau
Talend
Informatica Powercenter
Delivery Pipeline
Snowflake
Kafka
Stream Processing
Looker Analytics
Data Pipelines
Job description
- Design, develop, and maintain robust data pipelines and ETL/ELT processes
- Build and optimize data models within Snowflake for performance and scalability
- Ingest data from various sources (APIs, databases, streaming platforms, etc.)
- Ensure data quality, integrity, and governance across systems
- Collaborate with data analysts, scientists, and business stakeholders to deliver data solutions
- Monitor and troubleshoot data workflows and pipeline performance
- Implement best practices for data security, privacy, and compliance
- Document data architecture, processes, and workflows
Technologies:
- Airflow
- AWS
- Azure
- CI/CD
- Cloud
- DevOps
- ETL
- Fivetran
- GCP
- Informatica
- Support
- Kafka
- Looker
- Power BI
- SQL
- Security
- Snowflake
- Tableau
- Talend
- dbt, We are looking for a skilled Data Engineer with strong experience in Snowflake to join our growing data team. You will be responsible for designing, building, and maintaining scalable data pipelines and architectures that support analytics, reporting, and data-driven decision-making across the organization. This is a 6-month initial contract (OUTSIDE IR35), and you can work remotely with occasional travel into London (1 day per month).
Requirements
- Proven experience as a Data Engineer or in a similar role
- Strong hands-on experience with Snowflake data platform
- Proficiency in SQL and data modeling techniques
- Experience with ETL/ELT tools (e.g., dbt, Apache Airflow, Talend, Informatica)
- Experience with cloud platforms (AWS, Azure, or GCP)
- Familiarity with data warehousing concepts and best practices
- Understanding of data governance and data quality principles
- Experience with modern data stack tools (e.g., dbt, Fivetran, Kafka) (preferred)
- Knowledge of CI/CD pipelines and DevOps practices (preferred)
- Experience working with large-scale or real-time data processing systems (preferred)
- Familiarity with BI tools (e.g., Power BI, Tableau, Looker) (preferred)
- Snowflake certification is a plus (preferred)