Data Engineer
Role details
Job location
Tech stack
Job description
This is an exciting opportunity for a Data Engineer to join our Data, Analytics and Architecture team where you'll be working on fast paced, innovative data projects on our new Snowflake platform.
We're looking to work closely with our business colleagues to develop our data capabilities on our Snowflake platform utilising some of it's cutting edge features to help drive rapid business insights and leverage stronger analytical and data science capabilities, it's not often opportunities like this come along!
Don't worry, you don't need to be a Snowflake expert, we'll help you broaden your skill set in that department. What you will have is a background in Data Warehouse development, designing and implementing ETL pipelines on various technologies and ideally some Data Modelling experience.
You'll be an enthusiastic learner, keen and inquisitive towards new technologies and methods and we'll support your fearless mindset, encouraging you to "have a go". Your positive mindset will also rub off on your colleagues and peers as you are clearly a great team player.
On top of the technical challenges you will also have the opportunity to develop your broader data skills as Zurich looks to further leverage the exciting new technologies being developed on our new cloud based ecosystem.
As an Agile focused organisation, we put our customers at the heart of everything we do so you'll be comfortable working with a broad base of business stakeholders in a highly collaborative environment.
Many of our employees work flexibly in a variety of different ways, including part-time, flexible hours, job share, an element of working from home or compressed hours. This is because we want the best people for our roles, and we recognise that sometimes those people aren't available full-time. Please talk to us at interview about the flexibility you may need.
What will you be doing?
- Take a key role in the design, build and implementation of Data pipelines onto the Snowflake platform from a variety of input data source formats, frequencies and latencies
- Work with key business consumers to prototype, build and enhance data interfaces, reporting data marts and analytical models
- Lead the charge with innovation and experimentation, driving an automation first mindset
- Work closely with our Data Science and Analytics team to leverage opportunities within our data for insights business value
- Proactively drive sprint planning and the creation of tasks to help understand burndown, identify and remove blockers and drive continual improvement
Requirements
Personal skills
- Excellent problem-solving skills and able to deal with ambiguity.
- Ability to work under pressure and shift priorities depending on business need.
- Comfortable providing technical support and direction to colleagues
Technical Skills
- Experienced in ETL/ELT techniques integrating into Data Warehouse solutions.
- Solid understanding of Relational Databases & Data Warehouse methodology with some understanding of concepts such as Kimball, Inmon & Data Vault.
- Knowledge of various architectures and methodologies like metadata management, performance management and handling data quality issues.
- Experience working in a DevOps environment including Agile, Scrum or Kanban project management methodology.
- Developing in Cloud environments such as Azure, Snowflake, AWS etc.
Benefits & conditions
Salary: Up to £60,000 depending on experience plus an excellent benefits package