Data Analyst/Reporting & Analytics Engineer
Hamilton Barnes
Warwick Civil Parish, United Kingdom
2 days ago
Role details
Contract type
Contract Employment type
Full-time (> 32 hours) Working hours
Regular working hours Languages
English Compensation
£ 130KJob location
Warwick Civil Parish, United Kingdom
Tech stack
Adaptable Database Systems
Artificial Intelligence
Business Analytics Applications
Azure
Data Architecture
Data Governance
Data Integration
ETL
Python
Performance Tuning
Power BI
Standard Sql
Azure
Snowflake
Microsoft Fabric
Machine Learning Operations
Data Pipelines
Job description
Location: Warwick (Hybrid - 3 days onsite)
Contract: 6 months Rate: £500 per day
We are looking for an experienced Data Analyst/Reporting & Analytics Engineer to support the delivery of modern, scalable data solutions. You will work with tools such as Power BI, Snowflake, and Azure to enable data-driven decision-making and self-service analytics across the business.
Key Responsibilities
- Design and maintain scalable data models (warehouse, lakehouse, data fabric)
- Develop Power BI dashboards, datasets, and reports aligned to business KPIs
- Build and optimise ETL/ELT pipelines and data integration processes
- Support data architecture using Snowflake and Azure services
- Apply Data Fabric and Data Product principles
- Implement data governance, quality, and metadata practices
- Work with stakeholders to translate business needs into technical solutions
Required Skills
- Strong Power BI experience (data modelling, DAX, performance tuning)
- Hands-on experience with Snowflake or similar platforms
- Azure data services knowledge (eg Data Factory, Data Fabric)
- Strong SQL and Python (or similar)
- Experience with ETL/ELT and modern data architectures
Desirable
- DataOps.live or similar tools
- Exposure to Promethium or similar platforms
- Experience with AI/ML integration
Requirements
- Strong Power BI experience (data modelling, DAX, performance tuning)
- Hands-on experience with Snowflake or similar platforms
- Azure data services knowledge (eg Data Factory, Data Fabric)
- Strong SQL and Python (or similar)
- Experience with ETL/ELT and modern data architectures
Desirable
- DataOps.live or similar tools
- Exposure to Promethium or similar platforms
- Experience with AI/ML integration