Data Warehouse Engineer
Role details
Job location
Tech stack
Job description
The Senior Data Warehouse Engineer role encompasses full stack BI development skills, from stakeholder requirements gathering, through specification, data mart design, report building and quality management. You will work closely with analytics, data engineering and platform teams to deliver cutting-edge data solutions and help shape the organisation's data-driven future. In your capacity as a Senior Data Warehouse Engineer, you will be a seasoned professional leading many initiatives whilst continuing to learn and grow in the role and further develop your skills alongside acting as an authority within the responsibilities laid out below., * Job responsibilities: Lead the architectural redesign of the existing data warehouse to align with modern ELT best practices.
- This role, along with the rest of the team, is also responsible for the continued availability of the existing data platform at UNiDAYS for both reporting and Insights; monitoring, response to support tickets, fixes to issues.
- Optimise data models in Amazon Redshift for performance, scalability, and cost efficiency.
- Ensure data quality, lineage, and documentation standards are enforced across pipelines.
- Collaborate with data analysts, engineers, and stakeholders to capture requirements and validate solutions.
- Set up CI/CD practices for dbt deployments and promote version-controlled transformations.
- Support incremental loading strategies and design effective partitioning and snapshotting techniques.
- Assist with production rollout, monitoring setup, and knowledge transfer to internal teams.
- Support to colleagues is essential for this role, including; career development, coaching, and co-development/design., * Data Quality & Governance: Monitoring, testing and ensuring reliability, compliance, and security of data systems, response to support tickets and fixes to issues, and maintain lineage standards
- Collaboration & Handoff: Translating business/analyst needs into engineering solutions, clear documentation (data design specification etc.), co-development and co-design, peer reviews, discussions with the team
- Project & Stakeholder Management: Independently scoping, estimating, prioritising and delivering data projects across teams, representing the team in various cross department contexts and integrate themselves in Agile practices
- Influence and Leadership: Setting best practices, mentoring, driving platform decisions, cross-team alignment, career development and coaching and propose improvements
Job benefits: This is your opportunity to Join the world's largest student verification network and help shape the future of how brands connect with the next generation. We engage a community of 23M+ verified students and graduates across 115 markets, making us a truly global platform. Our brand is a powerhouse in the UK, and we're rapidly accelerating our presence worldwide-especially in the US, Germany, India, France, Canada, and Australia.
We partner with 850+ of the world's biggest brands, bringing their products and services to the hearts and minds of tomorrow's professionals-driving engagement, building affinity, and delivering real results.
We offer a fast-paced, fun, and social working environment where you can truly make an impact. We believe work should enhance and complement your life, which is why we provide a flexible hybrid working model. While there are expectations to attend the London campus periodically, you and your manager will determine the most appropriate balance together, ensuring both your needs and the needs of the business are met.
Requirements
- Job requirements: Data Warehouse maintenance and schema design - Star, Snowflake, normalisation, de-normalisation
- SQL development
- ETL & ELT development - Talend tools and/or dbt preferable, or equivalent
- Support Report developers - Tableau tools preferable, or equivalent
- End to end BI development (requirements to report delivery)
- AWS technology stack development - S3, Redshift, RDS, Athena, etc
- Big Data platform development - BigQuery
- Working with Insight Analysts/Data Scientists and their environments - e.g. R, Jupyter, Scala
- Some form of non-BI programming experience - e.g. application, website, systems development
- Experience of developing in, or learning, industry useful programming languages - e.g. Java and Python
- Platform familiarity and understanding - Linux and or Unix command line including Bash/Shell scripting
- Experience with version-controlled repositories - Git-based workflows
- Experience with data pipeline orchestration and scheduling
- Nice to have: knowledge of data governance, lineage, and quality monitoring practices.
- Nice to have: software experience of Airflo, Fivetran and Google Big Query, * Data Architecture & engineering: Independently builds scalable pipelines, models data flows, strong understanding of warehouse/lake architecture such as star schema, design and configure of batch/streaming systems to optimise processing.
- Programming & Automation: Expert proficiency in SQL and one of; dbt, python, shell script etc. for pipeline automation, development of ETL, testing/QA/proof of correctness, production release and deployments, optimise data models for performance, scalability, cost efficiency, setup/follow CI/CD practices and implement partitioning and snapshotting techniques
Benefits & conditions
- Competitive salaries
- 4pm finishes every Friday
- Company pension scheme
- Private health insurance (BUPA)
- Dental Insurance (BUPA)
- Income protection policy
- Life assurance policy
- Employee Assistance Program
- Enhanced parental leave pay
- Regular team building activities
- £150 pounds towards your home office set up