LMRe Analytics Engineer
Role details
Job location
Tech stack
Job description
The Analytics Engineer will design, build and optimise scalable, well-governed data models and solutions that power analytics, and enterprise reporting. You will code primarily in SQL and Python, working with large, complex datasets in cloud environments to deliver trusted, timely data to stakeholders across Actuarial, Finance, Operations and Underwriting. Partnering closely with Data Architects, Data Engineers, BI Developers and business teams, you will lead and contribute to complex data change initiatives-from requirements and solution design through to delivery, documentation, and ongoing optimisation. The goal is to support better decision-making within the business by leveraging data and software solutions. This will involve use of current technologies, such as GitHub and SQL and exploring use of regular MS updates and new tools available., o Data modelling and solution design o Design, implement and maintain canonical, reusable data models (e.g., star/snowflake dimensional models, data marts) to support MI, actuarial analytics and self service reporting. o Translate business requirements into functional and technical specifications, including source-to-target mappings, lineage, and definitions. o Ensure models adhere to enterprise architecture standards, data governance, privacy and security policies. o Engineering and delivery o Build robust, testable data transformations and pipelines using SQL and Python; leverage orchestration and CI/CD tooling for repeatable delivery. o Optimise models and queries for performance and cost in cloud environments (Azure or AWS). o Implement data quality checks, unit/integration tests, monitoring and alerting; troubleshoot data issues and drive root-cause resolution. o Collaboration and stakeholder engagement o Work closely with Data Architects on target-state design, standards and patterns; partner with Data Engineers on ingestion, storage and performance considerations. o Collaborate with Actuarial and Finance teams to deliver accurate, complete and timely datasets for valuation, reserving, pricing and MI. o Engage with BI Developers to ensure models are analytics-ready for tools such as Power BI, including semantic layer design and performance tuning. o Build strong relationships with internal stakeholders and external vendors; communicate progress, risks, and impacts clearly and proactively. o Governance, standards and change o Champion version control, code review, documentation and environment management using GitHub and agreed branching strategies. o Contribute to and help embed data standards, best practices and reusable patterns across teams in the UK and India. o Lead or support complex data change initiatives, managing backlogs, RAID logs, and delivery plans in Agile frameworks. o Produce and maintain comprehensive technical documentation, data dictionaries, and runbooks. o Continuous improvement and innovation o Identify and prioritise opportunities for standardisation, automation and cost/performance optimisation. o Research and evaluate new data sources, features and tools to enrich data products. o Promote a culture of data literacy and self-service through enablement and knowledge sharing.
Requirements
Do you have experience in T-SQL?, o Experience in data/analytics engineering or closely related roles within financial services, insurance, reinsurance, pensions or investments. o Strong data modelling expertise (conceptual, logical, physical) and dimensional design for MI/BI and analytics at scale. o Advanced SQL (T SQL preferred) and Python for data transformation, automation and testing. o Proven delivery of large, complex data projects in cloud environments (Azure preferred-e.g., Data Lake, Synapse, Databricks, ADF-or AWS equivalents). o Strong understanding of database design, warehousing, and performance tuning; excellent SSMS proficiency. o Hands on with version control and collaboration (GitHub), including branching, pull requests, code reviews and documentation. o Experience implementing data quality frameworks, testing methodologies and monitoring (e.g., unit tests, data reconciliation, lineage). o Familiarity with ETL/ELT/ELT-as-code approaches and orchestration (e.g., ADF, Airflow, dbt or equivalent). o Experience supporting or enabling BI and analytics (Power BI highly desirable), including semantic models, DAX optimisation and capacity/performance considerations. o Background working with Actuarial and Finance stakeholders; comfortable delivering critical datasets on tight timelines with strong controls. o Excellent stakeholder management and communication skills across technical and non-technical audiences; proven ability to manage complex workloads and competing priorities. o Knowledge of Agile methodologies and practical experience managing backlogs, sprints and incremental delivery. o Understanding of security, privacy and compliance in regulated environments; experience with access controls, PII handling and audit readiness.