Data Modeller
Role details
Job location
Tech stack
Job description
-
Define and maintain conceptual, logical, and physical data models that accurately reflect business processes and support analytics, AI/ML, and operational needs.
-
Translate business requirements into robust data entities, attributes, relationships, and constraints; ensure traceability from requirements to models.
-
Establish and enforce GDM modelling standards and naming conventions (e.g., normalization, dimensional/star/snowflake patterns, data vault where appropriate).
-
Design dimensional models (facts, dimensions, hierarchies, slowly changing dimensions) for BI/analytics and performance at scale.
-
Create and manage canonical data models and semantic layers to enable consistent metrics and self service analytics across domains.
-
Ensure data quality by design-define integrity rules, reference/master data relationships, and validation checks embedded in pipelines.
-
Optimise models for performance and cost (partitioning, clustering, indexing, compression, surrogate keys, distribution strategies).
-
Drive data integration design across sources (CDC, event streaming, APIs), mapping source to target, resolving conflicts, and handling historical changes.
-
Support AI/ML readiness by modelling features, aggregations, and histories; collaborate on feature stores and model input/output schemas.
-
Embed privacy and security requirements into models (PII classification, minimisation, masking, role based access, retention, and residency)., Salary, remote work... Define all the criteria that are important to you.
-
Get discovered Recruiters come directly to look for their future hires in our CV library.
-
Join a community Connect with like-minded tech and IT professionals on a daily basis through our forum.
Requirements
- Proven experience delivering conceptual, logical, and physical data models for cloud data platforms, ideally GCP
- Strong hands on modelling for BigQuery (analytical/columnar patterns, denormalization strategy, partitioning & clustering considerations)
- Expertise in data modelling approaches: 3NF, dimensional (Kimball), Data Vault, and hybrid patterns for Lakehouse designs
- Maintain versioned model artefacts (ERDs, schema scripts, JSON/YAML specs) and change logs; manage controlled evolution of models.
- Ability to translate banking domain requirements (Customer, Accounts, Payments, Credit, Risk, Finance) into scalable canonical models
- Strong understanding of BigQuery performance and cost optimisation impacts driven by modelling choices (query patterns, storage, scan costs)
- Experience designing data products for analytics and reporting with trusted definitions (facts, dimensions, SCD, conformed dimensions)
- Strong knowledge of data governance: metadata management, lineage, stewardship, data quality rules, and critical data elements
- Proficiency with data modelling tools such as ER/Studio, PowerDesigner, ERWin, SQL Developer Data Modeler, or equivalent cloud native tools.
- Familiarity with GCP ecosystem integration (e.g., Cloud Storage, Dataflow/Dataproc, Pub/Sub) and how ingestion patterns influence modelling
Benefits & conditions
6 Month contract initially Based: Hybrid/London - 2 days p/w Rate: £Market rates p/d (via Umbrella company)