Sr. BI & Data Architect [Fixed - Term Contract]
Role details
Job location
Tech stack
Job description
Cint is seeking a Senior BI Data Architect (Infrastructure & Optimization) to lead the technical evolution of our data platform for a period of 8 Months Fixed-Term Contract.
This is a hands-on architecture role at the intersection of data engineering and business intelligence - one where your decisions will directly shape how the business operates and makes decisions at scale.
You will own the design and implementation of our Databricks Lakehouse, drive the migration of complex legacy SQL into clean, governed pipelines, and build the semantic layer that powers self-service analytics in Omni. If you are energized by performance optimization, dimensional modeling, and infrastructure that lasts - this role was built for you.
Responsibilities
- Design and implement the Unity Catalog structure - Catalogs, Schemas, and Volumes - to create a governed, secure, and well-documented data environment that serves as a Single Source of Truth across the organization.
- Lead the migration of complex business logic from legacy systems into a unified Databricks Lakehouse, refactoring tightly coupled SQL into modular, maintainable, and performant code.
- Architect our internal transformation framework using open-source tooling (Delta Live Tables or custom Python/SQL Spark pipelines), building scalable pipelines without reliance on managed SaaS platforms.
- Serve as the resident query performance expert - analyze Spark execution plans and Spark UI to diagnose bottlenecks, reduce data skew, and optimize join strategies on large-scale datasets.
- Govern our Databricks compute footprint through strategic application of Z-Ordering, Liquid Clustering, partition design, and Serverless SQL Warehouse configurations to maximize performance per dollar.
- Build and maintain CI/CD pipelines (GitHub Actions or equivalent) to automate testing, validation, and deployment of data models.
- Architect the semantic layer in Omni - designing data models built for self-service reporting with sub-second dashboard latency.
- Occasionally take on the BI Developer role, building executive-level dashboards that surface clear, actionable narratives from complex datasets.
- Partner with cross-functional stakeholders across Finance, Sales, Product, Marketing, and Trust & Safety to translate business questions into scalable data solutions.
- Translate performance and cost metrics into clear recommendations for senior leadership, balancing engineering rigor with business impact.
Requirements
- 8+ years in Data Engineering or Data Architecture, with deep, hands-on experience in the Databricks ecosystem (Unity Catalog, Delta Lake, SQL Warehouses).
- Expert-level SQL and distributed computing skills - you can diagnose exactly why a query is slow and implement the fix across Spark execution plans, joins, and data skew scenarios.
- Demonstrated experience in query optimization and data platform migration, including refactoring legacy SQL and migrating from systems such as Snowflake, Redshift, or SQL Server into a Lakehouse architecture.
- Proven experience building data transformation workflows using Delta Live Tables or custom Python/SQL Spark pipelines.
- Strong command of dimensional modeling (Star and Snowflake schemas) and how to apply these patterns within a Lakehouse environment.
- Hands-on Unity Catalog experience, including permissions management, data lineage, and catalog governance.
- Fluency with Delta Lake internals - OPTIMIZE, Z-Order, Liquid Clustering, VACUUM, and file management strategies.
- Hands-on experience with Omni or a comparable modern BI semantic layer (Looker, Metabase, Superset) - you can build and govern a semantic model, not just consume one.
- Experience with CI/CD pipelines and version control workflows (Git / GitHub Actions or similar).
Essential Qualities
- You think in systems, not just queries - you design for reuse, performance, and longevity.
- You are comfortable navigating ambiguity and can move from whiteboard to working pipeline without waiting for perfect specifications.
- You can distill complex technical trade-offs into language that lands with non-technical senior stakeholders.
- You are self-directed and ownership-driven - you treat the data platform as your own product.
- You can balance the competing needs of different departments while maintaining a unified data standard.
Nice to Have
- Experience with dbt Core (OSS) or a similar SQL-based transformation framework; familiarity with transformation layer best practices is a strong asset.
- Familiarity with Databricks Genie Spaces or AI/BI features for natural language querying.
- Experience supporting embedded analytics or multi-tenant reporting environments.
- Background in the digital insights, market research, or programmatic technology space.