Remote Data Architect

Insight Global
Broomfield, United States of America
yesterday

Role details

Contract type
Temporary to permanent
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English
Compensation
$ 155K

Job location

Broomfield, United States of America

Tech stack

API
Data analysis
Azure
Business Systems
Program Optimization
Continuous Integration
Data Architecture
Information Engineering
Data Transformation
Data Sharing
Dataspaces
Data Systems
Data Warehousing
Dimensional Modeling
Python
SQL Databases
Data Processing
Snowflake
Data Management
Data Pipelines
Databricks
Microservices

Job description

In this role, you will design, extend, and support enterprise data architectures primarily on Azure and Databricks, balancing the maintenance of a recently modernized platform with the delivery of new data solutions across multiple high-impact initiatives. You will develop and maintain dimensional data models to support analytics and reporting, write high-quality and performant SQL and Python code, and integrate data from APIs and near-real-time sources as needed. Working closely with engineering, analytics, and business teams, you will translate evolving source systems and business requirements into scalable, well-governed data architectures while supporting multiple concurrent initiatives as both a hands-on contributor and a subject matter expert. Day to day, you will also apply and promote modern best practices in data architecture, data quality, CI/CD, and platform optimization, contributing to continuous improvement efforts as business systems and data needs evolve.

Requirements

  • Proven experience in a Data Architect or senior data engineering role, with real ownership of architecture decisions
  • Strong hands-on experience with Databricks in production environments

  • Extensive experience working in an Azure-based data ecosystem

  • Advanced proficiency in SQL for performant data transformation and querying

  • Strong Python skills, including working with data pipelines, APIs, and data processing logic

  • Deep understanding of dimensional modeling, including star and snowflake schemas

Ability to work across multiple initiatives and collaborate effectively with cross-functional teams - Experience supporting or leading enterprise system migrations

  • Familiarity with Snowflake as a downstream analytics or data sharing platform

  • Exposure to microservices-oriented data architectures

  • Experience improving or maintaining CI/CD pipelines for data platforms

Knowledge of modern data quality monitoring practices and tooling

Benefits & conditions

This position is a 6-month contract to hire. Insurance and a 401K are offered throughout the contract. The hourly rate for the role is $70-74.50/hr.

Apply for this position