Databricks Architect

Paradigm
Houston, United States of America
1 month ago

Role details

Contract type
Permanent contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English
Experience level
Senior

Job location

Houston, United States of America

Tech stack

Artificial Intelligence
Amazon Web Services (AWS)
Big Data
Cloud Computing
Cloud Engineering
Code Review
Computer Programming
Continuous Delivery
Continuous Integration
Data Governance
Fault Tolerance
Github
Python
Machine Learning
Pair Programming
Performance Tuning
Software Engineering
YAML
Data Logging
Spark
Data Lake
Information Technology
Data Management
Machine Learning Operations
Terraform
Data Pipelines
Databricks

Job description

The Principal Architect, Databricks will act as our top technical expert and lead practitioner for the Databricks Platform. This role combines architectural leadership with practical implementation. You will be responsible not only for defining the architectural vision but also for developing core, reusable patterns and reference architectures that will accelerate our teams. The ideal candidate is a master of the Databricks ecosystem who leads by example, demonstrates what's possible through hands-on development, and empowers teams to build robust, scalable data solutions., * Develop and Implement Reference Architectures: Design, build, and maintain a library of production-quality reference architectures and reusable patterns that showcase best practices and accelerate development for engineering teams.

  • Architect and Prototype Solutions: Architect and build proofs-of-concept for end-to-end solutions on the Databricks Lakehouse Platform, actively demonstrating feasibility and validating complex designs through hands-on implementation.
  • Advise Through Doing: Serve as the primary consultant for engineering teams on all aspects of Databricks, providing expert guidance that extends beyond diagrams to include code, best practices, and hands-on support.
  • Lead Platform Training: Create training sessions and train engineers, leading the adoption and implementation of new features such as Unity Catalog, Delta Live Tables, and advanced MLOps capabilities.
  • Establish and Govern Best Practices: Define, document, and evangelize standards for Databricks development, including data modeling, performance tuning, security, and cost management.
  • Mentor and Coach: Mentor engineers and other technical staff through code reviews, paired programming, and design sessions, elevating the overall technical proficiency of the organization within the Databricks ecosystem.

Requirements

  • Bachelor's Degree Computer Science or related field. Req
  • 10+ years Experience in software engineering, including significant experience in architecting systems. Required
  • Proven experience acting as a hands-on technical architect and advisor on large-scale data projects. Required
  • Databricks Mastery: Deep, expert-level knowledge of the Databricks Platform, including: Unity Catalog: Designing and implementing data governance and security. Delta Lake & Delta Live Tables: Architecting and building reliable, scalable data pipelines. Performance & Cost Optimization: Expertise in tuning Spark jobs, optimizing cluster usage, and managing platform costs. MLOps: Strong, practical understanding of the machine learning lifecycle on Databricks using tools like MLflow. Databricks SQL: Knowledge of designing and optimizing analytical workloads. Mosaic AI: Knowledge of designing and optimizing AI Agents.
  • Cloud & Infrastructure: Deep knowledge of cloud architecture and services on AWS. Strong command of Infrastructure as Code (Terraform, YAML).
  • Software Engineering & Programming: Strong background in software engineering and building large fault-tolerant systems.
  • CI/CD & Automation: Experience with designing and implementing CI/CD pipelines (preferably with GitHub Actions) for data and ML workloads.
  • Observability: Familiarity with implementing monitoring, logging, and alerting for data platforms.
  • Automation: The platform is ephemeral, and all changes are implemented using Terraform and Python. Expertise in Terraform and Python is a must.
  • Excellent communication and interpersonal skills, with the ability to influence and guide technical teams and stakeholders effectively.
  • A strategic mindset with a passion for solving complex data challenges and driving business outcomes through technology.
  • The ability to think critically, challenge assumptions, and make clear, well-reasoned architectural decisions.

TRAVEL REQUIRED

Minimal travel is required for this position (up to 10% of the time and on a domestic basis).

Benefits & conditions

  • Medical, Dental & Vision- eligible after 30 days of employment
  • 401K company match is 4% 1:1 - starts day one and you vest after 2 years.
  • 27 days of PTO in a full year. 10 paid holidays.
  • Eligible to participate in vehicle program and performance bonuses

Apply for this position