Databricks Admin - INTL India

Insight Global
Pittsburgh, United States of America
1 month ago

Role details

Contract type
Permanent contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English
Experience level
Intermediate
Compensation
$ 30K

Job location

Pittsburgh, United States of America

Tech stack

API
Artificial Intelligence
Amazon Web Services (AWS)
Amazon Web Services (AWS)
Azure
Big Data
Google BigQuery
Continuous Integration
Data as a Services
Information Engineering
Data Governance
ETL
Data Migration
Data Warehousing
DevOps
Github
Python
Machine Learning
Azure
SQL Databases
Systems Integration
Data Processing
Spark
Data Lake
Data Management
Machine Learning Operations
Azure
Data Pipelines
Legacy Systems
Jenkins
Redshift
Databricks

Job description

We are seeking an experienced Databricks Architect to lead the design, implementation, and optimization of our enterprise data and AI platform. The ideal candidate will have strong expertise in Databricks, cloud platforms (Azure/AWS/GCP), and big data ecosystems, with a proven track record of architecting scalable and secure solutions., * Design, architect, and implement end-to-end data and AI solutions using Databricks and related ecosystem tools.

  • Define architecture standards, best practices, and governance for Databricks-based data platforms.
  • Collaborate with business stakeholders, data engineers, data scientists, and analysts to design efficient and scalable data pipelines.
  • Optimize data processing, performance, and cost-efficiency across cloud and Databricks environments.
  • Lead migration projects from legacy systems to Databricks/Lakehouse architectures.
  • Ensure solutions meet enterprise-grade standards for security, compliance, and data quality.
  • Mentor junior engineers and provide architectural guidance to technical teams.

Requirements

  • 6-9 years of overall IT experience, with at least 3-4 years in Databricks architecture and development.
  • Hands-on expertise with Databricks Lakehouse, Delta Lake, Spark, and MLflow.
  • Strong experience with SQL, Python, or Scala for data engineering and advanced analytics.
  • Proven background in cloud platforms (Azure, AWS, or GCP) with focus on data services (e.g., Azure Data Lake, AWS S3, BigQuery, Synapse, Redshift).
  • Deep understanding of data modeling, ETL/ELT design, and data governance frameworks.
  • Experience integrating Databricks with enterprise systems (data warehouses, BI tools, APIs).
  • Strong communication and stakeholder management skills.

Nice to Have Skills & Experience

  • Experience with CI/CD, DevOps for Data (Databricks Repos, GitHub Actions, Azure DevOps, Jenkins, etc.).
  • Knowledge of machine learning pipelines and operationalizing ML in Databricks.
  • Certifications in Databricks, Azure, AWS, or GCP.
  • Experience in leading large-scale data migration/modernization projects.

Benefits & conditions

Benefit packages for this role will start on the 1st day of employment and include medical, dental, and vision insurance, as well as HSA, FSA, and DCFSA account options, and 401k retirement account access with employer matching. Employees in this role are also entitled to paid sick leave and/or other paid time off as provided by applicable law.

Apply for this position