Databricks Solutions Architect

Bitsoft International, Inc.
New York, United States of America
3 days ago

Role details

Contract type
Permanent contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English
Experience level
Senior

Job location

New York, United States of America

Tech stack

Amazon Web Services (AWS)
Azure
Big Data
Cloud Engineering
Data Architecture
Information Engineering
Data Security
Python
SQL Databases
Google Cloud Platform
Data Ingestion
Spark
Data Strategy
Data Lake
Data Management
Machine Learning Operations
Databricks

Job description

Architect and implement end-to-end Lakehouse solutions leveraging Databricks.

  • Drive data platform modernization initiatives and oversee migration to cloud-native architectures.
  • Collaborate with business stakeholders to define data strategy, ensuring alignment with organizational objectives.
  • Develop best practices for data ingestion, processing, and governance in Databricks.
  • Mentor junior engineers and provide technical leadership in solution delivery.

Requirements

We are seeking an experienced Databricks Solutions Architect to lead the design and deployment of scalable Lakehouse architectures. The ideal candidate will have over 10 years of experience in data engineering or architecture, with a proven track record of delivering end-to-end Databricks implementations., Minimum 10 years of experience in data architecture, engineering, or analytics.

  • Hands-on experience architecting and implementing Databricks Lakehouse platforms.
  • Deep understanding of cloud platforms (Azure, AWS, or Google Cloud Platform) and big data tools.
  • Expertise in Python, Spark, and SQL for large-scale data processing.
  • Databricks or relevant cloud certification (preferred).
  • Experience with CI/CD pipelines, data security, and compliance is a plus.

Preferred Experience:

  • Delivered projects at enterprise scale, preferably in Fortune 500 or leading tech organizations.
  • Familiarity with Delta Lake, MLflow, and modern data ecosystem tools.

Apply for this position