Databricks Integration Engineer
Cloudious LLC
Princeton, United States of America
3 days ago
Role details
Contract type
Permanent contract Employment type
Full-time (> 32 hours) Working hours
Regular working hours Languages
English Experience level
SeniorJob location
Princeton, United States of America
Tech stack
Java
Amazon Web Services (AWS)
Amazon Web Services (AWS)
Application Integration Architecture
User Authentication
Automation of Tests
Azure
Big Data
Continuous Integration
Information Engineering
Software Debugging
DevOps
Distributed Data Store
Identity and Access Management
Java Database Connectivity
OAuth
Open Database Connectivity
Performance Tuning
Azure
SQL Databases
Systems Integration
Data Logging
Enterprise Software Applications
Data Ingestion
Spark
Spring-boot
GIT
Build Management
Data Lake
Build Tools
REST
Databricks
Job description
We are looking for a Databricks Integration Engineer with strong Java development experience to design and build secure scalable connectors that enable enterprise applications to integrate with Databricks platforms The role involves developing application to Databricks connectivity for data ingestion processing and analytics use cases., * Design develop and maintain Javabased connectors for application integration with Databricks via JDBCODBC REST APIs Delta Lake
- Enable data ingestion and extraction between enterprise applications and Databricks using batch and streaming approaches
- Implement secure authentication and authorization mechanisms OAuth PAT Azure ADAWS IAM tokenbased access
- Develop reusable Java libraries and services to abstract Databricks connectivity for upstream applications
- Optimize connector performance for highvolume data transfers and lowlatency querying
- Handle schema evolution error handling retries and logging in connector implementation
- Work closely with data engineering platform and DevOps teams to deploy and monitor integrations
- Support CICD pipelines and automated testing for connector releases
- Troubleshoot integration issues related to connectivity performance and data consistency
Requirements
- Strong hands on experience in Java Java 8 Spring Boot and RESTful services
- Experience integrating applications with Databricks using
- Databricks JDBC ODBC
- Databricks REST APIs
- Delta Lake tables
- Solid understanding of distributed data systems and big data concepts
- Experience with SQL Spark concepts jobs clusters notebooks
- Authentication security knowledge
- OAuth 20 token based auth
- Cloud IAM AWS IAM Azure AD depending on environment
- Experience with cloud platforms AWS Azure GCP especially
- S3 ADLS GCS
- Familiarity with CICD pipelines Git Maven Gradle
Strong debugging and performance tuning skills