TELECOMMUTE Data Architect
TechSME Inc
2 days ago
Role details
Contract type
Permanent contract Employment type
Full-time (> 32 hours) Working hours
Regular working hours Languages
EnglishJob location
Remote
Tech stack
Amazon Web Services (AWS)
Azure
Big Data
Data Architecture
ETL
Data Warehousing
Performance Tuning
Google Cloud Platform
Sql Optimization
Spark
Data Lake
PySpark
Data Pipelines
Databricks
Requirements
- Seeking a Data Engineer with strong hands-on experience in Databricks (must-have, primary skill).
- Expertise in Apache Spark (PySpark/Scala) and Delta Lake for large-scale data processing.
- Proven experience building end-to-end data pipelines in Databricks (ETL/ELT).
- Strong knowledge of Databricks performance tuning, cluster optimization, and cost management.
- Hands-on experience with Palantir Foundry (secondary but required) - pipelines, ontology, and data modeling.
- Advanced SQL skills and experience with data warehousing concepts.
- Experience working on cloud platforms (AWS/Azure/Google Cloud Platform) with Databricks integration.
- Ability to handle large-scale data (TB/PB level) and optimize processing.
- Strong understanding of data architecture, governance, and pipeline orchestration.
- Candidates must have real-time project experience in Databricks (mandatory) and Palantir (required).