Platform Engineer - AWS, Databricks and Unity Catalog

StackStudio Digital Ltd.
Charing Cross, United Kingdom
3 days ago

Role details

Contract type
Permanent contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English

Job location

Charing Cross, United Kingdom

Tech stack

Unity
Amazon Web Services (AWS)
Amazon Web Services (AWS)
Azure
Cloud Computing Security
Continuous Integration
Data Governance
Data Security
Database Development
Github
Hive
Identity and Access Management
JSON
Python
Metadata
SQL Databases
Parquet
Software Troubleshooting
Gitlab
PySpark
Enterprise Integration
Data Management
Terraform
Databricks

Job description

Design and support secure integration across AWS S3, Databricks and downstream data platforms. Configure least-privilege access to S3 using IAM roles, bucket policies, KMS permissions and approved access controls. Support Databricks access patterns where S3-backed data is registered through Hive Metastore and exposed or migrated into Unity Catalog. Configure and validate Unity Catalog objects, including catalogs, schemas, tables, views, grants, storage credentials and external locations. Support downstream connectivity to Databricks using SQL Warehouse, tables, views or connector-based integration patterns. Develop Terraform/IaC for AWS IAM, S3, KMS and Databricks-related configuration. Troubleshoot access, metadata, schema, query and connectivity issues across AWS, Databricks and downstream integrations. Document lineage across AWS S3, Hive Metastore, Unity Catalog and downstream consumption layers.

Requirements

Strong hands-on experience with AWS IAM, S3, KMS and secure cloud access patterns. Hands-on Databricks on AWS experience, including Hive Metastore, Delta tables, SQL Warehouse and workspace access controls. Practical Unity Catalog experience, including grants, external locations, storage credentials and governed data access. Terraform or similar Infrastructure as Code experience. Good understanding of Delta, Parquet, CSV and JSON data formats. Strong troubleshooting, documentation and stakeholder communication skills.

Desirable Skills

Palantir Foundry experience, especially Databricks connector, Data Connection, datasets, syncs, projects and permissions. Secure data-sharing, data governance, lineage and access approval experience. Python, PySpark or SQL development experience. CI/CD experience using GitLab, GitHub Actions, Azure DevOps or similar. Key Deliverables Secure AWS S3 and Databricks access design. Hive Metastore to Unity Catalog integration approach. Terraform/IaC modules or scripts for repeatable configuration. Source-to-target lineage and access-control documentation. Operational runbook for access, schema, sync and connectivity issues.

Apply for this position