Remote Cloud Data Engineer: Azure Databricks & Data Lake

Sólo para miembros registrados
Arbo, Spain
6 days ago

Role details

Contract type
Permanent contract
Employment type
Full-time (> 32 hours)
Working hours
Shift work
Languages
English
Experience level
Senior
Compensation
€ 70K

Job location

Remote
Arbo, Spain

Tech stack

Azure
Cloud Engineering
Information Engineering
Data Governance
Data Sharing
Data Systems
Github
Industry Standard Architecture
Role-Based Access Control
Azure
Enterprise Data Management
Pulumi
Infrastructure as Code (IaC)
Data Lake
Databricks

Job description

Training budget for certifications Flexible working hours Private medical insurance Flexible remuneration options Team-building activities Internal tech communities, * Manage and evolve the enterprise Data Lake architecture on Azure.

  • Implement Infrastructure as Code (IaC) using Pulumi to manage Databricks Unity Catalog securables.
  • Design and deploy secure access architectures, following least-privilege and compliance standards.
  • Automate and resolve technical tickets related to data platform operations.
  • Contribute to the evolution of the company's Data Mesh and Data Products principles.
  • Collaborate with other engineering teams to support data sharing and integration initiatives.
  • Provide guidance for designing and deploying data solutions on Azure with best practices for infrastructure and automation.

Requirements

An innovative tech company is seeking a Cloud Engineer to join their Data Solutions team. You will manage and evolve an enterprise data platform, focusing on data lake management and architecture best practices. Essential requirements include 4 years of experience in Cloud Engineering, hands-on Azure and Databricks expertise, and knowledge in data governance. This fully remote role offers flexible working hours, a training budget, and comprehensive benefits including medical insurance., * 4 years of experience in Cloud Engineering or Data Engineering.

  • Hands-on experience with Azure Cloud.
  • Azure Databricks experience is mandatory.
  • Experience with Data Lakes, especially ADLS Gen2.
  • Knowledge of Data Governance and RBAC access control.
  • Experience implementing Infrastructure as Code (IaC).
  • Experience with CI/CD pipelines using GitHub and Azure DevOps.
  • Excellent English (B2+ minimum).

Apply for this position