Azure Data Architect - 6 Months- Start ASAP- Remote

Red - The Global SAP Solutions Provider
2 days ago

Role details

Contract type
Temporary contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English, German

Job location

Remote

Tech stack

Azure
Big Data
Cloud Computing
Cloud Computing Security
Software Quality
Continuous Integration
Data Architecture
Information Engineering
ETL
Data Security
Data Systems
Data Warehousing
Hadoop
NoSQL
Azure
SQL Databases
Data Streaming
Data Processing
Azure
Spark
GIT
Microsoft Fabric
Data Lake
PySpark
Real Time Data
Spark Streaming
Azure
Software Version Control
Data Pipelines
Azure
Databricks

Job description

  • Lead the architecture design and implementation of advanced analytics solutions using Azure Databricks Fabric The ideal candidate will have a deep understanding of big data technologies data engineering and cloud computing with a strong focus on Azure Databricks along with Strong SQL
  • Work closely with business stakeholders and other IT teams to understand requirements and deliver effective solutions
  • Oversee the end-to-end implementation of data solutions ensuring alignment with business requirements and best practices
  • Lead the development of data pipelines and ETL processes using Azure Databricks PySpark and other relevant tools
  • Integrate Azure Databricks with other Azure services eg Azure Data Lake Azure Synapse Azure Data Factory and on-premise systems
  • Provide technical leadership and mentorship to the data engineering team fostering a culture of continuous learning and improvement
  • Ensure proper documentation of architecture processes and data flows while ensuring compliance with security and governance standards
  • Ensure best practices are followed in terms of code quality data security and scalability
  • Stay updated with the latest developments in Databricks and associated technologies to drive innovation

Requirements

Language/s: Fluent German (Must have) and English, * Strong experience with Azure Databricks including cluster management notebook development and Delta Lake

  • Proficiency in big data technologies eg Hadoop Spark and data processing frameworks eg PySpark
  • Deep understanding of Azure services like Azure Data Lake Azure Synapse and Azure Data Factory
  • Experience with ETLELT processes data warehousing and building data lakes
  • Strong SQL skills and familiarity with NoSQL databases
  • Experience with CICD pipelines and version control systems like Git
  • Knowledge of cloud security best practices

Soft Skills

  • Excellent communication skills with the ability to explain complex technical concepts to nontechnical stakeholders
  • Strong problem solving skills and a proactive approach to identifying and resolving issues
  • Leadership skills with the ability to manage and mentor a team of data engineers

Nice to have Skills

  • Power BI for dashboarding and reporting
  • Microsoft Fabric for analytics and integration tasks
  • Spark Streaming for processing Real Time data streams
  • Familiarity with Azure Resource Manager ARM templates for infrastructure as code IaC practices

Skills

Mandatory Skills: Microsoft Azure, Azure Databricks, Big Data, SQL, Azure Data lake, Azure Data Factory Telephone interviews with our customer can be arranged at short notice, with a quick decision afterwards.

Apply for this position