Sign up or log in to watch the video
Fully Orchestrating Databricks from Airflow
Alan Mazankiewicz - 4 years ago
In this talk we will introduce how to use the popular cloud service Databricks for
hosting Apache Spark applications for distributed data processing in combination with
Apache Airflow, an orchestration framework for ETL batch workflows. After a brief
exploration of the Databricks Workspace and the fundamentals of Airflow we will take a
deeper look into the functionality Databricks provides in Airflow for orchestrating its
workspace. Afterwards, we will find out how to extend and customize that functionality to
manage virtually every aspect of the Databricks Workspace from Airflow.
The talk does not require any prior knowledge of Databricks, Spark or Airflow but it does
assume familiarity with the fundamentals of the Python programming language especially
object oriented programming and REST api requests. The actual distributed data processing
with Apache Spark itself is not the focus of this talk.
Jobs with related skills
(Senior) Experte (w/m/d) Data & KI
Raven51 AG
·
5 days ago
Melsungen, Germany
Hybrid
Python Backend Developer (f/m/x)
Raiffeisen Bank International AG
·
8 days ago
Vienna, Austria
Hybrid
Head of Development (w/m/d)
aedifion GmbH
·
12 days ago
Köln, Germany
Hybrid
Software Tester (f/m/d)
Power Plus Communications
·
13 days ago
Mannheim, Germany
Hybrid
Related Videos