In this talk we will introduce how to use the popular cloud service Databricks for
hosting Apache Spark applications for distributed data processing in combination with
Apache Airflow, an orchestration framework for ETL batch workflows. After a brief
exploration of the Databricks Workspace and the fundamentals of Airflow we will take a
deeper look into the functionality Databricks provides in Airflow for orchestrating its
workspace. Afterwards, we will find out how to extend and customize that functionality to
manage virtually every aspect of the Databricks Workspace from Airflow.
The talk does not require any prior knowledge of Databricks, Spark or Airflow but it does
assume familiarity with the fundamentals of the Python programming language especially
object oriented programming and REST api requests. The actual distributed data processing
with Apache Spark itself is not the focus of this talk.