Sign up or log in to watch the video
Fully Orchestrating Databricks from Airflow
Alan Mazankiewicz - 4 years ago
In this talk we will introduce how to use the popular cloud service Databricks for hosting Apache Spark applications for distributed data processing in combination with Apache Airflow, an orchestration framework for ETL batch workflows. After a brief exploration of the Databricks Workspace and the fundamentals of Airflow we will take a deeper look into the functionality Databricks provides in Airflow for orchestrating its workspace. Afterwards, we will find out how to extend and customize that functionality to manage virtually every aspect of the Databricks Workspace from Airflow. The talk does not require any prior knowledge of Databricks, Spark or Airflow but it does assume familiarity with the fundamentals of the Python programming language especially object oriented programming and REST api requests. The actual distributed data processing with Apache Spark itself is not the focus of this talk.
Jobs with related skills
Sr. Python Developer (m/w/d)
Instaffo
·
today
Berlin, Germany
Senior Software Developer (m/w/d), Standort: Remote/Düsseldorf/Bielefeld | Vollzeit |
Instaffo
·
today
Bielefeld, Germany
System Engineer (m/w/d) Datenbanken / Business Intelligence
Instaffo
·
today
Herten, Germany
Senior Python Engineer
Instaffo
·
today
Berlin, Germany
Related Videos