Part 5 - Developing a PySpark Application
This is the 5th and final part of a series of posts to show how you can develop PySpark applications for Databricks with Databricks-Connect and Azure DevOps. All source code can be found here.
Configuration & Releasing We are now ready to deploy. I’m working on the assumption we have two further environments to deploy into - UAT and Production.
Deploy.ps1 This script in the root folder will do all the work we need to release our Wheel and setup some Databricks Jobs for us.
[Read More]