I am developing a python package which will be deployed into databricks cluster. We often need reference to the "spark" and "dbutils" object within the python code.
We can access these objects easily within Notebook using "spark" (like spark.sql()). How do we get the spark instance within the python code in the package?