I am working on a project in Azure DataFactory, and I have a pipeline that runs a Databricks python script. This particular script, which is located in the Databricks file system and is run by the ADF pipeline, imports a module from another python script located in the same folder (both scripts are located in in dbfs:/FileStore/code).
The code below can import the python module into a Databricks notebook but doesn't work when is imported into a python script.
sys.path.insert(0,'dbfs:/FileStore/code/')
import conn_config as Connect
In the cluster logs, I get: Import Error: No module named conn_config
I guess that the problem is related to the inability of the python file of recognizing the Databricks environment. Any help?