Hi all,
I'm not able to load a MLFlow model from a remote server located in Databricks dbfs.
I'm using Kedro package locally for my project structure and I run the pipelines using the package databricks-connect to send the computation instructions to my cluster on Databricks.
Currently, I have a MLFLow model in a dbfs path which I can retrieve successfully from the Databricks workspace (Databricks web interface) using the command: mlflow.spark.load_model('dbfs:/<path>) on the notebooks.
Unfortunately, when I try to replicate this locally (from my kedro project) with these lines:
mlflow.set_tracking_uri('databricks')
mlflow.spark.load_model('dbfs:/<path>)
I get the following error message:
"An error occurred while calling z:org.apache.spark.api.python.PythonRDD.runJob.
: java.io.IOException: Illegal file pattern: Illegal/unsupported escape sequence near index 8
sparkml\metadata"
Thanks,
Rodrigo.