Issue with MLFlow remote tracking server and Databricks-connect

108 views
Skip to first unread message

Rodrigo Maranzana

unread,
Mar 31, 2021, 3:40:00 PM3/31/21
to mlflow-users
Hi all,

I'm not able  to load a MLFlow model from a remote server located in Databricks dbfs.

I'm using Kedro package locally for my project structure and I run the pipelines using the package databricks-connect to send the computation instructions to my cluster on Databricks.

Currently, I have a MLFLow model in a dbfs path which I can retrieve successfully from the Databricks workspace (Databricks web interface) using the command: mlflow.spark.load_model('dbfs:/<path>) on the notebooks.

Unfortunately, when I try to replicate this locally (from my kedro project) with these lines:
mlflow.set_tracking_uri('databricks')
mlflow.spark.load_model('dbfs:/<path>)

I get the following error message:
"An error occurred while calling z:org.apache.spark.api.python.PythonRDD.runJob.
: java.io.IOException: Illegal file pattern: Illegal/unsupported escape sequence near index 8
sparkml\metadata"

Thanks,
Rodrigo.
Reply all
Reply to author
Forward
0 new messages