py4j.Py4JException: Method forPath([class org.apache.spark.sql.SparkSession, class java.lang.String, class java.util.HashMap]) does not exist

442 views
Skip to first unread message

Jeferson Machado Santos

unread,
Dec 6, 2022, 8:55:08 AM12/6/22
to Google Cloud Dataproc Discussions
Hi,

I am executing some pyspark jobs on Dataproc cluster. All went fine until yesterday. However, today I started having this error while using command DeltaTable.forPath(sparkSession, path) to read delta tables and update it.

Traceback (most recent call last):
  File "/tmp/job-0eb2543e/cohort_ka.py", line 146, in <module>
    main()
  File "/tmp/job-0eb2543e/cohort_ka.py", line 128, in main
    persisted = DeltaTable.forPath(spark, destination)
  File "/opt/conda/default/lib/python3.8/site-packages/delta/tables.py", line 387, in forPath
    jdt = jvm.io.delta.tables.DeltaTable.forPath(jsparkSession, path, hadoopConf)
  File "/usr/lib/spark/python/lib/py4j-0.10.9-src.zip/py4j/java_gateway.py", line 1304, in __call__
  File "/usr/lib/spark/python/lib/pyspark.zip/pyspark/sql/utils.py", line 111, in deco
  File "/usr/lib/spark/python/lib/py4j-0.10.9-src.zip/py4j/protocol.py", line 330, in get_return_value
py4j.protocol.Py4JError: An error occurred while calling z:io.delta.tables.DeltaTable.forPath. Trace:
py4j.Py4JException: Method forPath([class org.apache.spark.sql.SparkSession, class java.lang.String, class java.util.HashMap]) does not exist
    at py4j.reflection.ReflectionEngine.getMethod(ReflectionEngine.java:318)
    at py4j.reflection.ReflectionEngine.getMethod(ReflectionEngine.java:339)
    at py4j.Gateway.invoke(Gateway.java:276)
    at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
    at py4j.commands.CallCommand.execute(CallCommand.java:79)
    at py4j.GatewayConnection.run(GatewayConnection.java:238)
    at java.lang.Thread.run(Thread.java:750)

Configs:
dataproc cluster image: 2.0-debian10
delta table version: delta-core_2.12-1.0.0.jar

Jeferson Machado Santos

unread,
Dec 6, 2022, 10:50:19 AM12/6/22
to Google Cloud Dataproc Discussions
Already found the problem. There was an update on delta-spark python package yesterday, and version was not set on the moment ofcluster creation. Just changed the command to python -m pip install delta-spark==2.1.1 and it worked.
Reply all
Reply to author
Forward
0 new messages