Hello community,
I am trying to execute the demo Pyspark script for setting up an Iceberg/Nessie/Spark environment :
Unfortunately, I encounter the following error:
24/08/29 10:39:14 INFO __main__: pyspark script logger initialized
24/08/29 10:39:14 INFO __main__: Spark Jars: Vector(spark://bastion-lab31-01.adminhppfsv2.dcpro31.opk.recouv.:37381/jars/iceberg-spark-extensions-3.5_2.13-1.6.1.jar, spark://bastion-lab31-01.adminhppfsv2.dcpro31.opk.recouv.:37381/jars/iceberg-spark-runtime-3.5_2.13-1.6.1.jar, spark://bastion-lab31-01.adminhppfsv2.dcpro31.opk.recouv.:37381/jars/nessie-spark-extensions-3.5_2.13-0.95.0.jar)
Traceback (most recent call last):
File "/home/ac75097972/spark_nessie_exemple/main.py", line 51, in <module>
catalog = jvm.NessieCatalog()
TypeError: 'JavaPackage' object is not callable
I followed the code as demonstrated, without any major modification. I checked with a logger that the required Jars are into loaded into Spark.
Did anyone encounter such an error ?