Stopped Spark Context when re-writing in the same delta path with different spark session

555 views
Skip to first unread message

Wassim Maaoui

unread,
Jun 15, 2021, 5:07:49 AM6/15/21
to Delta Lake Users and Developers

Hello,


When trying to write in the same path after re-creating a spark session I get:
```
java.lang.IllegalStateException: Cannot call methods on a stopped SparkContext.
```
Env: 
spark 3.1.2
delta: 1.0.0
scala 2.12

To reproduce in a spark shell

```
val df = spark.range(0,5)
df.write.format("delta").mode("overwrite").save("/tmp/table")
spark.stop
import org.apache.spark.sql.SparkSession
val spark2 =  SparkSession.builder().master("local").getOrCreate()
df.write.format("delta").mode("overwrite").save("/tmp/table")
```

This is an handicap especially in unit tests where we recreate the spark session to make the tests as independent as possible. 

Mich Talebzadeh

unread,
Jun 15, 2021, 6:09:34 AM6/15/21
to Wassim Maaoui, Delta Lake Users and Developers
Try this

var appName="deltatest"
val builder = SparkSession.builder \
        .appName(appName) \
        .config("spark.sql.extensions", "io.delta.sql.DeltaSparkSessionExtension") \
        .config("spark.sql.catalog.spark_catalog", "org.apache.spark.sql.delta.catalog.DeltaCatalog") \
        .enableHiveSupport()
val spark2 = configure_spark_with_delta_pip(builder).getOrCreate()

HTH


   view my Linkedin profile

 

Disclaimer: Use it at your own risk. Any and all responsibility for any loss, damage or destruction of data or any other property which may arise from relying on this email's technical content is explicitly disclaimed. The author will in no case be liable for any monetary damages arising from such loss, damage or destruction.

 



--
You received this message because you are subscribed to the Google Groups "Delta Lake Users and Developers" group.
To unsubscribe from this group and stop receiving emails from it, send an email to delta-users...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/delta-users/c34d57f8-c2d2-4656-8955-95ff7e3a9dc5n%40googlegroups.com.

Wassim Maaoui

unread,
Jun 15, 2021, 6:21:40 AM6/15/21
to Mich Talebzadeh, Delta Lake Users and Developers
I am on scala, so have run the following:

val spark2 = SparkSession.builder.appName("test")
   .config("spark.sql.extensions", "io.delta.sql.DeltaSparkSessionExtension")
   .config("spark.sql.catalog.spark_catalog", "org.apache.spark.sql.delta.catalog.DeltaCatalog")
   .enableHiveSupport()
   .getOrCreate

I also added the extension with `--conf` when running the shell.

But still the same issue.

Reply all
Reply to author
Forward
0 new messages