Serverless for Apache Spark autoscaling V2

5 views
Skip to first unread message

Igor Berman

unread,
Aug 12, 2025, 6:27:57 AMAug 12
to Google Cloud Dataproc Discussions
Hi All
need some clarification about autoscaling strategies for serverless apache spark
  • Configurable Spark grace decommission and shuffle migration behavior: Autoscaling V2 lets you use standard Spark properties to configure Spark graceful decommissioning and shuffle migration. This feature can help you maintain migration compatibility with your customized Spark properties.
However when trying to submit job with decommission properties enabled, i'm getting 400 errors as if advanced properties are not supported(see examples below)
My question is what mode is supported and how I can make sure that V2 decomissions executors when needed and what properties serverless spark do supports to control this behavior?
Thanks in advance,
Igor

small reproduction tests with SparkPi

  gcloud dataproc batches submit spark  \
  --class=org.apache.spark.examples.SparkPi \
  --jars=file:///usr/lib/spark/examples/jars/spark-examples.jar \
  --region=us-central1 \
  --version="2.2" \
  --project=<project> \
  --properties="^#^spark.dataproc.scaling.version=2#spark.decommission.enabled=true#spark.dynamicAllocation.enabled=true#spark.dynamicAllocation.minExecutors=2#spark.dynamicAllocation.maxExecutors=5" \
  --network <network>
ERROR: (gcloud.dataproc.batches.submit.spark) INVALID_ARGUMENT: Attempted to set unsupported properties: [spark.decommission.enabled]

gcloud dataproc batches submit spark  \
  --class=org.apache.spark.examples.SparkPi \
  --jars=file:///usr/lib/spark/examples/jars/spark-examples.jar \
  --region=us-central1 \
  --version="2.2" \
  --project=<project> \
  --properties="^#^spark.dataproc.scaling.version=2#spark.storage.decommission.enabled=true#spark.dynamicAllocation.enabled=true#spark.dynamicAllocation.minExecutors=2#spark.dynamicAllocation.maxExecutors=5" \
  --network <network>
ERROR: (gcloud.dataproc.batches.submit.spark) INVALID_ARGUMENT: Attempted to set unsupported properties: [spark.storage.decommission.enabled]
 gcloud dataproc batches submit spark  \
  --class=org.apache.spark.examples.SparkPi \
  --jars=file:///usr/lib/spark/examples/jars/spark-examples.jar \
  --region=us-central1 \
  --version="2.2" \
  --project=<project> \
  --properties="^#^spark.dataproc.scaling.version=2#spark.dynamicAllocation.shuffleTracking.enabled=true#spark.dynamicAllocation.enabled=true#spark.dynamicAllocation.minExecutors=2#spark.dynamicAllocation.maxExecutors=5" \
  --network <network>
ERROR: (gcloud.dataproc.batches.submit.spark) INVALID_ARGUMENT: Attempted to set unsupported properties: [spark.dynamicAllocation.shuffleTracking.enabled]
Reply all
Reply to author
Forward
0 new messages