OPTIMIZE with spark.dynamicAllocation.enabled

56 views
Skip to first unread message

andrea...@gmail.com

unread,
Aug 2, 2022, 3:24:47 PM8/2/22
to Delta Lake Users and Developers

Running OPTIMIZE on a cluster with spark.dynamicAllocation.enabled doesn't seem to allocate new executors dynamically.

When setting delta.optimize.maxThreads larger spark.executor.cores, we expect new executors getting allocated. But thats not the case. Is it a restriction or bug ?

Thanks in advance
Andreas


Shixiong(Ryan) Zhu

unread,
Aug 3, 2022, 1:17:37 AM8/3/22
to andrea...@gmail.com, Delta Lake Users and Developers
Hey Andreas,

"spark.executor.cores" controls how many concurrent tasks can run on an executor. You need to set "delta.optimize.maxThreads" to be greater than the number of cores on your cluster (spark.executor.cores * number of running executors).

Best Regards,

Ryan


--
You received this message because you are subscribed to the Google Groups "Delta Lake Users and Developers" group.
To unsubscribe from this group and stop receiving emails from it, send an email to delta-users...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/delta-users/bc826694-bb94-4500-b89a-94a017cf1be7n%40googlegroups.com.
Reply all
Reply to author
Forward
0 new messages