If you have a pipeline of multiple spark tasks, you will have spark configured in the luigi.cfg. However, there might be one or two tasks that need slightly different configs. Then this is a neat way to achieve this.
Below can be done in a task class derived from PySparkTask or SparkSubmitTask:
@property
def executor_cores(self): # Override luigi.cfg
return '1' # Limit load on service