Set spark config for one task only

4 views
Skip to first unread message

HadoopMarc

unread,
Dec 9, 2020, 9:16:48 AM12/9/20
to Luigi
This is not a question but something I could find neither on this list, nor in the docs.

If you have a pipeline of multiple spark tasks, you will have spark configured in the luigi.cfg. However, there might be one or two tasks that need slightly different configs. Then this is a neat way to achieve this.

Below can be done in a task class derived from PySparkTask or SparkSubmitTask: 

    @property

    def executor_cores(self):   # Override luigi.cfg

        return '1'              # Limit load on service



Reply all
Reply to author
Forward
0 new messages