I use DSL to build a lot of freestyle jobs (which are each set to run on different node labels) into different folders (basically one folder per department).
What I'd like now is to be able to go into the folder and click a job which will run all of the other jobs within that folder. This is because the tests are independent and I have them run on a schedule but sometimes something goes wrong and you want to re-run everything on the spot. I tried the Bulk Builder plugin and that doesn't work within folders, or with jobs in folders.
So I'd like to do it in DSL to build a job that can run the other jobs. All the jobs in each folder are generated by DSL after all anyway. I imagine that I'd need:
* To determine the folder the current job is in.
* To get all the names and labels of other jobs within that folder.
* Then start defining a pipeline job (it doesn't have to be a pipeline, but I think that's the way it's meant to be now?)
* And add steps to run each job on each label recorded for that job?
Unfortunately I've given it quite a few shots in different ways over the past week and I give up; from getting serialization errors when iterating the folders, to even constructing the pipeline job, I've failed at every step of the way.
Does anyone have something like this that works?