Templating mechanism for Job DSL

1,595 views
Skip to first unread message

Arno Moonen

unread,
Apr 26, 2016, 6:03:30 AM4/26/16
to job-dsl-plugin
The company I'm currently working for has a lot of different products for which they want to set-up and maintain Jenkins CI build pipelines.
I recommended them to use Job DSL, because this makes it easier to maintain the job configuration.

Looking at the Job DSL scripts they currently have, I can safely say that they are the same for like 90%.
So, they remove all that redundant configuration and make it more DRY, I would like to use some kind of templates.
Basically I want to define some input variables, configure the jobs with the default configuration and be able to overwrite/add some configuration afterwards.

The wiki briefly mentions something about the {{using}} command, but I think the example is rather small. I can't decide if that is what I need, or if I should use a different approach.

In the end I want to split up the scripts in multiple files: a few defining the different templates and a lot for all the specific products.

Arno Moonen

unread,
Apr 26, 2016, 6:09:53 AM4/26/16
to job-dsl-plugin
After a bit more searching in the mailing list I came across this thread:
https://groups.google.com/d/msg/job-dsl-plugin/1FtjN0cIKrk/WUUMpJSVrlUJ

So I reckon using is the command I'm looking for. However the example given in that thread does not work for me (I tried it on http://job-dsl.herokuapp.com/ ).
javaposse.jobdsl.dsl.JobTemplateMissingException: The template job with name maven-template does not exist.

- Arno

Op dinsdag 26 april 2016 12:03:30 UTC+2 schreef Arno Moonen:

Victor Martinez

unread,
Apr 26, 2016, 7:30:16 AM4/26/16
to job-dsl-plugin
You can also test those examples locally:

Arno Moonen

unread,
Apr 28, 2016, 5:00:02 AM4/28/16
to job-dsl-plugin
I'm currently using the instructions on the User Power Moves page to run the script locally.
I've also have a better understanding of what I want to do.

Current Situation:
For a few products a build pipeline has been defined in Job DSL and these have been used as the starting point for similar products.
For every product a copy exists of a script that is almost 100% identical to the other scripts, except for some variables defined at the top of the script.

Wanted Situation / what I've done so far:
I've created a new folder "template" within the folder that holds all the Job DSL scripts.
This folder contains a simple POGO that contains all the variables that used to be at the top of the script.
The also allows me to generate some of the values for the variables, since these are based on conventions and are rarely defined another value that what the convention describes.

I've created a DSL Factory (as described on the wiki) to generate the main MultiJob configuration as well as the nightly build job.
These are not bound to change often (I do not expect any maintenance on this actually).

Next step is to cut the current Job DSL scripts into pieces (breaking down the pipeline into jobs).
Our build pipeline is pretty standard and contains jobs like: Build, Static Code Analysis, Unit Test, Integration Test, Publish, Flash, Feature Test.
Instead of maintaining these in one giant file, I would like to split this up in multiple files (one per job type).
So, within the "template" directory I created "job_build.groovy", "job_sca.groovy", and so on.
I still want this to be "plain" Job DSL files, so my co-workers (less experienced with Groovy and Job DSL) can still do some maintenance on them by referencing the Job DSL documentation.

On the "top-level" directory I still have a PRODUCTNAME.groovy, which contains a definition of the POGO defined earlier and a call to the DSL Factory (giving it a reference to "this" and the POGO).
Next step is to include the Job DSL files for the specific jobs: so I added:
evaluate readFileFromWorkspace('template/job_build.groovy')
But unfortunately this does not work, since it is executed in a different "context", therefor the Job DSL methods as well as the POGO I defined are unknown.
So, my next question is, how do I include the other groovy scripts and still keep my context and variables defined in the script that is including them?

Thanks in advance,
Arno


Op dinsdag 26 april 2016 13:30:16 UTC+2 schreef Victor Martinez:

Malcolm Studd

unread,
May 16, 2016, 11:52:45 AM5/16/16
to job-dsl-plugin
I was trying to do a similar thing, and managed to make it work by mixing in the external groovy classes. For example, a yum update job:

package project

import javaposse.jobdsl.dsl.DslContext
import javaposse.jobdsl.dsl.jobs.FreeStyleJob

class YumJob {
        FreeStyleJob yumUpdateJob(name) {
                yumUpdateJob(name) { }
        }

        FreeStyleJob yumUpdateJob(name, @DslContext(FreeStyleJob) Closure closure) {

                def job = freeStyleJob(name) {
                        displayName("System: Yum update")
                        logRotator {
                                artifactNumToKeep(60)
                        }
                        quietPeriod(0)
                        steps {
                                shell("""
sudo yum clean all
sudo yum -y update
""")
                        }
                }
                if (closure) {
                        job.with(closure)
                }
                job
        }
}

In my main DSL script:
import project.YumJob
this.metaClass.mixin YumJob

yumUpdateJob("${folderName}/yum") {
        extendedEmail('m...@example.org')
}


Hope this helps


Malcolm
Reply all
Reply to author
Forward
0 new messages