Multiple pipelines in Jenkinsfile

22,472 views
Skip to first unread message

Bartłomiej Sacharski

unread,
May 29, 2016, 5:47:40 PM5/29/16
to Jenkins Users
I'm really hyped about the Jenkinsfiles - they make it much much easier to document and preserve project configuration.
However, all the examples that I've seen seem to use single pipeline.
I've tried to define different stages in separate node blocks, however they still were seen as a single pipeline.

Is it possible to define multiple pipelines in a single Jenkinsfile? Or maybe there's undocumented functionality for .jenkinsfile extension to handle such cases?

Norbert Lange

unread,
May 30, 2016, 11:03:10 AM5/30/16
to Jenkins Users
Sorry, its not really clear to me what you expect.
1 Build = one Pipeline (execution)
This Pipeline can be flexible and seperated horizontally with stages, and vertically with parallel execution.

Further, what will be ultimately run can be scripted from multiple scripts (some could be "pipelines" on its own).

Maybe you just mean the graphical representation?

Bartłomiej Sacharski

unread,
May 30, 2016, 1:20:14 PM5/30/16
to Jenkins Users
Basically, I would like to define multiple builds (different pipelines) with different stages and keep them in Jenkinsfile(s) in the repo.

Norbert Lange

unread,
May 30, 2016, 5:42:34 PM5/30/16
to Jenkins Users
So you have a git repository with pipeline (as script) and want to run them all?

Its possible, but Its somewhat a messy syntax.
# global variable
def Myclosure

Node {
# has to be in a node
Checkout scripts
Myclosure = # load scripts and call them
}
# call the closure.
Myclosure()

See triggering manual load
https://github.com/jenkinsci/pipeline-plugin/blob/master/TUTORIAL.md

I dont think its possible to execute plain pipeline scripts. You always have to define a function (closure) thats stored and later invoked

Norbert Lange

unread,
May 30, 2016, 5:44:18 PM5/30/16
to Jenkins Users

Bartłomiej Sacharski

unread,
May 30, 2016, 5:45:24 PM5/30/16
to Jenkins Users
Thanks. Will look into that.

Michael Neale

unread,
May 30, 2016, 11:14:53 PM5/30/16
to Jenkins Users
One jenkinsfile is one "pipeline" - what you may have done with many jobs in the past can be done with one pipeline. it can be quite rich if you need it to be. 

You can call other "jobs" from a Jenkinsfile, but I am not sure if that is what you mean. 

Alex Ehlke

unread,
Jun 28, 2016, 5:23:50 PM6/28/16
to Jenkins Users
We've been interested in having multiple "pipelines" per repo primarily for operational tasks that are independent of delivery. A couple examples: daily logical backups; some jobs that are manually triggered to manage production services in the event of an outage. If it's the case that Jenkinsfile isn't meant to address these types of jobs (which would otherwise make sense to live within the repo whose service they pertain to), then it's disappointing to lose out on its way of defining jobs. It'd be great to have one way to define jobs whether or not they're in a repo's (or branch's) singular pipeline, rather than relying on a combination of Jenkinsfile and Netflix's Job-DSL for other jobs, for instance, and ending up with disparate job DSLs.

Is there something I'm missing? Is there a way to define "standalone" jobs with the same DSL that Jenkinsfile uses? Or are those left to remain outside of source control (or to some entirely different tool)?

Patrick Wolf

unread,
Jun 28, 2016, 6:01:45 PM6/28/16
to jenkins...@googlegroups.com
Alex,

Do you multiple jobs for every branch in the repo? A different Jenkinsfile for each job type?

There are a couple of options that I can think of:

1. You can use the env.BRANCH_NAME to determine what steps happen in the Pipeline.
2. You could use the commit message on the branch to determine what part(s) of the Pipeline to execute.
3. A quick and dirty hack would be to use the "Replay" feature of the Pipeline Job to change the steps in the Pipeline.

For each of these you could have each different job defined in a separate file and use the 'load' command in your Jenkinsfile to execute the different job based on the criteria of the branch, message, etc.

Alternatively, you can have a primary Jenkinsfile that executes your primary pipeline for every branch and a separate standalone Pipeline job that loads the job definition from a different file in the same repo. This won't create a separate job with a unique job history for every branch in the repo but it allow you to have multiple Pipeline jobs for a single repo with the job definitions all stored within the repo.




--
You received this message because you are subscribed to the Google Groups "Jenkins Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to jenkinsci-use...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/jenkinsci-users/b4f6e051-1cf7-4455-be12-9c7a52366b9e%40googlegroups.com.

For more options, visit https://groups.google.com/d/optout.



--

Patrick Wolf
Product Director - Jenkins 
CloudBees

Michael Neale

unread,
Jun 29, 2016, 1:11:45 AM6/29/16
to Jenkins Users
If by same DSL, you mean the pipeline script (DSL) - yes, you can do stand alone pipeline jobs, that aren't tied to any specific repo (and aren't multibranch aware). They can be triggered via various means, take parameters etc.

But I may be misunderstanding the question. 

Another pattern I have seen in the wild is that there is a separate repo for the more infra-related concerns, with its own Jenkinsfile, that runs as needed, taking the upstream artifacts that other repos have produced (via their respective pipelines). 

Mike Rooney

unread,
Jul 12, 2016, 6:16:27 PM7/12/16
to Jenkins Users
This need makes a lot of sense to us, where we have a couple related sub-projects (as sub directories) in a single repository. It makes sense that they each have their own pipeline jobs and can run on different schedules. I've also seen cases similar to Alex's (hi Alex!) where there are different tasks you want to do with a single repo that don't make sense as one pipeline job that runs together (building/testing versus a nightly cron-type task that runs in the repo).

It is reasonable that a Jenkinsfile corresponds to a single Pipeline job, because these are often associated with and run via a Pipeline job which isn't a logical "parent" of these seed jobs. However, a great place for this enhancement would be the Github Org / Bitbucket plugins that scan repositories for Jenkinsfiles and are already in the place of creating multiple Pipeline jobs.

My proposal would be: add a configuration option for the Github and Bitbucket plugins which scan organizations for Jenkinsfiles. So, "Project Recognizers -> Pipeline Jenkinsfile" would get a box for this which defaults to "Jenkinsfile". Some logical configuration examples might be, "Jenkinsfiles/*", "**/Jenkinsfile", "Jenkinsfile-*". Then the Github/Bitbucket plugins can be pointed at an org, or just one repository, and multiple Jenkinsfiles can exist which define different Pipeline jobs.

Bartłomiej and Alex, would something like this satisfy your use cases as well?

- Michael

Bartłomiej Sacharski

unread,
Jul 12, 2016, 6:33:15 PM7/12/16
to Jenkins Users

IMO just having an option to specify the name of Jenkinsfile would be enough - and I would rather try to implement this as a standalone thing, not connected to bitbucket/github plugins (we're using Jenkins with standalone repository, so source-agnostic solution would be best methinks). Of course that should be available in both single-branch and multi-branch variants of pipeline plugins

Eric Parton

unread,
Jul 20, 2016, 5:26:10 PM7/20/16
to Jenkins Users
I'm in the same boat you are. My organization keeps several projects within a single repository and it would be great to have the ability to give each of them their own Jenkinsfile and (multi-branch) pipeline.

Alex Kessinger

unread,
Jul 20, 2016, 8:43:55 PM7/20/16
to Jenkins Users
Mike, I'd just like to chime in and say that makes a lot of sense to me. As others have noted their can be times when you want multiple pipelines with a repo. My own specific use case is that I'd like to be able to trigger a rollback pipeline.

Slide

unread,
Jul 21, 2016, 12:43:18 AM7/21/16
to Jenkins Users

The way I am planning on doing this is with the findFiles and load functions. I'll use findFiles in my Jenkinsfile to look for other build files further in the repo and create jobs from those to run. The other files will not necessarily be the same setup as a Jenkinsfile, but will use the pipeline syntax.


--
You received this message because you are subscribed to the Google Groups "Jenkins Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to jenkinsci-use...@googlegroups.com.

Wayne Warren

unread,
Jul 21, 2016, 4:17:30 PM7/21/16
to Jenkins Users

Sorry for chiming in late here, but I have recently been evaluating Jenkins Pipeline Plugin for use at my workplace and have considered the very problem you are describing in this thread--what if a given source repo has multiple Pipeline groovy scripts it wants to use for different purposes?

The approach I have come up with actually looks very similar to what I found described in a relatively new series of blog posts:

http://marcesher.com/2016/06/21/jenkins-as-code-registering-jobs-for-automatic-seed-job-creation/

This blog post describes an approach that leans heavily on the use of the Job DSL Plugin to create jobs by using a "mother seed" DSL job that will create new DSL seed jobs for each registered SCM repo (the blog author describes the use of git and I use git myself but it's easy for me to imagine an approach using some alternate SCM).

This approach allows individual SCM respositories to contain a Job DSL script that in turn defines additional jobs specific to that repository. The jobs defined in this way could be Pipeline jobs. When defining these Pipeline jobs in the Job DSL script you can specify the relative path in the repository that contains the Pipeline script you want to use.

As far as I can tell from the description of your situation this should fit your needs perfectly. I recommend starting at the earliest blog post in that series for full context on the approach the author is describing: http://marcesher.com/2016/06/08/jenkins-as-code-creating-jenkins-jobs-with-text-not-clicks/

The blog series appears to still be a work in progress as the author has not yet reached the point where he describes the interaction between Job DSL and Pipeline but it seems to me like this should be obvious. Job DSL defines the Pipeline jobs. Pipeline defines the behavior of those jobs.

Good luck!

alex kessinger

unread,
Jul 21, 2016, 4:42:58 PM7/21/16
to jenkins...@googlegroups.com
I've tried the seed job-dsl method previously. Some automation was better then no automation, but I think the Jenkinsfile in repo is even better. If I make a change to the Jenkinsfile, that change can be isolated in each environment/branch until it has been promoted to next step. If I have one master seed job it's harder for me to control the promotion between environments.

One other thing to Note, Job DSL, and Pipeline are both groovy, but they are not the same DSL. That may not matter in this case because you are orchestrating pipelines with job DSL.

--
You received this message because you are subscribed to a topic in the Google Groups "Jenkins Users" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/jenkinsci-users/nQ6p7xw5jqI/unsubscribe.
To unsubscribe from this group and all its topics, send an email to jenkinsci-use...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/jenkinsci-users/9b429ce4-cffb-47f0-b8cd-ae7b0db2465a%40googlegroups.com.

Mike Rooney

unread,
Jul 21, 2016, 10:39:43 PM7/21/16
to jenkins...@googlegroups.com

On Thu, Jul 21, 2016 at 11:42 AM, alex kessinger <void...@gmail.com> wrote:
I've tried the seed job-dsl method previously. Some automation was better then no automation, but I think the Jenkinsfile in repo is even better. If I make a change to the Jenkinsfile, that change can be isolated in each environment/branch until it has been promoted to next step. If I have one master seed job it's harder for me to control the promotion between environments.

I agree, being able to have multiple Jenkinsfiles or specify multiple jobs in one definitely feels like the right and superior option. I suppose you could work around it by having your Jenkinsfile only run on master (either via an option in the job or by checking env.BRANCH_NAME in the Jenkinsfile), and then that Jenkinsfile runs job-dsl, which creates Pipeline jobs that run for every branch/PR as desired. But, it will probably be organizationally inferior and everyone will be having to reinvent some logic to create and organize these sub-jobs that should be maintained in one place.

Wayne Warren

unread,
Jul 22, 2016, 5:43:25 PM7/22/16
to Jenkins Users


On Thursday, July 21, 2016 at 9:42:58 AM UTC-7, Alex Kessinger wrote:
I've tried the seed job-dsl method previously. Some automation was better then no automation, but I think the Jenkinsfile in repo is even better. If I make a change to the Jenkinsfile, that change can be isolated in each environment/branch until it has been promoted to next step. If I have one master seed job it's harder for me to control the promotion between environments.

One other thing to Note, Job DSL, and Pipeline are both groovy, but they are not the same DSL. That may not matter in this case because you are orchestrating pipelines with job DSL.

Yeah this is important to note, just as important is that there's a reason they are different DSLs. Specifically, they run in different contexts. One manages job lifecycles and gives the ability to configure an entire job head-to-toe (as well as views on the Jenkins instance), the other defines the behavior of those jobs. Pipeline is fine in the case where you want to sequence behavior of a particular project's test/build/deploy steps in a visually appealing way that is easy to configure-as-code, but what if you want to trigger more than one Pipeline in parallel based on a change in some common upstream dependency? This is where as you say, the Pipelines have to be orchestrated with Job DSL.

It's somewhat difficult to imagine unifying those contexts so that you could say have one DSL that both manages job lifecycles and multi-job relations and defines their behavior. In my opinion, somewhat biased by my desire to structure the relationships between different project pipelines, it would be a huge win if it were possible but as it stands Job DSL really does seem to fill in at least some of the blanks left by Pipeline.
 

Sorin Ionuț Sbârnea

unread,
Oct 3, 2016, 9:40:49 AM10/3/16
to Jenkins Users
Did you finish your implementation? Groovy is far from being my native language and it would be really helpful if you could share a snippet that is doing the subdirectory Jenkinsfile discovery and loading.

Joshua Noble

unread,
Mar 23, 2017, 10:32:01 PM3/23/17
to Jenkins Users
Has anyone been able to make any progress on this?

I have a new Jenkins 2 cluster up and running, and I've created declarative Jenkinsfile pipelines for each app repo. This works excellent for building all app branches and PR's. However, we have some more generic parameterized deploy jobs that we'd like to add. The Github Organization plugin is also currently used for automatically scanning all of our repos, and when Jenkinsfile's are found, jobs are automatically built - but under the org/repo/branch namespace.

We have a generic jenkins repo that hosts some legacy Jenkins Job DSL plugin scripts. Ideally, we'd like to add a Jenkinsfile to this repo, and have that job define these generic deploy and utility jobs. (perhaps via groovy loads) I would prefer to use the newer Jenkinsfile-based syntax - declarative or scripted is fine. I am hoping that I don't have to resort back to the Jenkins Jobs DSL plugin though. 

Is there any simple way to accomplish this?

Mike Rooney

unread,
Mar 24, 2017, 8:36:43 PM3/24/17
to jenkins...@googlegroups.com
I have not found a solution yet but I'd love to know if there is one. Current best practice seems to be to have a Jenkinsfile which triggers Job DSL to process DSL files located in a subdirectory, so at least they are all organized in your repo and Jenkinsfile serves as the single point of entry. You can still have Jenkinsfile perform your "primary build" natively in this case, just add a Job DSL stage somewhere in it.

--
You received this message because you are subscribed to a topic in the Google Groups "Jenkins Users" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/jenkinsci-users/nQ6p7xw5jqI/unsubscribe.
To unsubscribe from this group and all its topics, send an email to jenkinsci-users+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/jenkinsci-users/a219dde2-d90c-4ab9-a738-51ab6487a800%40googlegroups.com.
Reply all
Reply to author
Forward
0 new messages