[pipeline-plugin] [workflow-plugin] Parallelizing Jenkins Pipelines

63 views
Skip to first unread message

Wayne Warren

unread,
Jul 1, 2016, 7:15:21 PM7/1/16
to Jenkins Users

So I have been experimenting with Jenkins Pipeline Plugin recently. One of the requirements I am evaluating it against is the ability to express complex relationships between the build and test steps of different projects and to parallelize the execution of those steps where possible.

To visualize the kind of relationship I have in mind and maybe make it more concrete for folks reading this thread, I have attached a simple diagram illustrating the heterogenous nature of the pipelines that we deal with within our engineering organization. To understand this diagram, think of the colored blocks for "Project 1", "Project 2", "Project 3", and "Uber Project" as consisting of "Pipeline" stages being defined in Jenkinsfiles found in different git repos. "Project 1" can be thought of as a library that does not have any upstream dependencies on any other projects in our organization so it therefore does not have an "integration" test stage. "Project 1" and "Project 2" both depend on a library artifact produced from "Project 1". "Project 1" and "Project 2", while not necessarily libraries, are components of the "Uber Project" which itself is intended to be built and shipped as a higher-level package, usually at the operating system level (rpm, deb, msi, etc); having built that package, there are then "acceptance" tests that each project may define to be run against that new artifact--once all these tests pass, it is ready to be "deployed" which you can think of as being staged either for manual validation in cases where our QA team has not had a chance to automate tests or for early access for customers with specific support contracts.




In the ideal world of our CI pipelining technology requirements there are several triggering scenarios we want to consider:

  1. SCM Event on Project 1 repo
    1. This should trigger the Pipeline for Project 1. 
    2. On successful completion, the Pipelines for both Project 2 and Project 3 should be triggered.
    3. Once both Project 2 Pipeline and Project 3 Pipeline have finished, Uber Project Pipeline should trigger; to be clear, the Uber Project Pipeline should only trigger if both Project 2 and Project 3 Pipelines finish successfully.
  2. SCM Event on Project 2 repo
    1. This should trigger the Pipeline for Project 2. Project 1 and Project 3 pipelines should not be triggered.
    2. On successful completion, the Uber Project Pipeline should trigger; no requirement this time that Project 3 pipeline also run successfully (Uber Project Pipeline should use the last successful artifact from Project 3)
  3. SCM Event on Project 3 repo
    1. This should trigger the Pipeline for Project 3. Project 1 and Project 2 pipelines should not be triggered.
    2. On successful completion, the Uber Project Pipeline should trigger; no requirement this time that Project 2 pipeline also run successfully (Uber Project Pipeline should use the last successful artifact from Project 2)
  4. SCM Event on Uber Project repo
    1. This should trigger the Pipeline for Uber Project. Project 1 and Project 2 and Project 3 pipelines should not be triggered.

There are a couple problems I am currently seeing in defining such a pipeline that enables these scenarios:
  1. It does not seem possible to insert Pipeline stages in branches of the parallel blocks; parallel blocks, instead, seem to be providing similar behavior to what Matrix Project jobs provided in Jenkins 1.x. In other words, there seems to be a problem of "parallel composability" of Pipeline Plugin pipelines
  2. It's not clear that any one Pipeline job can trigger off of SCM events coming from multiple repos, so in order to build a pipeline like this it seems like each project will need to, within its own Jenkinsfile, define a distinct Pipeline containing ephemeral "copies" of each set of project-specific stages shown above; this is assuming, by the way, that it is practical to store base pipeline components/stages as groovy methods in a global workflow lib and re-use them in different project contexts. (It's also possible that the best way to represent this kind of relationship between Pipelines is as downstream/upstream triggers of one another.)
I am curious if anyone else has come across these problems when attempting to organize their Pipeline code. Is this kind of pipeline composability a feature that the Jenkins community is interested in working on as a goal for Pipeline Plugin?

For what it's worth, I have considered this alternate approach to composing different projects together using Pipeline:



But it has a number of its own problems:

  1. No one in our engineering organization really things of "composing" pipelines in this way.
  2. There will be a need for some kind of persistent state mechanism to check whether or not individual stages for each pipeline has already run for the current SHA of each project repo (in order to short-circuit project stages that have already run).

So if anyone has any advice for how to proceed or insight on upcoming Jenkins Pipeline Plugin work, I'd really appreciate it!

Reply all
Reply to author
Forward
0 new messages