POTD: Jenkinsfile Runner

1,388 views
Skip to first unread message

Kohsuke Kawaguchi

unread,
Mar 1, 2018, 2:23:15 PM3/1/18
to jenkin...@googlegroups.com
Jenkinsfile Runner is an experiment to package Jenkins pipeline execution as a command line tool. The intend use cases include:
  • Use Jenkins in Function-as-a-Service context
  • Assist editing Jenkinsfile locally
  • Integration test shared libraries
Over the past year, I've done some deep-dive 1:1 conversations with some Jenkins users and I felt something like this might move the needle for them in an important way.

I'd love to hear any reactions on your side. Could something like this be important for you, does it miss any key points for you? If you mentally picture a perfect version of this, what would that do, and how would you use?

Let me know!

--
Kohsuke Kawaguchi

Kohsuke Kawaguchi

unread,
Mar 1, 2018, 2:23:55 PM3/1/18
to jenkin...@googlegroups.com
And of course I forgot to have the link to the project! https://github.com/kohsuke/jenkinsfile-runner 
--
Kohsuke Kawaguchi

Jesse Glick

unread,
Mar 1, 2018, 6:01:00 PM3/1/18
to Jenkins Dev
For the third use case, what might be useful is an archetype or two analogous to

https://github.com/jenkinsci/archetypes/tree/master/scripted-pipeline/src/main/resources/archetype-resources
https://github.com/jenkinsci/archetypes/tree/master/global-shared-library/src/main/resources/archetype-resources

which use the Docker packaging and take care of all the boilerplate
needed to have a Maven module which, when “built”, runs the specified
`Jenkinsfile` or library against some test inputs in a well-controlled
environment, and presumably passes or fails according to some
conditions. (For example, Groovy validation scripts, akin to Maven
Invoker?) I think the hard part would be stubbing out side effects you
do not want—something handled easily by JenkinsPipelineUnit, at the
cost of running in an unrealistic environment.


For anyone interested in this general topic,

https://issues.jenkins-ci.org/browse/JENKINS-33925

has some longstanding discussion and ideas. (And, wow, 85 votes! Alas
there are no clear acceptance criteria for closing it.)

Kohsuke Kawaguchi

unread,
Mar 1, 2018, 9:36:02 PM3/1/18
to jenkin...@googlegroups.com
Thanks for the pointer.

TBH, I haven't thought much at all about what that 3rd bullet actually means :-)   it was suggested by Johannes Schnatterer (and I don't know his email address, I hope he's reading this!) and it sounded like an area where this would be potentially relevant.

I can clearly see the value and trade-off of JenkinsPipelineUnit. I think it's a great project.

I suppose one path that Jenkinsfile Runner could take is not to stub out side effects. I think that's what Johannes meant by "integration" in "integration test" means. So that you can actually run some real stuff.

But I'm also intrigued at the thought of Jenkinsfile Runner running Jenkins in it, while stubbing out actual steps. That can certainly enable some additional testing scenarios.

The question I'd like to ask to people like Johannes, folks behind JenkinsPipelineUnit (once again I don't know their email addresse! even though I met them in Jenkins World!), and pipeline users in general is whether they find that valuable.



--
You received this message because you are subscribed to the Google Groups "Jenkins Developers" group.
To unsubscribe from this group and stop receiving emails from it, send an email to jenkinsci-de...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/jenkinsci-dev/CANfRfr3fDig6_-X-Pjnm1jRgv3RFZnC59ezcVpuSsjU4Wrw9sQ%40mail.gmail.com.
For more options, visit https://groups.google.com/d/optout.
--
Kohsuke Kawaguchi

Oleg Nenashev

unread,
Mar 2, 2018, 4:55:02 AM3/2/18
to Jenkins Developers
Hi KK,

I have hacked something like that last summer when I wanted to have an end-to-end flow. Your implementation is much more ready for use, and +100 from me for the proposal. The CLI approach would be totally reasonable, especially for things like PipelineUnit where you have to bootstrap the job loading now.

My approach: In my case I have finally switched to another approach to local Pipeline development (short slidedeck from Jenkins World). I switched to a fully dockerized static-config "clone" of ci.jenkins.io setup, which was using FileSystem SCM plugin to bootstrap Pipelines and Groovy hooks to configure the instance.

Currently I am reworking my demo to support config-as-code plugin and other such goodies, so it would be more aligned with recent JEPs. OTOH my approach is far from being CLI-only, and it definitely does not address all Pipeline testing use-cases.


BR, Oleg
Auto Generated Inline Image 1

Bill Dennis

unread,
Mar 2, 2018, 12:26:29 PM3/2/18
to Jenkins Developers
Hello Kohsuke -

I am a developer using Jenkins pipeline quite lot where I work.

We are using Jenkins pipelines in two scenarios:
  • For CI building and testing some of our internal components (what Jenkins is traditionally used for)
  • For running / orchestrating complex automation processes (so Jenkins is talking to some external systems using SOAP / REST etc) via tooling and even directly via REST using plugins.
I have mostly used JenkinsPipelineUnit for testing / validation of the pipelines and I have been looking into the direct / live approach that Oleg demonstrated (running Jenkins locally in Docker and getting the pipeline being developed direct from a host file system volume mount).

I think Jenkinsfile Runner would be really useful for developers who don't need or want the overhead of developing tests with JenkinsPipelineUnit. I have worked with some developers wanting to develop Jenkinsfiles for their CI process and the main problem is knowing if the Jenkinsfile will work when they commit it to the repo. They go round this loop of commit / fix running in the production Jenkins or using the Jenkins pipeline "replay" feature. It can be a painful process if you are not familiar with Jenkins pipeline and Groovy syntax!

I think some things to consider are:

- How does the Jenkins Runner replicate the agents / slaves identifiers on the target Jenkins?
- How to deal with tooling on the target Jenkins (custom tools, JDKs, Gradle, etc)?

I think the perfect Jenkinsfile Runner for me would provide:

- Somehow capture the plugins, tooling and agents on our production Jenkins
- Validate the Jenkinsfile pipeline syntax
- Validate the Jenkinsfile against the plugins and agents / tooling (fail if it refers to some tool or agent not configured for example).
- Run the Jenkinfile in some sort of "no-op" mode : what would it do if I ran it, without actually doing anything
- Actually run the Jenkinsfile locally so I can know it works completely before committing to source control.
- Run the Jenkinsfile on the target Jenkins master server using the resources of that server (so know it works on the server).

Hope that helps!

Bill Dennis

Kohsuke Kawaguchi

unread,
Mar 2, 2018, 12:57:24 PM3/2/18
to jenkin...@googlegroups.com
On Fri, Mar 2, 2018 at 9:26 AM Bill Dennis <bill....@gmail.com> wrote:
Hello Kohsuke -

I am a developer using Jenkins pipeline quite lot where I work.

We are using Jenkins pipelines in two scenarios:
  • For CI building and testing some of our internal components (what Jenkins is traditionally used for)
  • For running / orchestrating complex automation processes (so Jenkins is talking to some external systems using SOAP / REST etc) via tooling and even directly via REST using plugins.
I have mostly used JenkinsPipelineUnit for testing / validation of the pipelines and I have been looking into the direct / live approach that Oleg demonstrated (running Jenkins locally in Docker and getting the pipeline being developed direct from a host file system volume mount).

I think Jenkinsfile Runner would be really useful for developers who don't need or want the overhead of developing tests with JenkinsPipelineUnit. I have worked with some developers wanting to develop Jenkinsfiles for their CI process and the main problem is knowing if the Jenkinsfile will work when they commit it to the repo. They go round this loop of commit / fix running in the production Jenkins or using the Jenkins pipeline "replay" feature. It can be a painful process if you are not familiar with Jenkins pipeline and Groovy syntax!

This kind of context is really helpful. Thank you!

I think some things to consider are:

- How does the Jenkins Runner replicate the agents / slaves identifiers on the target Jenkins?
- How to deal with tooling on the target Jenkins (custom tools, JDKs, Gradle, etc)?

Right, I guess your point is that Jenkinsfile Runner should aim to run Jenkinsfile in much more realistic setup, and that doesn't stop at using real Jenkins and real Pipeline plugins, but it also needs to include other configurations of Jenkins. I think Jesse made a similar observation. I have a few thoughts:
  • Configuration-as-code could play a role here in terms of letting people define the configuration of Jenkins once and use it both in production and in setup like Jenkinsfile Runner
  • I'm a fan of making Jenkinsfile itself more portable. For example, if people are already in the mode of using docker images to run builds in, then more of the toolings would be packaged in there, and it should allow Jenkinsfile Runner to run your project in much the same way as your production Jenkins. I'm curious how much of this is already reality vs ideal that people are working toward.

I think the perfect Jenkinsfile Runner for me would provide:

- Somehow capture the plugins, tooling and agents on our production Jenkins
- Validate the Jenkinsfile pipeline syntax

I think this is already happening as a result of actually running the pipeline -- one of the virtue of actually using the real pipeline plugins to run!
 
- Validate the Jenkinsfile against the plugins and agents / tooling (fail if it refers to some tool or agent not configured for example).
- Run the Jenkinfile in some sort of "no-op" mode : what would it do if I ran it, without actually doing anything

This one is interesting. I assumed JenkinsPipelineUnit does this pretty well, though. Can you tell me more about this? I'm imagining you'd want to be able to selectively mock out some steps (e.g., when Jenkinsfile gets to sh "./deploy.sh" don't actually do it and pretend that it succeeded) but more details would be helpful.
 
- Actually run the Jenkinsfile locally so I can know it works completely before committing to source control.

Yeah, this was the first goal for me.
 
- Run the Jenkinsfile on the target Jenkins master server using the resources of that server (so know it works on the server).

This got me thinking that maybe all I needed was a Jenkins CLI command that behind the scene creates a temporary/hidden job on the target Jenkins master and run the Pipeline. IOW, keep the same development flow as Jenkinsfile Runner today, but don't run Jenkins locally, just do it on your actual Jenkins.
 

Hope that helps!

Bill Dennis


On Thursday, 1 March 2018 19:23:15 UTC, Kohsuke Kawaguchi wrote:
Jenkinsfile Runner is an experiment to package Jenkins pipeline execution as a command line tool. The intend use cases include:
  • Use Jenkins in Function-as-a-Service context
  • Assist editing Jenkinsfile locally
  • Integration test shared libraries
Over the past year, I've done some deep-dive 1:1 conversations with some Jenkins users and I felt something like this might move the needle for them in an important way.

I'd love to hear any reactions on your side. Could something like this be important for you, does it miss any key points for you? If you mentally picture a perfect version of this, what would that do, and how would you use?

Let me know!

--
Kohsuke Kawaguchi

--
You received this message because you are subscribed to the Google Groups "Jenkins Developers" group.
To unsubscribe from this group and stop receiving emails from it, send an email to jenkinsci-de...@googlegroups.com.

For more options, visit https://groups.google.com/d/optout.
--
Kohsuke Kawaguchi

Jesse Glick

unread,
Mar 2, 2018, 2:08:17 PM3/2/18
to Jenkins Dev
On Fri, Mar 2, 2018 at 12:56 PM, Kohsuke Kawaguchi <k...@kohsuke.org> wrote:
> well, though. Can you tell me more about this? I'm imagining you'd want to
> be able to selectively mock out some steps (e.g., when Jenkinsfile gets to
> sh "./deploy.sh" don't actually do it and pretend that it succeeded)

One suggestion alluded to in JENKINS-33925 was to have a globally
recognized “dry-run” flag (akin to `Main.isUnitTest` I suppose) that
could be checked from various features in core or plugins, so that for
example the `mail` step would know to just print out the mail it
_would_ have sent without actually contacting an SMTP server.

This would not help directly with your example above—since Jenkins
would have no way of knowing whether `deploy.sh` actually had any
externally visible effects, or was just a build command operating
locally in the workspace—but perhaps in dry-run mode a special
environment variable could be set for the whole build which would be
visible to external processes, so your own script could include
something like

if [ "$DRY_RUN" -eq true ]
then
echo "Would be deploying to ${SERVER}:"
ls -lR
exit
fi
# else proceed

Not as flexible as the fine-grained mocking available (as I recall) in
JenkinsPipelineUnit, but perhaps sufficient for many use cases.

> This got me thinking that maybe all I needed was a Jenkins CLI command that
> behind the scene creates a temporary/hidden job on the target Jenkins master
> and run the Pipeline. IOW, keep the same development flow as Jenkinsfile
> Runner today, but don't run Jenkins locally, just do it on your actual
> Jenkins.

Already exists (though it uses your real job). See for example

https://jenkins.ci.cloudbees.com/cli/command/replay-pipeline

joha...@schnatterer.info

unread,
Mar 4, 2018, 11:13:17 AM3/4/18
to Jenkins Developers
I think Jenkinsfile Runner brings a lot of opportunities for pipeline developers. The most obvious ones to me are
  1. Pipeline development (Jenkinsfile)
  2. Shared library development

Pipeline development

Right now (as described by others in this thread) pipeline development is either a loop of committing / fixing pipelines on production Jenkins, using pipeline replay on production Jenkins or setting up a local instance manually.

With Jenkinsfile Runner we can get faster feedback without polluting our commit or Jenkins build history and don't have to set up a local instance manually.


Shared library development

Shared library development right now works much in the same as pipeline development, except that you have the library code and another (often production) Jenkinsfile to maintain, in order to try out (as opposed to automatically test) your Jenkinsfile.

For shared libraries, we thankfully already have JenkinsPipelineUnit, that makes it easier to implement some tests. However, (as also mentioned by others in this thread) this is unit testing only. It mocks most of our environment. Often, green unit tests mean nothing for productive use of our share library. I even gave up test-driven development for shared libraries, in favor of 90s try-and-error-style programming. Because most of the time when trying the library with green unit tests in production, it turns out that the real Jenkins environment has some restriction that is beyond the scope of JenkinsPieplineUnit (e.g. sandboxing).
Worst thing about the current state is that we don't have reliable regression tests. A change in shared library with green unit tests is far from convincing me that the library will continue to work in production.

With Jenkinsfile Runner we could write small Jenkinsfiles within the shared library repo's test folder and run them from the Jenkinsfile of the shared lib. Pretty much in the same way as we use Maven Invoker Plugin (as mentioned by Jesse) when developing maven plugins.

A first approach to shared library integration testing with Jenkinsfile Runner
My naive first approach was to build a Docker Image that contains Jenkinsfile Runner and all default plugins.

docker run -v~/src/it/myfunction:/workspace schnatterer/jenkinsfile-runner:1.0-SNAPSHOT-2.108 /workspace

runs the ~/foo/Jenkinsfile using Jenkinsfile Runner with all default plugins of Jenkins 2.108.

My idea was to eventually do the same in Jenkinsfile of the shared lib like so (not tested)
docker.image('schnatterer/jenkinsfile-runner:1.0-SNAPSHOT-2.108').inside {

    sh
'jenkinsfile-runner /app/jenkins /app/plugins src/it/myfunction'
}
or
sh 'docker run --rm -v $(pwd):$(pwd) src/it/myfunction'

It turned out that there a two major problems:
  1. There's no way to add non-default Jenkins plugins.
    My local test for cloudogu/ces-build-lib failed because there was no GitHub Groovy Libraries plugin.
    Here, my hope is that Configuration-as-code plugin might help automate loading plugins.
  2. There's still no way to load a "local" shared library from the same repo. So, we still would have to find a way to configure the shared library in our Jenkinsfile Runner.
    Loading local shared libraries has already been discussed here and there.

Once those issues are solved, we'll have a very basic way of automating integration tests for shared libraries by executing IT Jenkinsfiles from the shared libraries pipeline and failing the build if the IT fails.


Of course, this would be very basic testing. For more sophistiated testing we would want to

  • trigger the ITs from maven or gradle,
  • use asserts,
  • get the results as JUnit XML.
So, yes, we're not there yet. But we now have a foundation to build all this upon.

Thanks for that & best regards,

Johannes

Michael Neale

unread,
Mar 6, 2018, 4:22:53 AM3/6/18
to Jenkins Developers
Trying this out, looks like I am hitting JEP-200: 


Need to dig in further (I thought I tried the same version of Jenkins as you). Anyone else seen this? 



java.lang.UnsupportedOperationException: Refusing to marshal io.jenkins.jenkinsfile.runner.SetJenkinsfileLocation for security reasons; see https://jenkins.io/redirect/class-filter/
at hudson.util.XStream2$BlacklistedTypesConverter.marshal(XStream2.java:543)
at com.thoughtworks.xstream.core.AbstractReferenceMarshaller.convert(AbstractReferenceMarshaller.java:69)
at com.thoughtworks.xstream.core.TreeMarshaller.convertAnother(TreeMarshaller.java:58)
at com.thoughtworks.xstream.core.TreeMarshaller.convertAnother(TreeMarshaller.java:43)
at com.thoughtworks.xstream.core.AbstractReferenceMarshaller$1.convertAnother(AbstractReferenceMarshaller.java:88)
at com.thoughtworks.xstream.converters.collections.AbstractCollectionConverter.writeItem(AbstractCollectionConverter.java:64)
at com.thoughtworks.xstream.converters.collections.CollectionConverter.marshal(CollectionConverter.java:74)
at com.thoughtworks.xstream.core.AbstractReferenceMarshaller.convert(AbstractReferenceMarshaller.java:69)
at com.thoughtworks.xstream.core.TreeMarshaller.convertAnother(TreeMarshaller.java:58)
at com.thoughtworks.xstream.core.AbstractReferenceMarshaller$1.convertAnother(AbstractReferenceMarshaller.java:84)
at hudson.util.RobustReflectionConverter.marshallField(RobustReflectionConverter.java:265)
at hudson.util.RobustReflectionConverter$2.writeField(RobustReflectionConverter.java:252)
Caused: java.lang.RuntimeException: Failed to serialize hudson.model.Actionable#actions for class org.jenkinsci.plugins.workflow.job.WorkflowRun

Bill Dennis

unread,
Mar 6, 2018, 5:57:34 AM3/6/18
to Jenkins Developers
Thanks Kohsuke, I tried to give some answers to your questions inline below, if I didn't mess up the reply..

Bill


On Friday, 2 March 2018 17:57:24 UTC, Kohsuke Kawaguchi wrote:


On Fri, Mar 2, 2018 at 9:26 AM Bill Dennis <bill....@gmail.com> wrote:
Hello Kohsuke -

I am a developer using Jenkins pipeline quite lot where I work.

We are using Jenkins pipelines in two scenarios:
  • For CI building and testing some of our internal components (what Jenkins is traditionally used for)
  • For running / orchestrating complex automation processes (so Jenkins is talking to some external systems using SOAP / REST etc) via tooling and even directly via REST using plugins.
I have mostly used JenkinsPipelineUnit for testing / validation of the pipelines and I have been looking into the direct / live approach that Oleg demonstrated (running Jenkins locally in Docker and getting the pipeline being developed direct from a host file system volume mount).

I think Jenkinsfile Runner would be really useful for developers who don't need or want the overhead of developing tests with JenkinsPipelineUnit. I have worked with some developers wanting to develop Jenkinsfiles for their CI process and the main problem is knowing if the Jenkinsfile will work when they commit it to the repo. They go round this loop of commit / fix running in the production Jenkins or using the Jenkins pipeline "replay" feature. It can be a painful process if you are not familiar with Jenkins pipeline and Groovy syntax!

This kind of context is really helpful. Thank you!
 
Happy to feedback. Thanks for Jenkins and pipeline as code, it helped me deliver some projects with Jenkins in a way that I thought would not be possible a few years ago.


I think some things to consider are:

- How does the Jenkins Runner replicate the agents / slaves identifiers on the target Jenkins?
- How to deal with tooling on the target Jenkins (custom tools, JDKs, Gradle, etc)?

Right, I guess your point is that Jenkinsfile Runner should aim to run Jenkinsfile in much more realistic setup, and that doesn't stop at using real Jenkins and real Pipeline plugins, but it also needs to include other configurations of Jenkins. I think Jesse made a similar observation. I have a few thoughts:
  • Configuration-as-code could play a role here in terms of letting people define the configuration of Jenkins once and use it both in production and in setup like Jenkinsfile Runner
  • I'm a fan of making Jenkinsfile itself more portable. For example, if people are already in the mode of using docker images to run builds in, then more of the toolings would be packaged in there, and it should allow Jenkinsfile Runner to run your project in much the same way as your production Jenkins. I'm curious how much of this is already reality vs ideal that people are working toward.
Yes, all of this. I have often thought that we need something like declarative pipeline for the configuration of Jenkins as code instead of going into all those web config pages. Jenkins master as Docker container seems good. In our environment we are not currently using Docker but I have seen that that is the way to go. Getting a larger organisation to adopt the right technology and the associated costs of that is the challenge, so we remain using traditional Jenkins slaves and tooling methods. Hopefully Dockerised soon. We do use Jenkins Enterprise from CloudBees.


I think the perfect Jenkinsfile Runner for me would provide:

- Somehow capture the plugins, tooling and agents on our production Jenkins
- Validate the Jenkinsfile pipeline syntax

I think this is already happening as a result of actually running the pipeline -- one of the virtue of actually using the real pipeline plugins to run!
 
- Validate the Jenkinsfile against the plugins and agents / tooling (fail if it refers to some tool or agent not configured for example).
- Run the Jenkinfile in some sort of "no-op" mode : what would it do if I ran it, without actually doing anything

This one is interesting. I assumed JenkinsPipelineUnit does this pretty well, though. Can you tell me more about this? I'm imagining you'd want to be able to selectively mock out some steps (e.g., when Jenkinsfile gets to sh "./deploy.sh" don't actually do it and pretend that it succeeded) but more details would be helpful.

Yes, in JenkinsPipelineUnit I pretty much mock out everything like calls to tools. Because pipeline is based on Groovy I thought I could do some unit tests using Spock and Groovy for pipelines but then I discovered it was already done and shared as JenkinsPipelineUnit, so hat tip to Ozan there. So the Spock unit tests I write are confirming the execution of the tools is correct but not actually running them. I also write JenkinsPipelineUnit test code for my share library code as well confirming the behaviour, so actually all my use of special tooling is captured in libraries as DSL like constructs. A great feature of JenkinsPipelineUnit is that it can generate the execution callstack of the "mocked" pipeline run. These callstacks can be stored as files and JenkinsPipelineUnit tests can assert the execution callstack of a test run against a file committed as part of the tests. I do this for all my pipelines and share library units. So the callstacks actually capture the execution of tooling with commandline params etc and any time I change the pipeline code I can see what is changed by diffing the callstack file changes. Another thing you can do with this is run the pipeline jobs with different parameters in the tests (Spock parameterised tests) and by comparing the callstack files can see how the pipeline behaviour differs between these runs and assert the correct behaviour. All this without having to hit a real Jenkins server and wait for long running processes to complete! So this is all fine and you can write tests for the pipelines and check the use of tooling etc, but when you come to run the pipeline on the actual server, the tools or agents / slaves labels are not configured as expected for the fully tested pipeline. So this is where something like Jenkins Runner will be good to use. My tests are based on code methods in the project I have here: https://github.com/macg33zr/pipelineUnit
 
 
- Actually run the Jenkinsfile locally so I can know it works completely before committing to source control.

Yeah, this was the first goal for me.
 
- Run the Jenkinsfile on the target Jenkins master server using the resources of that server (so know it works on the server).

This got me thinking that maybe all I needed was a Jenkins CLI command that behind the scene creates a temporary/hidden job on the target Jenkins master and run the Pipeline. IOW, keep the same development flow as Jenkinsfile Runner today, but don't run Jenkins locally, just do it on your actual Jenkins.

Yes, this would be great, an anonymous job and just pull back the output and console log. A plugin for IntelliJ or whatever IDE using this would be even better :-) 

Jesse Glick

unread,
Mar 6, 2018, 9:17:40 AM3/6/18
to Jenkins Dev
On Tue, Mar 6, 2018 at 4:22 AM, Michael Neale <mne...@cloudbees.com> wrote:
> java.lang.UnsupportedOperationException: Refusing to marshal io.jenkins.jenkinsfile.runner.SetJenkinsfileLocation for security reasons; see https://jenkins.io/redirect/class-filter/

As suggested in that link, the JAR probably just needs a
`Jenkins-ClassFilter-Whitelisted: true` manifest header.

Bill Dennis

unread,
Mar 6, 2018, 11:00:34 AM3/6/18
to Jenkins Developers
I too am seeing the JEP-200 issue with these exceptions packaging to a Docker image and running with that using Docker for Windows:

java.lang.UnsupportedOperationException: Refusing to marshal io.jenkins.jenkinsfile.runner.FileSystemSCM for security reasons; see https://jenkins.io/redirect/class-filter/

java
.lang.UnsupportedOperationException: Refusing to marshal io.jenkins.jenkinsfile.runner.SetJenkinsfileLocation for security reasons; see https://jenkins.io/redirect/class-filter/

I'd like to try out the Jenkinsfile Runner which looks like a great idea but I don't have time to figure this out right now (busy writing a plugin..).

--Bill

Oleg Nenashev

unread,
Mar 6, 2018, 11:16:04 AM3/6/18
to JenkinsCI Developers
Yeah, the entire library needs to me whitelisted (or packaged as Jenkins module).
It's actually a quick-win to get it fixed as long as it is a development tool.

BR, Oleg


--
You received this message because you are subscribed to a topic in the Google Groups "Jenkins Developers" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/jenkinsci-dev/gjz3CDhi-kk/unsubscribe.
To unsubscribe from this group and all its topics, send an email to jenkinsci-dev+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/jenkinsci-dev/da0e3018-9206-43a3-b814-4a65a73d5c17%40googlegroups.com.

joha...@schnatterer.info

unread,
Mar 6, 2018, 1:32:21 PM3/6/18
to Jenkins Developers
Same exception here: java.lang.UnsupportedOperationException: Refusing to marshal io.jenkins.jenkinsfile.runner.SetJenkinsfileLocation for security reasons;

I created an issue with full stacktrace.
Jesse or Oleg, maybe you could elaborate there how we could fix this.
Which Jar needs the manifest, which library needs to be whitelisted and how can we achieve this?

Cheers,

Johannes

Am Donnerstag, 1. März 2018 20:23:15 UTC+1 schrieb Kohsuke Kawaguchi:

Oleg Nenashev

unread,
Mar 6, 2018, 2:13:43 PM3/6/18
to Jenkins Developers
I will create a pull request, stay tuned

Oleg Nenashev

unread,
Mar 6, 2018, 2:43:26 PM3/6/18
to JenkinsCI Developers

--
You received this message because you are subscribed to a topic in the Google Groups "Jenkins Developers" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/jenkinsci-dev/gjz3CDhi-kk/unsubscribe.
To unsubscribe from this group and all its topics, send an email to jenkinsci-dev+unsubscribe@googlegroups.com.

Kohsuke Kawaguchi

unread,
Mar 6, 2018, 4:04:01 PM3/6/18
to jenkin...@googlegroups.com
Yeah, there are many possible ways to go about something like this, including what you described. That's why I'm trying to hear from Bill what his world looks like. I can use some concrete data points like that.

--
You received this message because you are subscribed to the Google Groups "Jenkins Developers" group.
To unsubscribe from this group and stop receiving emails from it, send an email to jenkinsci-de...@googlegroups.com.

For more options, visit https://groups.google.com/d/optout.
--
Kohsuke Kawaguchi

Kohsuke Kawaguchi

unread,
Mar 6, 2018, 4:23:26 PM3/6/18
to jenkin...@googlegroups.com
On Sun, Mar 4, 2018 at 8:13 AM <joha...@schnatterer.info> wrote:
I think Jenkinsfile Runner brings a lot of opportunities for pipeline developers. The most obvious ones to me are
  1. Pipeline development (Jenkinsfile)
  2. Shared library development

Pipeline development

Right now (as described by others in this thread) pipeline development is either a loop of committing / fixing pipelines on production Jenkins, using pipeline replay on production Jenkins or setting up a local instance manually.

With Jenkinsfile Runner we can get faster feedback without polluting our commit or Jenkins build history and don't have to set up a local instance manually.

Right. I think we all get this basic picture. Details are where things get interesting!

Shared library development

Shared library development right now works much in the same as pipeline development, except that you have the library code and another (often production) Jenkinsfile to maintain, in order to try out (as opposed to automatically test) your Jenkinsfile.

For shared libraries, we thankfully already have JenkinsPipelineUnit, that makes it easier to implement some tests. However, (as also mentioned by others in this thread) this is unit testing only. It mocks most of our environment. Often, green unit tests mean nothing for productive use of our share library. I even gave up test-driven development for shared libraries, in favor of 90s try-and-error-style programming. Because most of the time when trying the library with green unit tests in production, it turns out that the real Jenkins environment has some restriction that is beyond the scope of JenkinsPieplineUnit (e.g. sandboxing). 
Worst thing about the current state is that we don't have reliable regression tests. A change in shared library with green unit tests is far from convincing me that the library will continue to work in production.

With Jenkinsfile Runner we could write small Jenkinsfiles within the shared library repo's test folder and run them from the Jenkinsfile of the shared lib. Pretty much in the same way as we use Maven Invoker Plugin (as mentioned by Jesse) when developing maven plugins.

OK, thank you, this is really helpful. I think maven invoker plugin analogy is a great one for somebody like me. So from this perspective, mocking isn't really high on the priority. 
 

A first approach to shared library integration testing with Jenkinsfile Runner
My naive first approach was to build a Docker Image that contains Jenkinsfile Runner and all default plugins.

docker run -v~/src/it/myfunction:/workspace schnatterer/jenkinsfile-runner:1.0-SNAPSHOT-2.108 /workspace

runs the ~/foo/Jenkinsfile using Jenkinsfile Runner with all default plugins of Jenkins 2.108.

My idea was to eventually do the same in Jenkinsfile of the shared lib like so (not tested)
docker.image('schnatterer/jenkinsfile-runner:1.0-SNAPSHOT-2.108').inside {

    sh
'jenkinsfile-runner /app/jenkins /app/plugins src/it/myfunction'
}
or
sh 'docker run --rm -v $(pwd):$(pwd) src/it/myfunction'

So if this is the intended use case, then I have another idea. Instead of running Jenkinsfile in this CLI and try to emulate your Jenkins instance as closely as possible (in terms of configuration), we could just run Jenkinsfile in the current Jenkins, in a place that nobody else can see. 

For example, packaged as CLI, you'd run it like:

   $ java -jar jenkins-cli.jar -s https://jenkins.example.com/ run-pipeline ./src/it/myfunction

Or from within Jenkinsfile, could be something like:

  pipeline.sharedlibrary.test './src/it/myfunction'

... and this would completely bypass the challenge of trying to replicate the Jenkins configuration.

 
It turned out that there a two major problems:
  1. There's no way to add non-default Jenkins plugins.
    My local test for cloudogu/ces-build-lib failed because there was no GitHub Groovy Libraries plugin.
    Here, my hope is that Configuration-as-code plugin might help automate loading plugins.
Yeah, I'm really looking forward for CaC project to fill this gap, and I know they are working on it. 
  1. There's still no way to load a "local" shared library from the same repo. So, we still would have to find a way to configure the shared library in our Jenkinsfile Runner.
    Loading local shared libraries has already been discussed here and there.
Good point.
 

Once those issues are solved, we'll have a very basic way of automating integration tests for shared libraries by executing IT Jenkinsfiles from the shared libraries pipeline and failing the build if the IT fails.


That'd be really cool, isn't it!? 

Of course, this would be very basic testing. For more sophistiated testing we would want to

  • trigger the ITs from maven or gradle,
  • use asserts,
  • get the results as JUnit XML.
So, yes, we're not there yet. But we now have a foundation to build all this upon.

asserts would already work, right? It seems like we can do that relatively easily on its own.

But with that aside, I think I understand the picture in your head. You are really approaching it like Maven plugin development. Indeed this would help illustrate relative strength of both JenkinsPipelineUnit and "integration tests"
 

Thanks for that & best regards,

Johannes


Am Donnerstag, 1. März 2018 20:23:15 UTC+1 schrieb Kohsuke Kawaguchi:
Jenkinsfile Runner is an experiment to package Jenkins pipeline execution as a command line tool. The intend use cases include:
  • Use Jenkins in Function-as-a-Service context
  • Assist editing Jenkinsfile locally
  • Integration test shared libraries
Over the past year, I've done some deep-dive 1:1 conversations with some Jenkins users and I felt something like this might move the needle for them in an important way.

I'd love to hear any reactions on your side. Could something like this be important for you, does it miss any key points for you? If you mentally picture a perfect version of this, what would that do, and how would you use?

Let me know!

--
Kohsuke Kawaguchi


Am Donnerstag, 1. März 2018 20:23:15 UTC+1 schrieb Kohsuke Kawaguchi:
Jenkinsfile Runner is an experiment to package Jenkins pipeline execution as a command line tool. The intend use cases include:
  • Use Jenkins in Function-as-a-Service context
  • Assist editing Jenkinsfile locally
  • Integration test shared libraries
Over the past year, I've done some deep-dive 1:1 conversations with some Jenkins users and I felt something like this might move the needle for them in an important way.

I'd love to hear any reactions on your side. Could something like this be important for you, does it miss any key points for you? If you mentally picture a perfect version of this, what would that do, and how would you use?

Let me know!

--
Kohsuke Kawaguchi

--
You received this message because you are subscribed to the Google Groups "Jenkins Developers" group.
To unsubscribe from this group and stop receiving emails from it, send an email to jenkinsci-de...@googlegroups.com.

For more options, visit https://groups.google.com/d/optout.
--
Kohsuke Kawaguchi

Kohsuke Kawaguchi

unread,
Mar 6, 2018, 4:23:56 PM3/6/18
to jenkin...@googlegroups.com
Oleg gave us the fix, which I merged to the master just now. I think that'll fix the problem.

--
You received this message because you are subscribed to the Google Groups "Jenkins Developers" group.
To unsubscribe from this group and stop receiving emails from it, send an email to jenkinsci-de...@googlegroups.com.

For more options, visit https://groups.google.com/d/optout.
--
Kohsuke Kawaguchi

Kohsuke Kawaguchi

unread,
Mar 6, 2018, 4:33:37 PM3/6/18
to jenkin...@googlegroups.com
On Tue, Mar 6, 2018 at 2:57 AM Bill Dennis <bill....@gmail.com> wrote:
Thanks Kohsuke, I tried to give some answers to your questions inline below, if I didn't mess up the reply..

Bill

On Friday, 2 March 2018 17:57:24 UTC, Kohsuke Kawaguchi wrote:


On Fri, Mar 2, 2018 at 9:26 AM Bill Dennis <bill....@gmail.com> wrote:
Hello Kohsuke -

I am a developer using Jenkins pipeline quite lot where I work.

We are using Jenkins pipelines in two scenarios:
  • For CI building and testing some of our internal components (what Jenkins is traditionally used for)
  • For running / orchestrating complex automation processes (so Jenkins is talking to some external systems using SOAP / REST etc) via tooling and even directly via REST using plugins.
I have mostly used JenkinsPipelineUnit for testing / validation of the pipelines and I have been looking into the direct / live approach that Oleg demonstrated (running Jenkins locally in Docker and getting the pipeline being developed direct from a host file system volume mount).

I think Jenkinsfile Runner would be really useful for developers who don't need or want the overhead of developing tests with JenkinsPipelineUnit. I have worked with some developers wanting to develop Jenkinsfiles for their CI process and the main problem is knowing if the Jenkinsfile will work when they commit it to the repo. They go round this loop of commit / fix running in the production Jenkins or using the Jenkins pipeline "replay" feature. It can be a painful process if you are not familiar with Jenkins pipeline and Groovy syntax!

This kind of context is really helpful. Thank you!
 
Happy to feedback. Thanks for Jenkins and pipeline as code, it helped me deliver some projects with Jenkins in a way that I thought would not be possible a few years ago.

My pleasure, and really the portion I can take credit for is getting smaller every day!
 


I think some things to consider are:

- How does the Jenkins Runner replicate the agents / slaves identifiers on the target Jenkins?
- How to deal with tooling on the target Jenkins (custom tools, JDKs, Gradle, etc)?

Right, I guess your point is that Jenkinsfile Runner should aim to run Jenkinsfile in much more realistic setup, and that doesn't stop at using real Jenkins and real Pipeline plugins, but it also needs to include other configurations of Jenkins. I think Jesse made a similar observation. I have a few thoughts:
  • Configuration-as-code could play a role here in terms of letting people define the configuration of Jenkins once and use it both in production and in setup like Jenkinsfile Runner
  • I'm a fan of making Jenkinsfile itself more portable. For example, if people are already in the mode of using docker images to run builds in, then more of the toolings would be packaged in there, and it should allow Jenkinsfile Runner to run your project in much the same way as your production Jenkins. I'm curious how much of this is already reality vs ideal that people are working toward.
Yes, all of this. I have often thought that we need something like declarative pipeline for the configuration of Jenkins as code instead of going into all those web config pages. Jenkins master as Docker container seems good. In our environment we are not currently using Docker but I have seen that that is the way to go. Getting a larger organisation to adopt the right technology and the associated costs of that is the challenge, so we remain using traditional Jenkins slaves and tooling methods. Hopefully Dockerised soon. We do use Jenkins Enterprise from CloudBees.

"something like declarative pipeline for the configuration of Jenkins as code instead of going into all those web config pages" --- please do check out the linked configuration-as-code effort. I think you'll love it.

And thanks for the explanation of the state of your current Jenkins master.

I think the perfect Jenkinsfile Runner for me would provide:

- Somehow capture the plugins, tooling and agents on our production Jenkins
- Validate the Jenkinsfile pipeline syntax

I think this is already happening as a result of actually running the pipeline -- one of the virtue of actually using the real pipeline plugins to run!
 
- Validate the Jenkinsfile against the plugins and agents / tooling (fail if it refers to some tool or agent not configured for example).
- Run the Jenkinfile in some sort of "no-op" mode : what would it do if I ran it, without actually doing anything

This one is interesting. I assumed JenkinsPipelineUnit does this pretty well, though. Can you tell me more about this? I'm imagining you'd want to be able to selectively mock out some steps (e.g., when Jenkinsfile gets to sh "./deploy.sh" don't actually do it and pretend that it succeeded) but more details would be helpful.

Yes, in JenkinsPipelineUnit I pretty much mock out everything like calls to tools. Because pipeline is based on Groovy I thought I could do some unit tests using Spock and Groovy for pipelines but then I discovered it was already done and shared as JenkinsPipelineUnit, so hat tip to Ozan there. So the Spock unit tests I write are confirming the execution of the tools is correct but not actually running them. I also write JenkinsPipelineUnit test code for my share library code as well confirming the behaviour, so actually all my use of special tooling is captured in libraries as DSL like constructs. A great feature of JenkinsPipelineUnit is that it can generate the execution callstack of the "mocked" pipeline run. These callstacks can be stored as files and JenkinsPipelineUnit tests can assert the execution callstack of a test run against a file committed as part of the tests. I do this for all my pipelines and share library units. So the callstacks actually capture the execution of tooling with commandline params etc and any time I change the pipeline code I can see what is changed by diffing the callstack file changes. Another thing you can do with this is run the pipeline jobs with different parameters in the tests (Spock parameterised tests) and by comparing the callstack files can see how the pipeline behaviour differs between these runs and assert the correct behaviour. All this without having to hit a real Jenkins server and wait for long running processes to complete! So this is all fine and you can write tests for the pipelines and check the use of tooling etc, but when you come to run the pipeline on the actual server, the tools or agents / slaves labels are not configured as expected for the fully tested pipeline. So this is where something like Jenkins Runner will be good to use. My tests are based on code methods in the project I have here: https://github.com/macg33zr/pipelineUnit

So just to make sure I understand you correctly, given all that you do with JenkinsPipelineUnit, the point of "integration test" for you is to actually run in real Jenkins, in order to verify that agent labels exist, those agents do have the tools that your 'sh' step uses, etc.

- Actually run the Jenkinsfile locally so I can know it works completely before committing to source control.

Yeah, this was the first goal for me.
 
- Run the Jenkinsfile on the target Jenkins master server using the resources of that server (so know it works on the server).

This got me thinking that maybe all I needed was a Jenkins CLI command that behind the scene creates a temporary/hidden job on the target Jenkins master and run the Pipeline. IOW, keep the same development flow as Jenkinsfile Runner today, but don't run Jenkins locally, just do it on your actual Jenkins.

Yes, this would be great, an anonymous job and just pull back the output and console log. A plugin for IntelliJ or whatever IDE using this would be even better :-) 

Great, between your feedback and what I think Johanness is saying, I'm starting to picture the next POTD.

Hope that helps!

Bill Dennis


On Thursday, 1 March 2018 19:23:15 UTC, Kohsuke Kawaguchi wrote:
Jenkinsfile Runner is an experiment to package Jenkins pipeline execution as a command line tool. The intend use cases include:
  • Use Jenkins in Function-as-a-Service context
  • Assist editing Jenkinsfile locally
  • Integration test shared libraries
Over the past year, I've done some deep-dive 1:1 conversations with some Jenkins users and I felt something like this might move the needle for them in an important way.

I'd love to hear any reactions on your side. Could something like this be important for you, does it miss any key points for you? If you mentally picture a perfect version of this, what would that do, and how would you use?

Let me know!

--
Kohsuke Kawaguchi

--
You received this message because you are subscribed to the Google Groups "Jenkins Developers" group.
To unsubscribe from this group and stop receiving emails from it, send an email to jenkinsci-de...@googlegroups.com.
--
Kohsuke Kawaguchi

--
You received this message because you are subscribed to the Google Groups "Jenkins Developers" group.
To unsubscribe from this group and stop receiving emails from it, send an email to jenkinsci-de...@googlegroups.com.

For more options, visit https://groups.google.com/d/optout.
--
Kohsuke Kawaguchi

Bill Dennis

unread,
Mar 7, 2018, 8:46:20 AM3/7/18
to Jenkins Developers
Kohsuke  - some more answers to your questions inline below.

Also I thought of these other things on my walk to the office this morning that might be of interest:

- Parameters to jobs : most of my pipeline jobs have parameters defined inside. Would the Jenkinsfile Runner have a way to pass in parameters so it doesn't just use default values?

- Environment values for jobs: For our pipeline automation deployments, we set "env" values (key value pairs as strings) at the gobal of folder level (using Jenkins Enterprise Folders plugin) and these define some environment for jobs (things like REST endpoints to some RESTful services we use in the jobs, for example). With the Jenkinsfile Runner have a way to supply this?

- Script approvals : I have mentioned this below, but some way to define script approvals or get the required approvals would help possibly. I know if you are hitting script approvals, then probably trying to do too much in the pipeline code but it is easy to hit this with simple stuff like Java date functions or JSON parsing.

- Jenkinsfile As a Service idea (JFAAS?): you mentioned this and I really like this idea. The possibility to externally run a Jenkinsfile supplied and get back some results like build artifacts and console output without having any jobs or builds appear on the Jenkins master or needing a job configuration. The reason is we tend to think about loading all our automation into Jenkins. But our world is not all inside Jenkins and we have other automation solutions that can have extensions and customisation. An example is some of our custom services we build and some off the shelf stuff like the Serena Business Process management server we have. But Jenkins is really good and doing some automation, tooling and spinning / scaling up agent workers in Docker etc, so it would be good to use that from these other services by having them send a Jenkinsfile they own over and getting some results back somehow. Maybe this is something different from Jenkinsfile Runner though.

Best regards,
--Bill
Yes, some way to integration test against a real Jenkins would really help. This is the main integration test mode. Another thing that trips us up is script approvals, and this is something that can be quite painful if doing certain things with pipeline and shared libraries (and this is why I am looking into plugin development right now for our own custom steps). It would be really good to get some output of script approvals required so that they can be inspected and configured on the server or the Jenkinsfile / library updated.

However, I still like the idea that something like Jenkinsfile Runner can spin up a "scratch" jenkins locally. This is because I would use it during development when I don't want to consume resources on the deployed master or break something that might affect someone else. I have been looking at something like what Oleg demoed with Jenkins in Docker and Filesystem SCM plugin for this scenario. This is because my main development involves multiple Jenkinsfile pipelines with some sort of orchestration of pipeline jobs going on (I don't really see Jenkinsfile Runner supporting that). With a Jenkinsfile Runner on this local scratch Jenkins I might also have the opportunity to run it with or without the script approvals bypassed in some way so I could get pipelines and libraries up and running quickly in the global pipeline library scenario (I don't care too much for the script security running this scratch Jenkins locally).
 

- Actually run the Jenkinsfile locally so I can know it works completely before committing to source control.

Yeah, this was the first goal for me.
 
- Run the Jenkinsfile on the target Jenkins master server using the resources of that server (so know it works on the server).

This got me thinking that maybe all I needed was a Jenkins CLI command that behind the scene creates a temporary/hidden job on the target Jenkins master and run the Pipeline. IOW, keep the same development flow as Jenkinsfile Runner today, but don't run Jenkins locally, just do it on your actual Jenkins.

Yes, this would be great, an anonymous job and just pull back the output and console log. A plugin for IntelliJ or whatever IDE using this would be even better :-) 

Great, between your feedback and what I think Johanness is saying, I'm starting to picture the next POTD.

Looking forward to it! 

Jesse Glick

unread,
Mar 7, 2018, 4:04:46 PM3/7/18
to Jenkins Dev
On Tue, Mar 6, 2018 at 4:23 PM, Kohsuke Kawaguchi <k...@kohsuke.org> wrote:
> I have another idea. Instead of
> running Jenkinsfile in this CLI and try to emulate your Jenkins instance as
> closely as possible (in terms of configuration), we could just run
> Jenkinsfile in the current Jenkins, in a place that nobody else can see.

In principle this is possible, by defining a new implementation of
`FlowExecutionOwner` that is not based on a `Run`. We have in the past
discussed the possibility of a Pipeline equivalent to the `/script`
console.

I doubt this would be a very satisfactory solution to the use case at
hand, though. Any step which required a `Run` context (either
mandatory, as in `StepDescriptor.getRequiredContext`, or optionally,
by checking for null from `StepContext.get(Run.class)`) would either
fail or behave in some reduced capacity. If you took care to only
write `Jenkinsfile`s that just used `node` and `sh` and the like and
none of these features, then fine. But a lot of common functionality
does assume it has a `Run` context: anything that looks up
folder-scoped environment variables or credentials; `junit`,
`milestone`, `lock`, `stash`; everything based on `SimpleBuildStep` or
`SimpleBuildWrapper`…a lot. You could create a temporary `WorkflowJob`
in the same folder and hack around with access control so that it is
only visible to the invoking user, which would let most of these
things work (probably with some specialized support for branch
projects), but this seems like it is asking for trouble.

I think it would be far more practical to work with the existing
Replay feature, which was designed for precisely this use case. If the
main objection to using Replay is that you do not want these
experimental builds to “pollute” the general build history, then we
can do some minor UI work (for example in Blue Ocean) to hide these
build records by default. There is already a plan to do something very
similar for restarted stages (JENKINS-48643). We could even stream the
build log to the CLI command (JENKINS-33438) and then add an option to
delete the build as soon as it completes—a very simple and safe
feature.

joha...@schnatterer.info

unread,
Mar 11, 2018, 9:43:23 AM3/11/18
to Jenkins Developers
From a user perspective, I like your idea of a "pipeline.sharedlibrary.test" step and/or a "run-pipeline" CLI command. They would definitely solve configuration issues with plugins, docker, etc.
Now, Jesse mentioned some implementation challenges in his post from Wed, 07 Mar 2018 13:04:46, that I can't comment on, because I'm not that deep into Jenkins internals (yet?).

In the meantime, Jesse also provided a workaround for loading a "local" shared lib. With this I implemented my first (very simple) integration test and run it successfully from the Jenkinsfile of the shared lib 🎉

You meant asserts should already work? How would you do those in that scenario?

This approach has of course the limitations we talked about, regarding the Jenkins configuration .
In addition these tests would only be run on Jenkins and not when running the build locally with Maven.

So right now, I think a better approach would be to provide a Java library that allows for running Jenkinsfiles.
It could be used to implement e.g. JUnit tests that are run by build tools such as maven locally (e.g. with the failsafe plugin) and in the same way in the Jenkinsfile.
This library could abstract from the concrete execution with could either be
- Jenkinsfile runner with local plugin folder
- Jenkinsfile runner in Docker
- "pipeline.sharedlibrary.test" step
- a "run-pipeline" CLI command.

The downside would be that such an implementation wouldn't be trivial. And, unfortunately, I can't invest to much of my free time right now.

Michael Neale

unread,
Mar 13, 2018, 1:57:43 AM3/13/18
to Jenkins Developers
As an example of how to use (abuse?) this: 

I was able to take a container with the set of "recommended" plugins, and run it on the "codeship pro" service (which uses docker): 



On Friday, March 2, 2018 at 6:23:15 AM UTC+11, Kohsuke Kawaguchi wrote:

Kohsuke Kawaguchi

unread,
Mar 21, 2018, 8:07:54 PM3/21/18
to jenkin...@googlegroups.com, Bill Dennis
Sorry for the bit of delay to come back to this,

On Wed, Mar 7, 2018 at 5:46 AM Bill Dennis <bill....@gmail.com> wrote:
Kohsuke  - some more answers to your questions inline below.

Also I thought of these other things on my walk to the office this morning that might be of interest:

- Parameters to jobs : most of my pipeline jobs have parameters defined inside. Would the Jenkinsfile Runner have a way to pass in parameters so it doesn't just use default values?

Ah, good catch, this is actually pretty easy to do. Tracking it as #13.

- Environment values for jobs: For our pipeline automation deployments, we set "env" values (key value pairs as strings) at the gobal of folder level (using Jenkins Enterprise Folders plugin) and these define some environment for jobs (things like REST endpoints to some RESTful services we use in the jobs, for example). With the Jenkinsfile Runner have a way to supply this?

The global settings, I put that into the class of problems to be solved by config-as-code, along with plugin installations.

The folder level things, I haven't thought about. Solving the general case of this is bit involving.

- Script approvals : I have mentioned this below, but some way to define script approvals or get the required approvals would help possibly. I know if you are hitting script approvals, then probably trying to do too much in the pipeline code but it is easy to hit this with simple stuff like Java date functions or JSON parsing.

I guess the idea here is that you want Jenkinsfile Runner to run the pipeline with exactly the same script approval settings.

I wonder if it'd be nice if Jenkinsfile Runner can do something like the permissive mode of SELinux, where it runs your pipeline entirely and spit out the approvals it needed? That way you don't need to do any trial & error on production Jenkins.


- Jenkinsfile As a Service idea (JFAAS?): you mentioned this and I really like this idea. The possibility to externally run a Jenkinsfile supplied and get back some results like build artifacts and console output without having any jobs or builds appear on the Jenkins master or needing a job configuration. The reason is we tend to think about loading all our automation into Jenkins. But our world is not all inside Jenkins and we have other automation solutions that can have extensions and customisation. An example is some of our custom services we build and some off the shelf stuff like the Serena Business Process management server we have. But Jenkins is really good and doing some automation, tooling and spinning / scaling up agent workers in Docker etc, so it would be good to use that from these other services by having them send a Jenkinsfile they own over and getting some results back somehow. Maybe this is something different from Jenkinsfile Runner though.

Yeah, as you nailed it, JFaaS idea is around being able to submit the work and spin up the process to execute it, and store the result somewhere, which you can later browse. There's a bit of similarity to build publisher plugin.

Obviously, Jenkinsfile Runner alone is not going to be able to do that, but the point of it is so that we can talk about it more concretely, like we are doing it here.

I'm not sure if I fully grok the portion where you say "But our world is not all inside Jenkins and we have other automation solutions that can have extensions and customisation. An example is ... Serena Business Process management server."

Can you help me understand how those tools and your custom stuff work together in your environment? If you need to get to sensitive stuff, I'm happy to jump onto a call, especially given that you appear to be a CloudBees customer :-)
 

For more options, visit https://groups.google.com/d/optout.
--
Kohsuke Kawaguchi

Kohsuke Kawaguchi

unread,
Mar 21, 2018, 8:31:41 PM3/21/18
to jenkin...@googlegroups.com
On Wed, Mar 7, 2018 at 1:04 PM Jesse Glick <jgl...@cloudbees.com> wrote:
On Tue, Mar 6, 2018 at 4:23 PM, Kohsuke Kawaguchi <k...@kohsuke.org> wrote:
> I have another idea. Instead of
> running Jenkinsfile in this CLI and try to emulate your Jenkins instance as
> closely as possible (in terms of configuration), we could just run
> Jenkinsfile in the current Jenkins, in a place that nobody else can see.

In principle this is possible, by defining a new implementation of
`FlowExecutionOwner` that is not based on a `Run`. We have in the past
discussed the possibility of a Pipeline equivalent to the `/script`
console.

Thanks for the tip. I was actually thinking about creating a full blown WorkflowJob that's just so happen to be invisible to anyone. Do you think that would work better/worse?

I doubt this would be a very satisfactory solution to the use case at
hand, though. Any step which required a `Run` context (either
mandatory, as in `StepDescriptor.getRequiredContext`, or optionally,
by checking for null from `StepContext.get(Run.class)`) would either
fail or behave in some reduced capacity. If you took care to only
write `Jenkinsfile`s that just used `node` and `sh` and the like and
none of these features, then fine. But a lot of common functionality
does assume it has a `Run` context: anything that looks up
folder-scoped environment variables or credentials; `junit`,
`milestone`, `lock`, `stash`; everything based on `SimpleBuildStep` or
`SimpleBuildWrapper`…a lot. You could create a temporary `WorkflowJob`
in the same folder and hack around with access control so that it is
only visible to the invoking user, which would let most of these
things work (probably with some specialized support for branch
projects), but this seems like it is asking for trouble.

IIUC, creating invisible WorkflowJob would bypass these problems.

I think it would be far more practical to work with the existing
Replay feature, which was designed for precisely this use case. If the
main objection to using Replay is that you do not want these
experimental builds to “pollute” the general build history, then we
can do some minor UI work (for example in Blue Ocean) to hide these
build records by default. There is already a plan to do something very
similar for restarted stages (JENKINS-48643). We could even stream the
build log to the CLI command (JENKINS-33438) and then add an option to
delete the build as soon as it completes—a very simple and safe
feature.

Ah, so instead of invisible WorkflowJob, we create invisible WorkflowRun, right?
--
Kohsuke Kawaguchi

Kohsuke Kawaguchi

unread,
Mar 21, 2018, 8:45:08 PM3/21/18
to jenkin...@googlegroups.com
On Sun, Mar 11, 2018 at 6:43 AM <joha...@schnatterer.info> wrote:
From a user perspective, I like your idea of a "pipeline.sharedlibrary.test" step and/or a "run-pipeline" CLI command. They would definitely solve configuration issues with plugins, docker, etc.
Now, Jesse mentioned some implementation challenges in his post from Wed, 07 Mar 2018 13:04:46, that I can't comment on, because I'm not that deep into Jenkins internals (yet?).

In the meantime, Jesse also provided a workaround for loading a "local" shared lib. With this I implemented my first (very simple) integration test and run it successfully from the Jenkinsfile of the shared lib 🎉

Nice! 

You meant asserts should already work? How would you do those in that scenario?

I was thinking you'd just put assert statements into Jenkinsfile and that should already work. Granted, I haven't tried it so maybe it doesn't work.

This approach has of course the limitations we talked about, regarding the Jenkins configuration .
In addition these tests would only be run on Jenkins and not when running the build locally with Maven.

I'm not sure if I follow you here. The two approaches that I was thinking about were:
  • "Jenkinsfile Runner" that runs embedded Jenkins locally and runs Jenkinsfile in there
  • "run-pipeline CLI command" that submits Jenkinsfile into an existing Jenkins server and run it there and send the result back
"pipeline.sharedlibrary.test" is just a pipeline step that runs tests in some form. In each of the above two versions we can think of their corresponding "pipeline.sharedlibrary.test" step.
 
So right now, I think a better approach would be to provide a Java library that allows for running Jenkinsfiles.
It could be used to implement e.g. JUnit tests that are run by build tools such as maven locally (e.g. with the failsafe plugin) and in the same way in the Jenkinsfile.
This library could abstract from the concrete execution with could either be
- Jenkinsfile runner with local plugin folder
- Jenkinsfile runner in Docker
- "pipeline.sharedlibrary.test" step
- a "run-pipeline" CLI command.

I was kind of trying to avoid doing 4 different things that each has different unique trade-offs that only Jenkins experts would understand.

You have mentioned two primary use cases for you, pipeline development and shared library development. I think we can find one solution that works for those two cases. I just need to think about this.

The downside would be that such an implementation wouldn't be trivial. And, unfortunately, I can't invest to much of my free time right now. 

Your feedback has been very helpful, and I appreciate the time you are investing in those.
 

For more options, visit https://groups.google.com/d/optout.
--
Kohsuke Kawaguchi

Kohsuke Kawaguchi

unread,
Mar 21, 2018, 8:45:51 PM3/21/18