Build Flow Plugin and artifacts management in the DSL

3,220 views
Skip to first unread message

Emmanuel Boudrant

unread,
Sep 16, 2013, 9:04:02 AM9/16/13
to jenkins...@googlegroups.com
Hello,

We've checked the Build Flow Plugin for parallel execution, is is possible to manage the artifact in the plugin.

Example :

parallel (
    // Each job is going to produce one artifact
    { build("job1") },
    { build("job2") },
    ...
    { build("jobN") }
)

// deploy need access to all the artifact generated by job1, job2, ..., jobN.
build("deploy")

Our use case is we have plenty of slaves for building the artifacts but only one slave can deploy them.

Thanks,
-emmanuel

Ginga, Dick

unread,
Sep 16, 2013, 9:09:06 AM9/16/13
to jenkins...@googlegroups.com

The Copy Artifact Plugin will allow your “deploy” job to copy artifacts from your other jobs and put them in the deploy workspace.

--
You received this message because you are subscribed to the Google Groups "Jenkins Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to jenkinsci-use...@googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.

Emmanuel Boudrant

unread,
Sep 16, 2013, 9:34:02 AM9/16/13
to jenkins...@googlegroups.com
I am not sure it will work, copy Artifact plugin is going to only copy the last successful artifact of a given job ?

When I launch N job in parallel they are actually the same job with different parameter. Could it copy the artifacts generated by each job build ?

nicolas de loof

unread,
Sep 16, 2013, 9:42:57 AM9/16/13
to jenkins...@googlegroups.com
copy artifact can be used with a fixed build selection passed as parameter afaik
so just have to get build number from previously executed jobs and pass as parameter to deploy


2013/9/16 Emmanuel Boudrant <emmanuel...@gmail.com>

Ginga, Dick

unread,
Sep 16, 2013, 9:44:23 AM9/16/13
to jenkins...@googlegroups.com

There are a number of ways to “pick” the artifacts to archive. Use the

 

In your DSL, do

B1 = build(“job 1”)

B2 = build(“job 2”)

 

Then pass them to Deploy as parameters.

 

Then, as just stated, your deploy job can pick specific builds.

Ginga, Dick

unread,
Sep 16, 2013, 9:45:32 AM9/16/13
to jenkins...@googlegroups.com

Sorry missed the important statement:

 

After each build, get the build number as B1.build.number

Emmanuel Boudrant

unread,
Sep 16, 2013, 9:47:24 AM9/16/13
to jenkins...@googlegroups.com
Thanks, I will try !
-emmanuel

teilo

unread,
Sep 16, 2013, 10:12:58 AM9/16/13
to jenkins...@googlegroups.com

To be honest I wouldn't pass buildnumber, I would use a run parameter so that manual triggering should you ever need it (e.g. a failed deploy) would be easier.

Format the build as 

def job1runparam =  build1.project.fullName + "#" build1.number

and then add that to the paramerers

def myparams = new HashMap();

myparams['job1', 'job1runparam']

then in the copy artifact a bit of fudging to use version '${job1_NUMBER'}

/James

Les Mikesell

unread,
Sep 16, 2013, 10:51:22 AM9/16/13
to jenkinsci-users
On Mon, Sep 16, 2013 at 8:44 AM, Ginga, Dick <Dick....@perkinelmer.com> wrote:
> There are a number of ways to “pick” the artifacts to archive. Use the
>
>
>
> In your DSL, do
>
> B1 = build(“job 1”)
>
> B2 = build(“job 2”)
>
>
>
> Then pass them to Deploy as parameters.
>
>
>
> Then, as just stated, your deploy job can pick specific builds.

This approach should work as long as the artifacts are copied by
another job started after the builds complete. Is there any way for
the parent build-flow job to collect (and then archive) them itself
instead of having another job do it? I can't find a way to use the
copy artifact plugin from a build flow job.

--
Les Mikesell
lesmi...@gmail.com

Ginga, Dick

unread,
Sep 16, 2013, 10:53:52 AM9/16/13
to jenkins...@googlegroups.com
You need a way to export the build numbers outside the build flow environment so that they are available to a later archive step.

I tried doing this but could not. I do not know DSL/Groovy/Jenkins enough to do it.

-----Original Message-----
From: jenkins...@googlegroups.com [mailto:jenkins...@googlegroups.com] On Behalf Of Les Mikesell
Sent: Monday, September 16, 2013 10:51 AM
To: jenkinsci-users
Subject: Re: Build Flow Plugin and artifacts management in the DSL

Schalk Cronjé

unread,
Sep 17, 2013, 11:06:30 AM9/17/13
to jenkins...@googlegroups.com
This blog post might be insightful - http://delivervalue.blogspot.co.uk/2013/06/more-advanced-build-flows-with-jenkins.html

It does show you how to run a number of similar jobs in parallel and then send the resulting build numbers to a downstream job.

It does not, however, show you how to download artifacts from these builds. This cannot be achieved by Copy Artifact Plugin as one has to loop around the build numbers (assuming they are from the same job). A simple Gradle build step or a Groovy build step can achieve this for you.

An example Gradle build step could contain something like the following:

buildscript {
    repositories {
        mavenCentral()
        mavenRepo(url: 'http://dl.bintray.com/ysb33r/grysb33r')
      }
      dependencies {
        classpath 'org.ysb33r.gradle:vfs-gradle-plugin:0.2'
        classpath 'commons-httpclient:commons-httpclient:3.1'
      }
}

apply plugin: 'vfs'
project.ext.JENKINS_URL = System.env['JENKINS_URL]

task download << {
  BUILD_NUMBERS.split(',').each { buildNum ->
    vfs.cp "${JENKINS_URL}/job/${SOURCE_JOB}/${buildNum}/artifact/NAME_OF_YOUR_ARTIFACT", new File('LOCAL_NAME_OF_ARTIFACT'), overwrite:true
  }
}

This script can now be called with parameters
-PSOURCE_JOB=YourJenkinsJobName
-PBUILD_NUMBERS=1,2,3,4,5

and task name
download

HTH

--
Schalk W. Cronjé

Les Mikesell

unread,
Sep 17, 2013, 2:51:59 PM9/17/13
to jenkinsci-users
On Tue, Sep 17, 2013 at 10:06 AM, Schalk Cronjé <ysb...@gmail.com> wrote:
> This blog post might be insightful -
> http://delivervalue.blogspot.co.uk/2013/06/more-advanced-build-flows-with-jenkins.html
>
> It does show you how to run a number of similar jobs in parallel and then
> send the resulting build numbers to a downstream job.
>
> It does not, however, show you how to download artifacts from these builds.
> This cannot be achieved by Copy Artifact Plugin as one has to loop around
> the build numbers (assuming they are from the same job). A simple Gradle
> build step or a Groovy build step can achieve this for you.

But I don't see 'add build step' in a build flow job - just the flow
dsl and post-build actions. I do have 'copy files back to the job's
workspace on the master node' in the post-build choices that might be
coming from the 'copy to slave' plugin, but none of my jobs run on the
master node so that doesn't make sense. (But maybe I could run the
build flow jobs there if that would make it work...).

Is it possible to do the copy within the build flow dsl itself? It
seems to handle some groovy stuff.

--
Les Mikesell
lesmi...@gmail.com

Schalk Cronjé

unread,
Sep 18, 2013, 1:31:09 PM9/18/13
to jenkins...@googlegroups.com
With Build Flow Plugin you can only access Groovy modues that Jenkins System Groovy has access to. You cannot use @Grab to pull in extra modules. This is a limitation of Build Flow Plugin IMHO, but Nicolas does not agree - https://groups.google.com/forum/#!topic/jenkinsci-users/2XFDB5g7O7A).

That is the reason that I do the copying in a seperate job - IN the end that paid off better for my specific cases.

Hwowever it is possible to get hold of different plugin within Build Flow, I am not sure how much success you'll be able to achieve via this route though. BY using getAction i.e.

def foobar=build.getAction(hudson.scm.RevisionParameterAction.class) // Replace with appropriate class name

you might be able to play around enough to get something to work.

nicolas de loof

unread,
Sep 18, 2013, 1:38:21 PM9/18/13
to jenkins...@googlegroups.com
this is a voluntary limitation. Even flow uses groovy, it's supposed to be a DSL, not a system groovy script. Use scriptler/groovy plugin/etc if you want to do such thing


2013/9/18 Schalk Cronjé <ysb...@gmail.com>

--

Les Mikesell

unread,
Sep 18, 2013, 3:51:34 PM9/18/13
to jenkinsci-users
On Wed, Sep 18, 2013 at 2:37 PM, nicolas de loof
<nicolas...@gmail.com> wrote:
> not inside the flow DSL, but inside jobs. build-flow is an orchestrator,
> it's not here to replace jenkins classic way to share artifacts between
> builds (fingerprints, copyartifact plugin, build number parameter, etc)
>
> collect executed jobs build object
>
> def b = build( "foo" )
>
> then pass build number as parameter to other jobs :
>
> build( "bar", artifactNumber: b.number )
>
> you should not try to use build flow as a groovy script executor. Even it
> don't restrict groovy syntax (maybe it should) it's a job scheduler DSL,
> nothing more.

But that means not thinking of it as a real jenkins job...
Conceptually I'm running the job to build some artifacts - in this
case as a somewhat more intelligent/controllable matrix job
replacement, so it doesn't make sense to me for the resulting
artifacts to have to end up in some other job's space. Could the DSL
be extended to have a 'copy child artifact' function so that after the
scheduling happen you can have the results where you'd expect jenkins
to have them?

--
Les Mikesell
lesmi...@gmail.com

nicolas de loof

unread,
Sep 18, 2013, 4:06:07 PM9/18/13
to jenkins...@googlegroups.com
build flow don't have a workspace, so no place to store child artifacts.
you could trigger a final job to collect other ones artifacts using copy-artifact


2013/9/18 Les Mikesell <lesmi...@gmail.com>

--
   Les Mikesell
     lesmi...@gmail.com

Les Mikesell

unread,
Sep 18, 2013, 4:54:21 PM9/18/13
to jenkinsci-users
On Wed, Sep 18, 2013 at 3:06 PM, nicolas de loof
<nicolas...@gmail.com> wrote:
> build flow don't have a workspace, so no place to store child artifacts.
> you could trigger a final job to collect other ones artifacts using
> copy-artifact

That's, ummm, interesting... I see a workspace in the jenkins job
view containing the svn version checkout that triggered the job and
'archive artifacts' post-build step works to save any of those files
into the master archive (we usually archive the .h files along with
built libraries). Can't it copy into that space?

--
Les Mikesell
lesmi...@gmail.com

Les Mikesell

unread,
Sep 18, 2013, 3:32:53 PM9/18/13
to jenkinsci-users
On Wed, Sep 18, 2013 at 12:38 PM, nicolas de loof
<nicolas...@gmail.com> wrote:
> this is a voluntary limitation. Even flow uses groovy, it's supposed to be a
> DSL, not a system groovy script. Use scriptler/groovy plugin/etc if you want
> to do such thing

OK, but how? I don't see the option to do another build step within
the job and if there were, how would I pass the parameters for the
child jobs and build numbers that build flow had just run? Another
separate job run as part of the flow will work, but it would just seem
neater to me if the top-level controlling job could collect all of the
artifacts into the desired layout by itself. In this particular
case I want it because I want to be able to run a custom 'promotion'
plugin that is going to commit to a subversion tag and it has to be an
atomic operation with everything in one tree, but just in general it
seems like you should be able to run the top-level job and have that
be the place where the artifacts show up (if you want) rather than
having some other job get them.

Or are you saying I should code up the whole sequence directly in
groovy instead of using build flow at all if I need the artifacts?

--
Les Mikesell
lesmi...@gmail.com

nicolas de loof

unread,
Sep 18, 2013, 3:37:32 PM9/18/13
to jenkins...@googlegroups.com
not inside the flow DSL, but inside jobs. build-flow is an orchestrator, it's not here to replace jenkins classic way to share artifacts between builds (fingerprints, copyartifact plugin, build number parameter, etc)

collect executed jobs build object

def b = build( "foo" )

then pass build number as parameter to other jobs :

build( "bar", artifactNumber: b.number )

you should not try to use build flow as a groovy script executor. Even it don't restrict groovy syntax (maybe it should) it's a job scheduler DSL, nothing more.


2013/9/18 Les Mikesell <lesmi...@gmail.com>

nicolas de loof

unread,
Sep 19, 2013, 1:16:32 AM9/19/13
to jenkins...@googlegroups.com
I removed workspace in HEAD, previous version of plugin extend a classic job


2013/9/18 Les Mikesell <lesmi...@gmail.com>

--
   Les Mikesell
      lesmi...@gmail.com

Les Mikesell

unread,
Sep 19, 2013, 8:49:24 AM9/19/13
to jenkinsci-users
On Thu, Sep 19, 2013 at 12:16 AM, nicolas de loof
<nicolas...@gmail.com> wrote:
> I removed workspace in HEAD, previous version of plugin extend a classic job.

Doesn't the triggering svn check need that space? And probably other
triggering options, as well as the ability to archive anything it
produces? I don't understand how a job without a workspace or
archive coordinates with the rest of jenkins.

--
Les Mikesell
lesmi...@gmail.com

nicolas de loof

unread,
Sep 19, 2013, 9:28:47 AM9/19/13
to jenkins...@googlegroups.com
build flow is a flightweight task, supposed to orchestrate jobs, not to archive content or manage a workspace. 
It can be triggered by commit hooks, best option.


2013/9/19 Les Mikesell <lesmi...@gmail.com>

--
   Les Mikesell
     lesmi...@gmail.com

Les Mikesell

unread,
Sep 19, 2013, 11:01:37 AM9/19/13
to jenkinsci-users
On Thu, Sep 19, 2013 at 8:28 AM, nicolas de loof
<nicolas...@gmail.com> wrote:
> build flow is a flightweight task, supposed to orchestrate jobs, not to
> archive content or manage a workspace.
> It can be triggered by commit hooks, best option.


I guess I don't understand having a jenkins job that isn't really a
jenkins job and intentionally doesn't do the things I want jenkins to
do... Is there some other plugin or approach that would be better to
accomplish this? In general that would be to poll a subversion
repository, and when a change is detected, build the same revision
across serveral platfoms (even if there are subsequent commits before
all builds happen), optionally run some tests, and optionally archive
some of the results for subsequent optional promotion. I, and many
of the other users here are not java/groovy experts so your DSL is
appealing, but it doesn't seem to match all of the job we need to
describe. Is there some way to import its methods into a generic
groovy build step to be able to use them in a less restricted context?
I don't think triggering a build on every commit would work in our
situation - they are mostly hourly or nightly polls. If there is no
other way to do it, should I think in terms of running one platform's
builds with the stock svn polling, then having that job kick off the
build flow job for the rest? And if I did that, can I pass the
triggering SVN_REVISION through to its child jobs so that the builds
and tests are consistent?

--
Les Mikesell
lesmi...@gmail.com

Slide

unread,
Sep 19, 2013, 11:16:45 AM9/19/13
to jenkins...@googlegroups.com
That's what I do. I have my hourly/nightly checks kick off a job that uses the Build Flow plugin to build other projects. It's worked very well for me so far. I had issues with the trigger parameterized build and other plugins, but using the Build Flow plugin in the method I described has been a life saver.



--
   Les Mikesell
     lesmi...@gmail.com

--
You received this message because you are subscribed to the Google Groups "Jenkins Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to jenkinsci-use...@googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.

Les Mikesell

unread,
Sep 19, 2013, 11:31:35 AM9/19/13
to jenkinsci-users
On Thu, Sep 19, 2013 at 10:16 AM, Slide <slide...@gmail.com> wrote:
> That's what I do. I have my hourly/nightly checks kick off a job that uses
> the Build Flow plugin to build other projects. It's worked very well for me
> so far. I had issues with the trigger parameterized build and other plugins,
> but using the Build Flow plugin in the method I described has been a life
> saver.

I can sort-of see how to patch the different bits of functionality
together, but I think it will confuse our users if they have to
trigger from one job, edit the workflow in another, and have their
results show up in yet another. I'm really looking for something that
works like a matrix job but permits a little more control over the
individual sub-jobs and the layout of the resulting artifacts (e.g.
when building libraries I'd like one snapshot copy of the include
header files plus each platform's .so/.dll/etc.).

--
Les Mikesell
lesmi...@gmail.com

Schalk W. Cronjé

unread,
Sep 19, 2013, 11:41:06 AM9/19/13
to jenkins...@googlegroups.com
The very first time I read about Build Flow I also thought it to be a DSL for specifying complex steps in a job. It was only when I started to play with it that I came to realise that it was actually a tool for the orchestrating multiple jobs. So you are not alone in your I initial perceptions.

As to answering another question you had,  I know that if you are using Subversion, then in the simplest case using SVN Tracking plugin will suffice.

If you need something more complex you may need to use a revision parameter on the downstream jobs. You'll then to append @${REVISION} to your SVN URLs. (Assuming param was called REVISION).

An alternative is to write the revision info into a fike which is published as an artifact. Utilising the Pre_SCM_BuildStep plugin you can download the file and use the Environment Injector plugin to set the url, revision etc. Similar to the previous example you will need to ${PARAM}s in your SVN setup 




--
Schalk W. Cronjé
[Sent from mobile phone]



-------- Original message --------
From: Les Mikesell <lesmi...@gmail.com>
Date: 19/09/2013 16:01 (GMT+00:00)
To: jenkinsci-users <jenkins...@googlegroups.com>
Subject: Re: Build Flow Plugin and artifacts management in the DSL


You received this message because you are subscribed to a topic in the Google Groups "Jenkins Users" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/jenkinsci-users/0vVMVDNAkRo/unsubscribe.
To unsubscribe from this group and all its topics, send an email to jenkinsci-use...@googlegroups.com.

Les Mikesell

unread,
Sep 19, 2013, 1:22:33 PM9/19/13
to jenkinsci-users
On Thu, Sep 19, 2013 at 10:41 AM, Schalk W. Cronjé <ysb...@gmail.com> wrote:
> The very first time I read about Build Flow I also thought it to be a DSL
> for specifying complex steps in a job.

Maybe it was wishful thinking, because that seems to be what I need.

>It was only when I started to play
> with it that I came to realise that it was actually a tool for the
> orchestrating multiple jobs. So you are not alone in your I initial
> perceptions.

The perception was boosted by the version I am using having access to
all of the usual build trigger options, a visible workspace, and the
usual post-build options.

> As to answering another question you had, I know that if you are using
> Subversion, then in the simplest case using SVN Tracking plugin will
> suffice.

If that's the plugin I think, it says it should be used with commit
hooks, not polling - and I don't understand quite how it would work if
builds of different revisions ran concurrently. If one set of builds
hasn't completed before they are started/queued for another revisions,
I still want each set to complete and stay consistent.

> If you need something more complex you may need to use a revision parameter
> on the downstream jobs. You'll then to append @${REVISION} to your SVN URLs.
> (Assuming param was called REVISION).

Yes, I can do this within build flow with $SVN_REVISION with the
version where the svn polling trigger works. Not sure if later
versions will have an equivalent. I'm beginning to think that what I
really need is some boilerplate groovy code or library to do the same
operations as the build flow plugin but in an ordinary build step.
Would the 'run other jobs in parallel' feature be possible there?

--
Les Mikesell
lesmi...@gmail.com

nicolas de loof

unread,
Sep 19, 2013, 1:36:09 PM9/19/13
to jenkins...@googlegroups.com



2013/9/19 Les Mikesell <lesmi...@gmail.com>

On Thu, Sep 19, 2013 at 10:41 AM, Schalk W. Cronjé <ysb...@gmail.com> wrote:
> The very first time I read about Build Flow I also thought it to be a DSL
> for specifying complex steps in a job.

Maybe it was wishful thinking, because that seems to be what I need.

It's not, it's a tool to orchestrate workflows (so the name) based on jobs, as a replacement for parameterized build trigger + join + downstream + ... plugins

If you need to do advanced build script, use groovy system script or scriptler plugin

 

>It was only when I started to play
> with it that I came to realise that it was actually a tool for the
> orchestrating multiple jobs. So you are not alone in your I initial
> perceptions.

The perception was boosted by the version I am using having access to
all of the usual build trigger options, a visible workspace, and the
usual post-build options.

First implementation didn't offered this, and BuildFlow just extended Job. For simplicity and avoid code copy/paste it extends Project, but isn't an actual Project as FreeStyle Job is.
 

> As to answering another question you had,  I know that if you are using
> Subversion, then in the simplest case using SVN Tracking plugin will
> suffice.

If that's the plugin I think, it says it should be used with commit
hooks, not polling - and I don't understand quite how it would work if
builds of different revisions ran concurrently.   If one set of builds
hasn't completed before they are started/queued for another revisions,
I still want each set to complete and stay consistent.

This is an issue I'm investigating. I'd like plugin to be triggered by a git post-commit hook, then pass the incoming revision as parameter to other jobs
 

> If you need something more complex you may need to use a revision parameter
> on the downstream jobs. You'll then to append @${REVISION} to your SVN URLs.
> (Assuming param was called REVISION).

Yes, I can do this within build flow with $SVN_REVISION with the
version where the svn polling trigger works.  Not sure if later
versions will have an equivalent.   I'm beginning to think that what I
really need is some boilerplate groovy code or library to do the same
operations as the build flow plugin but in an ordinary build step.
Would the 'run other jobs in parallel' feature be possible there?

--
   Les Mikesell
     lesmi...@gmail.com

--
You received this message because you are subscribed to the Google Groups "Jenkins Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to jenkinsci-use...@googlegroups.com.

Les Mikesell

unread,
Sep 19, 2013, 2:03:02 PM9/19/13
to jenkinsci-users
On Thu, Sep 19, 2013 at 12:36 PM, nicolas de loof
<nicolas...@gmail.com> wrote:
>
>> > The very first time I read about Build Flow I also thought it to be a
>> > DSL
>> > for specifying complex steps in a job.
>>
>> Maybe it was wishful thinking, because that seems to be what I need.
>
>
> It's not, it's a tool to orchestrate workflows (so the name) based on jobs,
> as a replacement for parameterized build trigger + join + downstream + ...
> plugins
>
> If you need to do advanced build script, use groovy system script or
> scriptler plugin

I think that's what I need, but I'm not a groovy expert. Is there
any way to incorporate the flow DSL operations into a generic groovy
build step? And does it have to be 'system'?

>> The perception was boosted by the version I am using having access to
>> all of the usual build trigger options, a visible workspace, and the
>> usual post-build options.
>
>
> First implementation didn't offered this, and BuildFlow just extended Job.
> For simplicity and avoid code copy/paste it extends Project, but isn't an
> actual Project as FreeStyle Job is.

It just seems confusing to me, and I think it would be to others to
not have the expected triggers and archive capabilities. How did the
version that extended Job mesh the details of running on a slave node
with its system operations - or does that matter?

>> > Subversion, then in the simplest case using SVN Tracking plugin will
>> > suffice.
>>
>> If that's the plugin I think, it says it should be used with commit
>> hooks, not polling - and I don't understand quite how it would work if
>> builds of different revisions ran concurrently. If one set of builds
>> hasn't completed before they are started/queued for another revisions,
>> I still want each set to complete and stay consistent.
>
>
> This is an issue I'm investigating. I'd like plugin to be triggered by a git
> post-commit hook, then pass the incoming revision as parameter to other jobs

I'm not really clear on how matrix jobs handle this either - which is
part of the reason I'm looking for more explicit control of the child
jobs.

--
Les Mikesell
lesmi...@gmail.com

nicolas de loof

unread,
Sep 19, 2013, 2:47:53 PM9/19/13
to jenkins...@googlegroups.com



2013/9/19 Les Mikesell <lesmi...@gmail.com>

On Thu, Sep 19, 2013 at 12:36 PM, nicolas de loof
<nicolas...@gmail.com> wrote:
>
>> > The very first time I read about Build Flow I also thought it to be a
>> > DSL
>> > for specifying complex steps in a job.
>>
>> Maybe it was wishful thinking, because that seems to be what I need.
>
>
> It's not, it's a tool to orchestrate workflows (so the name) based on jobs,
> as a replacement for parameterized build trigger + join + downstream + ...
> plugins
>
> If you need to do advanced build script, use groovy system script or
> scriptler plugin

I think that's what I need, but I'm not a groovy expert.   Is there
any way to incorporate the flow DSL operations into a generic groovy
build step?  And does it have to be 'system'?

it's just about using the adequate jenkins API
Jenkins.instance.getItemByFullName("job").scheduleBuild(...)
 

>> The perception was boosted by the version I am using having access to
>> all of the usual build trigger options, a visible workspace, and the
>> usual post-build options.
>
>
> First implementation didn't offered this, and BuildFlow just extended Job.
> For simplicity and avoid code copy/paste it extends Project, but isn't an
> actual Project as FreeStyle Job is.

It just seems confusing to me, and I think it would be to others to
not have the expected triggers and archive capabilities.  How did the
version that extended Job mesh the details of running on a slave node
with its system operations - or does that matter?

As I said, was set to avoid code duplication, but will rollback to remove them, as it's indeed confusing.
Archive capability is a generic post-build action, no way to filter it afaik.
 

>> > Subversion, then in the simplest case using SVN Tracking plugin will
>> > suffice.
>>
>> If that's the plugin I think, it says it should be used with commit
>> hooks, not polling - and I don't understand quite how it would work if
>> builds of different revisions ran concurrently.   If one set of builds
>> hasn't completed before they are started/queued for another revisions,
>> I still want each set to complete and stay consistent.
>
>
> This is an issue I'm investigating. I'd like plugin to be triggered by a git
> post-commit hook, then pass the incoming revision as parameter to other jobs

I'm not really clear on how matrix jobs handle this either - which is
part of the reason I'm looking for more explicit control of the child
jobs.

Matrix head has a workspace, and all axes are sub-directories in this workspace

Daniel Beck

unread,
Sep 19, 2013, 4:21:40 PM9/19/13
to jenkins...@googlegroups.com
On 19.09.2013, at 20:47, nicolas de loof <nicolas...@gmail.com> wrote:

> it's just about using the adequate jenkins API
> Jenkins.instance.getItemByFullName("job").scheduleBuild(...)

Let's make this into a real example. Execution of two other jobs (in parallel even) below, plus copying their artifacts, all within a Groovy system build step.

----

hudson.model.queue.QueueTaskFuture build(String fullName) {
def p = jenkins.model.Jenkins.instance.getItemByFullName(fullName)
def thisR = Thread.currentThread().executable
def f = p.scheduleBuild2(p.quietPeriod, new hudson.model.Cause.UpstreamCause(thisR))
return f
}

def f1 = build('test-artifact-source-1')
def f2 = build('test-artifact-source-2')

// wait for both builds to finish
def b1 = f1.get()
def b2 = f2.get()

println b1.id
println b2.id

// added 'copy artifacts' because it was so easy -- not tested with execution on slave though, but it should work
b2.builtOn.createPath(b2.artifactsDir.canonicalPath).copyRecursiveTo(Thread.currentThread().executable.workspace)
b1.builtOn.createPath(b1.artifactsDir.canonicalPath).copyRecursiveTo(Thread.currentThread().executable.workspace)

return 0

----

http://javadoc.jenkins-ci.org is a great resource for stuff like this. And if you want to see how others accomplished something, well, almost all plugins, including build flow, are open source...

Regards
Daniel

Les Mikesell

unread,
Sep 20, 2013, 1:17:53 PM9/20/13
to jenkinsci-users
On Thu, Sep 19, 2013 at 3:21 PM, Daniel Beck <m...@beckweb.net> wrote:
>
>> it's just about using the adequate jenkins API
>> Jenkins.instance.getItemByFullName("job").scheduleBuild(...)
>
> Let's make this into a real example. Execution of two other jobs (in parallel even) below, plus copying their artifacts, all within a Groovy system build step.
>
> ----
>
> hudson.model.queue.QueueTaskFuture build(String fullName) {
> def p = jenkins.model.Jenkins.instance.getItemByFullName(fullName)
> def thisR = Thread.currentThread().executable
> def f = p.scheduleBuild2(p.quietPeriod, new hudson.model.Cause.UpstreamCause(thisR))
> return f
> }
>
> def f1 = build('test-artifact-source-1')
> def f2 = build('test-artifact-source-2')
>
> // wait for both builds to finish
> def b1 = f1.get()
> def b2 = f2.get()

Thanks - I think this is going to be the best approach, but not being
a java developer, the full API is pretty intimidating and it would be
nice to keep the boilerplace code out of sight. Is there any way to
incorporate something that looks like the build flow DSL - or at least
the functions it offers into a library that would be available in any
groovy build step? That could match the convenience without imposing
the restrictions of a separate plugin.

--
Les Mikesell
lesmi...@gmail.com
Reply all
Reply to author
Forward
0 new messages