On Wed, Sep 21, 2016 at 8:03 AM, <
fkp...@gmail.com> wrote:
> I'd like our plugin to support the parallel step (in pipelines).
> I haven't found any documentation on what steps / methods i should implement
> to ensure our support.
There is nothing specific you need to implement. Does it currently
work in a Pipeline build that is not using `parallel` but fail in one
that is? If so, how?
The Pipeline APIs are designed to allow access to contextual objects
using dynamic scopes; each step accepting a block argument potentially
introduces a scope with added/overridden context, and each step may
access a context. For example, in
parallel linux: {
node('linux') {
sh 'make'
}
}, windows: {
node('windows') {
bat 'msbuild'
}
}
the `sh` and `bat` steps each get access to various API objects such
as a `FilePath` and `Launcher`, supplied by the enclosing `node`.
One thing to note: if a script sets environment variables using the syntax
env.SOME_SERVICE_URL = '
https://someservice.corp/'
that setting takes effect for the remainder of the build, regardless
of scopes. (Such variables also get exposed via the REST API and to
upstream builds using the `build` step, so they can be used to
“export” simple data from a build.) Any variables that might need to
vary by machine or operating system should generally be set using the
`withEnv` step so they scope properly:
parallel linux: {
node('linux') {
withEnv(["PATH=${tool 'gnumake'}/bin:${env.PATH}"]) {
sh 'make'
}
}
}, windows: {
node('windows') {
// uses default %Path%
bat 'msbuild'
}
}
(Various other block-scoped steps such as `withCredentials` also set
environment variables contextually.)