Piple line for multiple projects sharing the same SVN repository

39 views
Skip to first unread message

Anton Shepelev

unread,
Aug 15, 2020, 6:13:33 PM8/15/20
to jenkins...@googlegroups.com
Hello, all

We maintain hundreds of interdependent .NET libraries and programs
that typically reference tenths of those libraries each in a single
SVN repository. Maybe it is a non-standard, and even not a
recommended approach, but creating a repository per program or
library would introduce its own problems, especially because we use
source (project) references by relative path instead of binary
references.

I have inherited a Jenkins configuration from a previous
maintainer, whom I can no longer contact. He configured the
building of each project not by means of pipelines, but entirely
through the web-interface, as Freestyle projects with an MSBuild
step. I think it is not a standard Jenkins functionality but a
feature of the MSBuild plugin: https://plugins.jenkins.io/msbuild/ .

He configured Jenkins so that it should build each project in the
same local SVN mirror, in order not to checkout our unified
repository for each of the many projects. He created a Jenkins
global property `SVN_REPO' with the value
`ExecutorRepo_${EXECUTOR_NUMBER}',
but I can't understand how and where it is referenced, expanded, and
used instead of the default working directory of each project. Can
you help me to find that out? When I tried to specify that variable
as "Local module directory" in the pipeline configuration,
${EXECUTOR_NUMBER} was not expanded, perhaps because that settings
is expanded before an executor is chosen.

Is there a way to confire multiple pipelines to reuse the same
global SVN working directory on each node? For example, supposing I
have this SVN repository:

trunk
programs
program1
program2
...
program99
library
library1
library2
...
library99

I want each agent to checkout trunk exactly once and build each
program inside the same working directory. Is it possible?

jeremy mordkoff

unread,
Aug 17, 2020, 10:33:42 AM8/17/20
to Jenkins Users
I'm having a little trouble understanding exactly what you are asking. I think you are asking if it is possible to trigger multiple builds in a single workspace. If you use a single pipeline, this is trivial. 

So perhaps you should consider writing a single pipeline that iterates over all libraries and projects and builds each one in turn.  

Anton Shepelev

unread,
Aug 19, 2020, 6:47:23 PM8/19/20
to jenkins...@googlegroups.com
jeremy mordkoff:

> I'm having a little trouble understanding exactly what you
> are asking.

I am sorry for an insufficiently clear description. Let me
try again.

Since our .NET projects are highly interdependent, we
maintain them in a single SVN repository with a strucure
like this:

trunk
programs
program1
program2
...
program99
library
library1
library2
...
library999

where each program depends on one of more libraries. I want
to create 99 Jenkins items, using declarative pipelines,
each building one of the 99 programs. We have several nodes
registered on different machines with different
environments. I want each node to checkout the SVN trunk
exactly once and use that working copy in each pipeline to
build each individual program. I do not understand how to
set it up in Jenkins because each pipeline seems to use its
own SVN working directory.

The current configurtion, setup by a previous maintainer,
relies on the Jenkins MSBuild plugin, and manages somehow to
retain a single SVN working direcotory per node and
executor, whose name is stored in a Jenkins global property
SVN_REPO with the value `ExecutorRepo_${EXECUTOR_NUMBER}.'
When I tried to specify this property in the "Local module
directory" for the pipeline, the `${EXECUTOR_NUMBER}' part
was left unexpanded in the actual path to the SVN working
directory. I don't understand how and why it is expanded
with the MSBuild plugin but is not with the declarative
pipeline.

> I think you are asking if it is possible to trigger
> multiple builds in a single workspace.

Yes.

> If you use a single pipeline, this is trivial.

As I said above, I prefer one pipeline per program, for the
convenience of having a dedicated page on the Jenkins web-
interface for each project and the ability to change the
settings for building individual projects independently of
other projects. It this can be achieved with a single
pipelike, I should like to know how.

> So perhaps you should consider writing a single pipeline
> that iterates over all libraries and projects and builds
> each one in turn.

Is such iteration possible with the declarative syntax, or
shall I switch to the scripted pipeline?

jeremy mordkoff

unread,
Aug 19, 2020, 7:56:57 PM8/19/20
to Jenkins Users
You need 99 jobs for the 99 programs, but what about the libraries? I assume a program will depend on one or more libraries. Are there dependencies between libraries? (this is common and not a problem if it is understood.) Are there dependencies between programs?  (this would be bad.)

How fast is your job if there are no changes? Are the artifacts available so that the build system detects that nothing needs to be done quickly? If so, then you could consider a single jenkins job that iterated over the libraries in the correct order and then over the programs.  This would by default use a single workspace. It has the advantage of only scanning each library once. 

otherwise....there is actually nothing that says the build has to use the workspace checked out by jenkins. I suspect your predecessor had extra workspaces that he reused for every build. Have you found these so you know they exist?  Are there containers involved? It's possible the workspaces are not getting mounted inside the container.

If you are forced to have many jenkins jobs, then I would put a JenkinsFile in every program dir. whether they are scripted or declarative is up to you. I like scripted so much I rewrote everything my predecessor did because he used all declarative (and had terrible program structure). 

As far as ${EXECUTOR_NUMBER} not getting expanded, this could be an issue of timing. It's possible that whatever process is doing that expansion is before an executor is chosen. But the expansion in a script. Like I said, the build doesn't have to use the workspace created by jenkins.

Anton Shepelev

unread,
Aug 19, 2020, 8:43:33 PM8/19/20
to jenkins...@googlegroups.com
jeremy mordkoff:

> You need 99 jobs for the 99 programs, but what about the
> libraries?

Yes, one job per project, and let MSBuild take care of
building the libraries. Thankfully, I do not have to worry
about it in Jenkins. The libraries will be available at
their correct relative paths in the repository and its local
mirror.

> I assume a program will depend on one or more libraries.
> Are there dependencies between libraries? (this is common
> and not a problem if it is understood.)

Yes, and MSBuid takes care of them too, because all the
dependencies are specified as project references in our
project files. Did you want to suggest a manual way to
specify those dependencies (e.g. as in simple Makefiles) or
did you have some automated method in mind? I should be
interested to learn about it in case I need it.

> Are there dependencies between programs? (this would be
> bad.)

No, there are not.

> How fast is your job if there are no changes?

What changes do you mean -- source modifications? For the
time being, my primary concern is not build speed but ease
of configuration and maintenance of Jenkins jobs with
minimal duplication of code between similar jobs (which
deserves a separate thread).

> Are the artifacts available so that the build system
> detects that nothing needs to be done quickly?

Yes, the binaries are co-located with the SVN mirror, but
they are local files not under version-control, so that if
several programs depend on the same library the build system
(MSBuild) will find and reuse the same binary of the
library.

> If so, then you could consider a single jenkins job that
> iterated over the libraries in the correct order and then
> over the programs. This would by default use a single
> workspace. It has the advantage of only scanning each
> library once.

Not very easy, because some of the programs and libraries
have specific settings and build parameters, which I should
like to specify in their individual JenkinsFiles. Also, as I
wrote in the previous post, I (for now) want Jenkins to host
an individual web-page for each program.

> otherwise....there is actually nothing that says the build
> has to use the workspace checked out by jenkins.

OK, but I have three questions:

1. Who and when updates the local copy from the vesion-
control repository if no `checkout' step is present in
the JenkinsFile?

2. Does the `checkout' step imply update (e.g.
svn update) after the initial run? I hope it will not
rewrite an existing mirror every time...

3. If I set the same `Local module directory' in all
jobs, will they properly reuse the same SVN mirror or
not?

> I suspect your predecessor had extra workspaces that he
> reused for every build.

I doubt it. He used `ExecutorRepo_${EXECUTOR_NUMBER}'
properly expanded and located in the Jenkins directory on
the node machine. It has all the needed files. Why would he
need extra workspaces?

Anyway, only one executor is currenly setup for each node,
and if I stick to that, I could use a hard-coded path
instead.

> Have you found these so you know they exist?

No.

> Are there containers involved? It's possible the
> workspaces are not getting mounted inside the container.

No containers involved. In this regard everything is
transparent: executors work directly on our manually created
virtual machines.

> If you are forced to have many jenkins jobs, ->

Not that I am forced, no, but that is outwardly compatible
with what I have inherited and I myself think this approach
offers better flexibility regading the configuration of
individual programs. I fear it will require more work and
knowledge than I currenly have to manage them all in a
single pipeline. Futhermore, it will require special
provision not to rebuild everything after every commit but
only the projects affected by the commit.

> -> then I would put a JenkinsFile in every program dir.

That's what I am going to do. I already have two projects
set up that way, but only one them works so far. It is a
simple `pandoc' build for a document with no external
dependecies. That is how far I have gotten in the way of
pipelines :-)

> whether they are scripted or declarative is up to you. I
> like scripted so much I rewrote everything my predecessor
> did because he used all declarative (and had terrible
> program structure).

It is too early for me to make a justified choice. Is the
scripted syntax better documented, more flexible, or allows
better code reuse within and between pipelines? Does the
declarative syntax become too convoluted for non-trivial
build processes? What were your reasons for the change?

> As far as ${EXECUTOR_NUMBER} not getting expanded, this
> could be an issue of timing. It's possible that whatever
> process is doing that expansion is before an executor is
> chosen. But the expansion in a script.

That must be it, but I can't figure out who or what is
responsible for expanding it *after* an executor is chosen
in my the current setup. In fact, I have not been able to
find a single reference to that variable, yet it works...

Anton Shepelev

unread,
Aug 27, 2020, 11:48:59 AM8/27/20
to jenkins...@googlegroups.com
jeremy mordkoff:

> otherwise....there is actually nothing that says the build
> has to use the workspace checked out by jenkins.

Well, I can't seem to set up an external directory in the
`checkout' step. The option `local' (named "Local module
directory") is required to point to a subdirectory of the
workspace. I can I make it point to C:\JenkinsSvnWorkdir ?

> I suspect your predecessor had extra workspaces that he
> reused for every build. Have you found these so you know
> they exist?

Yes, the Jenkins MSBuild plugin was set up to use a global
directory for checkout.

Reply all
Reply to author
Forward
0 new messages