External Artifact Propagation of Dependent Pipelines (aka Version Bumping)

19 views
Skip to first unread message

Anthony Abate

unread,
Aug 16, 2019, 1:59:11 PM8/16/19
to go-cd
Hello

I am curious what is the best practice is for the following scenario:

Library A is used by Program B via an External Artifact Repository (nuget.org, npm, etc)

It seems like a 'simple scenario involving 2 pipelines:

Pipeline A (Library)
Source -> Build -> Test -> Deploy Artifact

Pipeline B (Program that use Library A)
Source -> Build -> Test -> Deploy To Production

In this example, there is no 'GoCD dependency' between pipeline A and B.  If A's source is changed, A.2 is built and eventually deployed to the External Artifact Repo.   In order for program B to use it, source code is edited (ie C# packageReference or Package.json) to link to the A.2.  A new build B.2 is generated and if it passes, it is deployed.  The VSM for Pipeline B.2 has no indication that Pipeline A.2 was involved / consumed.

It feels like the Up / Down stream + Package Repo along with the VSM capabilities of GoCD should be able to handle this scenario.  However, in my experimentation, things break down due to fact that the build has to modify a file under source control that tells Program B to switch from  Library A.1 to A.2.  Changing the file locally during the build stage seems like cheating since this won't become part of the official source history.  (version control of the library references is very important)

Like above the example above, all of my pipelines are 'silos'  If i want to deploy a new version of the top most dependency, I have to do a lot of source code changes to propagate a version bump downstream manually.  Now granted this is extremely 'safe' since the change might be breaking - it just can be very time consuming if there are 20 downstream projects.  If this were automated I could in theory publish a new 'top-level' dependency and have the entire downstream build process detect any breaking changes via build or test failures.

In addition automation, for any built (or deployed) artifacts, I would want to see a VSM that shows the state of all upstream pipelines that were involved / consumed so I can see at a glance what upstream labels are part of a downstream artifacts. 

1. Does GoCD have a way of showing / managing these type of dependencies?

2. Does GoCD have a way to bump versions (change source code) references to match upstream labels? Or is this burden on me? 

3. Are all upstream pipeline labels available as environment variables to downstream pipelines? 
(I could in theory create a parallel dependency pipeline - which does nothing but bump versions. However, the further downstream, the wider the scope of version bumping becomes as C depends on B and A, not just B depends on A)

In any case, was curious if some how the current functionality of GoCD addresses the above, or is this a separate 'feature / enhancement


Thanks
-Anthony

Jason Smyth

unread,
Aug 19, 2019, 11:03:36 AM8/19/19
to go-cd
Hi Anthony,

I think the only way to get GoCD to show these types of dependencies is to add Pipeline A as a Material for Pipeline B. This alone wouldn't handle automatically increasing the version of Library A in Application B but it would allow you to see all downstream applications that leverage Library A by looking at Pipeline A's VSM.

Depending on your library management solution you could potentially configure Application B to use the latest version of Library A instead of a specific version. I wouldn't recommend this, though, because you would lose the ability to track library version changes in Application B's source code and you would also lose the ability to recreate builds of Application B that were originally created with older versions of Library A.

A better solution for automating incrementing version might be to add a stage that does the version bump in Application B's source. This stage could exist either at the end of Pipeline A or as a completely separate Pipeline (C) whose sole purpose is to bump versions in applications that depend on Library A. You would need to maintain whatever process this new stage used to ensure it was aware of each new dependency that got added but I don't expect that would be a lot of work on top of manually adding the new downstream Pipelines.

With all of that said, I think the best solution might be the one you've already got. The existing implementation allows the maintainers of each downstream application to decide for themselves when they want to migrate to a new version of Library A. By extension, this means that nobody has to do any work to ensure that Application Q, which is working just fine in production and hasn't been updated in over a year, works with the latest version of Library A, which may include intentionally breaking changes.

I'm curious what issues you are encountering that are making you want to force library updates in otherwise unchanged applications.

This is just my opinion. I'm no expert but hopefully you find some of this helpful.

Cheers,
Jason

anthon...@gmail.com

unread,
Aug 19, 2019, 12:08:16 PM8/19/19
to go...@googlegroups.com

Jason,

Thanks yes it was helpful, and more or less what I already assumed. 

> I'm curious what issues you are encountering that are making you want to force library updates in otherwise unchanged applications.

To better understand this - Think of the production application as a pyramid.  Top of the pyramid are model definitions, contracts, and metadata, the middle services, the bottom applications.  The top changes less frequently, but can be more drastic if the changes are breaking. If they do change, I want to force an update from the top (or middle) down.  Most of the times, the changes are backwards compatible and bumping the version during the build is just another way to verify that the change did not break anything. 

In this regard there are two types of releases:  "In-Place Update" and also a "Downstream Update"  (when I put it that way, it does seem like 2 separate pipelines)

Another reason for the downstream updates is during "development phases"  changes are frequent, and I really do want all downstream pipelines to update in code for a coordinated release down the road.   

It the 'downstream update" that I think GoCD could probably do a better job of automating / provide in-built features since it usually requires modifying code files and it 'knows" about all pipelines.


--
You received this message because you are subscribed to the Google Groups "go-cd" group.
To unsubscribe from this group and stop receiving emails from it, send an email to go-cd+un...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/go-cd/af1f8a28-dc55-4952-b090-ae7d5b690254%40googlegroups.com.

anthon...@gmail.com

unread,
Aug 19, 2019, 8:11:45 PM8/19/19
to go...@googlegroups.com
alas it seems GoCD has some issues when setting the inter pipeline triggers with different source code materials


I was originally going to post my question (is it a feature request now?)  into github, but the github issue message clearly stated to ask questions on the google groups first...

anthon...@gmail.com

unread,
Aug 22, 2019, 9:42:54 AM8/22/19
to go...@googlegroups.com
It appears that dependabot (just acquired by GitHub) is similar to my 'feature request:   https://dependabot.com
One use case which I forgot for forcing downstream changes - security fixes ...


-Anthony

Reply all
Reply to author
Forward
0 new messages