Visibility of build.go.cd configuration?

260 views
Skip to first unread message

james....@datasift.com

unread,
Nov 3, 2014, 11:05:32 AM11/3/14
to go...@googlegroups.com
Is it possible to see the configuration for https://build.go.cd/go/pipelines ? It would be great to use as an example for how to do certain things. I was also looking for the build_utilities repo, is that available?

Cheers,
James.

Aravind SV

unread,
Nov 3, 2014, 4:08:04 PM11/3/14
to james....@datasift.com, go...@googlegroups.com
Hi James,

On Mon, Nov 3, 2014 at 11:05 AM, <james....@datasift.com> wrote:
Is it possible to see the configuration for https://build.go.cd/go/pipelines ? It would be great to use as an example for how to do certain things.

Yes, that should be fine. I can make it available here. There are a couple of sensitive parts to it (passwords, etc) which will need to be removed before it can be shared. I suspect most of it is a little routine, though. :) Is there anything specific you were interested in?
 

I was also looking for the build_utilities repo, is that available?

Unfortunately not. It has a few scripts which upload files from Go to go.cd/download and to bintray, so that they can be downloaded from the website. Again, was there anything specific you were interested in?

I looked at the scripts there as well, and they're all very specific to these use-cases. For instance, if you look at the build_utilities script running here, you can see that it uploads files and updates the releases list, so that it shows up at www.go.cd/download. A part of the script looks like this (I'm pasting snippets here):

=====================================
def calculate_data file, type_name
  puts "#{Time.now} Calculating checksums of #{file}"

  contents = File.read file

  {
    :sha1sum => Digest::SHA1.hexdigest(contents),
    :md5sum => Digest::MD5.hexdigest(contents),
    :name => File.basename(file),
    :type => type_name
  }
end
=====================================

... and ...

=====================================
puts "Updating releases.json with information about this release: #{release_version} - #{git_revision} - #{release_time}"

files_data = []
files_data.concat(compute_file_data_for_agent_and_server(files, "solaris",   /-solaris\.gz$/))
files_data.concat(compute_file_data_for_agent_and_server(files, "mac",       /-osx\.zip$/))
files_data.concat(compute_file_data_for_agent_and_server(files, "linuxDeb",  /\.deb$/))
files_data.concat(compute_file_data_for_agent_and_server(files, "linuxRpm",  /\.rpm$/))
files_data.concat(compute_file_data_for_agent_and_server(files, "package",   /[0-9]\.zip$/))
files_data.concat(compute_file_data_for_agent_and_server(files, "windows",   /.exe$/))

this_data = {
  :version => release_version,
  :git_revision => git_revision,
  :built_at => build_link,
  :release_time => release_time,
  :release_type => "experimental",
  :files => files_data
}

all_data = existing_data << this_data
=====================================

As you can see, it's standard Ruby code, but with some sensitive information, since it needs to access the download servers.

Regards,
Aravind

james....@datasift.com

unread,
Nov 4, 2014, 10:04:51 AM11/4/14
to go...@googlegroups.com, james....@datasift.com
One thing that I was interested in was how you deal with version numbers. It appears that you hard code the major.minor.patch numbers in Go and keep the pom files fixed at version 1.0. Is that correct? Was there ever any thought of updating the pom files with the correct version number.

I'm also interested in how you deal with the scripts that are used in the pipeline. We currently have a repo for any scripts used in the pipeline and make it a material of the pipeline. The downside of this is that you have to fetch materials for every stage (usually including the source code), the upside is you get nice version control.


Mainly the things that I think would help are not a matter of being able to do something or not, but more like best practices.

Cheers,
James.

Aravind SV

unread,
Nov 9, 2014, 10:38:58 PM11/9/14
to james....@datasift.com, go...@googlegroups.com
Hello James,

On Tue, Nov 4, 2014 at 1:04 PM, <james....@datasift.com> wrote:
One thing that I was interested in was how you deal with version numbers. It appears that you hard code the major.minor.patch numbers in Go and keep the pom files fixed at version 1.0. Is that correct? Was there ever any thought of updating the pom files with the correct version number.

Yes, you're (almost) right. The version numbers are not really hardcoded everywhere (I assume you mean the pipeline labels, as shown in the VSM below). It would be quite tedious to maintain it properly. Instead it uses materials in pipeline labels.

Inline image 1

I've attached the config XML in this mail, with a few redactions. What I mean by "uses materials in pipeline labels" is that "build-linux" pipeline has hardcoded 14.3.0 in it (the "labeltemplate" attribute on line 26 in the attached file - #{GO_VERSION}.${COUNT}-${GOCD} ), but the "plugins" pipeline has a labeltemplate of "${GO}" (line 695). In line 701, you can see that the material named "go" just refers to its upstream pipeline, that is, build-linux. So, whatever label build-linux has propagates to the "plugins" pipeline, and every other pipeline beyond these two.

That's the reason that, if you see a VSM like this one (username: view, password: password), then you see that all the label, down to the smallest details, are the same: 14.3.0.273-719e1417078f01a60fe0cb80bff4c3a529227455


There was some talk about changing the pom versions, but there was a feeling that it would be too much to make sure they're all in sync. I suspect that the correct way to do it is to set it to the right version, but that becomes more important when those JARs are being pushed to a real Maven repository. In this case, they're just used to build a WAR, and having it as 1.0 doesn't seem to hurt.


I'm also interested in how you deal with the scripts that are used in the pipeline. We currently have a repo for any scripts used in the pipeline and make it a material of the pipeline. The downside of this is that you have to fetch materials for every stage (usually including the source code), the upside is you get nice version control.


Mainly the things that I think would help are not a matter of being able to do something or not, but more like best practices.

Yeah, most build related scripts are checked into source code itself. Either with the rest of the code, or in build_utilities. My (personal) preference is to have them checked into the main code, itself, so that everyone can use those same scripts. Sometimes, because there are deploy related scripts, they're in build_utilities or a deploy_utilities repo (in some projects). I'd try and not do that, unless it is really necessary that you want that separation.

In Go's case, it allows the build_utilities repository to have a little more sensitive information, possibly internal IPs and hosts where the artifacts gets pushed around, before making them available for download, for instance.

Regards,
Aravind
config.xml

james....@datasift.com

unread,
Nov 26, 2014, 11:46:46 AM11/26/14
to go...@googlegroups.com, james....@datasift.com
Thanks for this. It's been really useful looking at the config and comparing how you do things to how we do things. Very useful.

Cheers,
James.
Reply all
Reply to author
Forward
0 new messages