My thoughts about gb

460 views
Skip to first unread message

William Kennedy

unread,
May 4, 2015, 12:36:16 AM5/4/15
to go-package...@googlegroups.com
Up until now I have been under the belief that any solution to the dependency management problem required full compatibility with all the go tooling, including "go get". Over the past year or more, I have tried different solutions based on this belief. Even though I have been successful in creating reproducible builds with several projects, every solution has come at some price of complexity or tedious management.

We all agree that vendoring is the right solution, but how to vendor is the question. I have always set up my code within the scope of a project, and vendored dependencies within the scope of that project. I have also always re-wrote the import paths for the code that was vendored. A project should contain all the code it needs, then through a release branch, you can have a reproducible version of the project that can always be built, fixed and deployed independently. The biggest problem with rewriting import paths is the long path names it creates, and making mistakes when performing the rename yourself. Tools today that help do this, create both a tooling dependency problem for the project or vendor things in a way that causes the import paths to be even longer than they have to. I was also never a big fan of playing GOPATH games because if you build the code by accident without the tool, the dependencies could also exist in your GOPATH and the build would succeed for the wrong reason.

I have always thought the "go get" tool was brilliant and I still believe the idea of "go get" is. But I have to agree with Dave that this tool is its own worse enemy. It is the cause of all the problems we have. There is no way to fix this dependency problem and keep "go get". Solutions that wish to add a commit id to the import url or tags would actually create an even worse situation than we already have today. It is error prone and does not fix the inherent problem, a project needs to own and control its own consistent version of each dependency, period.

I think gb is our best chance to create a tool that can solve this problem for a large set of projects people are building today. The key is we need to walk away from the existing go tooling including "go get". If we do, these are the initial benefits as I see them:
  1. Project Based:
    1. We can create the concept of a project.
    2. A project being a single repo that when built creates a binary.
    3. The project can exist anywhere on disk.
    4. GOPATH is determined and identified by the tool based on where the code is for a project.
    5. Only code inside the project can be seen, imported and built.
  2. Vendoring:
    1. No longer is there a need to rewrite import paths.
    2. The project can use the canonical url for import statements when referencing the vendored code.
    3. No more very long and convoluted import paths for vendored code.
    4. We can provide support to "pull" code from any DVCS right into the vendored folder for a project.
  3. Editor Support:
    1. The tool can provide support for allowing our editors to work as they do today.
    2. Building, Testing, Intellisense, etc. it can all work as if nothing has changed.
One final thought. All that is important is the ability to build and test our code in a reliable, consistent and reproducible way. When the community is doing this all the same, using the same tooling, everyone wins. Dave's thoughts and ideas on how to structure a project, manipulate the GOPATH, perform vendoring and build the code are valid thoughts even outside the gb tool itself. Once you understand that it is ok to walk away from the current Go tooling, the opportunities are endless, even if gb is not the opportunity the community wants to embrace today.

Daniel Theophanes

unread,
May 4, 2015, 10:55:00 AM5/4/15
to go-package...@googlegroups.com
Hi William,

I like your analysis. I would like to note two things that I think I may differ on:
1. Unless absolutely necessary we should avoid fracturing the community. It might be necessary, but a great deal more work needs to be done before that: gb build spec, PR, tooling forks... But we should continuously evaluate if it is necessary.

2. The ultra long build paths are just an artifact of current tooling. I have considered making my vendor tool (kardianos/vendor) rewrite "golang.org/x/net/context" to just "$project/internal/context". If there were naming conflicts, it would just bump up the the next level that provided unique paths. This would be possible due to the information recorded in the vendor file and still prevent duplicates packages. I haven't yet because I wanted to be conservative with the design initially.

I'm not against gb or an alternate build spec. Contrary to that, I think it is good to develop multiple options. But we should carefully count the cost of forking every tool that currently uses GOPATH and the potential for a community divided. Maybe some would upstream the changes. The political aspect of which tools would require forks, which would upstream would be an important statistic to put numbers behind.

The biggest issue with gb's design is not technical, but political. And that is non-trivial and important.

-Daniel

William Kennedy

unread,
May 4, 2015, 1:07:39 PM5/4/15
to Daniel Theophanes, go-package...@googlegroups.com

> On May 4, 2015, at 7:55 AM, Daniel Theophanes <kard...@gmail.com> wrote:
>
> Hi William,
>
> I like your analysis. I would like to note two things that I think I may differ on:
> 1. Unless absolutely necessary we should avoid fracturing the community. It might be necessary, but a great deal more work needs to be done before that: gb build spec, PR, tooling forks... But we should continuously evaluate if it is necessary.

I agree we should always avoid when possible fracturing of any kind. That being said, when you come to the conclusion that “go get” can’t be fixed, then you can logically take the next step and say we need new tooling. I believe new tooling is necessary to provide a solution that works for the general set of projects being built by the community. I like Dave’s proposal and I think it is worth exploring more.

>
> 2. The ultra long build paths are just an artifact of current tooling. I have considered making my vendor tool (kardianos/vendor) rewrite "golang.org/x/net/context" to just "$project/internal/context". If there were naming conflicts, it would just bump up the the next level that provided unique paths. This would be possible due to the information recorded in the vendor file and still prevent duplicates packages. I haven't yet because I wanted to be conservative with the design initially.

I too have vendored on occasion by removing parts of the original url. Some team members are ok with this and on other projects they are not. I understand the desire to keep the canonical path in tack under the vendor folder.

>
> I'm not against gb or an alternate build spec. Contrary to that, I think it is good to develop multiple options. But we should carefully count the cost of forking every tool that currently uses GOPATH and the potential for a community divided. Maybe some would upstream the changes. The political aspect of which tools would require forks, which would upstream would be an important statistic to put numbers behind.

I don’t see this as a fork but a new tool set. This does not mean others can't use the original tooling and this does not interfere with the language specification at all. The fact that this is a completely new set of tooling and not trying to work with the original tooling is what could allow this to be successful.

>
> The biggest issue with gb's design is not technical, but political. And that is non-trivial and important.

Yes, but what are we really doing if we build this new tool? We are solving a problem that was left up to us to solve. The language team has made that clear, it is on us. They have always said the community will figure this out. We have tried for several years to build new tooling on top of the existing. It has not worked. So now it is time to try something new.

>
> -Daniel

Daniel Theophanes

unread,
May 4, 2015, 1:17:15 PM5/4/15
to William Kennedy, go-package...@googlegroups.com
Hi William,

If these are new tools, make sure you have enough people, time, and effort to fork and keep up to date: go-vim, go-sublime, liteide, oracle, eg, gomovepkg, godebug, delve, etc and your tools work with gogcc, gollvm, gopherjs, etc. If these efforts aren't taken or changes up-streamed I personally would not be able to recommend a coworker to use this build method. So long as you have this in your cost model, then cool :).

Regarding shorter build paths, yes, it would be per project and optional opt in. I would implement it as a persistent option in the vendor file, initially set with the "vendor init -sort" command. I think if the person got used to just looking up were it came from with a tool or inspecting the json file, they might be fine with the shorter path.

-Daniel

William Kennedy

unread,
May 4, 2015, 1:24:29 PM5/4/15
to Daniel Theophanes, go-package...@googlegroups.com

> On May 4, 2015, at 12:17 PM, Daniel Theophanes <kard...@gmail.com> wrote:
>
> Hi William,
>
> If these are new tools, make sure you have enough people, time, and effort to fork and keep up to date: go-vim, go-sublime, liteide, oracle, eg, gomovepkg, godebug, delve, etc and your tools work with gogcc, gollvm, gopherjs, etc. If these efforts aren't taken or changes up-streamed I personally would not be able to recommend a coworker to use this build method. So long as you have this in your cost model, then cool :).

I am with you 100% and have already been talking with Dave about options to support the different editor environments. The tooling must work with the editors in an unobtrusive way, in other words, no one should experience anything different or need to change their workflow. This is definitely on the radar screen.

Dave Cheney

unread,
May 4, 2015, 2:54:33 PM5/4/15
to Daniel Theophanes, William Kennedy, go-package...@googlegroups.com
Daniel,

I know you're only playing the devils advocate, but these argument are
dangerously close to a sunk cost fallacy. You're arguing to _not_ do
something because of a cost already spent. What have the last 3 years
of trying to work out convoluted solutions to work _with_ the go tool
bought us ? Nothing, save a promise of more of the same.

gb is deliberately different because incremental change was not a
solution. You'll note this was also the manifesto of Go itself.

Thanks

Dave
> --
> You received this message because you are subscribed to the Google Groups
> "Go Package Management" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to go-package-manag...@googlegroups.com.
> To post to this group, send email to go-package...@googlegroups.com.
> For more options, visit https://groups.google.com/d/optout.

Nathan Youngman

unread,
May 5, 2015, 10:52:50 PM5/5/15
to Dave Cheney, Daniel Theophanes, William Kennedy, go-package...@googlegroups.com
Hi,

Just finally watched David's Reproducible Builds talk.

I've been using godep for the past few months **without** the -r (rewrite) option.

At first glance gb doesn't seem incredibly different from that model, but maybe I'm missing something. 

With godep I end up bouncing between my global mirror of GitHub and the project space. The fact that godep falls back to the global package space has certainly bit me when forgetting to run a godep save/update before pushing a branch.

Doing away with the global package space and just having projects sounds like an advantage in that regard.

I've yet to try out gb. Perhaps the most interesting thing is the statement (in source code) that we can use the Go compilers without the traditional go toolchain, if that will solve problems that a wrapper like godep can't solve.

Nathan. 


--
Nathan Youngman 
Email: he...@nathany.com
Web: http://www.nathany.com

Nathan Youngman

unread,
May 5, 2015, 11:11:08 PM5/5/15
to Dave Cheney, Daniel Theophanes, William Kennedy, go-package...@googlegroups.com

One important difference I glazed over is having a separate pkg folder per project to fix stale build issues like this one: https://github.com/golang/go/issues/10509

Nathan.

William Kennedy

unread,
May 6, 2015, 12:10:21 AM5/6/15
to Nathan Youngman, Dave Cheney, Daniel Theophanes, go-package...@googlegroups.com
Nathan,

Imagine a world without "go get" where an import statement is just a location on disk where you find a package of code. It is not tied to anything else. Then imagine a folder anywhere on your disk that contains a project you are working on. It has all the source code it needs to build, test and deploy the project.

Now imagine a way to pull code from any DVCS right inside your project without the need to rewrite import paths or worry about the code being somewhere else and potentially accessible. Then imagine all import paths being relative to the project. It is referenced like "vendor/github.com/goinggo/work". Never any longer or more complicated. Building the code is at the project level only. The project provide code containment.

When you walk away from "go get", things suddenly become smaller, simpler and more flexible. Isn't this what we always teach? I think years of trying to build tooling on top of "go get" has taught us that there will never be a tool that doesn't come with some level of warts the user needs to be willing to live with. The community will place a different pain point on these warts and never come to any consensus. Always having to choose the lessor of different evils.

The language spec does not include the tooling. The tooling can be what we make it. Building on top of the existing tooling has only gotten us so far. Why not take what we have learned and see what we can accomplish when not bound by the constraints of "go get"?

-- Bill

Nathan Youngman

unread,
May 6, 2015, 12:46:53 AM5/6/15
to William Kennedy, Dave Cheney, Daniel Theophanes, go-package...@googlegroups.com

Hey Bill,

When you walk away from "go get", things suddenly become smaller, simpler and more flexible.

That is a different way of looking at it. Dave Cheney openly admits that

gb does nothing more than GOPATH=$PROJECT/src:$PROJECT/vendor/src go build


It's not that gb does more than the go toolchain, but that it does less?

I really like how Dave managed to side-step the whole dependency metadata file that Brad Fitzpatrick was asking for and nobody could agree on.

From reading these recent threads, it sounds like some of the frustrations with Godep are due to it straddling two workflows (with and without rewrites). In the non-rewrite mode, godep also keeps the original $GOPATH (global mirror of GitHub) rather than just the project and its vendored dependencies.

Part of the reason for import rewrites in godep is to make cmd line tools go gettable (without the need for godep). Doing away with go get and that requirement is one less reason for rewrites. (I don't personally have enough experience using rewrites to like or dislike them).

What I would find most interesting, and as far as I know is still unresolved, is flattening dependencies of dependencies down to the single vendor folder (ensuring no dups). https://github.com/constabulary/gb/issues/20

Nathan.

Nicolas Grilly

unread,
May 14, 2015, 12:24:36 PM5/14/15
to go-package...@googlegroups.com, kard...@gmail.com, he...@nathany.com, da...@cheney.net
Hi William,


On Wednesday, May 6, 2015 at 6:10:21 AM UTC+2, William Kennedy wrote:
Now imagine a way to pull code from any DVCS right inside your project without the need to rewrite import paths or worry about the code being somewhere else and potentially accessible. Then imagine all import paths being relative to the project. It is referenced like "vendor/github.com/goinggo/work".

You wrote "imagine a way to pull code [...] without the need to rewrite import paths", but later you wrote "it is referenced like vendor/github.com/goinggo/work". I don't understand your point here. Adding the prefix "vendor" is import path rewriting.

William Kennedy

unread,
May 14, 2015, 12:44:55 PM5/14/15
to Nicolas Grilly, go-package...@googlegroups.com, kard...@gmail.com, he...@nathany.com, da...@cheney.net
Thanks to gb, the import statement for the vendored code inside the project uses the canonical path. There is no need to rewrite the import paths of the vendored code.

Nicolas Grilly

unread,
May 14, 2015, 12:58:06 PM5/14/15
to William Kennedy, go-package...@googlegroups.com, kard...@gmail.com, he...@nathany.com, da...@cheney.net
On Thu, May 14, 2015 at 6:44 PM, William Kennedy <bi...@thekennedyclan.net> wrote:
Thanks to gb, the import statement for the vendored code inside the project uses the canonical path. There is no need to rewrite the import paths of the vendored code.

Agreed. We can avoid import path rewriting by using gb, or by doing something similar to gb manually.

Daniel Theophanes

unread,
May 15, 2015, 12:08:32 AM5/15/15
to go-package...@googlegroups.com, kard...@gmail.com, bi...@thekennedyclan.net
Hi Nathan,

I have a two comments regarding this:
 
I really like how Dave managed to side-step the whole dependency metadata file that Brad Fitzpatrick was asking for and nobody could agree on.

1. The majority of commentators on that thread didn't actually appear to want to do import path rewriting. They may have had good intentions, but the result was one big troll.

2. There are a few people who are vendor tool authors working together to get a spec in place. I think it will happen.

One of the coolest things about Go is that it doesn't need a project file to build or analyze code. The next biggest thing is in a package; package main can live anywhere. In fact, LiteIDE used to have a project file, however I think it was a great improvement when it was removed. I love how the LiteIDE author has a set of working folders, and you can set an arbitrary package to build regardless of what you are editing at the time. It has some nice features. I use it with package rewriting and it works great.

Package rewriting can also be super easy and reversible; all you have to keep track of is the canonical path and tools can do really intelligent things when combined with GOPATH.

I'm not trying to convince you not to use gb, just not to give up on GOPATH just yet :). I think it has many good properties.
Daniel

William Kennedy

unread,
May 15, 2015, 12:55:52 AM5/15/15
to Daniel Theophanes, go-package...@googlegroups.com
Daniel,

Many of the editor plugins today require a single GOPATH to work.This restriction really hampers the flexibility to set a GOPATH for each project and to move around to different projects under a single configuration. One of the things Dave’s solution does is remove this issue, since the tool figures out what the GOPATH is on the fly and is still backwards compatible with the existing GOPATH structure. Basically gb and non-gb projects can co-exist together and all the editors work.

I have thought about the Go tool being able to locate a vendor folder within the scope of the GOPATH for code you are building as well. This would allow you to vendor code without the need to rewrite import paths. But, you still need to walk away from “go get” since the import paths in the non-vendored code would contain the canonical path and “go get” will bring down that code into your GOPATH as before. I just can’t find a way to vendor code without rewriting import paths and keep “go get”.

This idea of a vendor file really has me confused. If the idea is to list the commit or tag you want “go get” to fetch, and that code is still being placed inside the GOPATH as it works today, I’m not sure how this solves the problem. If “go get” via this vendor file is going to vendor the code, based on the location of the vendor file, then the GOPATH still needs to adjusted on the fly when building or the imports needs to be re-written.

I would like to know what I am missing when it comes to the use and mechanics of this vendor file.

— Bill

Daniel Theophanes

unread,
May 15, 2015, 9:24:44 AM5/15/15
to William Kennedy, go-package...@googlegroups.com
Hi Bill,

I'm hearing you say:
1. Some of the existing code helpers only work when GOPATH=PATHA, but don't work with GOPATH=PATHA:PATHB.
2. You'd like to set a separate GOPATH per project.
3. Import path rewriting doesn't work with "go get".
3. You don't understand the role of a vendor file. How will that help anything? Won't the GOPATH still need adjustment on the fly?

If I misinterpreted your questions let me know. Otherwise I'll work on answering them.
1. Every tool I've used works with a combined GOPATH *if* I'm using it correctly. Making a separate GOPATH entry per project is bound to result in problems. If a plugin doesn't actually handle multiple GOPATH entries correctly, I'm sure they would welcome a PR.

2. A number of version pinning tools require a separate GOPATH per project. This is very similar to how python and ruby work I believe. I would consider this a hack and cannot recommend it for Go.

3. The tool "go get" works very well for what it does. It was never meant to be a replacement for a legal and technical department to decide what code can be used by a company. My particular vendor tool expect someone to arrange the version they want to use on disk, then it pull is from disk, not network, and notes the version. In later version if I add a network fetch update option I can do a shallow fetch at that revision into a temporary folder, rewrite the import paths, and copy in the final location. These is good will to modify "go get" to support versions. None of this requires modifying GOPATH.

4. Import path rewriting by default loses information. It loses repository information and it loses the canonical path information. There are two approaches: a) always keep the entire repository of all dependencies and never move the location of the dependencies so nothing ever gets lost, b) write down what you need to know.

...

I'd like you to consider the following three methods of "taking in code". This of course omits legal and technical reviews, so insert them as needed:
A. The company you work for uses an internal repository of all external code, including Go code. When code is "taken in", it is rewritten under an internal name space. All projects never reference github.com, they reference the internal name space. 

B. The company you work for uses location for external code that is used by some of their projects. When code is "taken in" it is rewritten under an internal name space. A portion of projects in that department all reference that code.

C. The company you work for stores dependencies with each project. When code is "taken in" it is rewritten under an internal project name space and that project will use that code.

Option A has the least trouble from legal and technical reviews; all projects use the same dependencies. But Option C is also an option if the need arrises. This is made possible with import path rewriting. The vendor file just notes relevant information so it can be effectively managed by tools.

I will note gb support C and maybe B. Side note, could gb compile Go itself? What would it take to compile Go with gb? The Go project uses the "go" tool to compile Go (after a bootstrap). I don't see technical issues with import path rewriting. If fact when taking in code, the process of copying them is easy and 100% mechanical (if the canonical path is written down, reversible). The technical review and legal should be your biggest concern.

Daniel

Nicolas Grilly

unread,
May 15, 2015, 3:59:20 PM5/15/15
to Daniel Theophanes, go-package...@googlegroups.com, William Kennedy
Hi Daniel!

On Fri, May 15, 2015 at 6:08 AM, Daniel Theophanes <kard...@gmail.com> wrote:
One of the coolest things about Go is that it doesn't need a project file to build or analyze code. The next biggest thing is in a package; package main can live anywhere. In fact, LiteIDE used to have a project file, however I think it was a great improvement when it was removed.

Why do you think that not needing a project file is a good thing?

In Go, it's true that you don't need to create a "project file" at the root of each project, but you need to setup your GOPATH and put all your projects into it. I would say that the "configuration overhead" of both solutions is relatively similar.

There is the same dichotomy in desktop applications: some are able to open and save files anywhere in the file system, others are designed to store all their files in a dedicated directory or a dedicated database (this latter design is typical of Mac application that store their files in iCloud). Go reminds me of the second approach. The question is what are the pros and cons of each approach?
 
Package rewriting can also be super easy and reversible; all you have to keep track of is the canonical path and tools can do really intelligent things when combined with GOPATH.

You are progressively convincing me that rewriting import path is less harmful than what I initially thought, because it's fully reversible :-)

I've been thinking of another advantage of import path rewriting and "centralizing" all projects in one global GOPATH: we could imagine a tool that checkouts the latest version of a package in its canonical path and compares its API with the vendored version of the package using the rewritten path. This becomes possible because both versions coexist in the same GOPATH, one with its canonical path and the other with the rewritten path. It could be a very powerful to use such a tool before upgrading a dependency. But I'm just brainstorming here and maybe this is a silly idea, or maybe this is overkill :-)

Nicolas Grilly

unread,
May 15, 2015, 5:04:03 PM5/15/15
to William Kennedy, Daniel Theophanes, go-package...@googlegroups.com
Hi William,

On Fri, May 15, 2015 at 6:55 AM, William Kennedy <bi...@thekennedyclan.net> wrote:
I just can’t find a way to vendor code without rewriting import paths and keep “go get”.

Most people agree that staying compatible with `go get` is not mandatory. Even members of the Go team have made clear that `go get` should be considered as a commodity, a tool to bootstrap a project or to ease the learning of Go, but not as an industrial strength tool.

My point is that a strong distinction should be made between `go get` and the `go` tool as a whole. Divorcing from `go get` is an easy decision. Divorcing from the whole `go` tool requires more thoughts (and may be necessary or not -- that's the whole point of `gb`).
 
This idea of a vendor file really has me confused. If the idea is to list the commit or tag you want “go get” to fetch, and that code is still being placed inside the GOPATH as it works today, I’m not sure how this solves the problem. If “go get” via this vendor file is going to vendor the code, based on the location of the vendor file, then the GOPATH still needs to adjusted on the fly when building or the imports needs to be re-written.

 The vendor file is not used at built time. Its role is just to memorize the source and the revision (and other similar metadata) of vendored dependencies. A vendor file is really useful, I'd even say necessary, when dependencies are vendored, with or without import rewriting.

Nicolas Grilly

unread,
May 15, 2015, 5:18:12 PM5/15/15
to Daniel Theophanes, William Kennedy, go-package...@googlegroups.com
On Fri, May 15, 2015 at 3:24 PM, Daniel Theophanes <kard...@gmail.com> wrote:
2. A number of version pinning tools require a separate GOPATH per project. This is very similar to how python and ruby work I believe.

Yes, I confirm this approach is similar to what most Python and Ruby developers do.
 
I would consider this a hack and cannot recommend it for Go.

Why do you consider this a hack???

3. [...] None of this requires modifying GOPATH.

Well said :-)
 
4. Import path rewriting by default loses information. It loses repository information and it loses the canonical path information. There are two approaches: a) always keep the entire repository of all dependencies and never move the location of the dependencies so nothing ever gets lost, b) write down what you need to know.

Losing repository information is not specific to import path rewriting; it is a characteristic of vendoring in general, with or without import path rewriting.
 

I'd like you to consider the following three methods of "taking in code". This of course omits legal and technical reviews, so insert them as needed:
A. The company you work for uses an internal repository of all external code, including Go code. When code is "taken in", it is rewritten under an internal name space. All projects never reference github.com, they reference the internal name space. 

B. The company you work for uses location for external code that is used by some of their projects. When code is "taken in" it is rewritten under an internal name space. A portion of projects in that department all reference that code.

C. The company you work for stores dependencies with each project. When code is "taken in" it is rewritten under an internal project name space and that project will use that code.

Option A has the least trouble from legal and technical reviews; all projects use the same dependencies. But Option C is also an option if the need arrises. This is made possible with import path rewriting. The vendor file just notes relevant information so it can be effectively managed by tools.

I will note gb support C and maybe B. Side note, could gb compile Go itself? What would it take to compile Go with gb? The Go project uses the "go" tool to compile Go (after a bootstrap). I don't see technical issues with import path rewriting. If fact when taking in code, the process of copying them is easy and 100% mechanical (if the canonical path is written down, reversible). The technical review and legal should be your biggest concern.

Could you give examples of import path for A, B and C, to clarify?

Daniel Theophanes

unread,
May 15, 2015, 5:51:17 PM5/15/15
to Nicolas Grilly, William Kennedy, go-package...@googlegroups.com
Hi Nicolas,

From your previous email:
I consider not needing a project file a good thing because I can just create a folder and then a file anywhere in the GOPATH and I'm coding. Done. My other foot is in Java, C# where devs are constantly stepping on toes in the project file and before you do a new thing, you have to create a new project.

Also from previous, I think your idea to have a tool command compare the vendor copy with the canonical path copy is a nifty idea, I'd be for it.


I would consider this a hack and cannot recommend it for Go.

Why do you consider this a hack???


I've used a single (or near single) GOPATH for years. It just works. It is how it was designed. Kinda like using car keys to open a bottle, it works, but you are more liable to cut yourself.
 
Could you give examples of import path for A, B and C, to clarify?

A) This is what Google does and a number of other companies. You  might rewrite the context package from "golang.org/x/net/context" and store it under "$GOPATH/src/third_party/context" or maybe "$GOPATH/src/theid_party/golang.org/x/net/context".

B) Let's take an open source project, say coreos and related tools and use that as an example. Say they create a repo "github.com/coreos/third_party/{context,html,...}"
Then the packages "github.com/coreos/etcd" and "github.com/coreos/rkt" both reference packages from the "third_party" repo.

C) This is what "github.com/coreos/etcd" does today. The vendor packages for that command is stored within that same repo. This is what gb supports.

-Daniel


Dave Cheney

unread,
May 15, 2015, 6:26:34 PM5/15/15
to Nicolas Grilly, William Kennedy, Daniel Theophanes, go-package...@googlegroups.com
> My point is that a strong distinction should be made between `go get` and
> the `go` tool as a whole. Divorcing from `go get` is an easy decision.
> Divorcing from the whole `go` tool requires more thoughts (and may be
> necessary or not -- that's the whole point of `gb`).

Could you give some specifics of what moving away from the "whole go
tool" means to you and why it requires more thought ? This is a
serious question, I've been asked a similar one by several people, but
they have not been able to give me specifics about their concerns. Is
your concern, for example

- moving away from the go tool means moving away from GOPATH, and the
documentation and training implicit in it.
- moving away from the go tool means changing the way that Go programs
are compiled.
- moving away from the go tool means loosing access to tools which
consume Go code.
- moving away from the go tool means loosing access to IDEs that
understand GOPATH
- moving away from the go tool means starting a turf war for mindshare
between the gbites and gophers.

These are some of the objections I have _inferred_ so far, I would
love to have them made concrete.

William Kennedy

unread,
May 15, 2015, 11:19:22 PM5/15/15
to Daniel Theophanes, go-package...@googlegroups.com
Yes, I think you did but your response was helpful.


On May 15, 2015, at 9:24 AM, Daniel Theophanes <kard...@gmail.com> wrote:

Hi Bill,

I'm hearing you say:
1. Some of the existing code helpers only work when GOPATH=PATHA, but don't work with GOPATH=PATHA:PATHB.

From what I have seen from the Go plugins in editors like Sublime, I can’ t from within the editor change the value of GOPATH dynamically. So using a single GOPATH is the easiest way to go. Especially when you need to jump around to different projects.

2. You'd like to set a separate GOPATH per project.

If there was an intuitive way to do this, yes. I think “gb” does this brilliantly.

3. Import path rewriting doesn't work with "go get”.

rewriting import paths are compatible with “go get”. It is one of the reason we need to do it.

3. You don't understand the role of a vendor file. How will that help anything? Won't the GOPATH still need adjustment on the fly?

No, I don’t understand two things here:

1) The role of the vendor file as it related to how “go get” could use it in a safe, consistent and reproducible way.

2a) Why people feel the need to have some record of the code that has been vendored. To me, once a piece of code has been vendored, the source of truth for that code now exists in the project. It is irrelevant at that point what commit id or tag from the canonical repo it was pulled from.

2b) I also don’t believe re-writing the import path for the vendored code, alters the code in any way. The import path is a directive to tell the compiler where the code for the package is, since the source of truth now exists inside the project, that is the code I want.

When you don’t vendor, the source of truth still exists at the canonical path. Having “go get” potentially pull different “versions” of the code using this same import path is the whole reason we are in this mess.


If I misinterpreted your questions let me know. Otherwise I'll work on answering them.
1. Every tool I've used works with a combined GOPATH *if* I'm using it correctly. Making a separate GOPATH entry per project is bound to result in problems. If a plugin doesn't actually handle multiple GOPATH entries correctly, I'm sure they would welcome a PR.

“gb” is the only tool I have seen to solve this problem because it basically removes GOPATH from the equation. It is no longer a point of configuration, therefore no longer an issue or stumbling block.


2. A number of version pinning tools require a separate GOPATH per project. This is very similar to how python and ruby work I believe. I would consider this a hack and cannot recommend it for Go.

Agree.


3. The tool "go get" works very well for what it does. It was never meant to be a replacement for a legal and technical department to decide what code can be used by a company. My particular vendor tool expect someone to arrange the version they want to use on disk, then it pull is from disk, not network, and notes the version. In later version if I add a network fetch update option I can do a shallow fetch at that revision into a temporary folder, rewrite the import paths, and copy in the final location. These is good will to modify "go get" to support versions. None of this requires modifying GOPATH.

_The following is not talking about your tool specifically, just in general for this type of solution._

However, this type of solution attempts to create a project with vendored code that does support “go get”. This need to support a pre-configured GOPATH and “go get” means the tooling has to make decisions surrounding the lessor of different evils. Not really solving the problem, but trying to work around the flaws of GOPATH and “go get”.


4. Import path rewriting by default loses information. It loses repository information and it loses the canonical path information. There are two approaches: a) always keep the entire repository of all dependencies and never move the location of the dependencies so nothing ever gets lost, b) write down what you need to know.

This is where I disagree with you and go back to my comments above about where the source of truth now is for this vendored code.


...

I'd like you to consider the following three methods of "taking in code". This of course omits legal and technical reviews, so insert them as needed:
A. The company you work for uses an internal repository of all external code, including Go code. When code is "taken in", it is rewritten under an internal name space. All projects never reference github.com, they reference the internal name space. 

This just moved the problem from one repo to another as it related to having reproducible builds.


B. The company you work for uses location for external code that is used by some of their projects. When code is "taken in" it is rewritten under an internal name space. A portion of projects in that department all reference that code.

Again, the code at the internal name space can change every time you pull it down without you knowing about it.


C. The company you work for stores dependencies with each project. When code is "taken in" it is rewritten under an internal project name space and that project will use that code.

This is the only solution that provides reproducible builds for the binary that is being deployed.


Option A has the least trouble from legal and technical reviews; all projects use the same dependencies. But Option C is also an option if the need arrises. This is made possible with import path rewriting. The vendor file just notes relevant information so it can be effectively managed by tools.

There is nothing to manage.


I will note gb support C and maybe B.

Side note, could gb compile Go itself? What would it take to compile Go with gb?

It is not trying to solve that problem and does not need to.

The Go project uses the "go" tool to compile Go (after a bootstrap). I don't see technical issues with import path rewriting. If fact when taking in code, the process of copying them is easy and 100% mechanical (if the canonical path is written down, reversible). The technical review and legal should be your biggest concern.

Reproducible builds right now should be the biggest concern. Companies today need to be focused on this and not feel they need to rely on “go get” or be 100% compliant with the Go tooling to make that happen.

Unfortunately, each company, project and team need to come together and ask, for us and our needs, what works, choosing the lessor of different evils. I think what “gb” is attempting to do is make that decision consistent across many teams deploying code written in Go. Maybe at some point, that meeting never needs to happen again.

— Bill

Peter Bourgon

unread,
May 16, 2015, 3:13:40 AM5/16/15
to William Kennedy, Daniel Theophanes, go-package...@googlegroups.com
>> 2. A number of version pinning tools require a separate GOPATH per project.
>> This is very similar to how python and ruby work I believe. I would consider
>> this a hack and cannot recommend it for Go.
>
> Agree.

After watching these discussions for a very long time, it is
increasingly becoming my opinion that this might be the least bad
option. Maybe it's worth revisiting these categorical dismissals?

Dave Cheney

unread,
May 16, 2015, 3:31:34 AM5/16/15
to Peter Bourgon, William Kennedy, Daniel Theophanes, go-package...@googlegroups.com


On 16 May 2015 5:13 pm, "Peter Bourgon" <pe...@bourgon.org> wrote:
>
> >> 2. A number of version pinning tools require a separate GOPATH per project.
> >> This is very similar to how python and ruby work I believe. I would consider
> >> this a hack and cannot recommend it for Go.
> >
> > Agree.
>
> After watching these discussions for a very long time, it is
> increasingly becoming my opinion that this might be the least bad
> option. Maybe it's worth revisiting these categorical dismissals?
>

I'm sorry, I've completely lost context here, so I don't know which position you are for. Could you clarify please.

Peter Bourgon

unread,
May 16, 2015, 3:35:06 AM5/16/15
to Dave Cheney, William Kennedy, Daniel Theophanes, go-package...@googlegroups.com
On Sat, May 16, 2015 at 8:31 AM, Dave Cheney <da...@cheney.net> wrote:
>
> On 16 May 2015 5:13 pm, "Peter Bourgon" <pe...@bourgon.org> wrote:
>>
>> >> 2. A number of version pinning tools require a separate GOPATH per
>> >> project.
>> >> This is very similar to how python and ruby work I believe. I would
>> >> consider
>> >> this a hack and cannot recommend it for Go.
>> >
>> > Agree.
>>
>> After watching these discussions for a very long time, it is
>> increasingly becoming my opinion that this might be the least bad
>> option. Maybe it's worth revisiting these categorical dismissals?
>
> I'm sorry, I've completely lost context here, so I don't know which position
> you are for. Could you clarify please.

It is increasingly becoming my opinion that a separate GOPATH per
project (or similar) might be the least bad option.

Dave Cheney

unread,
May 16, 2015, 3:41:35 AM5/16/15
to Peter Bourgon, William Kennedy, Daniel Theophanes, go-package...@googlegroups.com

Thanks for clarifying. My next question is obviously, do you categorise gb under the GOPATH per project category?

Peter Bourgon

unread,
May 16, 2015, 3:43:04 AM5/16/15
to Dave Cheney, William Kennedy, Daniel Theophanes, go-package...@googlegroups.com
On Sat, May 16, 2015 at 8:41 AM, Dave Cheney <da...@cheney.net> wrote:
> Thanks for clarifying. My next question is obviously, do you categorise gb
> under the GOPATH per project category?

Yes, basically.

chris....@gmail.com

unread,
May 16, 2015, 2:14:26 PM5/16/15
to go-package...@googlegroups.com, kard...@gmail.com
On Friday, May 15, 2015 at 11:19:22 PM UTC-4, William Kennedy wrote:
> 2a) Why people feel the need to have some record of the code that has been vendored. To me, once a piece of code has been vendored, the source of truth for that code now exists in the project. It is irrelevant at that point what commit id or tag from the canonical repo it was pulled from.

Yes, the vendored code is the source of truth for purposes of reproducible builds, but there are other use cases to consider.

As a project matures there usually comes a time when you need to update the vendored code to incorporate bug fixes or performance enhancements made upstream. A record of which revision of the vendored code is currently in use by a project can help project maintainers determine what has changed in the upstream since the code was vendored and whether an update is required. It may also help a tool automatically apply the update in a consistent manner.

Another use case arises when a bug in the vendored code is discovered by the project maintainers. Knowing which revision they have can help them accurately report the bug upstream. It may also help when sending a pull request upstream if they make a local fix and want to be good citizens.

I realize that these use cases are not currently within the scope of gb, but gb-vendor should at least pay attention to these use cases and help minimize friction when these situations arise.

Chris

William Kennedy

unread,
May 16, 2015, 2:19:10 PM5/16/15
to chris....@gmail.com, go-package...@googlegroups.com, kard...@gmail.com
Excellent. I never considered these scenarios. You have given me something to think about. Thanks!!

Nicolas Grilly

unread,
May 18, 2015, 1:14:08 PM5/18/15
to Daniel Theophanes, William Kennedy, go-package...@googlegroups.com
Hi Daniel,

On Fri, May 15, 2015 at 11:51 PM, Daniel Theophanes <kard...@gmail.com> wrote:
I consider not needing a project file a good thing because I can just create a folder and then a file anywhere in the GOPATH and I'm coding. Done. My other foot is in Java, C# where devs are constantly stepping on toes in the project file and before you do a new thing, you have to create a new project.

I agree. It's a big advantage. I always disliked project files (same experience as yours in Java).

But to be fair, it's possible to implement one per-project GOPATH without any project file. This is what `gb` does. But you have to create some intermediate directories to respect the `gb` layout, which is equivalent to a project file in some way.

I've used a single (or near single) GOPATH for years. It just works. It is how it was designed. Kinda like using car keys to open a bottle, it works, but you are more liable to cut yourself.

Ok, this is not a fully rational argument, but now I understand what you meant by "hack" :-)
 

Could you give examples of import path for A, B and C, to clarify?

A) This is what Google does and a number of other companies. You  might rewrite the context package from "golang.org/x/net/context" and store it under "$GOPATH/src/third_party/context" or maybe "$GOPATH/src/theid_party/golang.org/x/net/context". 

B) Let's take an open source project, say coreos and related tools and use that as an example. Say they create a repo "github.com/coreos/third_party/{context,html,...}"
Then the packages "github.com/coreos/etcd" and "github.com/coreos/rkt" both reference packages from the "third_party" repo.

C) This is what "github.com/coreos/etcd" does today. The vendor packages for that command is stored within that same repo. This is what gb supports.


I think I get it now. Let me rephrase to make sure I understand:

A) The company uses a single monolithic company-wide repository which contains everything, including the vendored dependencies. (This is what I do by the way)

B) The company uses one repository for each app and a "shared" repository containing all the vendored dependencies.

C) The company uses one repository for each, which includes the vendored dependencies of the app.

Correct? ;-)

Nicolas Grilly

unread,
May 18, 2015, 1:41:52 PM5/18/15
to Dave Cheney, William Kennedy, Daniel Theophanes, go-package...@googlegroups.com
Hi Dave,

You've quite well inferred my concerns :)

My answers follow.

On Sat, May 16, 2015 at 12:26 AM, Dave Cheney <da...@cheney.net> wrote:
> My point is that a strong distinction should be made between `go get` and
> the `go` tool as a whole. Divorcing from `go get` is an easy decision.
> Divorcing from the whole `go` tool requires more thoughts (and may be
> necessary or not -- that's the whole point of `gb`).

Could you give some specifics of what moving away from the "whole go
tool" means to you and why it requires more thought ? This is a
serious question, I've been asked a similar one by several people, but
they have not been able to give me specifics about their concerns. Is
your concern, for example

- moving away from the go tool means moving away from GOPATH, and the
documentation and training implicit in it.

No, it's not a big issue in my opinion. `gb` looks simple to understand and use. The learning curve should be ok.
 
- moving away from the go tool means changing the way that Go programs
are compiled.

I'm not sure to understand, but if you mean replacing one command based on `go build/install` by another command based on `gb build` then I don't think this is an issue.
 
- moving away from the go tool means loosing access to tools which
consume Go code.

Yes, I think this one is an issue. A lot of tools rely on GOPATH. This is where the documentation and the training can become an issue because people will need to use these other tools. Maybe the transition can be smoothed by adding a command `gb exec COMMAND COMMAND_ARG...` that runs COMMAND with an environment containing the correct GOPATH?
 
- moving away from the go tool means loosing access to IDEs that
understand GOPATH

Yes, I think this is similar to the previous concern.
 
- moving away from the go tool means starting a turf war for mindshare
between the gbites and gophers.

Yes, this is something that bothers me a little. To be accurate, I'm not afraid of a "war". I'm afraid of the duplicated effort. I'm afraid of the incompatibilities. I'm afraid of the third party tools that will work with one toolset but not the other. I'm afraid of the subtle differences in behavior between both toolsets. I'm afraid of the `go` tool gaining new features and `gb` playing catchup. I'm afraid of the code base of `gb` growing day after day and the project needing more human resources to be sustainable. But to be honest, I've no rationale to backup this fear. Maybe it will be okay :)

Nicolas Grilly

unread,
May 18, 2015, 1:48:21 PM5/18/15
to William Kennedy, Daniel Theophanes, go-package...@googlegroups.com
On Sat, May 16, 2015 at 5:19 AM, William Kennedy <bi...@thekennedyclan.net> wrote:
2b) I also don’t believe re-writing the import path for the vendored code, alters the code in any way. The import path is a directive to tell the compiler where the code for the package is, since the source of truth now exists inside the project, that is the code I want.

Yes, but if $GOPATH/src/mycorp/vendor/github.com/somedev/somelib depends on $GOPATH/src/mycorp/vendor/github.com/otherdev/otherlib, then you have to rewrite the import path in github.com/somedev/somelib from:


to:

    import "mycorp/vendor/github.com/otherdev/otherlib"

So, it definitely alters the code in some way. But as Daniel said in this discussion, this alteration is fully deterministic and reversible.

For those that don't want to store an "altered" dependency, another approach is to vendor the dependency as-is, without any modification, without any import path rewriting, and use a Makefile that copies and rewrites import paths at build time, just before calling `go build`.

Nicolas Grilly

unread,
May 18, 2015, 1:53:20 PM5/18/15
to William Kennedy, Daniel Theophanes, go-package...@googlegroups.com
On Sat, May 16, 2015 at 5:19 AM, William Kennedy <bi...@thekennedyclan.net> wrote:
“gb” is the only tool I have seen to solve this problem because it basically removes GOPATH from the equation. It is no longer a point of configuration, therefore no longer an issue or stumbling block.

Yes, but instead of relying on GOPATH, `gb` relies on the current working directory or the -R flag. This is another point of configuration. I'd call that a draw. `gb` also relies on respecting a specific directory structure.

Daniel Theophanes

unread,
May 18, 2015, 3:37:43 PM5/18/15
to Nicolas Grilly, William Kennedy, go-package...@googlegroups.com

Ok, this is not a fully rational argument, but now I understand what you meant by "hack" :-)

Fair enough :) 

I think I get it now. Let me rephrase to make sure I understand:

A) The company uses a single monolithic company-wide repository which contains everything, including the vendored dependencies. (This is what I do by the way)

B) The company uses one repository for each app and a "shared" repository containing all the vendored dependencies.

C) The company uses one repository for each, which includes the vendored dependencies of the app.

Correct? ;-)

Exactly. Those are the different methods people can store dependencies at. I haven't seen (B) much, but I have seen (A) and (C) lots. If you support (A), then you'll also support (B). I think any end game solution should support all three.

Reply all
Reply to author
Forward
0 new messages