the go command

10,472 views
Skip to first unread message

Russ Cox

unread,
Feb 7, 2012, 11:38:12 AM2/7/12
to golang-nuts
Recent weekly snapshots have a new command, called “go”, that
automates the downloading, building, installation, and testing of Go
packages and commands. It replaces both goinstall and make. I
thought it would help to say a little bit about why we wrote a new
command, what it is, what it's not, and how to use it.

Motivation

You might have seen early Go talks in which Rob jokes that the idea
for Go arose while waiting for a large Google server to compile. That
really was the motivation for Go: to build a language that worked well
for building the large software that Google writes and runs. It was
clear from the start that such a language must provide a way to
express dependencies between code libraries clearly, hence the package
grouping and the explicit import blocks. It was also clear from the
start that you might want arbitrary syntax for describing the code
being imported; this is why import paths are string literals.

An explicit goal for Go from the beginning was to be able to build Go
code using only the information found in the source itself, not
needing to write a makefile or one of the many modern replacements for
makefiles. If Go needed a configuration file to explain how to build
your program, then Go would have failed.

At first, there was no Go compiler, and the initial development
focused on building one and then building libraries for it. For
expedience, we postponed the automation of building Go code by using
make and writing makefiles. When compiling a single package involved
multiple invocations of the Go compiler, we even used a program to
write the makefiles for us. You can find it if you dig through the
repository history.

The purpose of the new go command is our return to this ideal, that Go
programs should compile without configuration or additional effort on
the part of the developer beyond writing the necessary import
statements.

Configuration versus convention

The way to achieve the simplicity of a configuration-free system is to
establish conventions. The system works only to the extent that the
convention is followed. When we first launched Go, many people
published packages that had to be installed in certain places, under
certain names, using certain build tools, in order to be used. That's
understandable: that's the way it works in most other languages. Over
the last few years we have consistently reminded people about the
goinstall command and its conventions: first, that the import path is
derived in a known way from the URL of the source code; second, that
that the place to store the sources in the local file system is
derived in a known way from the import path; third, that each
directory in a source tree corresponds to a single package; and
fourth, that the package is built using only information in the source
code. Today, the vast majority of packages I see mentioned are
'goinstallable': they follow the conventions. The Go ecosystem is
simpler and more powerful for it.

We received many requests to allow a makefile in a package directory
to override goinstall's defaults, to provide just a little extra
configuration beyond what's in the source code. But that would have
introduced new rules. Because we did not accede to such requests, we
were able to write the new go command and eliminate our use of make or
any other build system.

It is important to understand that the go command is not a general
build tool. It cannot be configured and it does not attempt to build
anything but Go packages. These are important simplifying
assumptions: they simplify not only the implementation but also, more
important, the use of the tool itself.

Go's conventions

The go command retains the conventions established by the goinstall command.

First, the import path is derived in an known way from the URL of the
source code. For Bitbucket, GitHub, Google Code, and Launchpad, the
root directory of the repository is identified by the repository's
main URL, without the http:// prefix. Subdirectories are named by
adding to that path. For example, the supplemental networking
libraries for Go are obtained by running

hg clone http://code.google.com/p/go.net

and thus the import path for the root directory of that repository is
"code.google.com/p/go.net". The websocket package is stored in a
subdirectory, so its import path is
"code.google.com/p/go.net/websocket".

These paths are on the long side, but in exchange we get an
automatically managed name space for import paths and the ability for
a tool like the go command to look at an unfamiliar import path and
deduce where to obtain the source code.

Second, the place to store sources in the local file system is derived
in an known way from the import path. Specifically, the first choice
is $GOPATH/src/<import-path>. If $GOPATH is unset, the go command
will fall back to storing source code alongside the standard Go
packages, in $GOROOT/src/pkg/<import-path>. If $GOPATH is set to a
list of paths, the go command tries <dir>/src/<import-path> for each
of the directories in that list.

Each of those trees contains, by convention, a top-level directory
named "bin", for holding compiled executables, and a top-level
directory named "pkg", for holding compiled packages that can be
imported, and the "src" directory, for holding package source files.
Imposing this structure lets us keep each of these directory trees
self-contained: the compiled form and the sources are always near each
other.

These naming conventions also let us work in the reverse direction,
from a directory name to its import path. This mapping is important
for many of the go command's commands, as we'll see below.

Third, each directory in a source tree corresponds to a single
package. By restricting a directory to a single package, we don't have
to create hybrid import paths that specify first the directory and
then the package within that directory. Also, most file management
tools and UIs work on directories as fundamental units. Tying the
fundamental Go unit - the package - to file system structure means
that file system tools become Go package tools. Copying, moving, or
deleting a package corresponds to copying, moving, or deleting a
directory.

Fourth, each package is built using only the information present in
the source files. This makes it much more likely that the tool will
be able to adapt to changing build environments and conditions. For
example, if we allowed extra configuration like compiler flags or
command line recipes, then that configuration would need to be updated
each time the build tools changed; it would also be inherently tied
to the use of a specific toolchain.

Getting started with the go command

Finally, a quick tour of how to use the go command. Assuming you want
to keep your source code separate from the Go distribution source
tree, the first step is to set $GOPATH, the one piece of global
configuration that the go command needs. The $GOPATH can be a list of
directories, but by far the most common usage should be to set it to a
single directory. In particular, you do not need a separate entry in
$GOPATH for each of your projects. One $GOPATH can support many
projects.

Here’s an example. Let’s say we decide to keep our Go code in the
directory $HOME/mygo. We need to create that directory and set
$GOPATH accordingly.

$ mkdir $HOME/mygo
$ export GOPATH=$HOME/mygo
$

Into this directory, we now add some source code. Suppose we want to
use the indexing library from the codesearch project along with a
left-leaning red-black tree. We can install both with the “go get”
subcommand:

$ go get code.google.com/p/codesearch/index
$ go get github.com/petar/GoLLRB/llrb
$

Both of these projects are now downloaded and installed into our
$GOPATH directory. The one tree now contains the two directories
src/code.google.com/p/codesearch/index/ and
src/github.com/petar/GoLLRB/llrb/, along with the compiled packages
(in pkg/) for those libraries and their dependencies.

Because we used version control systems (Mercurial and Git) to check
out the sources, the source tree also contains the other files in the
corresponding repositories, such as related packages. The “go list”
subcommand lists the import paths corresponding to its arguments, and
the pattern “./...” means start in the current directory (“./”) and
find all packages below that directory (“...”):

$ go list ./...
code.google.com/p/codesearch/cmd/cgrep
code.google.com/p/codesearch/cmd/cindex
code.google.com/p/codesearch/cmd/csearch
code.google.com/p/codesearch/index
code.google.com/p/codesearch/regexp
code.google.com/p/codesearch/sparse
github.com/petar/GoLLRB/example
github.com/petar/GoLLRB/llrb
$

We can also test those packages:

$ go test ./...
? code.google.com/p/codesearch/cmd/cgrep [no test files]
? code.google.com/p/codesearch/cmd/cindex [no test files]
? code.google.com/p/codesearch/cmd/csearch [no test files]
ok code.google.com/p/codesearch/index 0.239s
ok code.google.com/p/codesearch/regexp 0.021s
? code.google.com/p/codesearch/sparse [no test files]
? github.com/petar/GoLLRB/example [no test files]
ok github.com/petar/GoLLRB/llrb 0.231s
$

If a go subcommand is invoked with no paths listed, it operates on the
current directory:

$ cd $GOPATH/src/code.google.com/p/codesearch/regexp
$ go list
code.google.com/p/codesearch/regexp
$ go test -v
=== RUN TestNstateEnc
--- PASS: TestNstateEnc (0.00 seconds)
=== RUN TestMatch
--- PASS: TestMatch (0.01 seconds)
=== RUN TestGrep
--- PASS: TestGrep (0.00 seconds)
PASS
ok code.google.com/p/codesearch/regexp 0.021s
$ go install
$

That “go install” subcommand installs the latest copy of the package
into the pkg directory. Because the go command can analyze the
dependency graph, “go install” also installs any packages that this
package imports but that are out of date, recursively.

Notice that “go install” was able to determine the name of the import
path for the package in the current directory, because of the
convention for directory naming. It would be a little more convenient
if we could pick the name of the directory where we kept source code,
and we probably wouldn't pick such a long name, but that ability would
require additional configuration and complexity in the tool. Typing an
extra directory name or two is a small price to pay for the increased
simplicity and power.

As the example shows, it’s fine to work with packages from many
different projects at once within a single $GOPATH root directory.

Limitations

As mentioned above, the go command is not a general-purpose build
tool. In particular, it does not have any facility for generating Go
source files during a build. Instead, if you want to use a tool like
yacc or the protocol buffer compiler, you will need to write a
makefile (or a configuration file for the build tool of your choice)
to generate the Go files and then check those generated source files
into your repository. This is more work for you, the package author,
but it is significantly less work for your users, who can use “go get”
without needing don't need to obtain and build any additional tools.

More information

There is not yet a full manual for the go command. For more
information, run “go help” or visit http://tip.golang.org/cmd/go/.

Peter Bourgon

unread,
Feb 7, 2012, 12:13:54 PM2/7/12
to r...@golang.org, golang-nuts
Russ, thanks for the detailed explanation. "Motivation" docs like this
are helpful. One point, however:

> Here’s an example.  Let’s say we decide to keep our Go code in the
> directory  $HOME/mygo.

The implication here is that "my Go code" is somehow distinct from "my
C code" (or whatever). This implication is false: I don't draw
partitions in this way. I keep "my Go code" with all my other code:
cloned into ~/src. I expect Go packages and binaries to be built "in
place" in those cloned subdirectories, just like all my other
software/language stacks. This expectation is reinforced by the fact
that the Go toolchain produces [nearly] pure, statically-linked
binaries, requiring no runtime framework. And, "gb" manages to satisfy
it. (It does so by resolving import dependencies into the $GOROOT
tree, which is fine by me.)

It could be that I'm being stubborn, or my use-case is atypical. If
so, I hope some people on this list will point it out and (gently)
correct me :) But, at least for now, I'm not sure that I'm all that
weird in having these expectations. The "go" tool feels more like the
weird element in the equation.

Gustavo Niemeyer

unread,
Feb 7, 2012, 12:16:27 PM2/7/12
to r...@golang.org, golang-nuts
On Tue, Feb 7, 2012 at 14:38, Russ Cox <r...@golang.org> wrote:
> thought it would help to say a little bit about why we wrote a new
> command, what it is, what it's not, and how to use it.

Thanks for the great write up Russ.

I'll take this chance to highlight one minor point:

The new "goinstall" is "go get".

The "install" subcommand of the "go" tool will build and install a
local package only.

--
Gustavo Niemeyer
http://niemeyer.net
http://niemeyer.net/plus
http://niemeyer.net/twitter
http://niemeyer.net/blog

-- I'm not absolutely sure of anything.

Gustavo Niemeyer

unread,
Feb 7, 2012, 12:36:30 PM2/7/12
to Peter Bourgon, r...@golang.org, golang-nuts
On Tue, Feb 7, 2012 at 15:13, Peter Bourgon <pe...@bourgon.org> wrote:
> The implication here is that "my Go code" is somehow distinct from "my
> C code" (or whatever). This implication is false: I don't draw
> partitions in this way. I keep "my Go code" with all my other code:
> cloned into ~/src. I expect Go packages and binaries to be built "in
> place" in those cloned subdirectories, just like all my other

I use ~/src as well, and I have GOPATH=$HOME. Works fine.

> It could be that I'm being stubborn, or my use-case is atypical. If
> so, I hope some people on this list will point it out and (gently)
> correct me :)

I don't think your use case is atypical, and I don't see why you might
be called stubborn in this regard. Just use ~/src, and set up the go
tool so it uses it.

> But, at least for now, I'm not sure that I'm all that
> weird in having these expectations. The "go" tool feels more like the
> weird element in the equation.

I disagree here, though. You seem to like gb. Have you realized that
one of the key reasons why you can use another build tool which is
non-standard with pretty much every single package out there is
because those packages don't depend on a build tool at all? Try to use
an autoconf package without autoconf, or a make-based package without
make, or an ant-based package without ant. The concept behind the go
tool is what allows you to easily use go packages without the tool
itself.

Peter Bourgon

unread,
Feb 7, 2012, 1:02:28 PM2/7/12
to Gustavo Niemeyer, r...@golang.org, golang-nuts
On Tue, Feb 7, 2012 at 6:36 PM, Gustavo Niemeyer <gus...@niemeyer.net> wrote:
> On Tue, Feb 7, 2012 at 15:13, Peter Bourgon <pe...@bourgon.org> wrote:
>> The implication here is that "my Go code" is somehow distinct from "my
>> C code" (or whatever). This implication is false: I don't draw
>> partitions in this way. I keep "my Go code" with all my other code:
>> cloned into ~/src. I expect Go packages and binaries to be built "in
>> place" in those cloned subdirectories, just like all my other
>
> I use ~/src as well, and I have GOPATH=$HOME. Works fine.

Except (as mentioned in IRC) this populates my ~/bin and ~/pkg subdirs
with artifacts, rather than dropping them in place. In my view, this
is undesirable.

> I disagree here, though. You seem to like gb. Have you realized that
> one of the key reasons why you can use another build tool which is
> non-standard with pretty much every single package out there is
> because those packages don't depend on a build tool at all? Try to use
> an autoconf package without autoconf, or a make-based package without
> make, or an ant-based package without ant. The concept behind the go
> tool is what allows you to easily use go packages without the tool
> itself.

Oh, don't get me wrong: I'm totally in favor of convention over
configuration. And for the most part, I'm totally fine with the
conventions that goinstall (and/or the go tool) established.

My problem is in the details: specifically, that the go tool both
requires extra convention ($GOPATH) and imposes extra restrictions
(mandating a home for "my Go code"; binaries necessarily to
$GOPATH/bin; packages necessarily to $GOPATH/pkg), both of which gb
manages to avoid.

I'm trying to figure out the cost that gb pays to achieve this (in my
view) clear usability win, but I'm not seeing it.

Gustavo Niemeyer

unread,
Feb 7, 2012, 1:17:35 PM2/7/12
to Peter Bourgon, r...@golang.org, golang-nuts
> Oh, don't get me wrong: I'm totally in favor of convention over
> configuration. (...)

> I'm trying to figure out the cost that gb pays to achieve this (in my
> view) clear usability win, but I'm not seeing it.

It's just a different taste. I personally appreciate the fact that the
source tree remains clean after a build, and also that I can drop
binaries onto my ~/bin with "go get" or "go install". To me that's a
win, but it may be a different convention than the one you prefer. The
only way for both camps to be happy would be with configuration, and
we both seem to agree that convention wins over configuration.

Peter Bourgon

unread,
Feb 7, 2012, 1:20:33 PM2/7/12
to Gustavo Niemeyer, r...@golang.org, golang-nuts
> It's just a different taste. I personally appreciate the fact that the
> source tree remains clean after a build, and also that I can drop
> binaries onto my ~/bin with "go get" or "go install". To me that's a
> win, but it may be a different convention than the one you prefer. The
> only way for both camps to be happy would be with configuration, and
> we both seem to agree that convention wins over configuration.

All else being equal, the solution which requires no extra environment
variables should be preferred.

Aram Hăvărneanu

unread,
Feb 7, 2012, 1:20:34 PM2/7/12
to Gustavo Niemeyer, Peter Bourgon, r...@golang.org, golang-nuts
Gustavo Niemeyer wrote:
> I use ~/src as well, and I have GOPATH=$HOME. Works fine.

Me too, I already had all my code, Go or not, in ~/src. GOPATH=$HOME
is perfect for me.

Peter Bourgon wrote:
> Except (as mentioned in IRC) this populates my ~/bin and ~/pkg subdirs
> with artifacts, rather than dropping them in place. In my view, this
> is undesirable.

Go putting stuff in ~/bin is a very nice bonus to me, I already had
~/bin added to my $PATH, so I didn't have to do anything extra to
accommodate myself to the new tool . ~/pkg wasn't there, but it's just
a directory I usually ignore, why is this an issue? It has to go
somewhere, I don't mind it in my home.

--
Aram Hăvărneanu

Miek Gieben

unread,
Feb 7, 2012, 1:26:53 PM2/7/12
to golang-nuts
[ Quoting <r...@golang.org> at 11:38 on Feb 7 in "[go-nuts] the go com..." ]

> Recent weekly snapshots have a new command, called “go”, that
> automates the downloading, building, installation, and testing of Go
> packages and commands. It replaces both goinstall and make. I
> thought it would help to say a little bit about why we wrote a new
> command, what it is, what it's not, and how to use it.

Thanks. This really helps.

> Limitations

<SNIP>

> into your repository. This is more work for you, the package author,
> but it is significantly less work for your users, who can use “go get”
> without needing don't need to obtain and build any additional tools.

I will be watching the src/cmd/goyacc directory to see how that will
actually work in that case :)

grtz Miek

signature.asc

Stefan Nilsson

unread,
Feb 7, 2012, 1:28:37 PM2/7/12
to golan...@googlegroups.com, r...@golang.org
"Third, each directory in a source tree corresponds to a single package."

How do you handle a package x_test that goes together with a package x?
Should it still go into directory x or should it have its own x_test directory?

If you put them in the same directory, it seems that you need to give the full path for x within the x_test package. That's inconvenient if you want to move your code.

Paul Borman

unread,
Feb 7, 2012, 1:29:25 PM2/7/12
to Aram Hăvărneanu, Gustavo Niemeyer, Peter Bourgon, r...@golang.org, golang-nuts
Perhaps it is a belief that $GOPATH/src must only contain go sources.  That is not the case.  The only convention enforced is that there is a src, bin, and pkg directory in $GOPATH (with the latter two being optional).

In the end, there cannot be a tool that satisfies everyones every desire.   A simple tool will enforce some sort of convention and a complex tool will negate the desire that some have for simplicity.  Some battles are not worth fighting.

    -Paul

Paul Borman

unread,
Feb 7, 2012, 1:30:29 PM2/7/12
to golan...@googlegroups.com, r...@golang.org
There is no "test" package.  Just the *_test.go sources associated with the package in the same directory.  The go command does the right thing when you us "go test".

    -Paul

Stefan Nilsson

unread,
Feb 7, 2012, 1:40:48 PM2/7/12
to golan...@googlegroups.com, r...@golang.org
Sure, you could put your test code in the same package. But common practice seems to be to put the test code for package x in package x_test. For example:

http://tip.golang.org/src/pkg/sort/sort_test.go

Peter Bourgon

unread,
Feb 7, 2012, 2:08:43 PM2/7/12
to Gustavo Niemeyer, r...@golang.org, golang-nuts
And another point, which I overlooked:

> I use ~/src as well, and I have GOPATH=$HOME. Works fine.

Beyond populating ~/bin and ~/pkg, this has the additional restriction
of making the "natural" home for my projects the tail end of a rather
long chain of subdirectories. For example, my project "goop" moves
from the logical home ~/src/goop to
~/src/github.com/peterbourgon/goop. That's totally unnatural, and not
something I'm keen to abide.

I remain open to arguments that the go tool's way is better. What I'm
less enthusiastic about is the notion that it's somehow no more
difficult, or not meaningfully different, than gb's way. The
convention of the go tool—ie. its mandated structure—imposes
significant cognitive load on the developer when compared to "the
competition" (viz. gb), and for no concrete reason or benefit that I
can discern.

(At least, this is my opinion.)

John Asmuth

unread,
Feb 7, 2012, 2:14:24 PM2/7/12
to golan...@googlegroups.com, Gustavo Niemeyer, r...@golang.org


On Tuesday, February 7, 2012 1:02:28 PM UTC-5, Peter Bourgon wrote:

I'm trying to figure out the cost that gb pays to achieve this (in my
view) clear usability win, but I'm not seeing it.

The cost is that with the go tool, you can run "go build" from any subdirectory within the workspace and it knows exactly what to do. With gb, it is not so straightforward (unless you set $GOPATH, which gb knows about and deals with). Without $GOPATH, you have to tell gb where the workspace root is by creating a file gb.cfg and specifying it there. Each directory you want to be able to run gb from needs to have this. Of course, gb will create (or modify, if they already exist) all the gb.cfg files for you if you run "gb --workspace" from the root. You then have to be careful about moving subdirectories around, because that can invalidate the workspace location if you used a relative path to describe it (you can just run "gb --workspace" again, so it's not so bad).

It's a trade-off. Both have their advantages.

Steve McCoy

unread,
Feb 7, 2012, 2:18:54 PM2/7/12
to golan...@googlegroups.com, r...@golang.org
Nice article. I was wary of a future where I'd be checking in yacc output, but you've persuaded me to suck it up.

Paul Borman

unread,
Feb 7, 2012, 2:41:26 PM2/7/12
to golan...@googlegroups.com
I believe that is an artifact.  It is better and normal practice to put them all in the same package.  The go command does the right thing.  The tests will not be included in your final package.  You can test this by adding a "foo_test.go" file in your sources.  Make it contain errors.  "go install" will still work but "go test" will fail.

The only way you can test private functions is to have the tests part of the same package.

dhconnelly

unread,
Feb 7, 2012, 2:44:42 PM2/7/12
to golang-nuts
This is great. Can it be posted to the Go blog?
>         hg clonehttp://code.google.com/p/go.net

Paul Borman

unread,
Feb 7, 2012, 2:46:27 PM2/7/12
to golan...@googlegroups.com
Checking in yacc output has the advantage that you don't have to worry that there is more than one yacc.  But it is annoying for development.  Change the yacc and now you have to manually create the .go file *or* you go back to something like make and eventually have make call "go" (but what dependencies to use?!)

I was working on a project where I was writing a Go program that produced Go code that was then built into the final product.  For development I would sure like the right thing to happen if I modify the first Go program.

However, my case is less common than most and there is a benefit to the simplicity of the go command.  It is the 90% solution.  The 100% solution apparently is not possible (ant, make, scons demonstrate none of them are 100%)

Gustavo Niemeyer

unread,
Feb 7, 2012, 2:50:28 PM2/7/12
to golan...@googlegroups.com
Hi Stefan,

On Tue, Feb 7, 2012 at 16:40, Stefan Nilsson
<trollerip...@gmail.com> wrote:
> Sure, you could put your test code in the same package. But common practice
> seems to be to put the test code for package x in package x_test. For
> example:

Indeed. The go tool continues to support the common practice of having
a test package with the prefix _test within the same directory as the
actual package. Nothing changed in that regard.

Kyle Lemons

unread,
Feb 7, 2012, 2:51:30 PM2/7/12
to golan...@googlegroups.com
If you put them in the same directory, it seems that you need to give the full path for x within the x_test package. That's inconvenient if you want to move your code.

The _test package is only intended to break dependency cycles incurred by test code.  Since the _test files are ignored, they don't count as another package normally, and go test understands them.

Yohann Coppel

unread,
Feb 7, 2012, 2:58:11 PM2/7/12
to golan...@googlegroups.com, Gustavo Niemeyer, r...@golang.org
I may be wrong, but I don't think you *have* to use the src/github.com/peterbourgon/goop directory to write your code.

A first option is put your code is ~/src/goop/, then symlink ~/src/github.com/peterbourgon/goop to ~/src/goop, and use "github.com/...." as import path. That would solve everything.

If you still don't want to do that, you can git clone your project in ~/src/goop. Then use import "goop/...." as you are doing now. I don't see the problem with that.

If goop have multiple packages though, and import "parts of itself" (e.g. import "goop/maths"), then users of your package will have to do the same: git clone github/...../ in their src directory, and done.
In fact, what I would do, is have a ~/third_party/src where I clone other people's package, e.g. ~/third_party/src/goop, and add ~/third_party/ to GOPATH. That way my src directory also stays clean of third-party packages.

I personally really like the fact that the source directory stays clean as well, and that no config is needed.
And finally, I guess gb is still a great tool that I have also been using until now, so it should still be possible to use it right ?

Russ Cox

unread,
Feb 7, 2012, 3:05:09 PM2/7/12
to golan...@googlegroups.com, Gustavo Niemeyer
On Tue, Feb 7, 2012 at 14:58, Yohann Coppel <yoh...@gmail.com> wrote:
> I may be wrong, but I don't think you *have* to use the
> src/github.com/peterbourgon/goop directory to write your code.
>
> A first option is put your code is ~/src/goop/, then symlink
> ~/src/github.com/peterbourgon/goop to ~/src/goop, and use "github.com/...."
> as import path. That would solve everything.

cd ~/src/goop
go install -v

See where it gets installed.

Russ

Yohann Coppel

unread,
Feb 7, 2012, 3:07:04 PM2/7/12
to golan...@googlegroups.com, Gustavo Niemeyer, r...@golang.org
what about 
go install -v
?

Yohann Coppel

unread,
Feb 7, 2012, 3:49:49 PM2/7/12
to golan...@googlegroups.com, Gustavo Niemeyer, r...@golang.org
This worked perfectly with one of my packages.

- Git clone in src/package
- linked github.com/yohcop/package to src/package
- used either by importing "github.com/yohcop/package" OR "package".

Here is the whole thing:

mkdir -p ~/go/src
export GOPATH=~/go

cd ~/go/src
go fix readflags.go/readflags
// ...one more manual fix....
mkdir -p ~/go/src/github.com/yohcop
cd ~/go/src/github.com/yohcop
ln -s ~/go/src/readflags.go readflags.go
cd ~/go/src
mkdir foo
vi foo/foo.go
// Type the following:
package main
import (
  "flag"
  "fmt"
)
var fooFlag = flag.Int("foo", 3, "An int")
func main() {
  readflags.ReadFlagsFromString("foo = 5")
  fmt.Println("Flag is", *fooFlag)
}


:wq

cd foo
go install -v
../../bin/foo
// TADAAA prints "Flag is 5"

Note, that I can even change my code do import "readflags.go/readflags" if I want to.

I finally ran
cd ~/go/src/readflags.go/readflags
go install -v
to have it installed as readflags.go/readflags as well, although it wasn't really necessary.

And to summarize, after I ran all those commands.
$ tree ~/go/
├── bin
│   └── foo
├── pkg
│   └── linux_amd64
│       ├── github.com
│       │   └── yohcop
│       │       └── readflags.go
│       │           └── readflags.a
│       └── readflags.go
│           └── readflags.a
└── src
    ├── foo
    │   └── foo.go
    ├── github.com
    │   └── yohcop
    │       └── readflags.go -> /home/ycoppel/go/src/readflags.go
    └── readflags.go
        ├── readflags
        │   ├── readflags.go
        │   ├── readflags_test.go
        │   └── testdata
        │       ├── err.txt
        │       ├── f1.txt
        │       ├── f2.txt
        │       └── subdir
        │           ├── f3.txt
        │           └── f4.txt
        └── README


I think that proves that you can use ~/src/goop, and don't worry about github.com/peterbourgon/goob

:)

Paul Ruane

unread,
Feb 7, 2012, 3:57:58 PM2/7/12
to golang-nuts
Thanks for this guide, it's very informative.

As there are conventions that need to be followed for the tooling to
operate as desired, do you think it would be useful (obviously not for
Go 1) for the 'go' command to have a facility for spitting out a basic
project structure that follows these conventions (a la Maven
archetypes or the Ruby on Rails 'rails new')?

E.g.

> export GOPATH=$HOME/mygo
> go create helloworld
>         hg clonehttp://code.google.com/p/go.net
> Notice that “go install” was able to determine ...
>
> read more »
Message has been deleted

Stefan Nilsson

unread,
Feb 7, 2012, 5:09:55 PM2/7/12
to golan...@googlegroups.com
On Tuesday, February 7, 2012 8:51:30 PM UTC+1, Kyle Lemons wrote:
If you put them in the same directory, it seems that you need to give the full path for x within the x_test package. That's inconvenient if you want to move your code.

The _test package is only intended to break dependency cycles incurred by test code.  Since the _test files are ignored, they don't count as another package normally, and go test understands them.

Let's say my package lives at  code.google.com/p/my-go-project/x. What's the best way to import this package from within a test package x_test living in the same directory? The following works (assuming you install the package at this path):

package x_test


Is there, and should there be, some other way?

Kyle Lemons

unread,
Feb 7, 2012, 5:22:09 PM2/7/12
to golan...@googlegroups.com
You shouldn't import it all.  You should just be in the same package (which, by the way, may not be "x" though it will be conventionally).  If you have a dependency cycle, then yes that is what you must do.

Adam Logghe

unread,
Feb 7, 2012, 6:04:55 PM2/7/12
to golan...@googlegroups.com, r...@golang.org
How are library versions handled? I would like to pin remote libraries in github at a particular version for example.

Kyle Lemons

unread,
Feb 7, 2012, 6:56:23 PM2/7/12
to golan...@googlegroups.com
How are library versions handled? I would like to pin remote libraries in github at a particular version for example.

I don't know for certain, but from my use so far, it seems that only go get will update the local version, so avoid that :).

John Asmuth

unread,
Feb 7, 2012, 7:09:34 PM2/7/12
to golan...@googlegroups.com, r...@golang.org
"go get" will only retrieve code from the repository if none exists, or if you give it a flag (it was -u with goinstall, haven't messed around with go get much yet).

If you want a particular version, go to the local source and checkout the correct version.

Adam Logghe

unread,
Feb 7, 2012, 7:10:05 PM2/7/12
to golan...@googlegroups.com

Heh, so just don't walk over the open manhole and you'll be fine eh?

John Asmuth

unread,
Feb 7, 2012, 7:20:06 PM2/7/12
to golan...@googlegroups.com, r...@golang.org
I'm probably wrong about this, from reading other posts. Perhaps I should experiment before talking.

Andrew Gerrand

unread,
Feb 7, 2012, 8:54:24 PM2/7/12
to golan...@googlegroups.com, r...@golang.org
On 8 February 2012 11:20, John Asmuth <jas...@gmail.com> wrote:
> I'm probably wrong about this, from reading other posts. Perhaps I should
> experiment before talking.

No, you're quite right. From "go help get":

The -u flag instructs get to use the network to update the named packages
and their dependencies. By default, get uses the network to check out
missing packages but does not use it to look for updates to existing packages.

Andrew

Nickos Ventouras

unread,
Feb 7, 2012, 9:31:01 PM2/7/12
to golang-nuts
On Feb 7, 8:02 pm, Peter Bourgon <pe...@bourgon.org> wrote:

> > I use ~/src as well, and I have GOPATH=$HOME. Works fine.
>
> Except (as mentioned in IRC) this populates my ~/bin and ~/pkg subdirs
> with artifacts, rather than dropping them in place. In my view, this
> is undesirable.

That.

And also no thoughts at all about version management? It would be a
disappointment to have to use yet another tool like "rvm" and/or
"virtualenv" in the future because "go" wouldn't cut it.

Jessta

unread,
Feb 7, 2012, 10:28:24 PM2/7/12
to Nickos Ventouras, golang-nuts
On Wed, Feb 8, 2012 at 1:31 PM, Nickos Ventouras <mit...@gmail.com> wrote:
> And also no thoughts at all about version management? It would be a
> disappointment to have to use yet another tool like "rvm" and/or
> "virtualenv" in the future because "go" wouldn't cut it.

Static linking means that this is only an issue if you happen to be
developing multiple applications that require different versions of a
library, which is unlikely.
virtualenv and rvm are symptoms of the problems that dynamic linking creates.


--
=====================
http://jessta.id.au

Gustavo Niemeyer

unread,
Feb 7, 2012, 10:29:48 PM2/7/12
to Nickos Ventouras, golang-nuts
On Wed, Feb 8, 2012 at 00:31, Nickos Ventouras <mit...@gmail.com> wrote:
> And also no thoughts at all about version management? It would be a
> disappointment to have to use yet another tool like "rvm" and/or
> "virtualenv" in the future because "go" wouldn't cut it.

The foundation has been put in place 10 minutes ago. Let's wait until
it dries so we can build on it.

Nikos Ventouras

unread,
Feb 7, 2012, 11:14:00 PM2/7/12
to golang-nuts
On Feb 8, 2012, at 5:28 AM, Jessta wrote:

> On Wed, Feb 8, 2012 at 1:31 PM, Nickos Ventouras <mit...@gmail.com> wrote:
>> And also no thoughts at all about version management? It would be a
>> disappointment to have to use yet another tool like "rvm" and/or
>> "virtualenv" in the future because "go" wouldn't cut it.
>
> Static linking means that this is only an issue if you happen to be
> developing multiple applications that require different versions of a
> library, which is unlikely.


With libraries in a state of flux, as they especially are for a new language, that's actually a quite possible scenario.

A common case is one wanting to keep working on an older version of his app that uses an older version of a lib
(say, to just add bugfixes, etc). Consider supporting the 1.x and 2.x versions of a shipped product. Is it far fetched to
consider such uses?

Besides, that's just about linking. What about how the binaries are installed? Can someone use both version 1.x
and version 2.x of the same "go" installable app?

@Gustavo Niemeyer <gus...@niemeyer.net>

> The foundation has been put in place 10 minutes ago. Let's wait until
> it dries so we can build on it.

Sure. Just some points to consider.

Nathan Trimble

unread,
Feb 8, 2012, 2:42:54 AM2/8/12
to golang-nuts
On Tue, Feb 7, 2012 at 9:13 AM, Peter Bourgon <pe...@bourgon.org> wrote:
>
> Russ, thanks for the detailed explanation. "Motivation" docs like this
> are helpful. One point, however:

>
> > Here’s an example.  Let’s say we decide to keep our Go code in the
> > directory  $HOME/mygo.
>
> The implication here is that "my Go code" is somehow distinct from "my
> C code" (or whatever). This implication is false: I don't draw
> partitions in this way. 

This.  I don't have C code or Go code or HTML code, I have a project, that's made of up lots of different inputs.  It happens to have some source in a variety of languages, plus even some "resource" files be it images, HTML, etc.  You run svn co on the project and build the thing where you checked it out from.  The C code gets compiled using qmake and currently also runs a shell or batch script that in turn calls gc and gl.  I'd love to have a cross platform tool that'll just compile the Go code, and I've been waiting to see how the go command turns out.  Now I'm starting to think it won't work for us.  It's too different for the way we work.  I need my binaries to get put in a specific spot to be packaged in RPM's and MSI's.  I need the images, HTML files, JavaScript, etc. to get copied along with it.  And I need it to Just Work on Windows (I'm looking at you make).  So, it sounds like maybe I should take a close look at gb, as I've been hearing a lot about it lately.

-Nate

Rob 'Commander' Pike

unread,
Feb 8, 2012, 2:53:50 AM2/8/12
to Nathan Trimble, golang-nuts
If you need a general compilation tool, use make. You had it before so you have it now.

-rob


John Asmuth

unread,
Feb 8, 2012, 6:27:00 AM2/8/12
to golan...@googlegroups.com
On Tuesday, February 7, 2012 11:14:00 PM UTC-5, Nickos Ventouras wrote:

What about how the binaries are installed? Can someone use both version 1.x
and version 2.x of the same "go" installable app?

Yes, they'd just need to have two different import paths. 

André Moraes

unread,
Feb 8, 2012, 6:27:19 AM2/8/12
to golang-nuts
Just to add my 2 cents,

If somebody don't want to use the go tool and need a directory
organization different from the expected by "go <name your command>"
you can use gb.

gb works with GOPATH, so you don't polute your $GOROOT installation
and don't need to write Makefiles at all.

> If you need a general compilation tool, use make. You had it before so you
> have it now.

With make,gb,gorun tools installed together can probably install
anything anybody would want to.

--
André Moraes
http://andredevchannel.blogspot.com/

dhconnelly

unread,
Feb 8, 2012, 7:01:26 AM2/8/12
to golang-nuts
So: go is a great tool that uses a single convention for zero-
configuration building and dependency management.
If you don't like those conventions, there are existing tools to use.

I think this is great. I just moved all my non-Go projects into
GOHOME/src, since go ignores the packages with no go code. No problem
at all, and my code is still all in the same place.

Russ Cox

unread,
Feb 8, 2012, 10:33:03 AM2/8/12
to André Moraes, golang-nuts
On Wed, Feb 8, 2012 at 06:27, André Moraes <and...@gmail.com> wrote:
> If somebody don't want to use the go tool and need a directory
> organization different from the expected by "go <name your command>"
> you can use gb.

If you want to write config files and not be compatible with
people using the go command, sure.

> gb works with GOPATH, so you don't polute your $GOROOT installation
> and don't need to write Makefiles at all.

All this is true of the go command too.

Russ

Ingo Oeser

unread,
Feb 9, 2012, 12:10:47 AM2/9/12
to golan...@googlegroups.com, Gustavo Niemeyer, r...@golang.org
Thanks for that great summary.

That also solves one of the build tool issue faced in corporate environments with more complex build artefact management.
Convention: Developer just imports things like readflags.go/readflags and be done with it. 

The part about "readflags.go" can be more complex like namespace/api-version/package and the rest of it can be 
managed in a more complex way via symlinks and checkouts for those who need it.

Perfect world, BUT code in contributed libraries doesn't work that way. Lets take a realistic example like:
  1. liblazy from a lazy maintainer, updating his code rarely
  2. libbleeding from a active developer updating his library very often (bleeding edge) and breaking API often in unpredictable ways.
  3. liblazy depends on libbleeding
  4. appgreat wants to use liblazy and does so. Thus it implicitly uses libbleeding too, because it has no other choice.
  5. after a year a bug needs to be fixed in appgreat, which depends on a fix in libbleeding and appgreat is in maintaining mode
  6. libbleeding has the third API rewrite already, libbleeding in it's current version simply doesn't work any more with liblazy
  7. liblazy became orphan.
  8. classic solution is to backport the fix or provide an alternative fix to and old version of libbleeding
This case is not supported. More simple cases are:
  • moving code from one VCS provider to another (e.g. Launchpad to Google Code)
  • using another fork on github.
  • temporarily using another copy to build, because main VCS provider is down (e.g. as happened with git.kernel.org)

All those require import path patching. At the current state of the go tool we need to patch import paths on ALL upstream 
packages before building the first source file and the go tool doesn't even provide support for this extra stage.

What about to provide a way to optionally remap package lists arbitrarily to support this.
Two extra options to the go tool should be enough for Go1:
  1. "--remap-packages-via-regex-file=filename" - provides a file with regular expressions and substitutions for simple cases (fast path solution)
  2. "--remap-packages-via-command=commandname" - provides a command expecting one package name per line outputting one package name per line in return (slow path, hard cases)
Both are extremely easy to implement and will make code maintaining teams very, very happy in the long run.

Thanks for reading this far, if idea is ok, I can sketch up an example implementation supporting this idea.

Best Regards

Ingo Oeser

Andrew Gerrand

unread,
Feb 9, 2012, 1:17:41 AM2/9/12
to Ingo Oeser, golan...@googlegroups.com, Gustavo Niemeyer, r...@golang.org

I think you'd be better off implementing a separate tool that rewrites
import paths in all go files in a directory tree, rather than giving
the go tool more responsibilities.

Andrew

Yohann Coppel

unread,
Feb 9, 2012, 1:23:10 AM2/9/12
to golan...@googlegroups.com, Ingo Oeser, Gustavo Niemeyer, r...@golang.org
I think so too.
here is one:

perl -p -i -e "s/<old package>/<new package>/g" `find . -name *.go`

;)

Volker Dobler

unread,
Feb 9, 2012, 3:42:54 AM2/9/12
to golang-nuts


On Feb 9, 6:10 am, Ingo Oeser <nightly...@googlemail.com> wrote:
> Perfect world, BUT code in contributed libraries doesn't work that way.
> Lets take a realistic example like:
>
>    1. liblazy from a lazy maintainer, updating his code rarely
>    2. libbleeding from a active developer updating his library very often
>    (bleeding edge) and breaking API often in unpredictable ways.
>    3. liblazy depends on libbleeding
>    4. appgreat wants to use liblazy and does so. Thus it implicitly uses
>    libbleeding too, because it has no other choice.
>    5. after a year a bug needs to be fixed in appgreat, which depends on a
>    fix in libbleeding and appgreat is in maintaining mode
>    6. libbleeding has the third API rewrite already, libbleeding in it's
>    current version simply doesn't work any more with liblazy
>    7. liblazy became orphan.
>    8. classic solution is to backport the fix or provide an alternative fix
>    to and old version of libbleeding
>
> This case is not supported.

I got stuck with this too on first sight...

The solution is pretty and simple.
First note that no tooling whatever can help you fixing appgreat:
You must patch other mans packages (lazy or bleeding).

From here on the path to the solution is clear: If you ship appgreat
an maintain it afterwards: Keep the source of lazy and bleeding
and use these kept sources to build appgreat (and not the
online/current packages). Yes, you wont benefit from development
in bleeding, but you must not anyway in production code: Your
appgreat production code must rely on a certain bleeding and
lazy (not only the API, the full impl), so you check them in with your
sources.

Instead if a single go install it is a handful more commands.
If you decide to upgrade to a new bleeding or lazy you just
update your copy (chek them in localy) and rebuild appgreat.
It is pretty much the same procedure used for a dependency in
maven: At a certain point you switch to a new version of the
dependency and from that point on you use just this version and
nothing else.

Volker

eike...@gmail.com

unread,
Feb 9, 2012, 2:20:53 PM2/9/12
to golan...@googlegroups.com, r...@golang.org
On Tuesday, February 7, 2012 5:54:24 PM UTC-8, Andrew Gerrand wrote:
 The -u flag is fine for projects with a single developer but wouldn't work for teams. For that you need each new member to be able to get the same versions easily. As far as I can tell this would require using make files in combination with 'go' to checkout the right commit-id/version.



Kyle Lemons

unread,
Feb 9, 2012, 3:12:16 PM2/9/12
to eike...@gmail.com, golan...@googlegroups.com, r...@golang.org
 The -u flag is fine for projects with a single developer but wouldn't work for teams. For that you need each new member to be able to get the same versions easily. As far as I can tell this would require using make files in combination with 'go' to checkout the right commit-id/version.

I've been considering that problem.  The go tool already has some branch/tag code in it, so I wonder if a new "go sync [<pkg> [<branch/tag>]]" might not be appropriate. 

Dorival Pedroso

unread,
Feb 9, 2012, 5:35:25 PM2/9/12
to golan...@googlegroups.com, r...@golang.org
+10

Adam Logghe

unread,
Feb 9, 2012, 9:13:45 PM2/9/12
to golan...@googlegroups.com, eike...@gmail.com, r...@golang.org
My original concern was for distributing source from github that others will compile.

It would be good if we could specify the version of dependencies and the go tool would retrieve the -correct- versions.

This would save me from having to pull things into my git as submodules etc.

The version issue seems unimportant now when Go is young, and the libraries immature anyway, but will likely begin to bite in the medium term.

Adam

Yohann Coppel

unread,
Feb 9, 2012, 9:17:07 PM2/9/12
to golan...@googlegroups.com, eike...@gmail.com, r...@golang.org
How would you resolve the following:

- Your project depends on lib1 version 1.1 and lib2 version 2.2
- lib1 depends on lib3 v3.0
- lib2 depends on lib2 v3.2

Yohann Coppel

unread,
Feb 9, 2012, 9:26:52 PM2/9/12
to golan...@googlegroups.com, eike...@gmail.com, r...@golang.org
And I think you should follow Volker's advice anyway:

That is, keep the sources of the packages you depend on in your repos. Git submodules are great for that, since they link to a specific revision of another/remote git repository.

You can have your project depending on lib1 @ 34bf7ac7.....
When lib1 is updated, you can pull the new version, and update your code.
Do that with all the dependencies, and you always have a working tree.
You can come back in time, compile your project like it was 2 years ago, with all the dependencies like they were at this exact moment.

For my previous example with lib1 lib2 and lib3, you just update your working tree to a version that work for all the libs. But a tool can hardly resolve that for you.
You may have to patch one of the dependency anyway.

Hope that helps

Adam Logghe

unread,
Feb 9, 2012, 9:35:04 PM2/9/12
to golan...@googlegroups.com, eike...@gmail.com, r...@golang.org
It does help in general, it's really good advice for production code.

I guess I'm hoping for something easier for trying to give away code on github etc.

Adam

Ingo Oeser

unread,
Feb 9, 2012, 7:58:01 PM2/9/12
to golang-nuts
ok, found a possible solution, which is nearly implemented, so I can
answer my own question and thus share the research results.

TL;DR Using "go list" with a template for the rewriting for dependency
tracking and feeding this into sth. similiar to
http://code.google.com/p/go/source/browse/src/cmd/go/test.go seems to
solve the issue for me.

Details and discussion summary below.

Using the go template system and go list to generate the rewriting
code using some free rules on the fly, compile this and build
everything afterwards seem to work for me.
That way you only loose the download+update part of the Go build
system, which is exactly what I wanted to do.
The actual build scripts for a known dependency chain are the easy
part.

Other solutions suggested:
- rewriting source code for every package I need it -> not doable
large scale, so I consider this a funny comment
- having local versions checked out and maintaining them -> nearly,
but you still have to patch source code for simple repository
movements for every upstream change and monitor those, too.
- separate tool -> good idea, but needs support for dependency
tracking and resolution, which the go tool has already. The core of
it, made available by go list, is actually the way to go here.

The go tool has basic ideas for version support is sketched, but not
yet fleshed out and not specified, yet.
http://code.google.com/p/go/source/browse/src/cmd/go/pkp.go shows a
TODO remark on the Package.Version field here.

So thanks anyone. I will sketch out a basic implementation with some
people and provide it somewhere. Stay tuned :-)
@rsc: Great and wise design!

Best Regards

Ingo

Russ Cox

unread,
Feb 10, 2012, 11:47:37 AM2/10/12
to Adam Logghe, golan...@googlegroups.com, eike...@gmail.com
Versioning is incredibly hard, and usually a mistake.
The go tool is not trying to solve this problem.

I would strongly encourage package authors to take the
same approach we have been for Go 1: if at all possible,
over the lifetime of your package, keep older clients working
by avoiding backwards-incompatible changes.

Conversely, if you find yourself in a situation where your
code needs version 3.1 of a library and cannot work with
newer versions, you need to make your own copy of that
library: it's moved on without you.

We did put considerable effort into making sure that you
can link two copies of a library into a binary, with different
import paths, of course, and they will be kept separate.

Russ

Brian Ketelsen

unread,
Feb 10, 2012, 11:56:30 AM2/10/12
to Adam Logghe, golan...@googlegroups.com, eike...@gmail.com, r...@golang.org

There is no option for versioning that doesn't (d)evolve into RubyGems/Bundler/Maven at some point. I don't want that for Go. My plan is to fork any libraries that I intend to use in production and maintain them myself.

Paul Borman

unread,
Feb 10, 2012, 12:25:21 PM2/10/12
to Brian Ketelsen, Adam Logghe, golan...@googlegroups.com, eike...@gmail.com, r...@golang.org
I was in the Unix derived OS vending business for most of my carear.  The first 10 years on an internally used OS and then another 10 years for BSD/OS.  I was involved with both VxWork and Linux following that.

During that time I have learned that a project that uses third party solutions for personal use is very different from a supported project that does.  Utilizing open source solutions is a double edged sword.  It greatly increases the tools at your disposal, but it also greatly complicates maintenance as many open source  projects do not do what Russ has described for Go1.

I cannot recommend the "copy and maintain yourself" approach.  In the long run you are much better off keeping current and not depending on libraries or packages that do not (perhaps you should help keep those things current if the owners will accept the help).  If you do copy and maintain you will almost certainly find yourself with an ever increasingly difficult job of back-poriting.  It was also painfully learned that expect in the most critical of situations, if you cannot push fixes or improvements upstream, then don't do them.  Trying to bring patches forward also becomes increasingly more difficult with time.  At BSDi we migrated to a state where we did as few changes as possible to any contributed package.

Copying simple code is one thing.  You can likely prove it correct and not need to worry about keeping up with the community (until someone produces a better algorithm, of course).  Forking larger should only be done because the owners of the original code base are going off into left field or have more or less ceased to responsibly maintain their code.

I hope that the Go tool and the Go project dashboard will decrease the desire of people to fork and maintain.

    -Paul

mar...@google.com

unread,
Feb 10, 2012, 1:01:42 PM2/10/12
to golan...@googlegroups.com, r...@golang.org
There are a number of cases where there are combinations of tools and libraries that packaged and distributed together. A simple example would be David Symond's gomock tool, which consists of the gomock pkg, as well as the mockgen tool. That doesn't seem to fit well into the go model.

Is the recommendation for something like that to create two separate repositories, one for the gomock package and another for the mockgen tool? While I do understand that the need for convention, this seems like too common a use case to ignore? (Or, that I've missed how to properly do it--the better outcome in this case.)

Here's what I tried:

can't load package: github.com/dsymonds/gomock: $GOPATH/src/github.com/dsymonds/gomock: no Go source files


$ go install ./..
can't load package: ./..: $GOPATH/src/github.com/dsymonds: no Go source files
$ go install ./gomock
$ go install ./mockgen
$ go test ./gomock






Adam Logghe

unread,
Feb 10, 2012, 1:04:46 PM2/10/12
to golan...@googlegroups.com, Adam Logghe, eike...@gmail.com, r...@golang.org
After thinking about it last night, I decided that if I wanted the go tool to do this I might be just flat approaching it wrong by wanting yet another layer.

I think I'll work on just keeping current as best as and being explicit about known working versions.

Thanks guys.

Russ your answer is good and might be profitably turned into a FAQ especially for people coming from other language ecosystems.

Kyle Lemons

unread,
Feb 10, 2012, 2:51:59 PM2/10/12
to mar...@google.com, golan...@googlegroups.com, r...@golang.org
David probably hasn't had time to fully convert gomock to the go tool.  I developed an RPC stub generator that you can install by simply doing "go get github.com/kylelemons/go-rpcgen/protoc-gen-go" and it will download and install both its dependencies and the dependencies of its generated code.  If the generated code had a dependency that the binary did not, I would add it in a separate file called something like "depend.go" that only had that dependency _imported along with a comment explaining its existence. 

Brian Ketelsen

unread,
Feb 10, 2012, 3:47:31 PM2/10/12
to Paul Borman, Adam Logghe, golan...@googlegroups.com, eike...@gmail.com, r...@golang.org
The good news is that `gofix` remains the most powerful tool in a Go programmers tool chain.

Brian


John Eikenberry

unread,
Feb 10, 2012, 5:56:42 PM2/10/12
to golan...@googlegroups.com
Russ Cox wrote:

> I would strongly encourage package authors to take the
> same approach we have been for Go 1: if at all possible,
> over the lifetime of your package, keep older clients working
> by avoiding backwards-incompatible changes.

Maybe this behaviour could be encouraged by, for example, adding a 'Stable API'
column to the package dashboard or just repeating it in the docs where
appropriate.

--

John Eikenberry
[ j...@zhar.net - http://zhar.net ]
[ PGP public key @ http://zhar.net/jae_at_zhar_net.gpg ]
________________________________________________________________________
"Perfection is attained, not when no more can be added, but when no more
can be removed." -- Antoine de Saint-Exupery

signature.asc

ron minnich

unread,
Feb 11, 2012, 12:46:56 AM2/11/12
to John Eikenberry, golan...@googlegroups.com
But without versioning, how would we be able to get great error
messages like this:

patch 2.6 is incompatible with these scripts. Please install either
version 2.5.9 (or earlier) or version 2.6.1 (or later.)

I wouid miss them so :-)

ron

DisposaBoy

unread,
Feb 11, 2012, 6:23:28 AM2/11/12
to golan...@googlegroups.com, John Eikenberry
From the app itself, the same as in your example.
Reply all
Reply to author
Forward
0 new messages