Hi,
The ideal unit test is one where you stub/mock the bigquery response and test your usage of specific responses, as well as validate well formed requests. If you are using the BigQuery client from the
code.google.com/p/google-apis-go-client project, you can launch a httptest.Server, and provide a handler that returns mocked responses serialized using the structs in the client package to help you provide a small test environment. The client allows you to override the server base URL to be the one of the tests.
If you plan to test BigQuery as the same way you test a regular appengine app by using a the local development server, I don't know of a good solution from upstream. What I did in the past for a Java app was to write a thin wrapper around the bigquery api calls, and on testing/development, set this wrapper to a in-memory sql implementation, so I could test load/query operations. For Go, an option to write such wrapper would be to write an interface for your calls, and write an stub implementaton with the help of the
https://github.com/cznic/ql package. This aproach has a caveat: your testing query syntax may differ or be invalid from the production one, so it is not the most elegant solution, but works for simple query cases.
Finally, If you are willing to write up some integration tests, you can aways setup a project on Cloud Console, and provide a service account for your to test to use. We use this aproach for testing our app behavior with the dev server, and our BigQuery client setup checks for an env var containing the credentials of a service account to use, otherwise it uses the appengine service account. This procedure costs some $$, so if you don't have a budget allocated for Q.A. apps it may not be an option.
Hope this helps!