What causes a "BigQuery error in load operation"?

Skip to first unread message

Edward Hartwell Goose

Apr 23, 2012, 11:49:25 AM4/23/12
to bigquery...@googlegroups.com
I'm updating a set of data (which currently exists as files on Google Cloud Storage).

I'm aware some of the data is wrong (header and last two lines) but as far as I can tell, the rest fits the schema of string and nullable. I have to set the max_bad_records to quite a high number in order to ensure the data gets imported correctly. Is it possible to determine what causes those errors?


Michael Manoochehri

Apr 23, 2012, 12:41:24 PM4/23/12
to bigquery...@googlegroups.com
Hi Ed:

Just to start, how are you currently ingesting the data: via bq tool, web UI, or programmatically accessing the API directly?

A good start would be:
  • In the Web UI, click on "Recent jobs" - click on one of the erroneous jobs and check the error details.
  • If you are using the bq tool, adding the flag --apilog=-  will dump the JSON response to standard out, allowing you to see more details about the error.
If these steps aren't uncovering a helpful error response, please send your job_id to bigqu...@google.com and we can investigate further.

- Michael

Michael Sheldon

Apr 23, 2012, 12:56:24 PM4/23/12
to bigquery...@googlegroups.com
FYI: The apilog= may cause more output than you care to read through. You can see the full server response for just the job resource by using:

bq --format=prettyjson show -j <job_id>

This will show all details, included individual errors encountered, during the load job <job_id>.


--Michael Sheldon

Edward Hartwell Goose

Apr 24, 2012, 8:07:06 AM4/24/12
to bigquery...@googlegroups.com
Thanks both - I've used a combination of your responses to discover the issue. It was some commas in a comma separated file, obviously creating too many columns.

Reply all
Reply to author
0 new messages