403 (Quota Exceeded)

752 views
Skip to first unread message

Jaka Jancar

unread,
Jan 11, 2011, 7:36:31 PM1/11/11
to BigQuery discuss
I'm getting 403 (Quota Exceeded).

Can I ask what the quotas are for queries and for imports?

Jaka Jancar

unread,
Jan 11, 2011, 7:50:29 PM1/11/11
to BigQuery discuss
I also get 417 (Rate Limit Exceeded) and 403 (Rate Limit Exceeded)
occasionally.

I am aware I'm not being the nicest to the api, but it would be good
to know the limits so I know how much to optimize.

Siddartha Naidu

unread,
Jan 11, 2011, 8:17:01 PM1/11/11
to bigquery...@googlegroups.com
Hi Jaka,

Currently limits for import are:

RATE_LIMIT: 5 imports / 10 mins
QUOTA_LIMIT: 150 imports / day

The 403/417 help distinguish between short term throttles and long term (daily) quota.

We do modify these limits based on our backend capacity so they may change in the future. I will follow up with an update to docs to document these limits.

- siddarth

Jaka Jancar

unread,
Jan 11, 2011, 8:32:05 PM1/11/11
to BigQuery discuss
Thanks Siddartha!

One thing though. Unless this was added very recently, it does not
work. I think I did around 4k (tiny) imports in 2 hours today, up to
100 tasks in parallel (which includes uploading to Google Storage and
2 queries, so likely ~20 imports at any one time).

Also, are similar limits planned to stay in place after launch, or is
this just during testing?

While I don't intend to have even nearly this many imports, I was
hoping to make an import every 5 minutes (288 imports / day) to keep
the data near-realtime, which is ok for RATE_LIMIT, but over
QUOTA_LIMIT.

Siddartha Naidu

unread,
Jan 11, 2011, 8:42:36 PM1/11/11
to bigquery...@googlegroups.com
Hi Jaka,

as I mentioned we have varied these quota limits over time. We did just set them lower so it is likely the point at which you started seeing issues.

We do intend to revise limits post launch and also following changes we make in our backend.

As for your feedback on the specific limits we will adjust the quota limit based on service usage and maintain fair usage of resources across trusted testers. If we have sufficient capacity to allow higher limits we will definitely raise them. 

- siddarth

Torre Wenaus

unread,
Jan 13, 2011, 8:33:35 PM1/13/11
to bigquery...@googlegroups.com
Hi,
After uploading only a few GB to BigTable today I hit 'Quota Exceeded', and found why when googling brought me here. As has been noted, there is no documentation of such a limit. The documentation that does exist advertises, for example, 240 BILLION ROW applications
But a few million rows and the door is slammed in my face. So much for testing whether BigQuery can give us better scalability than our present Oracle based archive.  150 imports a day? No scaling test possible, or needed. You need to document these limits so people don't waste their time. I think it is pretty natural to have higher expectations than this for a Google (Google!) data service, even a pre-release developers version. I certainly did.
  - Torre Wenaus

Jaka Jancar

unread,
Jan 25, 2011, 1:19:17 PM1/25/11
to BigQuery discuss
Do these limits apply only to starting new imports or to all requests?

I start new imports with at least 2 min between each, and while
they're in progress, I check their status every 15 seconds. After 2 or
3 imports I'll always start getting 403's.

Michael Sheldon

unread,
Jan 25, 2011, 1:49:23 PM1/25/11
to bigquery...@googlegroups.com
Hi Jaka,

There might be other reasons to get a 403 return result. When you see these, what error message is included in the body of the response?

Also, if you can give me a time range from when you've hit this unexpected 403 rate limit error I can try and find more information for you.

Thank you,

--Michael Sheldon

Jaka Jancar

unread,
Jan 25, 2011, 3:09:55 PM1/25/11
to BigQuery discuss
I got one just now:

stdClass Object
(
[error] => stdClass Object
(
[errors] => Array
(
[0] => stdClass Object
(
[domain] => global
[reason] => usageLimits
[message] => Rate Limit Exceeded
)

)

[code] => 403
[message] => Rate Limit Exceeded
)

)

Jaka Jancar

unread,
Jan 25, 2011, 3:20:52 PM1/25/11
to BigQuery discuss
Got another one. And as I said, I'm throttling to 1 import / 2
minutes:

+----------------------------+
| submissionAttemptTimestamp |
+----------------------------+
| 2011-01-25 20:02:24 |
| 2011-01-25 20:04:24 |
| 2011-01-25 20:06:24 |
| 2011-01-25 20:08:24 |
| 2011-01-25 20:10:24 |
| 2011-01-25 20:12:24 |
| 2011-01-25 20:14:25 |
| 2011-01-25 20:16:26 |
| 2011-01-25 20:18:28 |
+----------------------------+
9 rows in set (0.00 sec)

Michael Sheldon

unread,
Jan 26, 2011, 7:10:09 PM1/26/11
to bigquery...@googlegroups.com
Hello Jaka,

There does seem to be a problem here, but I'm still investigating to find the root cause. I'll follow up when I know more.

Thanks for bringing this to our attention!

--Michael Sheldon

Jaka Jancar

unread,
Feb 2, 2011, 6:59:13 PM2/2/11
to BigQuery discuss
Hi Michael,

Which requests count against the quota limit? Only starting new
imports, or checking their status too?

I'm hitting the limit with 50 imports in the last day.

Michael Sheldon

unread,
Feb 3, 2011, 8:17:24 PM2/3/11
to bigquery...@googlegroups.com
Hi Jaka,

We've figured out the problem with the quota and are working on a fix. We were double-counting quota requests during imports. This explains why you are getting 2.5 average requests in 10 minutes (you should get 5) and 50 in one day (you should get 100 requests, each with a maximum of 5GB of data).

We will let you know when the fix is available.

Thank you,

--Michael Sheldon

Jaka Jancar

unread,
Feb 3, 2011, 8:58:50 PM2/3/11
to BigQuery discuss
Great!
Reply all
Reply to author
Forward
0 new messages