Task Queue Quota Errors, but I have enough quota

124 views
Skip to first unread message

John Wheeler

unread,
Feb 8, 2012, 9:31:30 PM2/8/12
to google-a...@googlegroups.com
Hello,

I am experiencing the error: The API call taskqueue.BulkAdd() required more quota than is available.

I am sure I have enough quota. I have 1 Gig allocated to task queue stored task bytes (and the green meter on the task queue page goes less than half-way), and I'm nowhere near even the free stored task call and API call limits.

This seems to be a new problem I've been experiencing (last few days).

Any ideas?

John

Robert Kluin

unread,
Feb 9, 2012, 2:05:54 AM2/9/12
to google-a...@googlegroups.com
Hey John,
Just wondering, are you seeing those when inserting really large
batches or just a few tasks?

Robert

> --
> You received this message because you are subscribed to the Google Groups
> "Google App Engine" group.
> To view this discussion on the web visit
> https://groups.google.com/d/msg/google-appengine/-/x6RZezOEFz4J.
> To post to this group, send email to google-a...@googlegroups.com.
> To unsubscribe from this group, send email to
> google-appengi...@googlegroups.com.
> For more options, visit this group at
> http://groups.google.com/group/google-appengine?hl=en.

Johan Euphrosine

unread,
Feb 9, 2012, 9:20:44 AM2/9/12
to google-a...@googlegroups.com
Hi John,

How often are you calling BulkAdd per minute?

--
Johan Euphrosine (proppy)
Developer Programs Engineer
Google Developer Relations

John Wheeler

unread,
Feb 9, 2012, 3:45:47 PM2/9/12
to google-a...@googlegroups.com
Hi Robert and Johann, I am not calling BulkAdd at all. However, I am using MapReduce, and that might be calling BulkAdd. Inside my MapReduce job, I am just adding tasks one at a time to the task queue. 

I might add as many as a few thousand tasks per minute.

MapReduce job -> each entity -> 10-3000 tasks

I do not remember having this problem before, but my app has been growing in size.

Please advise

Robert Kluin

unread,
Feb 10, 2012, 1:37:17 AM2/10/12
to google-a...@googlegroups.com
Hi John,
BulkAdd is just a level down in the taskqueue stuff, so you're
calling it just not "directly." I've also occasionally hit this quota
in short bursts, then it will go away. Are you seeing this for a
prolonged period or just periodically seeing small bursts of the error
in your logs?

Are you adding 3000 tasks one at a time or in batch inserts? And
are you saying that each entity produces 10 to 3000 tasks or that
there will be between 10 and 3000 tasks in total?

If you're inserting very large numbers of tasks rapidly, what
happens if you throttle that back a little?

Robert

> --
> You received this message because you are subscribed to the Google Groups
> "Google App Engine" group.
> To view this discussion on the web visit

> https://groups.google.com/d/msg/google-appengine/-/bLzs-j5838AJ.

Brandon Wirtz

unread,
Feb 10, 2012, 1:47:16 AM2/10/12
to google-a...@googlegroups.com
Did you tweak your buck and token rate? I have never tried to see how far I
could take it.


Mike Wesner

unread,
Feb 10, 2012, 12:55:02 PM2/10/12
to Google App Engine
The bucket and rates are for controlling execution. They don't limit
adding tasks to a queue.

-mike

Brandon Wirtz

unread,
Feb 10, 2012, 3:52:58 PM2/10/12
to google-a...@googlegroups.com
> The bucket and rates are for controlling execution. They don't limit
adding
> tasks to a queue.

Doesn't the Token bucket?

Also there is a max QueSize in MB you can specify in the YAML. I don't know
what the limit is.


My rough understanding was that

Token Rate was how many task/s you could add.

Process rate was how fast tasks would be processed.

Queue Size was number of tasks that could be pending.

Time Out was the time to expire


If your token rate is 5 per S
And your process is 2 per
And your Queue size is 5000
And your time out is 75 minutes

If 18 people wanted to make 10 new tasks in 3 seconds. The Token bucket
would empty and only 15 tasks would be created.

If 500 people an hour wanted to create 20 tasks each (10000 tasks, 3600
seconds) yields 2.78 Tasks/s the Token bucket would not go dry. The Process
bucket would spin up instances enough to process the tasks, and it would try
to space them out such that it would take 5000 seconds
But you would lose some of those task because the Task Queue would not have
only processed all of the tasks before it hit the 5000 task limit (7200-ish
would complete)
And because you are limiting tasks to 75 minutes, but it would take 1 hour
23 minutes to process the task a portion would get dropped.

Mike Wesner

unread,
Feb 10, 2012, 4:47:03 PM2/10/12
to Google App Engine
They payloads of the tasks count towards the store bytes quota. So
that does limit how many you can add to the queue, but that is not the
issue here.

The bucket/token stuff doesnt impact adding to the queue.

from the docs...

"""The task queue uses token buckets to control the rate of task
execution. Each named queue has a token bucket that holds a certain
number of tokens, defined by the bucket_size directive. Each time your
application executes a task, it uses a token. Your app continues
processing tasks in the queue until the queue's bucket runs out of
tokens. App Engine refills the bucket with new tokens continuously
based on the rate that you specified for the queue."""

John Wheeler

unread,
Feb 10, 2012, 4:52:05 PM2/10/12
to google-a...@googlegroups.com
Hi, I switched my tasks from enqueing more tasks to using fan-out with a cursor because it is production and I don't have time to experiment. Besides, I've been meaning to convert the thing to fan-out for a while now, so this happened to be a good time to do it. Sorry I can't help troubleshoot it further.

I'm not sure what caused the problem, but I run my MapReduce job twice a day. I only noticed the problem over the last few days, and sometimes it would not occur for a job. FYI

Robert Kluin

unread,
Feb 11, 2012, 1:13:30 AM2/11/12
to google-a...@googlegroups.com
Hey Brandon,
No offense, but there's almost nothing correct in this. I'm also
not sure where some of the terms come from -- I couldn't find
references to them in the Python or Java docs.

Tasks will basically be run as fast as your settings allow. There
are three main settings: rate, bucket_size, and
max_concurrent_requests. The bucket size controls the "token bucket"
behavior. Rate controls the frequency at which tasks may be selected
from the queue. The concurrent request limit gives you more fine
grained control of a queue (handy with the new pricing).

The only limit to how many tasks you can have, that I'm aware of, is
the stored task count/bytes quotas, and possibly the api calls quota.

Tasks shouldn't ever be lost. If an app is actually losing tasks
that is a bug in either the app's code or app engine's.

I'm not sure what timeout you're referring too. You can specify a
delay before a task will be run (eta or countdown params), and for
pull queues a lease durration.


Robert

> --
> You received this message because you are subscribed to the Google Groups "Google App Engine" group.

Brandon Wirtz

unread,
Feb 11, 2012, 1:50:06 AM2/11/12
to google-a...@googlegroups.com
I could be totally off my rocker. This is how they were explained to me by
someone on SO a while back I am traveling I don't have the thread on hand.

I have only ever tweaked the rate tasks are processed.

Simon Knott

unread,
Feb 11, 2012, 4:52:00 AM2/11/12
to google-a...@googlegroups.com
I always liked this picture by Ikai for explaining Task Queue configurations - http://twitpic.com/3y5814/full
Reply all
Reply to author
Forward
0 new messages