Just released: Python SDK 1.2.3

85 views
Skip to first unread message

Jason (Google)

unread,
Jun 18, 2009, 7:52:03 PM6/18/09
to Google App Engine
A new release of the Python SDK was made available earlier today. In
addition to oft-requested support for Django 1.0 and asynchronous URL
Fetch, this release introduces the experimental Task Queue API, which
allows you to perform offline processing on App Engine by scheduling
bundles of work (tasks) for automatic execution in the background
without having to worry about managing threads or polling.

You can download the newest SDK directly from the Downloads page at
http://code.google.com/appengine/downloads.html.

Check out the official blog post, release notes, and documentation for
more on the newest features of the SDK. And, as always, please feel
free to share your questions, suggestions, and comments on the group.

http://googleappengine.blogspot.com/2009/06/new-task-queue-api-on-google-app-engine.html
http://code.google.com/p/googleappengine/wiki/SdkReleaseNotes
http://code.google.com/appengine/docs/python/taskqueue/

Cheers!
- Jason

风笑雪

unread,
Jun 18, 2009, 9:03:45 PM6/18/09
to google-a...@googlegroups.com
Great job and many thanks!

These 3 update are very useful for me:

Task Queue support available as google.appengine.api.labs.taskqueue.
Django 1.0 support. You must install Django locally on your machine for the SDK but no longer need to upload it to App Engine.
Urlfetch supports asynchronous requests.

2009/6/19 Jason (Google) <apij...@google.com>

Takashi Matsuo

unread,
Jun 18, 2009, 10:42:37 PM6/18/09
to google-a...@googlegroups.com
Thank you very much for your great work!

I've tried asynchronous urlfetch. As far as I've tried, it worked well
on App Engine.

I've gota feeling that async urlfetch doesn't work asynchronously on SDK.
SDK is single threaded, so it is understandable.

But, if its true, it might be better to mention about this difference
on the urlfetch document.

http://code.google.com/appengine/docs/python/urlfetch/asynchronousrequests.html

I also noticed the second example on this document lacks a line
invoking make_fetch_call() method. I guess the 2nd example should be
like following:

rpcs = []
for url in urls:
rpc = urlfetch.create_rpc()
rpc.callback = lambda: handle_result(rpc)
urlfetch.make_fetch_call(rpc, url)
rpcs.append(rpc)

Regards,

-- Takashi Matsuo

cz

unread,
Jun 19, 2009, 2:11:32 AM6/19/09
to Google App Engine
Thank you!
These are really great new features.

Sylvain

unread,
Jun 19, 2009, 4:35:13 AM6/19/09
to Google App Engine
Cool, great job. Very good release !

Could it be possible to have an example of the "bucket_size" usage.
How does it work ?
Currently, it is not clear for me.

Regards

Barry Hunter

unread,
Jun 19, 2009, 4:54:59 AM6/19/09
to google-a...@googlegroups.com
Excellent!

Is there any limits on the 'params' structure in the task queue?

Can we (should we!?!) pass around really big data via this, or would
it be best stored in memcache (for example) and just the key passed?

Paul Kinlan

unread,
Jun 19, 2009, 5:51:31 AM6/19/09
to google-a...@googlegroups.com
Barry,

I believe your treat each task as a webrequest and at the moment there is a 10K limit (http://code.google.com/appengine/docs/python/taskqueue/overview.html) on the size of task items.  I believe the best course of action is to stash them in memcache (although I am sure you may get instances where it might be removed from memcache) - from what I understand enqueing on to the task queue is a lot faster than storing a temp object in the data store, depending on the reason for you using the queue, persisting the object to the datastore might negate some of its usefulness.

I think some experimentation is needed.

Paul

2009/6/19 Barry Hunter <barryb...@googlemail.com>

Ubaldo Huerta

unread,
Jun 19, 2009, 7:05:08 AM6/19/09
to Google App Engine
Regarding django support.

Is it 1.02 support or just 1.0 support?

I'm currently using zip import (which slows things down significantly
when app instance is cold). The release notes says that django needs
to be installed. But where? Is 0.96 removed?

On Jun 19, 11:51 am, Paul Kinlan <paul.kin...@gmail.com> wrote:
> Barry,
> I believe your treat each task as a webrequest and at the moment there is a
> 10K limit (http://code.google.com/appengine/docs/python/taskqueue/overview.html) on the
> size of task items.  I believe the best course of action is to stash them in
> memcache (although I am sure you may get instances where it might be removed
> from memcache) - from what I understand enqueing on
> to the task queue is a lot faster than storing a temp object in the
> data store, depending
> on the reason for you using the
> queue, persisting the object to the datastore might negate some of its
> usefulness.
>
> I think some experimentation is needed.
>
> Paul
>
> 2009/6/19 Barry Hunter <barrybhun...@googlemail.com>

Sylvain

unread,
Jun 19, 2009, 8:08:20 AM6/19/09
to Google App Engine
Django 1.02

Check this http://code.google.com/intl/fr/appengine/docs/python/tools/libraries.html#Django
for version.

from google.appengine.dist import use_library
use_library('django', '1.0')
---

Neves

unread,
Jun 19, 2009, 12:47:03 PM6/19/09
to Google App Engine
does TaskQueue is the Google solution for long running tasks, or it
will be another new API?

What about Big File Storage? I heard nothing about it on GIO.

Great work guys!

On Jun 19, 9:08 am, Sylvain <sylvain.viv...@gmail.com> wrote:
> Django 1.02
>
> Check thishttp://code.google.com/intl/fr/appengine/docs/python/tools/libraries....

CaiSong

unread,
Jun 20, 2009, 12:38:15 AM6/20/09
to Google App Engine
-------------------------------------------------------------------------------------------------
from google.appengine.dist import use_library
use_library('django', '1.0')
import logging, os
# Google App Engine imports.
from google.appengine.ext.webapp import util

# Force Django to reload its settings.
from django.conf import settings
settings._target = None

# Must set this env var *before* importing any part of Django
os.environ['DJANGO_SETTINGS_MODULE'] = 'settings'

import django.core.handlers.wsgi
import django.core.signals
import django.db
import django.dispatch.dispatcher

def log_exception(*args, **kwds):
logging.exception('Exception in request:')

# Log errors.
django.dispatch.dispatcher.connect(
log_exception, django.core.signals.got_request_exception)

# Unregister the rollback event handler.
django.dispatch.dispatcher.disconnect(
django.db._rollback_on_exception,
django.core.signals.got_request_exception)

def main():
# Re-add Django 1.0 archive to the path, if needed.
# if django_path not in sys.path:
# sys.path.insert(0, django_path)

# Create a Django application for WSGI.
application = django.core.handlers.wsgi.WSGIHandler()

# Run the WSGI CGI handler with that application.
util.run_wsgi_app(application)

if __name__ == '__main__':
main()
-------------------------------------------------------------------------------------------------

Thomas Winningham

unread,
Jun 20, 2009, 11:00:11 AM6/20/09
to Google App Engine
Some initial thoughts using the task queue api:

1. It is very easy to create a chain reaction if you don't know what
you are doing :P

2. Using the queues with the dev_appservery.py is very nice such that
you can test things out and see how things get queued.

3. Would like to see flush queue option (or something) in the
production server, as well as to look at the queue.

4. My (horrible) first try at queues with production data spawned a
lot of tasks, most of which now I wish I could just remove and start
over.

5. It seemed like I generated 10x tasks then what I was expecting, not
sure if that is my mistake, but it didn't seem to have this order of
magnitude when I tried with development data, so I am not sure if that
is my fault or what.

6. Currently my queue is stuck and not progressing, again, not sure if
that is my fault or not.

Thanks again, the API itself it drop dead simple and fun.

-t

mdipierro

unread,
Jun 21, 2009, 12:56:07 PM6/21/09
to Google App Engine
Congratulations!

It also works great with web2py.

web2py has a built-in cron-like mechanism but it does not work on
GAE.
Now thanks to the Task Queue API we may be able to port our cron API
(or some part of it) to GAE.

Massimo

On Jun 18, 6:52 pm, "Jason (Google)" <apija...@google.com> wrote:
> A new release of the Python SDK was made available earlier today. In
> addition to oft-requested support for Django 1.0 and asynchronous URL
> Fetch, this release introduces the experimental Task Queue API, which
> allows you to perform offline processing on App Engine by scheduling
> bundles of work (tasks) for automatic execution in the background
> without having to worry about managing threads or polling.
>
> You can download the newest SDK directly from the Downloads page athttp://code.google.com/appengine/downloads.html.
>
> Check out the official blog post, release notes, and documentation for
> more on the newest features of the SDK. And, as always, please feel
> free to share your questions, suggestions, and comments on the group.
>
> http://googleappengine.blogspot.com/2009/06/new-task-queue-api-on-goo...http://code.google.com/p/googleappengine/wiki/SdkReleaseNoteshttp://code.google.com/appengine/docs/python/taskqueue/
>
> Cheers!
> - Jason

Jon McAlister

unread,
Jun 22, 2009, 6:18:20 AM6/22/09
to google-a...@googlegroups.com
On Sat, Jun 20, 2009 at 5:00 PM, Thomas<winni...@gmail.com> wrote:
>
> Some initial thoughts using the task queue api:
>
> 1. It is very easy to create a chain reaction if you don't know what
> you are doing :P

Indeed it is :-P

> 2. Using the queues with the dev_appservery.py is very nice such that
> you can test things out and see how things get queued.
>
> 3. Would like to see flush queue option (or something) in the
> production server, as well as to look at the queue.

We don't have this right now, but will certainly add this eventually.
In the mean time I would encourage you to file this on the issue
tracker.

> 4. My (horrible) first try at queues with production data spawned a
> lot of tasks, most of which now I wish I could just remove and start
> over.

One thing you can do is pause the queue. Another is to push a new
version of your app that has a very simpler handler for the URL the
tasks are using; that way it will quickly eat through all of the
dangling tasks.

> 5. It seemed like I generated 10x tasks then what I was expecting, not
> sure if that is my mistake, but it didn't seem to have this order of
> magnitude when I tried with development data, so I am not sure if that
> is my fault or what.
>
> 6. Currently my queue is stuck and not progressing, again, not sure if
> that is my fault or not.
>
> Thanks again, the API itself it drop dead simple and fun.

Glad to hear!

Reply all
Reply to author
Forward
0 new messages