API call datastore_v3.Put() required more quota ?

939 views
Skip to first unread message

@zghanv/-

unread,
Sep 13, 2011, 10:21:11 PM9/13/11
to Google App Engine
I'm trying to delete testing data from my app from datastore admin ...
but it's giving following error. My High Replication Data is 100%,
now i want to reset all data. How can i remove that.

Thanks.

---

Delete Job Status

There was a problem kicking off the jobs. The error was:

The API call datastore_v3.Put() required more quota than is available.

---

Dashboard status ...

CPU Time
30% 1.95 of 6.50 CPU hours
Outgoing Bandwidth
2% 0.02 of 1.00 GBytes
Incoming Bandwidth
0% 0.00 of 1.00 GBytes
Total Stored Data
0% 0.00 of 1.00 GBytes
Recipients Emailed
0% 0 of 2,000
High Replication Data
100% 0.50 of 0.50 GBytes This resource is currently experiencing a
short-term quota limit.
Backend Usage
0% $0.00 of $0.72

@zghanv/-

unread,
Sep 15, 2011, 9:04:56 AM9/15/11
to Google App Engine
any expert ?

sofia

unread,
Sep 16, 2011, 12:24:14 PM9/16/11
to google-a...@googlegroups.com
I have the same problem.. Datastore quota at 100% and unable to delete an entity either through admin or using map/reduce. What I did is set up a script to delete x records at a time. I've managed to decrease data by 45% in 2 days but I then hit cpu quota so I'm guessing it's gonna take a few more days until I'm able to delete all the data. What i did was set up this script, bulkdelete.py:

#!/usr/bin/env python
# -*- coding: UTF-8 -*-
#
#

import time
from google.appengine.ext import webapp
from google.appengine.ext.webapp.util import run_wsgi_app
from google.appengine.ext import db
from lib.model import Documents

class BulkDelete(webapp.RequestHandler):
    def get(self):
        self.response.headers['Content-Type'] = 'text/plain'
        mod = self.request.get('m')
        if not mod:
            exit

        try:
            while True:
                q = db.GqlQuery("SELECT __key__ FROM Documents ORDER BY date ASC")
                assert q.count()
                db.delete(q.fetch(200))
                time.sleep(0.5)
        except Exception, e:
            self.response.out.write(repr(e)+'\n')
            pass

# init
application = webapp.WSGIApplication([('/bulkdelete', BulkDelete)],debug=True)

def main():
        run_wsgi_app(application)

if __name__ == '__main__':
        main()

and then call it each 5 min through cron.yaml. 

Gerald Tan

unread,
Sep 16, 2011, 12:51:28 PM9/16/11
to google-a...@googlegroups.com
From my experience (at least with Java) I believe sleep() continues to eat cpu time despite the cpu not really doing anything. My cpu times skyrocketed when I used sleep() with long-held http connections to implement http push back before channel api was available. I recommend breaking the work into taskqueue tasks instead.

@zghanv/-

unread,
Sep 24, 2011, 1:48:02 PM9/24/11
to Google App Engine
Any official engineer present on this group ?

On Sep 14, 7:21 am, "@zghanv/-" <azgha...@gmail.com> wrote:

Brandon Wirtz

unread,
Sep 24, 2011, 2:20:38 PM9/24/11
to google-a...@googlegroups.com

Sleep in non-GAE Environments says “Do other things and check back on me in X time”

With instance Hour billing Sleep would only save you money if you have concurrency to handle multiple tasks, and this freed CPU Cycles, AND you weren’t sleeping longer than the thing you were waiting to have happen.

--
You received this message because you are subscribed to the Google Groups "Google App Engine" group.
To view this discussion on the web visit https://groups.google.com/d/msg/google-appengine/-/vJC56brjHYwJ.
To post to this group, send email to google-a...@googlegroups.com.
To unsubscribe from this group, send email to google-appengi...@googlegroups.com.
For more options, visit this group at http://groups.google.com/group/google-appengine?hl=en.

djidjadji

unread,
Sep 29, 2011, 6:55:23 AM9/29/11
to google-a...@googlegroups.com
If you delete many entities it's faster to use a cursor. An entity is
marked as deleted and will physically be removed in the datastore at a
later time. It takes time to skip the marked entities when executing a
query.
The loop can be modified as such, no need to use count() as it uses
CPU and time also, if the result set is empty there are no entities.

try:
cursor = None
while True:
q = Documents.all(keys_only=True)
if cursor:
q.with_cursor(cursor)
result = q.fetch(200)
if not result:
break
cursor = q.cursor()
db.delete(result)


except Exception, e:
self.response.out.write(repr(e)+'\n')
pass

Op 16 september 2011 18:24 heeft sofia <sofiac...@gmail.com> het
volgende geschreven:

Reply all
Reply to author
Forward
0 new messages