I'm using dbfpy to read records from a blobstore entry and am unable to read 24K records before hitting the 10 minute wall (my process is in a task queue). Here's my code:
def get(self):
count = 0
cols = ['R_MEM_NAME','R_MEM_ID','R_EXP_DATE','R_STATE','R_RATING1','R_RATING2']
blobkey = self.request.get('blobkey')
blob_reader = blobstore.BlobReader(blobkey)
dbf_in = dbf.Dbf(blob_reader, True)
try:
if dbf_in.fieldNames[0] == 'R_MEM_NAME':
pass
except:
logging.info("Invalid record type: %s", dbf_in.fieldNames[0])
return
mysql = mysqlConnect.connect('ratings')
db = mysql.db
cursor = db.cursor()
for rec in dbf_in:
count = count + 1
if count == 1:
continue
continue
This simple loop should finish in seconds. Instead it gets through a few thousand records and then hits the wall.
Note the last "continue" that I added to bypass the mysql inserts (that I previously thought were the culprit).
I'm stumped and stuck.