(<class 'sqlite3.OperationalError'> parser stack overflow) when writing a row with 140 fields

209 views
Skip to first unread message

Jaison Raj

unread,
Jan 18, 2018, 9:31:49 PM1/18/18
to web2py-users
Hi guys,

 I am trying to write a row with 140 fields into the database, with the following script:

controller:
WCEL={item1:value1,item2:value2,.......item140:value140}

db.WCEL.update_or_insert(**(WCEL))

model:
db = DAL('sqlite://storage.sqlite1')

db.define_table('WCEL',
                Field('item1'),
                Field('item2),
                ......
                Field('item139'),
                Field('item140')
                               )

 but I am getting the following error:

<class 'sqlite3.OperationalError'> parser stack overflow



 File "C:\web2py\gluon\packages\dal\pydal\adapters\base.py", line 412, in execute
rv = self.cursor.execute(command, *args[1:], **kwargs)
OperationalError: parser stack overflow

Function argument list

(self=<pydal.adapters.sqlite.SQLite object>, *args=('SELECT "WCEL"."id", "WCEL."it.."."item140" = \'0\')) LIMIT 1 OFFSET 0;',), **kwargs={})



I tried the same set of 140 fields by importing from a csv, it was successful.

db.WCEL.import_from_csv_file(open('WCEL.csv', 'r'))

Is there a way to  rectify this issue?
I am actually extracting data from a xml and writing to the DB, if i have to write from xml to csv and import to dB, that is not efficient.
Pls advise.

Regards,
Jaison

Anthony

unread,
Jan 22, 2018, 11:24:31 AM1/22/18
to web2py-users
On Thursday, January 18, 2018 at 9:31:49 PM UTC-5, Jaison Raj wrote:
Hi guys,

 I am trying to write a row with 140 fields into the database, with the following script:

controller:
WCEL={item1:value1,item2:value2,.......item140:value140}

db.WCEL.update_or_insert(**(WCEL))

model:
db = DAL('sqlite://storage.sqlite1')

db.define_table('WCEL',
                Field('item1'),
                Field('item2),
                ......
                Field('item139'),
                Field('item140')
                               )

 but I am getting the following error:

<class 'sqlite3.OperationalError'> parser stack overflow



 File "C:\web2py\gluon\packages\dal\pydal\adapters\base.py", line 412, in execute
rv = self.cursor.execute(command, *args[1:], **kwargs)
OperationalError: parser stack overflow

Function argument list

(self=<pydal.adapters.sqlite.SQLite object>, *args=('SELECT "WCEL"."id", "WCEL."it.."."item140" = \'0\')) LIMIT 1 OFFSET 0;',), **kwargs={})



I tried the same set of 140 fields by importing from a csv, it was successful.

db.WCEL.import_from_csv_file(open('WCEL.csv', 'r'))

The problem is not with writing the records but with attempting to check for existing records (.import_from_csv_file works because it does not check whether each record to be inserted matches an existing record).

If you use .insert() instead of .update_or_insert(), you shouldn't have a problem. When you use .update_or_insert(), the DAL first attempts to select a record that matches all of the fields, and it does so with a query like:

(db.mytable.field1 == 'value 1') & (db.mytable.field2 == 'value 2') & ... & (db.mytable.field140 == 'value 140')

The problem is that when the DAL joins multiple queries, it places them in nested sets of parentheses -- with just 3 queries, you get SQL that looks something like:

(((mytable.field1 = 'value 1') AND (mytable.field2 = 'value 2')) AND (mytable.field3 = 'value 3'))

With 140 fields, you get queries nested 140 levels deep in parentheses, which SQLite cannot handle. Of course, all of the nested parentheses are not really necessary, so if you want to first check for the existence of a record, one option is to avoid using .update_and_insert() and instead write your own code to check for record existence. You code would have to manually construct the "WHERE" clause of the SQL query. The logic then might look something like:

record = db(manually_generated_raw_SQL).select(db.WCEL.id, limitby=(0, 1)).first()
if not record:
    db
.WCEL.insert(**WCEL)

Anthony

Anthony

unread,
Jan 22, 2018, 11:32:02 AM1/22/18
to web...@googlegroups.com
With 140 fields, you get queries nested 140 levels deep in parentheses, which SQLite cannot handle. Of course, all of the nested parentheses are not really necessary, so if you want to first check for the existence of a record, one option is to avoid using .update_and_insert() and instead write your own code to check for record existence. You code would have to manually construct the "WHERE" clause of the SQL query. The logic then might look something like:

record = db(manually_generated_raw_SQL).select(db.WCEL.id, limitby=(0, 1)).first()
if not record:
    db
.WCEL.insert(**WCEL)

Note, one simple way to generate the raw SQL query is something like:

str(reduce(lambda a, b: a & b, [db.WCEL[f] == v for f, v in iteritems(WCEL)])).replace('(', '').replace(')', '')

That generates the same query .update_or_insert would generate, converts it to SQL via str(), and then simply removes all the parentheses.

Anthony
Reply all
Reply to author
Forward
0 new messages