big data "out of memory" error

24 views
Skip to first unread message

Alex Glaros

unread,
Aug 13, 2019, 12:33:41 AM8/13/19
to web2py-users
Not sure if I'm qualified to give tips here, but I had a one-time job that timed out with out-of-memory error. To fix that, I cloned a copy of my site (admin-console, manage, pack all), keeping the same database, then in db.py replaced every instance of 'reference my_big_tables' with 'integer' in the clone. It ran quickly and I then discarded the clone. If this helps anyone, then good.

Questions: 

Is it generally accepted that with big data tables, it's best to achieve referential integrity in the controller at the record-creation stage, and not add any rules into the database? 

My clone kept the requires statements: requires = IS_IN_DB(db, 'my_big_tables.id'  ... and still ran fast so can the "reference my_big_tables" statement be safely, and completely eliminated from db.py? The "reference" statement really slows things down.

thanks,

Alex Glaros
Reply all
Reply to author
Forward
0 new messages