What is the prescribed method for doing large bulk inserts? I am using sqllite as my backend.
Is it ok to just write a Python script that talks to the db directly or should the DAL be used? The book says the bulk_insert method is not more advantageous than a for loop.
I searched this group for an answer but didn't find anything definitive.
Thanks.
PS Kudos to the web2py creator and contributors. Every day I am struck by how elegant, easy and well-designed it is.
Thanks.
So is it correct to say that -
1. There is no compelling reason to do this without the DAL
2. My options in the DAL are bulk_insert, looping db.query and csv import and that performance wise they're similar?
I would expect that each DBMS needs different info to start a bulk load, so the interface may be tricky, or just pass a dict and let the adapter work it out.
What do you think?
Is it possible that we add a "native bulk insert" function which is coded up in each adapter. Even bulk_insert is an odbc 1 row at a time-slow for big files. I need to load huge files all the time and I am writing custom modules to do this with a native loader. Should this be a dal option? Worth noting that this type of operation is a batch, back end thing, I wouldn't do this for a end user web app.
I would expect that each DBMS needs different info to start a bulk load, so the interface may be tricky, or just pass a dict and let the adapter work it out.
What do you think?
--