I have found the same problem using executemany() when trying to
insert into Access databases. My workaround has been to use a for-loop
around execute() and commit after every single insert. This speeds up
inserts drastically, but it is still much slower than using the GUI on
Access itself. For example, duplicating a 1GB database into another
Access database takes about 20 min. If I use executemany() I have
allowed it to run for hours without finishing.
On Feb 13, 10:48 am, Johan <
johan2s...@gmail.com> wrote:
> Hi,
>
> I'm calculating statistics for some 40000 items (about 10000
> datapoints per item) and every 1000th item I'm dumping the stats to a
> database. Unfortunately saving the data to the database is several
> orders of magnitude slower than calculating it. I'm on a Sandy Bridge
> quad core machine with a Vertex 3 SSD and the database is a SQL Server
> running locally. There must be something wrong here. For a completely
> different take on the same issue, see this post over at stackoverflow:
>
>
http://stackoverflow.com/questions/5693885/pyodbc-very-slow-bulk-inse...