Depending on your use case, there may be an easier (or more efficient way) to skin this cat. Both mysql and postgres have syntax that will let you load an entire csv into a db table as a one-liner. To do that from within Django, you need to be able to run an external shell command, and the best way to do that is (often) with Fabric. And since this is a function you'll likely want to do on a regular basis, it makes sense to wrap it all up in a Django management command.
Here's an excerpt of some code I use to do something similar:
This is located in someapp/management/commands/import_courses.py . With this in place I can run
./manage.py import_courses
at any time. On each run, it:
- Drops a temporary table
- Re-creates that temporary table according to a predefined postgres schema (contained in importer_courses.sql)
- Imports the CSV into that temporary table
In my case, I needed to be able to perform queries across the CSV data *before* importing some of it into my real models, but this turned out to be a wonderfully efficient way of handling CSV data in general. You don't have to drop and recreate the temp table each time - I chose to do it that way for my use case but season to taste.
To use this technique you'll need to `pip install fabric`, set a directory location where your CSVs are located (as IMPORTER_DATA_DIR in your settings), and read up a bit on Fabric and Django management commands. Then you'll need to modify it to handle multiple CSVs rather than just one.
./s