Please see this enhancement request:
Unlike what Russ has suggested, I'm pretty sure that a single UPDATE query with a large number (Ks/Ms) of updates will be significantly faster than doing multiple SQL UPDATE queries. If more people on the list feel this is not going to be the case, I will happily run a test against Postgresql and confirm the results either way.
Assuming however that the performance benefit is significant, should we look at contributing a patch? If so, what would be the API for the same?
Use case:
For each row in a table we send requests to a server. We get individual updates from the server informing us of the status of the request. Each update corresponds to a row in a table. We want to store the datetime of the update but do not wish to hit the database everytime (we have seen performance impact since the table is huge). We memcache to batch the updates and do a single Django ORM .update() call which works well but updates all rows to a common datetime. Ideally, we wish to update each row with its own datetime of receipt of request.
Also, if any django/postgres experts can advise of a way to do large number of updates concurrently on a table or a better design for this , would like to hear suggestions but can move that over to the Django Users mailing list.