In my particular use case, I was crashing when executing a South data
migration which ingested large datasets into my PostGIS database.
There is a monkey patch available on SO:
http://stackoverflow.com/a/7769117/1545769
After including this monkey patch in each of my data migration files, the
migrations completed successfully.
Here is why I am creating an issue:
1) There isn't a very helpful message when the process is terminated; you
have to deduce it from looking at PostgreSQL logs or /var/log/syslog to
see that low memory was the perpetrator.
2) There is no way to configure in settings.py to disable the
CursorDebugWrapper (e.g. settings.DISABLE_DEBUG_CURSOR boolean)
3) I wonder if we can't make Django smart enough to trap Out-of-memory
errors to present an informative exception via Python and/or...
4) ...smartly manage the connection.queries buffer to prevent it from
using too much RAM in the first place?
N2 would be an easy pickings solution (here:
https://github.com/django/django/blob/master/django/db/backends/__init__.py),
but it wouldn't solve N1 and developers might not know that RAM is the
issue. So I think N3/N4 should be considered.
--
Ticket URL: <https://code.djangoproject.com/ticket/21245>
Django <https://code.djangoproject.com/>
The Web framework for perfectionists with deadlines.
* status: new => closed
* needs_docs: => 0
* resolution: => duplicate
* needs_tests: => 0
* needs_better_patch: => 0
Comment:
Duplicate of #12581
--
Ticket URL: <https://code.djangoproject.com/ticket/21245#comment:1>