The second is to set draft (another field on "post") to "True". I also
want to call a celery task.
My first solution was to put something like this in the projects save
function
def save(self):
if not self.thumbnail:
self.draft=True
thumnailtask.delay(self.pk) #starts a celery task, not really
relevant.
super(post, self).save()
That works great when I manually save an object.
But when an object thumbnail gets deleted, it doesn't call that. There's
no way to start a task, or one any kind of logic when a ForeignKey gets
deleted. ForeignKey.SET() doesn't seem to have access to the post's PK, so
I can't say something like ForeignKey.SET(thumbnailtask_returns_null(pk)).
I think that ideally, ForeignKey.SET would simply call the post's save
function.
--
Ticket URL: <https://code.djangoproject.com/ticket/21640>
Django <https://code.djangoproject.com/>
The Web framework for perfectionists with deadlines.
* needs_better_patch: => 0
* type: Uncategorized => Bug
* needs_tests: => 0
* needs_docs: => 0
--
Ticket URL: <https://code.djangoproject.com/ticket/21640#comment:1>
* status: new => closed
* resolution: => wontfix
* type: Bug => New feature
Comment:
I disagree that calling save() is inherently the right approach here.
There are several examples in Django where model functions are not invoked
when the efficient approach to implementation doesn't route through
individual models. For example, when you call a bulk delete on a queryset,
delete() isn't invoked on individual models. This is because the efficient
implementation of the bulk deletion is to call "DELETE FROM table WHERE
[condition]", not calling delete on individual models on the query set.
This is done for efficiency reasons - if you delete a query set of 1000
items, invoking delete() on 1000 items would be very computationally
expensive -- even if the delete() method itself was a no-op (because
function calls aren't free).
Similarly, a behaviour like on_delete=NULL behaviour can be implemented at
a database level (i.e., the column definition automatically causes the
database to set the field to NULL), or it can be implemented with a bulk
update (UPDATE field=NULL WHERE [delete condition]). Requiring a save on
every object related to a deleted object would have a similar CPU overhead
-- if you deleted a query set with 1000 items, you're going to invoke 1000
save() methods.
I understand your use case, but IMHO the unintended side effects of your
proposed feature outweigh the benefits.
You have a couple of options here as a workaround.
* You could tackle this at the database level with a stored procedure.
* You could write a "my_delete()" method (pick a better name, though!)
that does the thumbnail task generation as a pre-processing step before
issuing the delete. It's a little ugly,
* If you're only ever dealing with single object deletion, provide an
implementation of delete() that does the pre-processing step. This won't
help you if you run Post.objects.filter(…).delete() (because Post.delete()
won't be invoked), but if you're only ever deleting individual posts, it
will work fine.
* Write your own implementation of SET_NULL. Although it looks like a
constant, it's actually a function that defines what processing should be
performed on deleted objects. I haven't looked into this approach in
detail, but you may find it possible to hook in the pre-delete processing
as well. See
[https://github.com/django/django/blob/master/django/db/models/deletion.py
the source] for details.
--
Ticket URL: <https://code.djangoproject.com/ticket/21640#comment:2>
Comment (by anonymous):
Yeah, I was worried it would be something like that.
I hadn't considered rewriting SET_NULL, and it seems like I might be able
to get it to work.
It would be nice if there was some easy way to run real code when a
foriegnkey gets deleted, but for my purposes rewriting SET_NULL should do.
Thanks.
--
Ticket URL: <https://code.djangoproject.com/ticket/21640#comment:3>