Trying to run jobs in the background using multiprocessing

100 views
Skip to first unread message

Pau Creixell

unread,
Nov 27, 2013, 2:58:13 PM11/27/13
to django...@googlegroups.com
Hi there!

I am trying to run jobs in the background using the following (minimal) code:

Under views.py:

def submit(request):
   model = MyModel()
   model.RunInAThread(ReferenceSeparator, MutationSeparator)

Under models.py:

class MyModel(models.Model):
   def RunInAThread(self, ReferenceSeparator, MutationSeparator):
           print 1
           Thread = multiprocessing.Process(target = self.Run, args=(self, ReferenceSeparator, MutationSeparator))
           print 2
           Thread.daemon = True
           print 3
           Thread.start()
           print 4

   def Run(self, ReferenceSeparator, MutationSeparator):
           print 5


But I get the error below and it only print 1, 2, 3 and 4, but not 5:

TypeError at /submit
'str' object is not callable
...
Exception Location: /sw/lib/python2.6/multiprocessing/forking.py in __init__, line 98
...

I have tried in many different ways (e.g. Daemon = False, without "self" as an "args"), but don't really understand why this would not work, unless Django doesn't permit multiprocessing (which some people seem to suggest online).
Any help will be highly appreciated.

Best wishes,
Pau

Timothy W. Cook

unread,
Nov 27, 2013, 3:11:20 PM11/27/13
to django...@googlegroups.com
When I asked about this it seems that the best solution is to use
Celery in combination with Django. There seems to be quite a bit of
good experience here using them together.

HTH,
Tim
> --
> You received this message because you are subscribed to the Google Groups
> "Django users" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to django-users...@googlegroups.com.
> To post to this group, send email to django...@googlegroups.com.
> Visit this group at http://groups.google.com/group/django-users.
> To view this discussion on the web visit
> https://groups.google.com/d/msgid/django-users/27f043ea-d59d-42ee-9529-0bf86ade31b0%40googlegroups.com.
> For more options, visit https://groups.google.com/groups/opt_out.



--
MLHIM VIP Signup: http://goo.gl/22B0U
============================================
Timothy Cook, MSc +55 21 94711995
MLHIM http://www.mlhim.org
Like Us on FB: https://www.facebook.com/mlhim2
Circle us on G+: http://goo.gl/44EV5
Google Scholar: http://goo.gl/MMZ1o
LinkedIn Profile:http://www.linkedin.com/in/timothywaynecook

Avraham Serour

unread,
Nov 27, 2013, 7:29:18 PM11/27/13
to django...@googlegroups.com
maybe your webserver won't permit multiprocessing, it depends on how you are running django.

In any case I would also recommend celery, in this case you have control on how many workers are running, having a webrequest open yet another process means you won't have control on how many processes are running, easily taking your server down.

with celery you just add a task to the queue, and you can have that on a different server if you would like

success
avraham


Doug Blank

unread,
Nov 27, 2013, 10:01:25 PM11/27/13
to django-users
On Wed, Nov 27, 2013 at 2:58 PM, Pau Creixell <paucr...@gmail.com> wrote:
> Hi there!
>
> I am trying to run jobs in the background using the following (minimal)
> code:
>
> Under views.py:
>
> def submit(request):
> model = MyModel()
> model.RunInAThread(ReferenceSeparator, MutationSeparator)
>
> Under models.py:
>
> class MyModel(models.Model):
> def RunInAThread(self, ReferenceSeparator, MutationSeparator):
> print 1
> Thread = multiprocessing.Process(target = self.Run, args=(self,
> ReferenceSeparator, MutationSeparator))

I suspect that if you are setting the target to be self.Run, then the
args should not include self:

Thread = multiprocessing.Process(target = self.Run,
args=(ReferenceSeparator, MutationSeparator))

-Doug

> print 2
> Thread.daemon = True
> print 3
> Thread.start()
> print 4
>
> def Run(self, ReferenceSeparator, MutationSeparator):
> print 5
>
>
> But I get the error below and it only print 1, 2, 3 and 4, but not 5:
>
> TypeError at /submit
> 'str' object is not callable
> ...
> Exception Location: /sw/lib/python2.6/multiprocessing/forking.py in
> __init__, line 98
> ...
>
> I have tried in many different ways (e.g. Daemon = False, without "self" as
> an "args"), but don't really understand why this would not work, unless
> Django doesn't permit multiprocessing (which some people seem to suggest
> online).
> Any help will be highly appreciated.
>
> Best wishes,
> Pau
>

Pau Creixell

unread,
Nov 28, 2013, 2:42:47 AM11/28/13
to django...@googlegroups.com
Hi again,

Thanks for all your answers.
Yes, I also found out many recommend Celery, but I was hoping for a "simpler" solution, if there is one...
Doug, I already tried without the self in arguments, but got the same error, unfortunately.

I guess if there isn't another simpler way, I'll have to dig into Celery.

Cheers,
Pau

Javier Guerra Giraldez

unread,
Nov 28, 2013, 3:32:15 AM11/28/13
to django...@googlegroups.com
On Thu, Nov 28, 2013 at 2:42 AM, Pau Creixell <paucr...@gmail.com> wrote:
> I guess if there isn't another simpler way, I'll have to dig into Celery.


there _are_ simpler alternatives to Celery:

- worker threads: similar to your proposal, but with very well-thought
process control. pros: no extra process. cons: very hard to get it
even half right, very easy to kill your web workers with the lightest
load. probably unsupported by hostings.

- ghetto queues: a db table + a cron process. pros: simple, easy to
debug. cons: has a hard ceiling, if the queue grows over some (hard
to estimate) limit, it grinds to a halt.

- uWSGI spool: extra uWSGI containers that work on a queue of tasks.
pros: easy to use, you get the process management by uWSGI, no extra
tools. cons: limited semantics, might not be supported by hosts.

- simple brokers: for example pub/sub or other structures in Redis.
pros: easier to use and understand than celery, might be lighter on
memory. cons: no standarization, needs a broker process (at least if
you're not already running Redis), might not be supported by hostings.

as you can see, in most cases the only reasonable answers are Celery
or ghetto queues. the last ones are an easy way out, and if done well
can serve to a surprising load level; but if you hit gold, it will
fall down. this scheme was a big part of the Twitter growing
problems.


--
Javier

Pau Creixell

unread,
Nov 28, 2013, 6:03:06 AM11/28/13
to django...@googlegroups.com
Cool, thanks Javier.
I will look into the different solutions you suggest.
Cheers,
Pau
Reply all
Reply to author
Forward
0 new messages