On 19 Jul 2012, at 04:41, Amit Saha wrote:
>
> Found a solution for this. So that it is useful, I shall be a little
> descriptive.
>
> Problem: You have an existing Python application with standard logging
> to a file. You want to keep it as it is when using celery instead of
> allowing celery to "hijack" it to its task log.
>
> Solution: Using celery.signals.worker_process_init (from
>
http://stackoverflow.com/a/6193082/59634)
>
> Basically, set this up in the top-level package and use away!
>
> I did not need to do anything else.
>
Note that the answer is slightly outdated for 3.0, as
in 3.0 you're discouraged from using task.get_logger.
Instead the best practice is to add a module-level logger:
from celery import task
from celery.utils.log import get_task_logger
logger = get_task_logger(__name__)
@task
def add(x, y):
logger.debug('add: %r %r', x, y)
return x + y
In 3.0 all task logger inherit from a common logger called 'celery.task',
which you can get from 'celery.utils.task_logger',
so if you want to agument the configuration you can do:
from celery.utils.log import task_logger
from celery.signals import after_setup_task_logger
@after_setup_task_logger.connect
def augment_task_logging(**kwargs):
logger = logging.getLogger('myapp.tasks')
if not logger.handlers:
formatter = logging.Formatter(logging.BASIC_FORMAT)
handler.setFormatter(formatter)
logger.addHandler(handler)
logger.propagate = 0
(setting propagate to 0 there, because some versions seems to have
problems if it's set to True or False).
Also the worker_process_init signal is only sent for each pool child process
when using the multiprocessing pool, so it won't be setup in the mainprocess
(which would be a problem if using other pools like eventlet).
--
Ask Solem
twitter.com/asksol |
+44 (0)7713357179