The error stack from the workers log
[2016-01-07 11:32:18,225] {jobs.py:455} INFO - Getting list of tasks to skip for active runs.
[2016-01-07 11:32:18,227] {jobs.py:470} INFO - Checking dependencies on 18 tasks instances, minus 0 skippable ones
[2016-01-07 11:32:18,371] {jobs.py:633} INFO - Done queuing tasks, calling the executor's heartbeat
[2016-01-07 11:32:18,371] {jobs.py:636} INFO - Loop took: 0.364169 seconds
[2016-01-07 11:32:18,373] {models.py:222} INFO - Finding 'running' jobs without a recent heartbeat
[2016-01-07 11:32:18,373] {models.py:228} INFO - Failing jobs without heartbeat after 2016-01-07 11:30:03.373523
[2016-01-07 11:32:23,011] {jobs.py:507} INFO - Prioritizing 0 queued jobs
[2016-01-07 11:32:23,035] {jobs.py:632} ERROR - (pickle.PicklingError) Can't pickle <class 'celery.utils.log.ProcessAwareLogger'>: it's not found as celery.utils.log.ProcessAwareLogger [SQL: u'INSERT INTO dag_pickle (pickle, created_dttm, pickle_hash) VALUES (%s, now(), %s)'] [parameters: [{'pickle_hash': -811270095097316367, 'pickle': <DAG: jordy>}]]
Traceback (most recent call last):
File "/usr/local/lib/python2.7/dist-packages/airflow/jobs.py", line 629, in _execute
self.process_dag(dag, executor)
File "/usr/local/lib/python2.7/dist-packages/airflow/jobs.py", line 436, in process_dag
pickle_id = dag.pickle(session).id
File "/usr/local/lib/python2.7/dist-packages/airflow/models.py", line 2389, in pickle
session.commit()
File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/session.py", line 813, in commit
self.transaction.commit()
File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/session.py", line 392, in commit
self._prepare_impl()
File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/session.py", line 372, in _prepare_impl
self.session.flush()
File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/session.py", line 2027, in flush
self._flush(objects)
File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/session.py", line 2145, in _flush
transaction.rollback(_capture_exception=True)
File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/util/langhelpers.py", line 60, in __exit__
compat.reraise(exc_type, exc_value, exc_tb)
File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/session.py", line 2109, in _flush
flush_context.execute()
File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/unitofwork.py", line 373, in execute
rec.execute(self)
File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/unitofwork.py", line 532, in execute
uow
File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/persistence.py", line 174, in save_obj
mapper, table, insert)
File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/persistence.py", line 800, in _emit_insert_statements
execute(statement, params)
File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 914, in execute
return meth(self, multiparams, params)
File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/sql/elements.py", line 323, in _execute_on_connection
return connection._execute_clauseelement(self, multiparams, params)
File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 1010, in _execute_clauseelement
compiled_sql, distilled_params
File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 1078, in _execute_context
None, None)
File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 1341, in _handle_dbapi_exception
exc_info
File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/util/compat.py", line 200, in raise_from_cause
reraise(type(exception), exception, tb=exc_tb)
File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py", line 1073, in _execute_context
context = constructor(dialect, self, conn, *args)
File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/default.py", line 582, in _init_compiled
param.append(processors[key](compiled_params[key]))
File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/sql/sqltypes.py", line 1241, in process
value = dumps(value, protocol)
File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 192, in dumps
dump(obj, file, protocol, byref, fmode, recurse)#, strictio)
File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 186, in dump
pik.dump(obj)
File "/usr/lib/python2.7/pickle.py", line 224, in dump
self.save(obj)
File "/usr/lib/python2.7/pickle.py", line 331, in save
self.save_reduce(obj=obj, *rv)
File "/usr/lib/python2.7/pickle.py", line 419, in save_reduce
save(state)
File "/usr/lib/python2.7/pickle.py", line 286, in save
f(self, obj) # Call unbound method with explicit self
File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 748, in save_module_dict
StockPickler.save_dict(pickler, obj)
File "/usr/lib/python2.7/pickle.py", line 649, in save_dict
self._batch_setitems(obj.iteritems())
File "/usr/lib/python2.7/pickle.py", line 681, in _batch_setitems
save(v)
File "/usr/lib/python2.7/pickle.py", line 331, in save
self.save_reduce(obj=obj, *rv)
File "/usr/lib/python2.7/pickle.py", line 396, in save_reduce
save(cls)
File "/usr/lib/python2.7/pickle.py", line 286, in save
f(self, obj) # Call unbound method with explicit self
File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 1102, in save_type
StockPickler.save_global(pickler, obj)
File "/usr/lib/python2.7/pickle.py", line 748, in save_global
(obj, module, name))
StatementError: (pickle.PicklingError) Can't pickle <class 'celery.utils.log.ProcessAwareLogger'>: it's not found as celery.utils.log.ProcessAwareLogger [SQL: u'INSERT INTO dag_pickle (pickle, created_dttm, pickle_hash) VALUES (%s, now(), %s)'] [parameters: [{'pickle_hash': -811270095097316367, 'pickle': <DAG: jordy>}]]
[2016-01-07 11:32:23,043] {jobs.py:653} ERROR - This Session's transaction has been rolled back due to a previous exception during flush. To begin a new transaction with this Session, first issue Session.rollback(). Original exception was: (pickle.PicklingError) Can't pickle <class 'celery.utils.log.ProcessAwareLogger'>: it's not found as celery.utils.log.ProcessAwareLogger [SQL: u'INSERT INTO dag_pickle (pickle, created_dttm, pickle_hash) VALUES (%s, now(), %s)'] [parameters: [{'pickle_hash': -811270095097316367, 'pickle': <DAG: jordy>}]]