Furthermore, the dag as a whole doesn't seem to work. The child dag can't seem to run, let alone complete, and failed tasks have errors like this:
Enter code here...Thanks!
[2016-03-15 04:23:56,021] {models.py:974} INFO - Executing <Task(SubDagOperator): child_dag> on 2016-03-13 00:00:00
[2016-03-15 04:23:56,032] {models.py:1041} ERROR - [Errno 35] Resource temporarily unavailable
Traceback (most recent call last):
File "/usr/local/lib/python2.7/site-packages/airflow/models.py", line 1000, in run
result = task_copy.execute(context=context)
File "/usr/local/lib/python2.7/site-packages/airflow/operators/subdag_operator.py", line 45, in execute
executor=self.executor)
File "/usr/local/lib/python2.7/site-packages/airflow/models.py", line 2472, in run
job.run()
File "/usr/local/lib/python2.7/site-packages/airflow/jobs.py", line 165, in run
self._execute()
File "/usr/local/lib/python2.7/site-packages/airflow/jobs.py", line 711, in _execute
executor.start()
File "/usr/local/lib/python2.7/site-packages/airflow/executors/local_executor.py", line 59, in start
w.start()
File "/usr/local/Cellar/python/2.7.11/Frameworks/Python.framework/Versions/2.7/lib/python2.7/multiprocessing/process.py", line 130, in start
self._popen = Popen(self)
File "/usr/local/Cellar/python/2.7.11/Frameworks/Python.framework/Versions/2.7/lib/python2.7/multiprocessing/forking.py", line 121, in __init__
self.pid = os.fork()
[2016-03-16 13:48:44,507] {models.py:974} INFO - Executing <Task(SubDagOperator): child_dag> on 2016-03-01 00:00:00[2016-03-16 13:48:44,568] {base_executor.py:34} INFO - Adding to queue: airflow run parent_dag.child_dag parent_dag.child_dag.print_task 2016-03-01T00:00:00 --local -s 2016-03-01T00:00:00 [2016-03-16 13:48:49,521] {jobs.py:802} INFO - [backfill progress] waiting: 1 | succeeded: 0 | kicked_off: 1 | failed: 0 | wont_run: 0 [2016-03-16 13:48:49,524] {local_executor.py:29} INFO - LocalWorker running airflow run parent_dag.child_dag parent_dag.child_dag.print_task 2016-03-01T00:00:00 --local -s 2016-03-01T00:00:00 [2016-03-16 13:48:54,527] {jobs.py:788} ERROR - The airflow run command failed at reporting an error. This should not occur in normal circumstances. Task state is 'running',reported state is 'success'. TI is <TaskInstance: parent_dag.child_dag.parent_dag.child_dag.print_task 2016-03-01 00:00:00 [running]>[2016-03-16 13:48:54,527] {jobs.py:802} INFO - [backfill progress] waiting: 1 | succeeded: 0 | kicked_off: 1 | failed: 0 | wont_run: 0[2016-03-16 14:41:35,878] {models.py:222} INFO - Finding 'running' jobs without a recent heartbeat[2016-03-16 14:41:35,879] {models.py:228} INFO - Failing jobs without heartbeat after 2016-03-16 14:39:20.879235[2016-03-16 14:41:37,950] {__init__.py:36} INFO - Using executor LocalExecutor[2016-03-16 14:41:37,953] {__init__.py:36} INFO - Using executor LocalExecutorTraceback (most recent call last): File "/usr/local/bin/airflow", line 15, in <module> args.func(args) File "/usr/local/lib/python2.7/site-packages/airflow/bin/cli.py", line 134, in run raise AirflowException(msg)airflow.utils.AirflowException: DAG [parent_dag.child_dag] could not be found in /Users/thinderson/airflow/dagsTraceback (most recent call last): File "/usr/local/bin/airflow", line 15, in <module> args.func(args) File "/usr/local/lib/python2.7/site-packages/airflow/bin/cli.py", line 134, in run raise AirflowException(msg)airflow.utils.AirflowException: DAG [parent_dag.child_dag] could not be found in /Users/thinderson/airflow/dagsfrom airflow.operators.subdag_operator import SubDagOperatorfrom airflow.operators import SubDagOperatorLogging into: /Users/thinderson/airflow/logs/parent_dag/child_dag/2016-03-11T00:00:00[2016-03-16 15:49:13,725] {__init__.py:36} INFO - Using executor SequentialExecutor[2016-03-16 15:49:24,579] {__init__.py:36} INFO - Using executor SequentialExecutorTraceback (most recent call last): File "/usr/local/bin/airflow", line 4, in <module> __import__('pkg_resources').run_script('airflow==1.6.2', 'airflow') File "/usr/local/lib/python2.7/site-packages/pkg_resources/__init__.py", line 726, in run_script self.require(requires)[0].run_script(script_name, ns) File "/usr/local/lib/python2.7/site-packages/pkg_resources/__init__.py", line 1484, in run_script exec(code, namespace, namespace) File "/usr/local/lib/python2.7/site-packages/airflow-1.6.2-py2.7.egg/EGG-INFO/scripts/airflow", line 15, in <module> args.func(args) File "/usr/local/lib/python2.7/site-packages/airflow-1.6.2-py2.7.egg/airflow/bin/cli.py", line 167, in run raise AirflowException(msg)airflow.utils.AirflowException: DAG [parent_dag.child_dag] could not be found in /Users/thinderson/airflow/dags