Sorry one more, using quickstart "OSError: [Errno 2] No such file or directory" when running [fxudply@auaeuap070wbcr2 bin]$ ./airflow webserver -p 100

2,747 views
Skip to first unread message

Garry van Diggele

unread,
Oct 18, 2015, 10:08:24 PM10/18/15
to Airflow
Hi

As mentioned in a previous post we had to build everything from scratch including python. We believe we have it all built correctly but have tried to switch to the out of the box serial executor and db just to get up and running.


[fxudply@auaeuap070wbcr2 bin]$ ./airflow -h
usage: airflow [-h]
               {backfill,clear,run,test,task_state,webserver,scheduler,initdb,resetdb,upgradedb,list_dags,list_tasks,worker,serve_logs,flower,version,kerberos}
               ...

positional arguments:
  {backfill,clear,run,test,task_state,webserver,scheduler,initdb,resetdb,upgradedb,list_dags,list_tasks,worker,serve_logs,flower,version,kerberos}
                        sub-command help
    backfill            Run subsections of a DAG for a specified date range
    clear               Clear a set of task instance, as if they never ran
    run                 Run a single task instance
    test                Test a task instance. This will run a task without
                        checking for dependencies or recording it's state in
                        the database.
    task_state          Get the status of a task instance.
    webserver           Start a Airflow webserver instance
    scheduler           Start a scheduler scheduler instance
    initdb              Initialize the metadata database
    resetdb             Burn down and rebuild the metadata database
    upgradedb           Upgrade metadata database to latest version
    list_dags           List the DAGs
    list_tasks          List the tasks within a DAG
    worker              Start a Celery worker node
    serve_logs          Serve logs generate by worker
    flower              Start a Celery Flower
    version             Show version
    kerberos            Start a kerberos ticket renewer

optional arguments:
  -h, --help            show this help message and exit
[fxudply@auaeuap070wbcr2 bin]$ ./airflow version
  ____________       _____________
 ____    |__( )_________  __/__  /________      __
____  /| |_  /__  ___/_  /_ __  /_  __ \_ | /| / /
___  ___ |  / _  /   _  __/ _  / / /_/ /_ |/ |/ /
 _/_/  |_/_/  /_/    /_/    /_/  \____/____/|__/
   v1.5.1


We successfully initialise the DB without error. When we try to start the airflow server we get:

[fxudply@auaeuap070wbcr2 bin]$ ./airflow webserver -p 10002
  ____________       _____________
 ____    |__( )_________  __/__  /________      __
____  /| |_  /__  ___/_  /_ __  /_  __ \_ | /| / /
___  ___ |  / _  /   _  __/ _  / / /_/ /_ |/ |/ /
 _/_/  |_/_/  /_/    /_/    /_/  \____/____/|__/

2015-10-19 13:06:46,003 - root - INFO - Filling up the DagBag from /home/fxudply/fxudply/airflow/dags
2015-10-19 13:06:46,005 - root - INFO - Importing /opt/fxudply/python/current/lib/python2.7/site-packages/airflow-1.5.1-py2.7.egg/airflow/example_dags/example_bash_operator.py
2015-10-19 13:06:46,023 - root - INFO - Loaded DAG <DAG: example_bash_operator>
2015-10-19 13:06:46,032 - root - INFO - Importing /opt/fxudply/python/current/lib/python2.7/site-packages/airflow-1.5.1-py2.7.egg/airflow/example_dags/example_branch_operator.py
2015-10-19 13:06:46,042 - root - INFO - Loaded DAG <DAG: example_branch_operator>
2015-10-19 13:06:46,051 - root - INFO - Importing /opt/fxudply/python/current/lib/python2.7/site-packages/airflow-1.5.1-py2.7.egg/airflow/example_dags/example_http_operator.py
2015-10-19 13:06:46,053 - root - INFO - Loaded DAG <DAG: example_http_operator>
2015-10-19 13:06:46,058 - root - INFO - Importing /opt/fxudply/python/current/lib/python2.7/site-packages/airflow-1.5.1-py2.7.egg/airflow/example_dags/example_python_operator.py
2015-10-19 13:06:46,065 - root - INFO - Loaded DAG <DAG: example_python_operator>
2015-10-19 13:06:46,074 - root - INFO - Importing /opt/fxudply/python/current/lib/python2.7/site-packages/airflow-1.5.1-py2.7.egg/airflow/example_dags/example_xcom.py
2015-10-19 13:06:46,084 - root - INFO - Loaded DAG <DAG: example_xcom>
2015-10-19 13:06:46,091 - root - INFO - Importing /opt/fxudply/python/current/lib/python2.7/site-packages/airflow-1.5.1-py2.7.egg/airflow/example_dags/tutorial.py
2015-10-19 13:06:46,099 - root - INFO - Loaded DAG <DAG: tutorial>
Running the Gunicorn server with 4on host 10.5.75.231 and port 10002...
Traceback (most recent call last):
  File "./airflow", line 4, in <module>
    __import__('pkg_resources').run_script('airflow==1.5.1', 'airflow')
  File "build/bdist.linux-x86_64/egg/pkg_resources/__init__.py", line 735, in run_script
  File "build/bdist.linux-x86_64/egg/pkg_resources/__init__.py", line 1652, in run_script
  File "/opt/sw/calypso/software/fxudply/python/Python-2.7.10/lib/python2.7/site-packages/airflow-1.5.1-py2.7.egg/EGG-INFO/scripts/airflow", line 17, in <module>
    args.func(args)
  File "/opt/fxudply/python/current/lib/python2.7/site-packages/airflow-1.5.1-py2.7.egg/airflow/bin/cli.py", line 265, in webserver
    args.hostname + ':' + str(args.port), 'airflow.www.app:app'])
  File "/opt/fxudply/python/current/lib/python2.7/subprocess.py", line 710, in __init__
    errread, errwrite)
  File "/opt/fxudply/python/current/lib/python2.7/subprocess.py", line 1335, in _execute_child
    raise child_exception
OSError: [Errno 2] No such file or directory

Any idea's?

Garry van Diggele

unread,
Oct 20, 2015, 8:51:52 PM10/20/15
to Airflow
Fixed, there were some file permission and underlying path issues which have been resolved.

Sasha Kacanski

unread,
Dec 14, 2015, 2:56:05 PM12/14/15
to Airflow
  File "/cube/opt/python27/bin/airflow", line 17, in <module>
    args.func(args)
  File "/cube/opt/python27/lib/python2.7/site-packages/airflow/bin/cli.py", line 338, in webserver
    'airflow.www.app:cached_app()'])
  File "/cube/opt/python27/lib/python2.7/subprocess.py", line 710, in __init__
    errread, errwrite)
  File "/cube/opt/python27/lib/python2.7/subprocess.py", line 1335, in _execute_child

    raise child_exception
OSError: [Errno 2] No such file or directory


Same issue with possibly permissions
DB is created fine (sqlite) I actually installed all optional modules for airflow but I can't get web server up and running.
I tried as non prov user and with sudo ...



so what gives...?

Maxime Beauchemin

unread,
Dec 16, 2015, 1:04:37 AM12/16/15
to Airflow
it looks like gunicorn is working.

What happens when you just run `gunicorn`? `No such file or directory`?


Sasha Kacanski

unread,
Dec 16, 2015, 10:04:35 AM12/16/15
to Airflow
Ok Maxime,
that did it:

export PATH=/cube/opt/python27:$PATH

I did not realized that guncorn is binary and that airflow is passing app to lunch web service.
It works now...
I would add this documentation as bunch of us developers use variety of python interpreters and most of those are completely virtual or local scope from prospective of user env...

Thanks for help ...

Maxime Beauchemin

unread,
Dec 16, 2015, 12:54:23 PM12/16/15
to Airflow
What distro puts python under `/cube/opt`? Are all python executables (CLIs) not in path?

Seems to me like it's more of a problem with the distro/environment...

Max

Sasha Kacanski

unread,
Dec 16, 2015, 2:33:27 PM12/16/15
to Airflow
There is no distro that will setup /cube/opt
that is my own dev spindle slice.
This is resolved and no issue here. I was just not aware that gunicron is binary  and not module invoked via interpreter ...
I don't know for you and rest of community but I roll my own python implementations and they are not advertised to user env but ratter wrapper scripts take care of anything necessery to satisfy python binaries and/or modules I use in my development.

Regards,

Reply all
Reply to author
Forward
0 new messages