DAG isn't available in the web server's DAG BAG object

2,637 views
Skip to first unread message

eliseop...@gmail.com

unread,
May 19, 2018, 6:49:44 PM5/19/18
to cloud-composer-discuss
After uploading a new DAG, it does not appear to be parsed properly. The only hint in the UI is:

This DAG isn't available in the web server's DAG BAG object. It shows up in this list because the scheduler marked it as active in the metadata database.

I can't find any log that explains why the task cannot be activated or is not parsed properly ( no schedule, no buttons in the UI, no status circles).

How can I find out the problem?

Crystal Qian

unread,
May 19, 2018, 8:48:34 PM5/19/18
to eliseop...@gmail.com, cloud-compo...@googlegroups.com
Hi!

Make sure that the DAG is definitely in the /dags directory of your GCS bucket.

The following command should let show if there are any syntactic errors in your DAG:
gcloud beta composer environments run [your environment] list_dags

Finally, you can run the following for each task in your DAG to verify that each task is working properly:
gcloud alpha composer environments run [your environment] test -- [dag name] [task name] [task run date]

Hope this helps!

--
You received this message because you are subscribed to the Google Groups "cloud-composer-discuss" group.
To unsubscribe from this group and stop receiving emails from it, send an email to cloud-composer-di...@googlegroups.com.
To post to this group, send email to cloud-compo...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/cloud-composer-discuss/0d28aa82-c89a-4c7b-b24a-97b7972a5a4d%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

eliseop...@gmail.com

unread,
May 21, 2018, 8:01:59 AM5/21/18
to cloud-composer-discuss
Thank you, that helped. I was wondering how to run `airflow` commands from the command line. 

I had to spin a local environment and test my DAG. The DAG import was failing, even though the DAG appeared in the metadata database.
I quickly found out that, perhaps surprisingly, i could not 

from gooogle.cloud import storage

unless I imported the google-cloud SDKs without installing them via pypi. I ended up using the GSCstorageHook instead.

Separately, is there a way to `airflow resetdb` and clean previous DAG runs?



On Sunday, May 20, 2018 at 1:48:34 AM UTC+1, Crystal Qian wrote:
Hi!

Make sure that the DAG is definitely in the /dags directory of your GCS bucket.

The following command should let show if there are any syntactic errors in your DAG:
gcloud beta composer environments run [your environment] list_dags

Finally, you can run the following for each task in your DAG to verify that each task is working properly:
gcloud alpha composer environments run [your environment] test -- [dag name] [task name] [task run date]

Hope this helps!

On Sat, May 19, 2018 at 3:49 PM <eliseop...@gmail.com> wrote:
After uploading a new DAG, it does not appear to be parsed properly. The only hint in the UI is:

This DAG isn't available in the web server's DAG BAG object. It shows up in this list because the scheduler marked it as active in the metadata database.

I can't find any log that explains why the task cannot be activated or is not parsed properly ( no schedule, no buttons in the UI, no status circles).

How can I find out the problem?

--
You received this message because you are subscribed to the Google Groups "cloud-composer-discuss" group.
To unsubscribe from this group and stop receiving emails from it, send an email to cloud-composer-discuss+unsub...@googlegroups.com.

Maria Janczak

unread,
May 22, 2018, 6:34:07 PM5/22/18
to eliseop...@gmail.com, cloud-compo...@googlegroups.com
Hi Elsie,

I would advice against running resetdb, since this will eliminate connections set-up by Composer.

-Maria

On Mon, May 21, 2018 at 5:02 AM <eliseop...@gmail.com> wrote:
Thank you, that helped. I was wondering how to run `airflow` commands from the command line. 

I had to spin a local environment and test my DAG. The DAG import was failing, even though the DAG appeared in the metadata database.
I quickly found out that, perhaps surprisingly, i could not 

from gooogle.cloud import storage

unless I imported the google-cloud SDKs without installing them via pypi. I ended up using the GSCstorageHook instead.

Separately, is there a way to `airflow resetdb` and clean previous DAG runs?



On Sunday, May 20, 2018 at 1:48:34 AM UTC+1, Crystal Qian wrote:
Hi!

Make sure that the DAG is definitely in the /dags directory of your GCS bucket.

The following command should let show if there are any syntactic errors in your DAG:
gcloud beta composer environments run [your environment] list_dags

Finally, you can run the following for each task in your DAG to verify that each task is working properly:
gcloud alpha composer environments run [your environment] test -- [dag name] [task name] [task run date]

Hope this helps!

On Sat, May 19, 2018 at 3:49 PM <eliseop...@gmail.com> wrote:
After uploading a new DAG, it does not appear to be parsed properly. The only hint in the UI is:

This DAG isn't available in the web server's DAG BAG object. It shows up in this list because the scheduler marked it as active in the metadata database.

I can't find any log that explains why the task cannot be activated or is not parsed properly ( no schedule, no buttons in the UI, no status circles).

How can I find out the problem?

--
You received this message because you are subscribed to the Google Groups "cloud-composer-discuss" group.
To unsubscribe from this group and stop receiving emails from it, send an email to cloud-composer-di...@googlegroups.com.

--
You received this message because you are subscribed to the Google Groups "cloud-composer-discuss" group.
To unsubscribe from this group and stop receiving emails from it, send an email to cloud-composer-di...@googlegroups.com.

To post to this group, send email to cloud-compo...@googlegroups.com.
Reply all
Reply to author
Forward
0 new messages