We were using cloud composer to do a log data load jobs. Recently we started to using it with BigQuery dataset that's not in US or EU (asia-southeast1) and one of our DAG stops working with errors like below:
[2018-12-27 04:22:18,884] {models.py:1736} ERROR - ('BigQuery job status check failed. Final error was: %s', 404)
Traceback (most recent call last)
File "/usr/local/lib/airflow/airflow/contrib/hooks/bigquery_hook.py", line 981, in run_with_configuratio
jobId=self.running_job_id).execute(
File "/usr/local/lib/python3.6/site-packages/googleapiclient/_helpers.py", line 130, in positional_wrappe
return wrapped(*args, **kwargs
File "/usr/local/lib/python3.6/site-packages/googleapiclient/http.py", line 851, in execut
raise HttpError(resp, content, uri=self.uri
googleapiclient.errors.HttpError: <HttpError 404 when requesting https://www.googleapis.com/bigquery/v2/projects/perx-production/jobs/job_lvpf7lmKyR92vFxdEzJ0sH4cnZpx?alt=json returned "Not found: Job perx-production:job_lvpf7lmKyR92vFxdEzJ0sH4cnZpx"
After some dig up, I figured out the error is because of airflow BigQuery hook doesn't submit a required parameter "location" when pooling for job results for jobs that are created in other location.
Any help would be appreciated.