bigquery loading meet quota exceed error

1,344 views
Skip to first unread message

bill li

unread,
Mar 3, 2021, 11:18:10 AM3/3/21
to Google Cloud Developers
start from March 1st, 2021, when I load data to Google BigQuery, I always meet an error as below:
"reason": "quotaExceeded", 
        "message": "Quota exceeded: Your project exceeded quota for imports per project. For more information, see https://cloud.google.com/bigquery/troubleshooting-errors", 
        "location": "load_job"

if I reduce the number of concurrent loading jobs or do a retry, I will success to load data again. From Google Cloud Console, I can see the requests roaring a lot,  from around 90,000 to 3.160,000, most are 200 response. And in quota page, I not find exceed notice tips.
123.png
789.png

how to resolve it, whether I be hacked? SOS

Jun (Cloud Platform Support)

unread,
Mar 4, 2021, 3:53:42 PM3/4/21
to Google Cloud Developers
Hey, 

While loading data into BigQuery, please keep the Quotas and limits at [1] for load jobs in mind as these will apply to your load jobs (it is likely that you're hitting the limit of load jobs per project per day per [1] based on your error message provided). You can also using the Cloud Logging explorer [2] to have more information on the errors you're getting. 

In the meanwhile, please notice that Google Groups are reserved for general product discussion only, and you can try to reach to StackOverflow for technical questions. To get a better support you should post to the relevant forum, thus please read the Community Support article for better understanding. 

bill li

unread,
Mar 22, 2021, 3:02:03 AM3/22/21
to Google Cloud Developers
Hi, and every weekend is the summit, but most of my cron jobs not be scheduled in weekend. any way to fix which dataset or table's data insert cause this?

nibrass

unread,
Mar 22, 2021, 8:39:04 AM3/22/21
to Google Cloud Developers

Hello,

I understand that you would like to know how to determine which tables or datasets were causing the quota exceeded errors.

You can use the following filter [1] in Cloud Logging in order to  get the requests with the quota exceeded. There in each request you will be able to see the table or dataset during which this error occurred.

You can also monitor usage in order to avoid or anticipate the errors associated with exceeding a limit:

1) You can see information about jobs and other resources available to query out-of-the-box using SQL in the information schema metadata views [2]. 

2) There are metrics and audit logging, which can also be exported for querying [3].

If you want to reset or raise a BigQuery quota, please contact Google Cloud Platform Support [4].

Best Regards,
Nibrass
========================

[1]
Reply all
Reply to author
Forward
0 new messages