Where to put Utilities.sleep(1000);

12,631 views
Skip to first unread message

Sander

unread,
Feb 4, 2015, 10:38:21 AM2/4/15
to adwords...@googlegroups.com
Hi all,

I'm trying to use this script:

In the log I get an error that I should use utilities.sleep(1000)
I've tried to place it on a few places but apparently not the correct one.

Can anyone tell me where to put it?

It says line 320 is the problem. That's this part:
function getUrlStatus(url) {
  url = encodeURI(url);
  var response = UrlFetchApp.fetch(url, { muteHttpExceptions: true});
  
  return response.getResponseCode();
}

It doesn't seem to go there though?

Alexander Wang

unread,
Feb 4, 2015, 2:06:20 PM2/4/15
to adwords...@googlegroups.com
Hi Sander,

Generally that error means that you've made too many requests to UrlFetchApp and hit the quota for that hour/day. You'll want to place the sleep call before the UrlFetchApp.fetch call. You can see all of the limits (including UrlFetchApp limits) here.

Something like this:
function getUrlStatus(url) {
  url
= encodeURI(url);

 
try {
   
// Try and fetch the specified url.

   
var response = UrlFetchApp.fetch(url, { muteHttpExceptions: true});
   
return response.getResponseCode();

 
} catch (e) {
   
// If it fails, it's likely because we've hit the rate limit for UrlFetchApp.
   
// Sleep for one second before making another request.
   
Utilities.sleep(1000);

   
var response = UrlFetchApp.fetch(url, { muteHttpExceptions: true});
   
return response.getResponseCode();
 
}
}

This is a quick and dirty way to adjust your script. It won't work if the fetch fails multiple times (may want to increase the sleep time from 1 second to 2 or 3 seconds). I also found a slightly more sophisticated handling of this on stackoverflow that will retry more than once:

Hope this helps.

Cheers,
Alex

Sander

unread,
Feb 5, 2015, 3:58:09 AM2/5/15
to adwords...@googlegroups.com
Thanks Alexander!

Unfortunately it still says it. I'll try it with 2000 now (I assume it's milliseconds so 2000 is 2 seconds?)

Alexander Wang

unread,
Feb 5, 2015, 3:31:04 PM2/5/15
to adwords...@googlegroups.com
Yes, it's in milliseconds. Increasing it makes sense and is worth a shot. You may also want to re-write the code to be more resilient/intelligent (more than 1 retry, increasing wait times for subsequent retries, etc). I stumbled onto this javascript implementation of exponential backoff here:

Probably the easiest thing to do though is to limit the number of calls you make. If you have multiple scripts checking urls, you will reach your daily and hourly quotas much quicker. I believe these quotas are per user, so if someone else (or a different login) schedules/executes them, you will be hitting different quotas. You can also try and adjust the script so that it only checks enabled keywords that belong to enabled campaigns/ad groups. Last thing you can try is to do a first pass that saves all of the urls you want to check, you can deduplicate the urls, and THEN check the urls. That way you don't end up checking the same url several times (it's possible you have several keywords/ads with the same destination url).

Cheers,
Alex
Reply all
Reply to author
Forward
0 new messages