Understanding app scripts and tracking quotas

43 views
Skip to first unread message

Larry Keen

unread,
Mar 13, 2025, 3:26:20 PM (3 days ago) Mar 13
to Google Apps Script Community
I have been creating multiple App Scripts that use the UrlFetchApp function, and I am now getting this error: "Exception: Service invoked too many times for one day: premium urlfetch." I am aware of the quotas that are put in place to restrict abusing the system but I am at a loss as to how to start tackling this issue. 

My first guess is to assign the app scripts either to one project in the google cloud console or each one into a separate GCC project. It would seem that I should be able to track the usage. Would this be a good plan of action?

If there is anyone who could advise on the best steps for me to move forward, It would be greatly appreciated. 

Ed Robinson

unread,
Mar 14, 2025, 11:41:03 AM (2 days ago) Mar 14
to Google Apps Script Community
Hi Larry,
You're right - the first step is understanding your usage.

It looks like the current quota for Url Fetch is 20,000 for a free user, 100,000 for a paid workspace account

Assuming you're on a workspace account, the 100K max is about 1.15 URL fetch's per second.

BTW: From reading the article, it reads like the quota is per-user. If this is the case, then separating into different projects won't help.
It not clear how to track what your current usage is - but it would be simple to add your own instrumentation to track it 

A couple questions:
1. Is it expected you would use >100K Url fetches per day (or is something misbehaving in the scripts)?
2. Are the Url fetches retrieving data that could be cached and served from cache?

Larry Keen

unread,
Mar 14, 2025, 12:03:30 PM (2 days ago) Mar 14
to Google Apps Script Community
First, thanks for the reply.

After posting this discussion, I put together a sheet listing all app scripts that increments a count each time the url fetch is called. So it tacked roughly 25000 calls to url fetch in some capacity. (This would account for about 2/3 of a days usage based on when I turned it on.)

Now I am using a workspace account so I am assuming the 100k limit is what should be applied to this situation. 

To answer your questions: 
1. judging from the tracker, it is not expected that I am surpassing 100k, but as I have been learning these systems from scratch over the past 9 months I most likely have some less than efficient code.
2. Since I am working with creating as close to real-time dashboards as possible, there is a handful of data that can be cached (and I just started trying to work this into my code because of this issue) such as data that wouldn't change very often or only changes daily.

One huge note to add here is I created a library with a sort of "base api call function", so that I could just add the library to other app scripts and call the function with the necessary parameters rather than creating the entire URL fetch every single time. 

I guess my question now is, should I rethink creating live dashboards all together, or is there some best practice that I just have not found yet when it comes to dealing with live data like this?
Reply all
Reply to author
Forward
0 new messages