"Bandwidth quota exceeded" error in Community Connector — What are the limits and best practices?

30 views
Skip to first unread message

Hashcrypt

unread,
Apr 29, 2026, 3:24:42 AM (2 days ago) Apr 29
to Google Apps Script Community
Hi,

  We are running a Looker Studio Community Connector that fetches data from our
  external API using UrlFetchApp.fetch() and UrlFetchApp.fetchAll(). We
  frequently encounter the following error:

  Data fetch failed: Bandwidth quota exceeded:
  https://our-api-domain.com/api/... Try reducing the rate of data transfer

  About our connector:
  - It connects to 6 different advertising/analytics platforms (similar to a
  cross-platform reporting tool)
  - In getConfig() (Step 2), it makes 6 API calls to load account lists from
  each platform
  - In getData(), it fetches data for multiple accounts across multiple
  platforms — each account requires a separate API call
  - We use UrlFetchApp.fetchAll() to batch requests per platform, and fall back
  to individual UrlFetchApp.fetch() calls if the batch fails
  - API responses can be large (ad account data with 100+ metrics per row,
  across date ranges)

Questions:
  1. What exactly is the bandwidth quota for UrlFetchApp in Community
  Connectors? Is it per-script, per-user, per-project, or per-execution?
  2. Is there official documentation on bandwidth limits specific to Looker
  Studio Community Connectors (vs. regular Apps Script)?
  3. Does UrlFetchApp.fetchAll() count differently towards bandwidth than
  individual fetch() calls?
  4. Are there any recommended patterns from Google for connectors that need to
  aggregate data from multiple external APIs in a single getData() call?
  5. Is there a way to increase this quota (e.g., Google Workspace Enterprise,
  Cloud project billing, etc.)?

  We've attached a simplified version of our connector code below showing the
  fetch pattern.

  Thanks for any guidance!
apps_script_sanitized.txt

Kildere S Irineu

unread,
Apr 30, 2026, 9:39:35 AM (yesterday) Apr 30
to Google Apps Script Community

Here’s the practical answer:

  1. Documented quota: Apps Script documents URL Fetch calls as 20,000/day for consumer accounts and 100,000/day for Google Workspace accounts, per user, resetting 24 hours after first request. It also documents 50 MB max response size per call.
  2. Bandwidth quota: I don’t see an official public number for a separate UrlFetchApp bandwidth quota specific to Looker Studio Community Connectors. The “Bandwidth quota exceeded” error appears to be an internal transfer/rate protection limit, not a listed numeric quota.
  3. Community Connector vs regular Apps Script: Community Connectors run on Apps Script, so Apps Script quotas apply. Looker Studio docs describe Community Connectors as Apps Script-based connectors to internet-accessible data sources.
  4. fetchAll(): fetchAll() makes multiple URL fetch requests; Google documents it as “fetch multiple URLs,” not as a special quota bucket. Treat each request inside fetchAll() as consuming URL Fetch quota and bandwidth.
  5. Increasing quota: Google Workspace raises the documented daily URL Fetch calls from 20k to 100k, but there is no documented way to buy/enable a higher UrlFetchApp bandwidth limit via Cloud billing. Trial domains may have stricter limits until payment/age conditions are met.

Recommended architecture: don’t aggregate 6 platforms × many accounts directly inside getData(). Move ingestion to Cloud Run/Cloud Functions + BigQuery or your own backend cache, then have the connector read a small, filtered result. Also cache getConfig() account lists with CacheService/Properties and reduce API payloads to only requested fields/date ranges.

Reply all
Reply to author
Forward
0 new messages