How can I persist file upload progress in a Chrome Extension popup when it’s closed and reopened?

101 views
Skip to first unread message

Vishnu Gaddime

unread,
Nov 3, 2025, 8:31:02 AM (2 days ago) Nov 3
to Chromium Extensions

Hi everyone,

I’m developing a Chrome Extension where users upload large files.
Since the uploads can take a long time, I want to achieve the following behavior:

  • When a user starts uploading files from the extension popup, the upload begins.

  • If the user clicks somewhere else or closes the popup, the upload should continue running in the background.

  • When the user clicks the extension icon again to reopen the popup, the UI should show the current upload progress if uploads are still in progress.

  • If no upload is in progress, it should show the default start screen.

In short — I want the popup to reopen in the correct state depending on whether uploads are ongoing or not.

My questions are:

  1. Is this possible to achieve in a Chrome Extension (Manifest V3)?

  2. For implementing this kind of reactive UI, should I build the extension using Vite + React, or can this be done effectively with plain HTML/JS popup?

Any guidance, examples, or best practices for setting this up would be really helpful.

Thanks!

Oliver Dunk

unread,
Nov 3, 2025, 8:39:54 AM (2 days ago) Nov 3
to Vishnu Gaddime, Chromium Extensions
Hi Vishnu,

Great question!

Since you want the upload to continue even when the popup is closed, you should perform the actual upload in your extension's service worker. I believe we keep the service worker alive for the outgoing part of a fetch request, so you shouldn't need to worry about the service worker being stopped, but if you run into any issues there you can use a keepalive as documented here: https://developer.chrome.com/docs/extensions/develop/migrate/to-service-workers#keep_a_service_worker_alive_until_a_long-running_operation_is_finished

One tricky part will be that when the user selects a file from the popup, I expect that handle to the file will only be valid until the popup closes. So you might need to find a way to immediately read the entire file before you upload it. For example, you could request the `unlimitedStorage` permission and then save the file locally in `chrome.storage.local`. Then, you can upload by reading data from there rather than reading from the file directly.

To keep the popup state persistent as it is closed and reopened, you'll want to have a way for the popup to get the latest state. You could do this by having the popup send a message to the background service worker or by having the background service worker periodically save its state in an API like `chrome.storage.session` that can also be read from the popup.

With all of this, your choice of framework and build tool isn't important. You can absolutely use Vite + React if you like but it would work just as well in plain HTML/JS.

I hope this helps! If you have more questions, feel free to ask.
Oliver Dunk | DevRel, Chrome Extensions | https://developer.chrome.com/ | London, GB


--
You received this message because you are subscribed to the Google Groups "Chromium Extensions" group.
To unsubscribe from this group and stop receiving emails from it, send an email to chromium-extens...@chromium.org.
To view this discussion visit https://groups.google.com/a/chromium.org/d/msgid/chromium-extensions/c9aa727d-2fa9-4ad0-b53e-2caaa2866ec9n%40chromium.org.

Vishnu Gaddime

unread,
Nov 4, 2025, 12:27:23 AM (yesterday) Nov 4
to Chromium Extensions, Oliver Dunk, Chromium Extensions, Vishnu Gaddime

Hi Oliver,

Thank you for your response. I also wanted to understand the limitations of Chrome extensions in terms of file uploads.

If I attempt to upload around 1,000 files, each approximately 10 MB in size, would there be any performance issues or inherent limitations I should be aware of?

Appreciate your guidance on this.

Thank you

Oliver Dunk

unread,
Nov 4, 2025, 6:31:34 AM (yesterday) Nov 4
to Vishnu Gaddime, Chromium Extensions
In practice, I imagine however you tried to do this Chrome would impose some throttling. However, you should probably avoid uploading that many files simultaneously regardless, to avoid hogging resources and the network connection.

My advice would be to have some logic so you start the uploads in batches.
Oliver Dunk | DevRel, Chrome Extensions | https://developer.chrome.com/ | London, GB

woxxom

unread,
Nov 4, 2025, 7:23:10 AM (yesterday) Nov 4
to Chromium Extensions, Oliver Dunk, Chromium Extensions, Vishnu Gaddime
  1. You'll need to read all the files from disk immediately before uploading them, because when the popup is closed the browser won't allow reading the files. Use the Web platform messaging to send the files as Blob, which only takes 1ms per file as the Blob is just a pointer to the actual data.

    for (const f of inputElem.files) {
      progressElem.textContent = 'Reading ' + f.name;
      await new Promise(setTimeout); // show progress
      navigator.serviceWorker.controller.postMessage({cmd: 'upload', file: new Blob([f], {type: f.type}));
    }

    The background script:

    self.onmessage = e => e.data.cmd === 'upload' && uploadFile(e.data.file);
    const uploadQueue = [];
    async function uploadFile(file) {
      if (uploadQueue.push(file) > 1) return; // already in the loop
      do { await fetch(SERVER_URL, { body: uploadQueue[0] });  }
      while (uploadQueue.shift());
    }

  2. To report the upload progress within one file, either look up "workaround for fetch file upload progress" or do the upload in chrome.offscreen document using XMLHttpRequest.
  3. In case of using an offscreen document you can pass the Blob directly, without messaging, from the popup:

    let offscreen;
    while (true) {
      offscreen = chrome.extension.getViews().find(v => v.location.pathname.endsWith('/offscreen.html'));
      if (offscreen) break;
      await chrome.offscreen.createDocument(.....);
    }
    for (const f of inputElem.files) {
      progressElem.textContent = 'Reading ' + f.name;
      await new Promise(setTimeout); // show progress
      offscreen.uploadFile(f);
    }

    The offscreen script:

    const uploadQueue = [];
    self.uploadFile = uploadFile; // exposing the global in case this code is inside an IIFE or module
    async function uploadFile(file) {
      const blob = new Blob([file], {type: file.type}); // adopting the file handle
      if (uploadQueue.push(blob) > 1) return; // already in the loop
      do { await xmlhttpRequestUpload(SERVER_URL, { body: uploadQueue[0] });  } // xmlhttpRequestUpload is pseudo code for promisified XMLHttpRequest
      while (uploadQueue.shift());
    }

Juraj M.

unread,
Nov 4, 2025, 10:31:10 AM (yesterday) Nov 4
to Chromium Extensions, woxxom, Oliver Dunk, Chromium Extensions, Vishnu Gaddime
I'm sorry to barge in here but we need to talk about that line...
await new Promise(setTimeout);

Just to clarify, as a "delay" we are passing the "rejected" function.

And converting function to a number creates NaN, right?
new Promise((resolve, reject) => console.log(+reject))
// NaN

I'm not happy about the NaN, but I guess browsers will simply fallback to 0, although I wouldn't be surprised if Safari exploded on this :D.
Anyway, brilliant code, I love it! :)
It reminds me when I first discovered:
switch (true) {

woxxom

unread,
Nov 4, 2025, 12:28:47 PM (yesterday) Nov 4
to Chromium Extensions, Juraj M., woxxom, Oliver Dunk, Chromium Extensions, Vishnu Gaddime
NaN is fine because the same happens when converting `undefined` in `setTimeout(func)`, which is used a lot, so if it fails in any relevant browser (including Safari), it'll be a major bug, which arguably would be caught by automatic tests before a bad commit even lands in the source code base.

woxxom

unread,
Nov 4, 2025, 12:37:53 PM (yesterday) Nov 4
to Chromium Extensions, woxxom, Juraj M., Oliver Dunk, Chromium Extensions, Vishnu Gaddime
I was wrong about `undefined` coercion - according to chromium's source code it explicitly checks for `undefined` and uses 0 as the default. Well, that means I'm abusing a non-explicitly specified aspect, but I still think that if it ever breaks in a published browser it'll be a major bug that would be fixed within hours, because JavaScript is type-coercive by design.
Reply all
Reply to author
Forward
0 new messages