Intent to Ship: Token Binding

228 views
Skip to first unread message

Nick Harper

unread,
Apr 25, 2016, 5:37:05 PM4/25/16
to blin...@chromium.org, net-dev, awha...@chromium.org, van...@chromium.org

Contact emails

nha...@chromium.org, awha...@chromium.org, van...@chromium.org


Specs

http://datatracker.ietf.org/doc/draft-ietf-tokbind-protocol/
http://datatracker.ietf.org/doc/draft-popov-tokbind-negotiation/
http://datatracker.ietf.org/doc/draft-ietf-tokbind-https/

Summary

Token Binding allows a site to cryptographically bind bearer tokens (such as Cookies) to the TLS layer, so that if a cookie is stolen, it can't be replayed by an attacker unless the attacker also has possession of the user's Token Binding private key for that site. This is continuing work of the already-launched Channel ID feature in chrome. Token Binding has been behind a flag since M50, the subject of a Finch experiment for Canary and Dev in M51, and plans to go on by default in M52.


Link to “Intent to Implement” blink-dev discussion

https://groups.google.com/a/chromium.org/forum/#!msg/blink-dev/jTwWj2Y_IPM/7tOHWa34C6EJ


Is this feature supported on all six Blink platforms (Windows, Mac, Linux, Chrome OS, Android, and Android WebView)?

Yes.


Debuggability

Token Binding includes an HTTP request header, which can be viewed in the Network tab of DevTools. Nothing else is provided for developers to debug Token Binding.


Interoperability and Compatibility Risk

Low risk. Token Binding is nearing working group last call in the IETF - editorial changes are still being made, but the technical details have settled down. If future changes are made to the spec, it has version negotiation built in to accommodate that.


OWP launch tracking bug

Feature bug: crbug.com/467312

Launch bug: crbug.com/596699


Entry on the feature dashboard

https://www.chromestatus.com/feature/5097603234529280


Ryan Sleevi

unread,
Apr 25, 2016, 8:17:53 PM4/25/16
to Nick Harper, blink-dev, net-dev, awha...@chromium.org, van...@chromium.org
On Mon, Apr 25, 2016 at 2:36 PM, Nick Harper <nha...@chromium.org> wrote:

Debuggability

Token Binding includes an HTTP request header, which can be viewed in the Network tab of DevTools. Nothing else is provided for developers to debug Token Binding.


Interoperability and Compatibility Risk

Low risk. Token Binding is nearing working group last call in the IETF - editorial changes are still being made, but the technical details have settled down. If future changes are made to the spec, it has version negotiation built in to accommodate that.


I'm largely echoing concerns I've shared privately with the team, so I hope none of this comes as a surprise, or as a feeling that I'm trying to submarine at the last minute - merely making sure that Blink OWNERS have all the available data in front of them. While I fully support the efforts to reduce the risk of bearer tokens, the tradeoffs are something I'm personally very uncomfortable with, as is the security model. However, I know other teams at Google are very invested in this, and would like to see it shipped - so please do not interpret this as "Google position"

Token Binding (as does Channel Binding) does create compatibility risk with debugging scenarios, and may potentially allow for/facilitate the ease of a more restrictive Web model that may work to harm, rather than help, users. To date, the Web Platform provides no means for guaranteeing an end-to-end cryptographic assertion. This is both a weakness and a strength - a weakness, because it limits the possibility to improve web security, but a strength, because it necessarily prohibits anti-user technologies from being effectively deployed. 

- Because Token Binding relies on end-to-end assertions, this limits the ability to use local proxies, such as Fiddler, to inspect or modify traffic.
- Because Token Binding prevents the notion of bearer tokens, this limits the ability to move cookies from one browser to another, or from one application to another (for example, to use DevTools to generate a curl or wget request, with cookies, to simulate and further debug)
- Because Token Binding relies on end-to-end assertions, it adds a discernable overhead to making connections. While Token Binding, being based as an HTTP header that can be used on coalesced connections (as used in HTTP/2), is a significant improvement over Channel ID, which was connection-oriented (and thus prohibited connection pooling and coalescing, and added non-negligible overhead to establishing connections), this overhead still exists, and is by-design.
- Other vendors and implementations, including both Windows and Android, are looking to make Token Binding keys non-extractable from the device. With this property, one could use this as a form of rudimentary DRM on these implementations. There is risk, and already has been significant pressure from within Google, to do the same within Chrome, and that pressure will continue throughout shipping. That is, one could impose the "number of authenticated device" limits that we see in various native DRM platforms (such as Silverlight and Flash), by binding the notion of "authenticated device" to "registered keys", and know that the user has no recourse to change user agents or keys without hassle. That is, even if a utility like curl or wget supported Token Binding, there's no guarantee (and in fact, a desired non-guarantee) that it would be able to reuse the cookies, and as a result, sites can use that as an additional signal to discourage users from clearing keys&cookies, by treating them as 'unique devices'. While this is not intrinsic to the notion of Token Binding, and the risks are similar to that within FIDO, and is fundamentally hypothetical, it is worth highlighting as a risk to platform agility.
- Because of this desire, the security properties of Token Binding have been cryptographically weakened in order to accommodate hardware binding on older devices. In particular, the WG has decided to permit the use of RSA-2048 PKCS#1 v1.5, which arguably should not be used in new standards (as compared with RSA-PSS or ECDSA).
- Because of this disagreement of security policy, there is compatibility risk that making efforts to remove cryptographic algorithms we are uncomfortable with may create interoperability hurdles with servers, either in launch, or over time. That is, if a cookie is bound to an RSA-2048 PSS key, but we decide that's below an acceptable security threshold, we would need to delete the cookie as well as the key. While cookie eviction is admittedly non-deterministic, it does couple systems in ways that may be surprising, but is wholly intentional.
- Token Binding necessarily impacts and interacts with the Fetch API in ways that are not clear. During the I2I, AnneVK of Mozilla referenced https://github.com/whatwg/fetch/issues/30 , which does not seem to have progressed.
- Because Token Binding creates segments around cookie stores (effectively), it does mean that existing Chrome APIs, such as WebView and Extensions, will be impacted in some form. While Token Binding does attempt to address this at a protocol level, specifically the notion of federation, we don't have the implementation experience with this to know if the protocol matches the API. Without launching, it's hard to get good feedback, but once launched, it becomes harder to change the protocol. We've already run in to issues with migrating from Channel ID to Token Binding, with a question about whether existing keys should continue or not, and whether to clear cookies or not.
- Because Token Binding works at the protocol level, it further decreases the probability that connections can be effectively pooled or preconnected. Under today's model, unless you're using TLS client authentication, we can effectively preconnect a socket and complete the TLS handshake without lumping it into a 'cors-anonymous' or 'cors-credentials' bucket until the first request has been sent (either with or without cookies). With Token Binding (as with Channel ID, to be clear), the 'credentials' mode moves up to the TLS handshake, and as such, such sockets must be segregated - a connection with a Token Binding key should not be used to service cors-anonymous requests, and vice-versa.
- Token Binding is actively hostile to Enterprise MITM scenarios, particularly for mobile scenarios (laptop, mobile device). This can be argued as a feature by some security purists, but in thinking about the priority of constituencies and in terms of the Immutable Laws of Security, it's accepted (and part of Chrome's Security FAQ) that we cannot and do not defend against the device owner. For example, one can no longer use a mobile device at work, for which you're MITM'd on a work connection (e.g. wireless), but not on another connection (e.g. public wifi, at home). Similarly, you cannot enable local AV inspection ('self-mitm') conditionally. When I say "cannot", I mean, "not without breaking the token binding assertions and thus invalidating your cookies". We don't know what impact that will have on users.

While I value the incredible effort put forward in Token Binding, my own experiences with Channel ID lead me to believe it's not a good fit for the Web. While it offers a defense in depth, it's worth reiterating that many of the concerns Token Binding sets out to address are either already addressed/addressible through other means (Secure cookies, CSP, proper server security), or explicitly out of scope for Chrome's threat model (locally-installed, same-or-higher privileged malware). At best, it seems a useful features for Enterprises - of which I know a number are excited for the possibilities of both this and FIDO - but that's largely derived from the ability to run a wholly hermetic instance.

With respect to public signals, it's worth noting that Edge is presently advertising draft support for this extension, in some form, I believe as part of their "Windows Hello" attempt at identity.

With all of these said, do I believe there's a direction for which to ship Token Binding in Chrome? As I mentioned earlier, I think it's a bad fit for the Web, even though it does set out to solve a hard problem, and even though it's now the fourth attempt by Chrome to do so (although the first for which we've used the Blink process, rather than just turning it on). I suspect proponents of this would likely agree that any form of whitelisting, such as for Enterprise needs, will likely be unsatisfactory for the use case. So I don't know if there is - but I acknowledge that some of this may only be borne out by actually shipping the thing and seeing what breaks/what happens next/whether it does ruin everything. As such, it may be useful to know what impact there would be if disabling this feature, or how to quantify that this feature is a worthwhile addition to the platform.

There's clearly a desire from teams at Google to see this feature launched, and while I previously participated in those conversations, I've decreased my input and involvement to the point that I can no longer fairly or accurately summarize the arguments in favor of this. But I believe this does represent a platform and compatibility risk, and that it may be unavoidable by design and arguably intentional; I think the Blink OWNERS should consider that when evaluating this feature.

Anne van Kesteren

unread,
Apr 26, 2016, 4:20:02 AM4/26/16
to Nick Harper, blink-dev, net-dev, awha...@chromium.org, van...@chromium.org
On Mon, Apr 25, 2016 at 11:36 PM, Nick Harper <nha...@chromium.org> wrote:
> Interoperability and Compatibility Risk
>
> Low risk. Token Binding is nearing working group last call in the IETF -
> editorial changes are still being made, but the technical details have
> settled down. If future changes are made to the spec, it has version
> negotiation built in to accommodate that.

Ryan pointed this out as well, but how this integrates with the
networking in the browser is still rather unclear. Not sure why
https://github.com/whatwg/fetch/issues/30 ended up ignored.


--
https://annevankesteren.nl/

Nick Harper

unread,
Apr 26, 2016, 12:44:31 PM4/26/16
to Ryan Sleevi, blink-dev, net-dev, awha...@chromium.org, van...@chromium.org
On Mon, Apr 25, 2016 at 5:17 PM, Ryan Sleevi <rsl...@chromium.org> wrote:
- Because of this desire, the security properties of Token Binding have been cryptographically weakened in order to accommodate hardware binding on older devices. In particular, the WG has decided to permit the use of RSA-2048 PKCS#1 v1.5, which arguably should not be used in new standards (as compared with RSA-PSS or ECDSA).

Although the spec allows use of RSA-2048 PKCS1 v1.5, I have only implemented support for ECDSA P256. If there are servers that only support RSA, I would consider adding support to Chrome for RSA-PSS, but I have no intention of adding support for PKCS1v1.5.

Vinod Anupam

unread,
Apr 26, 2016, 12:49:00 PM4/26/16
to Anne van Kesteren, Nick Harper, blink-dev, net-dev, awha...@chromium.org, van...@chromium.org
Hi Anne,

Sorry, we got spread too thin -trying to address multiple fronts- and missed making progress on this.
We will pick this up ASAP.

cheers,
-Anupam

 

--
https://annevankesteren.nl/

Nick Harper

unread,
Apr 27, 2016, 4:47:13 PM4/27/16
to blink-dev, Vinod Anupam, Anne van Kesteren, net-dev, awha...@chromium.org, van...@chromium.org
I'm going to pause this Intent to Ship for now while we incubate this and work on resolving the concerns raised here, including the Fetch spec. When this is resumed, I'll include links to a doc discussing the concerns and mitigation strategies, and a doc discussing the threat model for Token Binding. The resumed Intent to Ship will also better describe compatibility risks for if we remove support for Token Binding further down the road.
Reply all
Reply to author
Forward
0 new messages