Proposal: Marking HTTP As Non-Secure

2073 views
Skip to first unread message

Chris Palmer

unread,
Dec 12, 2014, 7:46:36 PM12/12/14
to public-w...@w3.org, blink-dev, security-dev, dev-se...@lists.mozilla.org
Hi everyone,

Apologies to those of you who are about to get this more than once, due to the cross-posting. I'd like to get feedback from a wide variety of people: UA developers, web developers, and users. The canonical location for this proposal is: https://www.chromium.org/Home/chromium-security/marking-http-as-non-secure.

Proposal


We, the Chrome Security Team, propose that user agents (UAs) gradually change their UX to display non-secure origins as affirmatively non-secure. We intend to devise and begin deploying a transition plan for Chrome in 2015.


The goal of this proposal is to more clearly display to users that HTTP provides no data security.


Request


We’d like to hear everyone’s thoughts on this proposal, and to discuss with the web community about how different transition plans might serve users.


Background


We all need data communication on the web to be secure (private, authenticated, untampered). When there is no data security, the UA should explicitly display that, so users can make informed decisions about how to interact with an origin.


Roughly speaking, there are three basic transport layer security states for web origins:


  • Secure (valid HTTPS, other origins like (*, localhost, *));

  • Dubious (valid HTTPS but with mixed passive resources, valid HTTPS with minor TLS errors); and

  • Non-secure (broken HTTPS, HTTP).


For more precise definitions of secure and non-secure, see Requirements for Powerful Features and Mixed Content.


We know that active tampering and surveillance attacks, as well as passive surveillance attacks, are not theoretical but are in fact commonplace on the web.


RFC 7258: Pervasive Monitoring Is an Attack

NSA uses Google cookies to pinpoint targets for hacking

Verizon’s ‘Perma-Cookie’ Is a Privacy-Killing Machine

How bad is it to replace adSense code id to ISP's adSense ID on free Internet?

Comcast Wi-Fi serving self-promotional ads via JavaScript injection

Erosion of the moral authority of transparent middleboxes

Transitioning The Web To HTTPS


We know that people do not generally perceive the absence of a warning sign. (See e.g. The Emperor's New Security Indicators.) Yet the only situation in which web browsers are guaranteed not to warn users is precisely when there is no chance of security: when the origin is transported via HTTP. Here are screenshots of the status quo for non-secure domains in Chrome, Safari, Firefox, and Internet Explorer:


Screen Shot 2014-12-11 at 5.08.48 PM.png


Screen Shot 2014-12-11 at 5.09.55 PM.png


Screen Shot 2014-12-11 at 5.11.04 PM.png


ie-non-secure.png


Particulars


UA vendors who agree with this proposal should decide how best to phase in the UX changes given the needs of their users and their product design constraints. Generally, we suggest a phased approach to marking non-secure origins as non-secure. For example, a UA vendor might decide that in the medium term, they will represent non-secure origins in the same way that they represent Dubious origins. Then, in the long term, the vendor might decide to represent non-secure origins in the same way that they represent Bad origins.


Ultimately, we can even imagine a long term in which secure origins are so widely deployed that we can leave them unmarked (as HTTP is today), and mark only the rare non-secure origins.


There are several ways vendors might decide to transition from one phase to the next. For example, the transition plan could be time-based:


  1. T0 (now): Non-secure origins unmarked

  2. T1: Non-secure origins marked as Dubious

  3. T2: Non-secure origins marked as Non-secure

  4. T3: Secure origins unmarked


Or, vendors might set thresholds based on telemetry that measures the ratios of user interaction with secure origins vs. non-secure. Consider this strawman proposal:


  1. Secure > 65%: Non-secure origins marked as Dubious

  2. Secure > 75%: Non-secure origins marked as Non-secure

  3. Secure > 85%: Secure origins unmarked


The particular thresholds or transition dates are very much up for discussion. Additionally, how to define “ratios of user interaction” is also up for discussion; ideas include the ratio of secure to non-secure page loads, the ratio of secure to non-secure resource loads, or the ratio of total time spent interacting with secure vs. non-secure origins.


We’d love to hear what UA vendors, web developers, and users think. Thanks for reading!

Eduardo Robles Elvira

unread,
Dec 12, 2014, 8:18:12 PM12/12/14
to Chris Palmer, public-w...@w3.org, blink-dev, security-dev, dev-se...@lists.mozilla.org
Hello Chris et al:

I'm a web developer. I did some vendor security-related development sometime ago inside Konqueror. Some first thoughts:

* In principle, the proposal makes sense to me. Who doesn't want a more secure web? Kudos for making it possible with ambitious proposals like this.

* The biggest problem I see is that to get an accepted certificate traditionally you needed to pay. This was a show-stopper for having TLS certs in small websites. Mozilla, EFF, Cisco, Akamai are trying to fix that [1] and that StartSSL gives free certificates though. Just stating the obvious: you either get easy and free "secure" certificates, or this proposal is going to make some webmasters angry.


Regards,
--
[1] https://www.eff.org/deeplinks/2014/11/certificate-authority-encrypt-entire-web
--
Eduardo Robles Elvira     @edulix             skype: edulix2
http://agoravoting.org       @agoravoting     +34 634 571 634

Chris Palmer

unread,
Dec 12, 2014, 8:32:28 PM12/12/14
to Eduardo Robles Elvira, public-w...@w3.org, blink-dev, security-dev, dev-se...@lists.mozilla.org
On Fri, Dec 12, 2014 at 5:17 PM, Eduardo Robles Elvira <edu...@agoravoting.com> wrote:

* The biggest problem I see is that to get an accepted certificate traditionally you needed to pay. This was a show-stopper for having TLS certs in small websites. Mozilla, EFF, Cisco, Akamai are trying to fix that [1] and that StartSSL gives free certificates though. Just stating the obvious: you either get easy and free "secure" certificates, or this proposal is going to make some webmasters angry.


Oh yes, absolutely. Obviously, Let's Encrypt is a great help, and SSLMate's ease-of-use and low price is great, and CloudFlare's free SSL helps too. 

Hopefully, as operations like those ramp up, it will get increasingly easier for web developers to switch to HTTPS. We (Chrome) will weigh changes to the UX very carefully, and with a close eye on HTTPS adoption.

Alex Gaynor

unread,
Dec 12, 2014, 8:47:00 PM12/12/14
to Chris Palmer, public-w...@w3.org, blink-dev, security-dev, dev-se...@lists.mozilla.org
Fantastic news, I'm very glad to see the Chrome Security Team taking initiative on this.

Existing browsers' behavior of defaulting to, and using a "neutral" UI for, HTTP is fundamentally an assumption what users want. And it's not an assumption that is grounded in data.

No ordinary users' mental for communication on the net includes lack of authenticity, integrity, or confidentially. Plaintext is a blight on the internet, and this is a fantastic step towards making reality match users (TOTALLY REASONABLE!) expectations.

Cheers,
Alex

To unsubscribe from this group and stop receiving emails from it, send an email to security-dev...@chromium.org.

Hanno Böck

unread,
Dec 12, 2014, 9:17:01 PM12/12/14
to securi...@chromium.org, Chris Palmer
Am Fri, 12 Dec 2014 16:46:34 -0800
schrieb "'Chris Palmer' via Security-dev" <securi...@chromium.org>:

> Apologies to those of you who are about to get this more than once,
> due to the cross-posting. I'd like to get feedback from a wide
> variety of people: UA developers, web developers, and users. The
> canonical location for this proposal is:

First of all: ++ from me, I like this very much.

But there's one thing I feel increasingly uneasy about and I'd like to
bring that up: I think much more stuff should be moved into the
"dubious" category.

I feel that there is some "de-facto standard" for secure HTTPS configs
that emerged from discussions around TLS in recent years, some large
webpages and some security conscious people follow it, most ignore it
(including e.g. banks and others who should care). E.g. I'd consider
HSTS pretty much required for a secure setup. Also agl wrote a few days
ago "everything less than TLS 1.2 with an AEAD cipher suite is
cryptographically broken". Of course that's completely reasonable given
the amount of CBC/MacthenEncrypt-related attacks we've seen in the
past. So I'm asking: Why do webpages that only support
"cryptographically broken" protocols get a green lock? They shouldn't.

I feel this is the part that should be discussed alongside.

--
Hanno Böck
http://hboeck.de/

mail/jabber: ha...@hboeck.de
GPG: BBB51E42

Ryan Sleevi

unread,
Dec 12, 2014, 9:23:54 PM12/12/14
to Hanno Böck, security-dev, Chris Palmer

Indeed, and expect a separate discussion of that. You can already see some of the discussion on the security-dev@ list regarding requiring OCSP stapling or modern ciphersuites for EV, and one can naturally assume that will migrate to DV.

That is, just as EV moves to DV when deployed dangerously, so too should DV move to dubious.

But, as you note, that's something to be discussed alongside.

Chris Palmer

unread,
Dec 12, 2014, 9:25:45 PM12/12/14
to Ryan Sleevi, Hanno Böck, security-dev
The SHA-1 deprecation plan that Ryan and I put up (http://googleonlinesecurity.blogspot.com/2014/09/gradually-sunsetting-sha-1.html) was a first take at moving old and busted crypto into Dubious. See also https://code.google.com/p/chromium/issues/detail?id=102949 for broken things we simply un-supported, as well.

Mike West

unread,
Dec 13, 2014, 6:53:18 AM12/13/14
to Ryan Sleevi, Hanno Böck, security-dev, Chris Palmer

That is, just as EV moves to DV when deployed dangerously, so too should DV move to dubious.

Ryan, for clarity, I think you mean that DV should move to dubious iff deployed dangerously, right? Not as a blanket statement? :)

-mike

Igor Bukanov

unread,
Dec 13, 2014, 11:56:22 AM12/13/14
to Chris Palmer, Eduardo Robles Elvira, dev-se...@lists.mozilla.org, blink-dev, public-w...@w3.org, security-dev
Free SSL certificates helps, but another problem is that activating SSL not only generates warnings, but just break the site due to links to insecure resources. Just consider a case of old pages with a few youtube videos served using http iframes. Accessing those pages over https stops the videos from working as browsers blocks access to active insecure context. And in case of youtube one can fix that, but for other resources it may not be possible.

So what is required is ability to refer to insecure context from HTTPS pages without harming user experience. For example, it should be a way to insert http iframe into https site. Similarly, it would be nice if a web-developer could refer to scripts, images etc. over http as long as the script/image tag is accompanied with a secure hash of known context.

_______________________________________________
dev-security mailing list
dev-se...@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security

Mathias Bynens

unread,
Dec 13, 2014, 12:33:42 PM12/13/14
to Chris Palmer, public-w...@w3.org, blink-dev, security-dev, dev-se...@lists.mozilla.org
On Sat, Dec 13, 2014 at 1:46 AM, 'Chris Palmer' via blink-dev <blin...@chromium.org> wrote:

We know that people do not generally perceive the absence of a warning sign. (See e.g. The Emperor's New Security Indicators.) Yet the only situation in which web browsers are guaranteed not to warn users is precisely when there is no chance of security: when the origin is transported via HTTP. Here are screenshots of the status quo for non-secure domains in Chrome, Safari, Firefox, and Internet Explorer:


Screen Shot 2014-12-11 at 5.08.48 PM.png


Screen Shot 2014-12-11 at 5.09.55 PM.png


Screen Shot 2014-12-11 at 5.11.04 PM.png


ie-non-secure.png


For completeness sake, here’s what a non-secure looks like in Opera:


 

Christian Heutger

unread,
Dec 13, 2014, 2:06:02 PM12/13/14
to pal...@google.com, edu...@agoravoting.com, public-w...@w3.org, blin...@chromium.org, securi...@chromium.org, dev-se...@lists.mozilla.org
I see a big danger in the current trend. Expecting everyone having a free „secure“ certificate and being in requirement to enable HTTPS it will result in nothing won. DV certificates (similar to DANE) do finally say absolute nothing about the website operator. They ensure encryption, so I can then be phished, be scammed, … encrypted. Big advantage!^^ Pushing real validation (e.g. EV with green adressbar and validated details by an independent third party, no breakable, spoofable automatism) vs. no validation is much more important and should be focussed on. However, this „change“ could come with marking HTTP as Non-Secure, but just stating HTTPS as secure is the completely wrong sign and will result in more confusion and loosing any trust in any kind of browser padlocks than before.

Just a proposal:

Mark HTTP as Non-Secure (similar to self-signed) e.g. with a red padlock or sth. similar.
Mark HTTPS as Secure (and only secure in favor of encrypted) e.g. with a yellow padlock or sth. similar
Mark HTTPS with Extended Validation (encrypted and validated) as it is with a green padlock or sth. similar

This would be a good road for more security on the web.

Chris Palmer

unread,
Dec 14, 2014, 12:59:23 PM12/14/14
to Igor Bukanov, Eduardo Robles Elvira, dev-se...@lists.mozilla.org, blink-dev, public-w...@w3.org, security-dev
On Sat, Dec 13, 2014 at 8:56 AM, Igor Bukanov <ig...@mir2.org> wrote:

Free SSL certificates helps, but another problem is that activating SSL not only generates warnings, but just break the site due to links to insecure resources. Just consider a case of old pages with a few youtube videos served using http iframes. Accessing those pages over https stops the videos from working as browsers blocks access to active insecure context. And in case of youtube one can fix that, but for other resources it may not be possible.

Yes, unfortunately we have a collective action problem. (http://en.wikipedia.org/wiki/Collective_action#Collective_action_problem) But just because it's hard, doesn't mean we don't have try. I'd suggest that embedders ask embeddees to at least make HTTPS available, even if not the default.

Also, keep in mind that this proposal is only to mark HTTP as non-secure — HTTP will still work, and you can still host your site over HTTP.
 
So what is required is ability to refer to insecure context from HTTPS pages without harming user experience.

No, because that reduces or eliminates the security guarantee of HTTPS.
 
For example, it should be a way to insert http iframe into https site. Similarly, it would be nice if a web-developer could refer to scripts, images etc. over http as long as the script/image tag is accompanied with a secure hash of known context.

Same thing here. The security guarantee of HTTPS is the combination of server authentication, data integrity, and data confidentiality. It is not a good user experience to take away confidentiality without telling users. And, unfortunately, we cannot effectively communicate that nuance. We have enough trouble effectively communicating the secure/non-secure distinction as it is.

Alex Gaynor

unread,
Dec 14, 2014, 1:01:00 PM12/14/14
to Chris Palmer, Igor Bukanov, Eduardo Robles Elvira, dev-se...@lists.mozilla.org, blink-dev, public-w...@w3.org, security-dev
Chris,

Is there a plan for HTTP to eventually have an interstitial, the way HTTPS with a bogus cert does?

Alex

To unsubscribe from this group and stop receiving emails from it, send an email to security-dev...@chromium.org.

Chris Palmer

unread,
Dec 14, 2014, 1:17:08 PM12/14/14
to Christian Heutger, edu...@agoravoting.com, public-w...@w3.org, blin...@chromium.org, securi...@chromium.org, dev-se...@lists.mozilla.org
On Sat, Dec 13, 2014 at 11:05 AM, Christian Heutger <chri...@heutger.net> wrote:

I see a big danger in the current trend. Expecting everyone having a free „secure“ certificate and being in requirement to enable HTTPS it will result in nothing won. DV certificates (similar to DANE) do finally say absolute nothing about the website operator.

Reducing the number of parties you have to trust from [ the site operator, the operators of all networks between you and the site operator ] to just [ the site operator ] is a huge win.
 
They ensure encryption, so I can then be phished, be scammed, … encrypted. Big advantage!^^ Pushing real validation (e.g. EV with green adressbar and validated details by an independent third party, no breakable, spoofable automatism) vs. no validation is much more important and should be focussed on.

I think you'll find EV is not as "extended" as you might be hoping.

But more importantly, the only way to get minimal server auth, data integrity, and data confidentiality on a mass scale is with something at least as easy to deploy as DV. Indeed, you'll see many of the other messages in this thread are from people concerned that DV isn't easy enough yet! So requiring EV is a non-starter.

Additionally, the web origin concept is (scheme, host, port). Crucially, EV-issued names are not distinct origins from DV-issued names, and proposals to enforce such a distinction in browsers have not gotten any traction because they are not super feasible (for a variety of reasons).
 
However, this „change“ could come with marking HTTP as Non-Secure, but just stating HTTPS as secure is the completely wrong sign and will result in more confusion and loosing any trust in any kind of browser padlocks than before.

HTTPS is the bare minimum requirement for secure web application *transport*. Is secure transport by itself sufficient to achieve total *application-semantic* security? No. But a browser couldn't determine that level of security anyway. Our goal is for the browser to tell as much of the truth as it can programatically determine at run-time.

Igor Bukanov

unread,
Dec 14, 2014, 1:34:26 PM12/14/14
to Chris Palmer, Eduardo Robles Elvira, dev-se...@lists.mozilla.org, blink-dev, public-w...@w3.org, security-dev
On 14 December 2014 at 18:59, Chris Palmer <pal...@google.com> wrote:

Yes, unfortunately we have a collective action problem. (http://en.wikipedia.org/wiki/Collective_action#Collective_action_problem) But just because it's hard, doesn't mean we don't have try. I'd suggest that embedders ask embeddees to at least make HTTPS available, even if not the default.

Also, keep in mind that this proposal is only to mark HTTP as non-secure — HTTP will still work, and you can still host your site over HTTP.

If serving context over HTTPS generates broken pages, the insensitive of enabling encryption is very low. As it was already mentioned, a solution to that is to allow to serve encrypted pages over HTTP so pages that refer to unencrypted elements would not break pages but just produces warnings. Such encrypted http:// also allows to generate less warnings for a page where all context is available over self-signed and key-pinned certificate as that solution is strictly more secure then a plain HTTP.

Chris Palmer

unread,
Dec 14, 2014, 1:34:56 PM12/14/14
to Alex Gaynor, Igor Bukanov, Eduardo Robles Elvira, dev-se...@lists.mozilla.org, blink-dev, public-w...@w3.org, security-dev
On Sun, Dec 14, 2014 at 10:00 AM, Alex Gaynor <alex....@gmail.com> wrote:

Is there a plan for HTTP to eventually have an interstitial, the way HTTPS with a bogus cert does?

We (Chrome) have no current plan to do that. In the Beautiful Future when some huge percentage of pageviews are shown via secure transport, it might or might not make sense to interstitial HTTP then. I kind of doubt that it will be a good idea, but who knows. We'll see.

Chris Palmer

unread,
Dec 14, 2014, 1:40:58 PM12/14/14
to Igor Bukanov, Eduardo Robles Elvira, dev-se...@lists.mozilla.org, blink-dev, public-w...@w3.org, security-dev
On Sun, Dec 14, 2014 at 10:34 AM, Igor Bukanov <ig...@mir2.org> wrote:

If serving context over HTTPS generates broken pages, the insensitive of enabling encryption is very low.

That's the definition of a collective action problem, yes.

I think that the incentives will change, and are changing, and people are becoming more aware of the problems of non-secure transport. There is an on-going culture shift, and more and more publishers are going to offer HTTPS. For example, http://open.blogs.nytimes.com/author/eitan-konigsburg/?_r=0.

As it was already mentioned, a solution to that is to allow to serve encrypted pages over HTTP so pages that refer to unencrypted elements would not break pages but just produces warnings. Such encrypted http:// also allows to generate less warnings for a page where all context is available over self-signed and key-pinned certificate as that solution is strictly more secure then a plain HTTP.

But, again, consider the definition of the origin. If it is possible for securely-transported code to run in the same context as non-securely transported code, the securely-transported code is effectively non-secure.

Igor Bukanov

unread,
Dec 14, 2014, 1:48:08 PM12/14/14
to Chris Palmer, Eduardo Robles Elvira, dev-se...@lists.mozilla.org, blink-dev, public-w...@w3.org, security-dev
On 14 December 2014 at 19:40, Chris Palmer <pal...@google.com> wrote:

But, again, consider the definition of the origin. If it is possible for securely-transported code to run in the same context as non-securely transported code, the securely-transported code is effectively non-secure.

Yes, but the point is that the page will be shown with the same warnings as a plain http page rather then showing a broken page. 

Igor Bukanov

unread,
Dec 14, 2014, 1:53:30 PM12/14/14
to Chris Palmer, Eduardo Robles Elvira, dev-se...@lists.mozilla.org, blink-dev, public-w...@w3.org, security-dev
I.e. just consider that currently a hosting provider has no option to unconditionally encrypt pages they host for modern browsers as that may break pages of the users. With encrypted http:// they get such option delegating the job of fixing warnings about insecure context to the content producers as it should.

Chris Palmer

unread,
Dec 14, 2014, 2:08:32 PM12/14/14
to Igor Bukanov, Eduardo Robles Elvira, dev-se...@lists.mozilla.org, blink-dev, public-w...@w3.org, security-dev
On Sun, Dec 14, 2014 at 10:53 AM, Igor Bukanov <ig...@mir2.org> wrote:

I.e. just consider that currently a hosting provider has no option to unconditionally encrypt pages they host for modern browsers as that may break pages of the users. With encrypted http:// they get such option delegating the job of fixing warnings about insecure context to the content producers as it should.

I'm sorry; I still don't understand what you mean. Do you mean that you want browsers to treat some hypothetical encrypted HTTP protocol as if it were a secure origin, but still allow non-secure embedded content in these origins?

I would argue strongly against that, and so far not even the "opportunistic encryption" advocates have argued for that.

Igor Bukanov

unread,
Dec 14, 2014, 2:26:45 PM12/14/14
to Chris Palmer, Eduardo Robles Elvira, dev-se...@lists.mozilla.org, blink-dev, public-w...@w3.org, security-dev
I would like to see some hypothetical encrypted http:// when a browser present a page as if it was over https:// if everything of a secure origin and as if it was served over plain http if not. That is, if a future browser shows warnings for plain http, so it will show the same warnings for encrypted http:// with insecure resources.

The point of such encrypted http:// is to guarantee that *enabling encryption never degrades user experience* compared with the case of plain http. This will allow for a particular installation to start serving everything encrypted independently from the job of fixing the content. And as the page still served as http://, the user training/expectations about https:// sites no longer applies.

Michal Zalewski

unread,
Dec 14, 2014, 2:48:08 PM12/14/14
to Igor Bukanov, Chris Palmer, Eduardo Robles Elvira, dev-se...@lists.mozilla.org, blink-dev, public-w...@w3.org, security-dev
> I would like to see some hypothetical encrypted http:// when a browser
> present a page as if it was over https:// if everything of a secure origin
> and as if it was served over plain http if not. That is, if a future browser
> shows warnings for plain http, so it will show the same warnings for
> encrypted http:// with insecure resources.

Browsers have flirted with along the lines of your proposal with
non-blocking mixed content icons. Unfortunately, websites are not
static - so the net effect was that if you watched the address bar
constantly, you'd eventually get notified that your previously-entered
data that you thought will be visible only to a "secure" origin has
been already leaked to / exposed to network attackers.

The main point of having a visible and stable indicator for encrypted
sites is to communicate to the user that the site offers a good degree
of resilience against the examination or modification of the exchanged
data by network attackers. (It is a complicated property and it is
often misunderstood as providing clear-cut privacy assurances for your
online habits, but that's a separate topic.)

Any changes that make this indicator disappear randomly at unexpected
times, or make the already-complicated assurances more fragile and
even harder to explain, are probably not the right way to go.

/mz

Igor Bukanov

unread,
Dec 14, 2014, 3:04:05 PM12/14/14
to Michal Zalewski, Chris Palmer, Eduardo Robles Elvira, dev-se...@lists.mozilla.org, blink-dev, public-w...@w3.org, security-dev
On 14 December 2014 at 20:47, Michal Zalewski <lca...@google.com> wrote:
The main point of having a visible and stable indicator for encrypted
sites is to communicate to the user that the site offers a good degree
of resilience against the examination or modification of the exchanged
data by network attackers.

Then browser should show absolutely no indications of secure origin for encrypted http://. The idea is that encrypted http:// experience would be equivalent to the current http experience with no indications of security and no warnings. However, encrypted http:// with insecure elements will start to produce warnings in the same way a future browser will show warnings for plain http.

Without something like this I just do not see how a lot of sites could ever start enabling encryption unconditionally. I.e. currently enabling https requires to modify content often in a significant way. I would for a site operator to have an option to enabling encryption unconditionally without touching the content.

Michal Zalewski

unread,
Dec 14, 2014, 3:07:46 PM12/14/14
to Igor Bukanov, Chris Palmer, Eduardo Robles Elvira, dev-se...@lists.mozilla.org, blink-dev, public-w...@w3.org, security-dev
> Then browser should show absolutely no indications of secure origin for
> encrypted http://. The idea is that encrypted http:// experience would be
> equivalent to the current http experience with no indications of security
> and no warnings. However, encrypted http:// with insecure elements will
> start to produce warnings in the same way a future browser will show
> warnings for plain http.

As mentioned in my previous response, this gets *really* hairy because
the "has insecure elements" part is not a static property that can be
determined up front; so, you end up with the problem of sudden and
unexpected downgrades and notifying the user only after the
confidentiality or integrity of the previously-stored data has been
compromised.

/mz

Christian Heutger

unread,
Dec 14, 2014, 4:41:39 PM12/14/14
to Chris Palmer, edu...@agoravoting.com, public-w...@w3.org, blin...@chromium.org, securi...@chromium.org, dev-se...@lists.mozilla.org
> Reducing the number of parties you have to trust from [ the site operator, the operators of all networks between you and the site operator ] to just [ the site operator ] is a huge win.

But how can I trust him and who is he? No WHOIS records no imprint, all spoofable, so in what should I trust then? If there is a third-party, who state me, the details given are correct and they have a warranty for misinformation, that’s something I could trust. I also look at online-shopping if there are customer reviews, but I do not recognize them as fully trustable as they may be spoofed, if the shop has a seal like Trusted Shops with a money-back guarantee, I feel good and shop there.
 
> I think you'll find EV is not as "extended" as you might be hoping.

I know, but it’s the best we currently have. And DV is much worser, finally loosing any trust in HTTPS with it, scaling it down to encryption, nothing else.

> But more importantly, the only way to get minimal server auth, data integrity, and data confidentiality on a mass scale is with something at least as easy to deploy as DV. Indeed, you'll see many of the other messages in this thread are from people concerned that DV isn’t
> easy enough yet! So requiring EV is a non-starter.

I agree on data confidentiality, maybe also on integrity although DV without effort or costs may break that also in any way, but server auth is somehow saying nothing as any server endpoint I called I get, nothing more is authenticated. However, I support the idea of having mass encryption, but before confusing and damaging end users mind on internet security, there need to be a clear differentiation in just encryption and encryption with valid authentication.

> HTTPS is the bare minimum requirement for secure web application *transport*. Is secure transport by itself sufficient to achieve total *application-semantic* security? No. But a browser couldn't determine that level of security anyway. Our goal is for the browser to tell
> as much of the truth as it can programatically determine at run-time.

But wasn’t that the idea of certificates? Seals on websites can be spoofed, WHOIS records can be spoofed, imprints can be spoofed, but spoofing EV certificates, e.g. in combination with solutions like pinning, is a hard job. Considering there would be no browser warning for self-signed certificates, I do not see any advantage in mass developed (finally requiring a full automated process) DV certificates. It’s a bit back to the roots to times, I remember, some website operators offered their self-signed root to be installed in the browser to remove the browser warning.

Peter Bowen

unread,
Dec 14, 2014, 5:57:44 PM12/14/14
to Chris Palmer, Igor Bukanov, Eduardo Robles Elvira, dev-se...@lists.mozilla.org, blink-dev, public-w...@w3.org, security-dev
On Sun, Dec 14, 2014 at 11:08 AM, 'Chris Palmer' via Security-dev
<securi...@chromium.org> wrote:
> On Sun, Dec 14, 2014 at 10:53 AM, Igor Bukanov <ig...@mir2.org> wrote:
>
>> I.e. just consider that currently a hosting provider has no option to
>> unconditionally encrypt pages they host for modern browsers as that may
>> break pages of the users. With encrypted http:// they get such option
>> delegating the job of fixing warnings about insecure context to the content
>> producers as it should.
>
>
> I'm sorry; I still don't understand what you mean. Do you mean that you want
> browsers to treat some hypothetical encrypted HTTP protocol as if it were a
> secure origin, but still allow non-secure embedded content in these origins?

I'm also not clear on what Igor intended, but there is a real issue
with browser presentation of URLs using TLS today. There is no way to
declare "I know that this page will have insecure content, so don't
consider me a secure origin" such that the browser will show a
"neutral" icon rather than a warning icon. I think there is a strong
impression that a closed lock is better than neutral, but a yellow
warning sign over the lock is worse than neutral. Today prevents
sites from using HTTPS unless they have a very high confidence that
all resources on the page will come from secure origins.

Thanks,
Peter

cha...@yandex-team.ru

unread,
Dec 15, 2014, 4:04:44 AM12/15/14
to Christian Heutger, pal...@google.com, edu...@agoravoting.com, public-w...@w3.org, blin...@chromium.org, securi...@chromium.org, dev-se...@lists.mozilla.org
(Ouch. maintaining the cross-posting ;( ).
 
15.12.2014, 11:57, "Christian Heutger" <chri...@heutger.net>:
However, this „change“ could come with marking HTTP as Non-Secure, but just stating HTTPS as secure is the completely wrong sign and will result in more confusion and loosing any trust in any kind of browser padlocks than before.
 
The message someone wrote about "I understand I am using insecure mixed content, please don't make me look worse than someone who doesn't understand that" strikes a chord - I think there is value in supporting the use case.
 
Just a proposal:
 
Mark HTTP as Non-Secure (similar to self-signed) e.g. with a red padlock or sth. similar.
Mark HTTPS as Secure (and only secure in favor of encrypted) e.g. with a yellow padlock or sth. similar
Mark HTTPS with Extended Validation (encrypted and validated) as it is with a green padlock or sth. similar
 
Just a "nit pick" - please don't EVER rely only on colour distinction to communicate anything important. There are a lot of colour-blind people, and on top there are various people who rely on high-contrast modes and the like which effectively strip the colour out.
 
cheers
 
Chaals
 
--
Charles McCathie Nevile - web standards - CTO Office, Yandex
cha...@yandex-team.ru - - - Find more at http://yandex.com
 

Igor Bukanov

unread,
Dec 15, 2014, 4:16:46 AM12/15/14
to Peter Bowen, Chris Palmer, Eduardo Robles Elvira, dev-se...@lists.mozilla.org, blink-dev, public-w...@w3.org, security-dev
On 14 December 2014 at 23:57, Peter Bowen <pzb...@gmail.com> wrote:
 I think there is a strong
impression that a closed lock is better than neutral, but a yellow
warning sign over the lock is worse than neutral.

The problem is not just a warning sign.

Browsers prevents any active context including iframe served over http from loading. Thus showing a page with youtube and other videos over https is not an option unless one fixes the page. Now consider that it is not a matter of running sed on a set of static files but rather patching the stuff stored in the database or fixing JS code that inserts the video as the task of enabling https becomes non-trivial and very content dependent.

So indeed an option to declare that despite proper certificates and encryption the site should be treated as of insecure origin is needed. This way the page will be shown as if it was served as before with plain http with no changes in user experience.  But then it cannot be a https site since many users still consider that https is enough to assume a secure site. Hence the idea of encrypted http:// or something that makes user experience with an encrypted page absolutely the same as she has with plain http:// down to the browser stripping http:// from the URL.

After considering this I think it will be even fine for a future browser to show a warning for such properly-encrypted-but-explicitly-declared-as-insecure in the same way as a warning will be shown for plain http. And it will be really nice if a site operator, after activating such user-invisible encryption, can receive reports from a browser about any violation of the secure origin policy in the same way how violations of CSP are reported today. This would give nice possibility of activating encryption without breaking anything, collecting reports of violated secure origin policy, fixing content and finally declaring the site explicitly https-only.

Jeffrey Walton

unread,
Dec 15, 2014, 4:29:54 AM12/15/14
to Christian Heutger, public-w...@w3.org, blin...@chromium.org, securi...@chromium.org, dev-se...@lists.mozilla.org
On Sat, Dec 13, 2014 at 2:05 PM, Christian Heutger
<chri...@heutger.net> wrote:
> I see a big danger in the current trend.

Surely you haven't missed the big danger in plain text traffic. That
traffic gets usuroed and fed into susyems like Xkeyscore for Tailored
Access Operations (TAO). In layman's terms, adversaries are using the
information gathered to gain unauthorized access to systems.

> Expecting everyone having a free
> „secure“ certificate and being in requirement to enable HTTPS it will result
> in nothing won. DV certificates (similar to DANE) do finally say absolute
> nothing about the website operator.

The race to the bottom among CAs is to blame for the quality of
verification by the CAs.

With companies like StartCom, Cacert and Mozilla offering free
certificates, there is no barrier to entry.

Plus, I don't think a certificate needs to say anything about the
operator. They need to ensure the server is authenticated. That is,
the public key bound to the DNS name is authentic.

> They ensure encryption, so I can then be
> phished, be scammed, … encrypted. Big advantage!^^

As I understand it, phishers try to avoid TLS because they count on
the plain text channel to avoid all the browser warnings. Peter
Gutmann discusses this in his Engineering Security book
(https://www.cs.auckland.ac.nz/~pgut001/pubs/book.pdf).

> Pushing real validation
> (e.g. EV with green adressbar and validated details by an independent third
> party, no breakable, spoofable automatism) vs. no validation is much more
> important and should be focussed on.

You should probably read Gutmann's Engineering Security. See his
discussion of "PKI me harder" in Chapter 1 or 6 (IIRC).

> However, this „change“ could come with
> marking HTTP as Non-Secure, but just stating HTTPS as secure is the
> completely wrong sign and will result in more confusion and loosing any
> trust in any kind of browser padlocks than before.

Security engineering studies seem to indicate most users don't
understand the icons. It would probably be better if the browsers did
the right thing, and took the users out of the loop. Gutmann talks
about it in detail (with lots of citations).

> Just a proposal:
>
> Mark HTTP as Non-Secure (similar to self-signed) e.g. with a red padlock or
> sth. similar.

+1. In the browser world, plaintext was (still is?) held in higher
esteem then opportunistic encryption. Why the browsers choose to
indicate things this way is a mystery.

> Mark HTTPS as Secure (and only secure in favor of encrypted) e.g. with a
> yellow padlock or sth. similar
> Mark HTTPS with Extended Validation (encrypted and validated) as it is with
> a green padlock or sth. similar

Why green for EV (or why yellow for DV or DANE)? EV does not add any
technical controls. From a security standpoint, DV and EV are
equivalent.

If DNS is authentic, then DANE provides stronger assurances than DV or
EV since the domain operator published the information and the
veracity does not rely on others like CAs (modulo DBOUND).

Not relying on a CA is a good thing since its usually advantageous to
minimize trust (for some definition of "trust"). Plus, CAs don't
really warrant anything, so its not clear what exactly they are
providing to relying parties (they are providing a a signature for
money to the applicant).

Open question: do you think the browsers will support a model other
than the CA Zoo for rooting trust?

Michal Zalewski

unread,
Dec 15, 2014, 4:30:23 AM12/15/14
to Igor Bukanov, Peter Bowen, Chris Palmer, Eduardo Robles Elvira, dev-se...@lists.mozilla.org, blink-dev, public-w...@w3.org, security-dev
> So indeed an option to declare that despite proper certificates and
> encryption the site should be treated as of insecure origin is needed. This
> way the page will be shown as if it was served as before with plain http
> with no changes in user experience. But then it cannot be a https site
> since many users still consider that https is enough to assume a secure
> site. Hence the idea of encrypted http:// or something that makes user
> experience with an encrypted page absolutely the same as she has with plain
> http:// down to the browser stripping http:// from the URL.

Sounds like you're essentially proposing a flavor of opportunistic
encryption for http://, right?

That seems somewhat tangential to Chris' original proposal, and there
is probably a healthy debate to be had about this; it may be also
worthwhile to look at SPDY and QUIC. In general, if you're comfortable
with not providing users with a visible / verifiable degree of
transport security, I'm not sure how the proposal changes this?

By the way, note that nothing is as simple as it seems; opportunistic
encryption is easy to suggest, but it's pretty damn hard to iron out
all the kinks. If there is genuinely no distinction between plain old
HTTP and opportunistically encrypted HTTP, the scheme can be
immediately rendered useless by any active attacker, and suffers from
many other flaws (for example, how do you link from within that
opportunistic scheme to other URLs within your application without
downgrading to http:// or upgrading to "real" https://?). Establishing
a new scheme solves that, but doesn't really address your other
concern - you still need to clean up all links. On top of that, it
opens a whole new can of worms by messing around with SOP.

/mz

Ryan Sleevi

unread,
Dec 15, 2014, 4:38:43 AM12/15/14
to Jeffrey Walton, security-dev, blink-dev, dev-se...@lists.mozilla.org, Christian Heutger, public-w...@w3.org

From an SOP point of view, this is true.
However, it is increasingly less true if you're willing to ignore the (near cataclysmic) SOP failure, as EV gains technical controls such as certificate transparency and pontentially mandatory stronger security settings (e.g. secure ciphersuites in modern TLS, OCSP stapling, etc). Additionally, there are other technical controls (validity periods, key processing) that do offer distinction.

That is, it is not all procedural changes, and UAs can detect and differentiate. While the hope is that these will be able to apply to all sites in the future, any change of this scale takes time.

> If DNS is authentic, then DANE provides stronger assurances than DV or
> EV since the domain operator published the information and the
> veracity does not rely on others like CAs (modulo DBOUND).
>
> Not relying on a CA is a good thing since its usually advantageous to
> minimize trust (for some definition of "trust"). Plus, CAs don't
> really warrant anything, so its not clear what exactly they are
> providing to relying parties (they are providing a a signature for
> money to the applicant).
>
> Open question: do you think the browsers will support a model other
> than the CA Zoo for rooting trust?

Chromium has no plans for this, particularly those based on DNS/DANE, which are empirically less secure and more operationally fraught with peril. I would neither take it as foregone that the CA system cannot improve nor am I confident that any of the known alternatives are either practical or comparable in security to CAs, let alone superior.

Igor Bukanov

unread,
Dec 15, 2014, 5:03:15 AM12/15/14
to Michal Zalewski, Peter Bowen, Chris Palmer, Eduardo Robles Elvira, dev-se...@lists.mozilla.org, blink-dev, public-w...@w3.org, security-dev
On 15 December 2014 at 10:30, Michal Zalewski <lca...@google.com> wrote:
That seems somewhat tangential to Chris' original proposal, and there
is probably a healthy debate to be had about this; it may be also
worthwhile to look at SPDY and QUIC. In general, if you're comfortable
with not providing users with a visible / verifiable degree of
transport security, I'm not sure how the proposal changes this?


Chris' original proposal is a stick. I want to give a site operator also a carrot. That can be an option to activate encryption that is not visible to the user and *receive* from the browser all reports about violations of secure origin policy. This way the operator will know that they can activate HTTPS without worsening user experience and have information that helps to fix the content.

If there is genuinely no distinction between plain old
HTTP and opportunistically encrypted HTTP, the scheme can be
immediately rendered useless by any active attacker

I am not proposing that a user-invisible encryption should stay forever. Rather it should be treated just as a tool to help site operators to transition to the proper https so at no stage the user experience would be worse than continuing to serve pages with plain http.

mof...@gmail.com

unread,
Dec 15, 2014, 9:41:39 AM12/15/14
to securi...@chromium.org
суббота, 13 декабря 2014 г., 3:46:36 UTC+3 пользователь Chris Palmer написал:
> Hi everyone,
>
>
> Apologies to those of you who are about to get this more than once, due to the cross-posting. I'd like to get feedback from a wide variety of people: UA developers, web developers, and users. The canonical location for this proposal is: https://www.chromium.org/Home/chromium-security/marking-http-as-non-secure.
>
>
>
> Proposal
>
> We, the Chrome Security Team, propose that user agents (UAs) gradually change their UX to display non-secure origins as affirmatively non-secure. We intend to devise and begin deploying a transition plan for Chrome in 2015.
>
> The goal of this proposal is to more clearly display to users that HTTP provides no data security.
>
> Request
>
> We’d like to hear everyone’s thoughts on this proposal, and to discuss with the web community about how different transition plans might serve users.
>
> Background
>
> We all need data communication on the web to be secure (private, authenticated, untampered). When there is no data security, the UA should explicitly display that, so users can make informed decisions about how to interact with an origin.
>
> Roughly speaking, there are three basic transport layer security states for web origins:
>
> Secure (valid HTTPS, other origins like (*, localhost, *));
> Dubious (valid HTTPS but with mixed passive resources, valid HTTPS with minor TLS errors); and
> Non-secure (broken HTTPS, HTTP).
>
> For more precise definitions of secure and non-secure, see Requirements for Powerful Features and Mixed Content.
>
> We know that active tampering and surveillance attacks, as well as passive surveillance attacks, are not theoretical but are in fact commonplace on the web.
>
> RFC 7258: Pervasive Monitoring Is an Attack
> NSA uses Google cookies to pinpoint targets for hacking
> Verizon’s ‘Perma-Cookie’ Is a Privacy-Killing Machine
> How bad is it to replace adSense code id to ISP's adSense ID on free Internet?
> Comcast Wi-Fi serving self-promotional ads via JavaScript injection
> Erosion of the moral authority of transparent middleboxes
> Transitioning The Web To HTTPS
>
> We know that people do not generally perceive the absence of a warning sign. (See e.g. The Emperor's New Security Indicators.) Yet the only situation in which web browsers are guaranteed not to warn users is precisely when there is no chance of security: when the origin is transported via HTTP. Here are screenshots of the status quo for non-secure domains in Chrome, Safari, Firefox, and Internet Explorer:
>
>
>
>
>
>
>
>
>
> Particulars
>
> UA vendors who agree with this proposal should decide how best to phase in the UX changes given the needs of their users and their product design constraints. Generally, we suggest a phased approach to marking non-secure origins as non-secure. For example, a UA vendor might decide that in the medium term, they will represent non-secure origins in the same way that they represent Dubious origins. Then, in the long term, the vendor might decide to represent non-secure origins in the same way that they represent Bad origins.
>
> Ultimately, we can even imagine a long term in which secure origins are so widely deployed that we can leave them unmarked (as HTTP is today), and mark only the rare non-secure origins.
>
> There are several ways vendors might decide to transition from one phase to the next. For example, the transition plan could be time-based:
>
> T0 (now): Non-secure origins unmarked
> T1: Non-secure origins marked as Dubious
> T2: Non-secure origins marked as Non-secure
> T3: Secure origins unmarked
>
> Or, vendors might set thresholds based on telemetry that measures the ratios of user interaction with secure origins vs. non-secure. Consider this strawman proposal:
>
> Secure > 65%: Non-secure origins marked as Dubious
> Secure > 75%: Non-secure origins marked as Non-secure
> Secure > 85%: Secure origins unmarked
>
> The particular thresholds or transition dates are very much up for discussion. Additionally, how to define “ratios of user interaction” is also up for discussion; ideas include the ratio of secure to non-secure page loads, the ratio of secure to non-secure resource loads, or the ratio of total time spent interacting with secure vs. non-secure origins.
>
> We’d love to hear what UA vendors, web developers, and users think. Thanks for reading!

This is the worst idea ever. There are no point to encrypt ANYTHING. What for to encrypt cat's images? There are public websites, static websites - there are no need to encrypt it's traffic. They are public, open and not related to something which need to be hidden. HTTPS adds an additional point of failure and additional point to monitor for system administrator and web developers.
Also it's not always possible to use HTTPS. For many websites it's much easier to just stay only HTTP mode.

You basically want all websites to work on HTTPS and it's stupid idea.

If something like it will be invented in chromium, i'll cast a vote to prevent it from grow.

Daniel Veditz

unread,
Dec 15, 2014, 12:55:00 PM12/15/14
to Igor Bukanov, Michal Zalewski, Peter Bowen, Chris Palmer, Eduardo Robles Elvira, dev-se...@lists.mozilla.org, blink-dev, public-w...@w3.org, security-dev
On 12/15/14 2:03 AM, Igor Bukanov wrote:
> Chris' original proposal is a stick. I want to give a site operator also
> a carrot. That can be an option to activate encryption that is not
> visible to the user and *receive* from the browser all reports about
> violations of secure origin policy. This way the operator will know that
> they can activate HTTPS without worsening user experience and have
> information that helps to fix the content.

Serve the HTML page over http: but load all sub-resources over https: as
expected after the transition. Add the following header:

Content-Security-Policy-Report-Only: default-src https:; report-uri <me>

(add "script-src https: 'unsafe-inline' 'unsafe-eval';" if necessary)

This doesn't give you the benefit of encrypting your main HTML content
during the transition as you requested, but it is something that can be
done today. When the reports come back clean enough you can switch the
page content to https too.

-Dan Veditz

simonand...@gmail.com

unread,
Dec 15, 2014, 6:21:18 PM12/15/14
to blin...@chromium.org, public-w...@w3.org, securi...@chromium.org, dev-se...@lists.mozilla.org
I'm a webmaster and I have switched a good 2 months ago (my website is really not huge, around 100K unique monthly visitors, it's a one-man website). 
I would say switching was "interesting.
It was easy till I realized I had mixed content on some of my pages ( a visitor tipped me off, it was embarrassing).
Finding/analyzing and solving that tiny problem was hard. I spent 99% of the time spent on switching working on that one little problem. 

My biggest problem with switching to HTTPS is that it is time and money and as an independent, non venture backed webmaster my resources are very limited.

I'm not sure there is a positive ROI in switching for a small webmaster like me. 
My visitors don't care and/or understand what is HTTPS and why is it better for them.

However, flagging non-secure websites would definitely generate more awareness. 

I definitely - and I think a lot of the small guys, who switched already - support this 100%.


On Saturday, December 13, 2014 1:46:39 AM UTC+1, Chris Palmer wrote:
Hi everyone,

Apologies to those of you who are about to get this more than once, due to the cross-posting. I'd like to get feedback from a wide variety of people: UA developers, web developers, and users. The canonical location for this proposal is: https://www.chromium.org/Home/chromium-security/marking-http-as-non-secure.

Proposal


We, the Chrome Security Team, propose that user agents (UAs) gradually change their UX to display non-secure origins as affirmatively non-secure. We intend to devise and begin deploying a transition plan for Chrome in 2015.


The goal of this proposal is to more clearly display to users that HTTP provides no data security.


Request


We’d like to hear everyone’s thoughts on this proposal, and to discuss with the web community about how different transition plans might serve users.


Background


We all need data communication on the web to be secure (private, authenticated, untampered). When there is no data security, the UA should explicitly display that, so users can make informed decisions about how to interact with an origin.


Roughly speaking, there are three basic transport layer security states for web origins:


  • Secure (valid HTTPS, other origins like (*, localhost, *));

  • Dubious (valid HTTPS but with mixed passive resources, valid HTTPS with minor TLS errors); and

  • Non-secure (broken HTTPS, HTTP).


For more precise definitions of secure and non-secure, see Requirements for Powerful Features and Mixed Content.


We know that active tampering and surveillance attacks, as well as passive surveillance attacks, are not theoretical but are in fact commonplace on the web.


RFC 7258: Pervasive Monitoring Is an Attack

NSA uses Google cookies to pinpoint targets for hacking

Verizon’s ‘Perma-Cookie’ Is a Privacy-Killing Machine

How bad is it to replace adSense code id to ISP's adSense ID on free Internet?

Comcast Wi-Fi serving self-promotional ads via JavaScript injection

Erosion of the moral authority of transparent middleboxes

Transitioning The Web To HTTPS


We know that people do not generally perceive the absence of a warning sign. (See e.g. The Emperor's New Security Indicators.) Yet the only situation in which web browsers are guaranteed not to warn users is precisely when there is no chance of security: when the origin is transported via HTTP. Here are screenshots of the status quo for non-secure domains in Chrome, Safari, Firefox, and Internet Explorer:


Screen Shot 2014-12-11 at 5.08.48 PM.png


Screen Shot 2014-12-11 at 5.09.55 PM.png


Screen Shot 2014-12-11 at 5.11.04 PM.png


ie-non-secure.png


ferdy.c...@gmail.com

unread,
Dec 15, 2014, 6:28:38 PM12/15/14
to blin...@chromium.org, public-w...@w3.org, securi...@chromium.org, dev-se...@lists.mozilla.org
I'm a small website owner and I believe this proposal will upset a lot of small hosters and website owners. In particular for simple content websites, https is a burden for them, in time, cost and it not adding much value. I don't need to be convinced of the security advantages of this proposal, I'm just looking at the practical aspects of it. Furthermore, as mentioned here there is the issue of mixed content and plugins you don't own or control. I sure hope any such warning message is very subtle, otherwise a lot of traffic will be driven away from websites.

Adrienne Porter Felt

unread,
Dec 15, 2014, 6:50:36 PM12/15/14
to ferdy.c...@gmail.com, blink-dev, public-w...@w3.org, security-dev, dev-se...@lists.mozilla.org
If someone thinks their users are OK with their website not having integrity/authentication/privacy, then why is it problematic that Chrome will start telling users about it? Presumably these users would still be OK with it after Chrome starts making the situation more obvious. (And if the users start disliking it, then perhaps they really were never OK with it in the first place?)

Alex Gaynor

unread,
Dec 15, 2014, 6:53:02 PM12/15/14
to Adrienne Porter Felt, ferdy.c...@gmail.com, blink-dev, public-w...@w3.org, security-dev, dev-se...@lists.mozilla.org
Indeed, the notion that users don't care is based on an ill-founded premise of informed consent.

Here's a copy-pasted comment I made elsewhere on this topic:

To respect a user's decision, their decision needs to be an informed one, and it needs to be choice. I don't think there's a reasonable basis to say either of those are the case with users using HTTP in favor of HTTPS:

First, is it a choice: given that browsers default to HTTP, when no protocol is explicitly selected, and that many users will access the site via external links that they don't control, I don't think it's fair to say that users choose HTTP, they simply get HTTP.

Second, if we did say they'd made a choice, was an informed one. We, as an industry, have done a very poor job of educating users about the security implications of actions online. I don't believe most non-technical users have an understanding of what the implications of the loss of the Authentication, Integrity, or Confidentiality that coms with preferring HTTP to HTTPS are.

Given the fact that most users don't proactively consent to having their content spied upon or mutated in transit, and insofar as they do, it is not informed consent, I don't believe website authors have any obligation to provide access to content over dangerous protocols like HTTP.


Alex

To unsubscribe from this group and stop receiving emails from it, send an email to security-dev...@chromium.org.

ferdy.c...@gmail.com

unread,
Dec 15, 2014, 7:10:21 PM12/15/14
to blin...@chromium.org, ferdy.c...@gmail.com, public-w...@w3.org, securi...@chromium.org, dev-se...@lists.mozilla.org, fe...@chromium.org
"If someone thinks their users are OK with their website not having integrity/authentication/privacy"

That is an assumption that doesn't apply to every website. Many websites don't even have authentication.

"Presumably these users would still be OK with it after Chrome starts making the situation more obvious."

Or perhaps it doesn't, and it scares them away. Just like with the cookie bars, where now every user believes all cookies are evil. You assume users are able to make an informed decision based on such warnings, and I doubt that.


"Presumably these users would still be OK with it after Chrome starts making the situation more obvious"

Donald Stufft

unread,
Dec 15, 2014, 7:12:28 PM12/15/14
to ferdy.c...@gmail.com, blin...@chromium.org, public-w...@w3.org, securi...@chromium.org, dev-se...@lists.mozilla.org, fe...@chromium.org

On Dec 15, 2014, at 7:10 PM, ferdy.c...@gmail.com wrote:

"If someone thinks their users are OK with their website not having integrity/authentication/privacy"

That is an assumption that doesn't apply to every website. Many websites don't even have authentication.

"Presumably these users would still be OK with it after Chrome starts making the situation more obvious."

Or perhaps it doesn't, and it scares them away. Just like with the cookie bars, where now every user believes all cookies are evil. You assume users are able to make an informed decision based on such warnings, and I doubt that.

"Presumably these users would still be OK with it after Chrome starts making the situation more obvious"

If users are unable to make an informed choice than I personally believe it’s up to the User Agent to try and pick what choice the user most likely wants. I have a hard time imagining that most users, if given the choice between allowing anyone in the same coffee shop to read what they are reading and not allowing, would willingly choose HTTP over HTTPS.

---
Donald Stufft
PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA

ferdy.c...@gmail.com

unread,
Dec 15, 2014, 7:12:41 PM12/15/14
to blin...@chromium.org, fe...@chromium.org, ferdy.c...@gmail.com, public-w...@w3.org, securi...@chromium.org, dev-se...@lists.mozilla.org, alex....@gmail.com
"To respect a user's decision, their decision needs to be an informed one, and it needs to be choice. I don't think there's a reasonable basis to say either of those are the case with users using HTTP in favor of HTTPS:"

What if a decision doesn't apply, what if the user is unable to be informed and make a proper decision? I don't agree that every action should be decided upon by the end-user, I am much more in favor of education website owners.

ferdy.c...@gmail.com

unread,
Dec 15, 2014, 7:15:14 PM12/15/14
to blin...@chromium.org, ferdy.c...@gmail.com, public-w...@w3.org, securi...@chromium.org, dev-se...@lists.mozilla.org, fe...@chromium.org, donald...@gmail.com
I think a choice between HTTP and HTTPS by the user doesn't make sense. The choice is different, it is the choice between staying on a HTTP-only site or leaving it alltogether.

Ryan Sleevi

unread,
Dec 15, 2014, 7:19:00 PM12/15/14
to ferdy.c...@gmail.com, blink-dev, public-w...@w3.org, security-dev, dev-se...@lists.mozilla.org, Adrienne Porter Felt
On Mon, Dec 15, 2014 at 4:10 PM, <ferdy.c...@gmail.com> wrote:
"If someone thinks their users are OK with their website not having integrity/authentication/privacy"

That is an assumption that doesn't apply to every website. Many websites don't even have authentication. 

I think there may be some confusion.

"Authentication" here does not refer to "Does the user authenticate themselves to the site" (e.g. do they log in), but "Is the site you're talking to the site you the site you expected" (or, put differently, "Does the server authenticate itself to the user").

Without authentication in this sense (e.g. talking to whom you think you're talking to), anyone can trivially impersonate a server and alter the responses. This is not that hard, a few examples for you about why authentication is important, even for sites without logins:


This is why it's important to know you're talking to the site you're expecting (Authentication), and that no one has modified that site's contents (Integrity).

Christian Heutger

unread,
Dec 15, 2014, 8:11:55 PM12/15/14
to nolo...@gmail.com, public-w...@w3.org, blin...@chromium.org, securi...@chromium.org, dev-se...@lists.mozilla.org
>Surely you haven't missed the big danger in plain text traffic. That
>traffic gets usuroed and fed into susyems like Xkeyscore for Tailored
>Access Operations (TAO). In layman's terms, adversaries are using the
>information gathered to gain unauthorized access to systems.

With DV (weak validation) it then goes encrypted to them, I don¹t see the
advantage. The magic bullet TOR to prevent from being monitored also
showed up, that the expected privacy may be broken. It¹s a good idea but
therefor stepping back from the value of PKIX is the wrong way in my
opinion.

>The race to the bottom among CAs is to blame for the quality of
>verification by the CAs.

Right, so DV need to be deprecated or set to a recognizable lower level,
clearly stating that it¹s only encryption, nothing else.

>With companies like StartCom, Cacert and Mozilla offering free
>certificates, there is no barrier to entry.

And no barrier breaking the value of certificate authorities vs.
self-signed certificates (Cacert is the only good exception, for a good
reason their approach is different).

>Plus, I don't think a certificate needs to say anything about the
>operator. They need to ensure the server is authenticated. That is, the
>public key bound to the DNS name is authentic.

If a certificate doesn¹t tell, what should tell? How should I be sure to
be on www.onlinebanking.de and not www.onlínebanking.de (see the accent)
by getting spoofed or phished? It¹s the same for Facebook.com or
Facebo0k.com, ...

>As I understand it, phishers try to avoid TLS because they count on the
>plain text channel to avoid all the browser warnings. Peter Gutmann
>discusses this in his Engineering Security book
>(https://www.cs.auckland.ac.nz/~pgut001/pubs/book.pdf).

If there is a free certificate for everyone and everything is https, which
browser warnings should occur?

>Why green for EV (or why yellow for DV or DANE)? EV does not add any
>technical controls. From a security standpoint, DV and EV are equivalent.

That¹s what certificates are for. If we only would want to have
encryption, there would never be any requirement for certificates.
Browsers and servers handle cipher suites, handshakes etc., the
certificate is the digital equivalent to an authorized identity card, and
there for sure DV and EV are different. Security is about confidentiality,
integrity and availability. Confidentiality is the encryption, integrity
is the validation.

>If DNS is authentic, then DANE provides stronger assurances than DV or EV
>since the domain operator published the information and the veracity does
>not rely on others like CAs (modulo DBOUND).

From the pure technical standpoint, yes, from the validation standpoint,
no. DANE has the hazel of compatibility, but it also struggle with harder
mandatory realization of restrictions (online or offline key material, key
sizes, algorithm, debian bug or heart bleed reissue, Š, all the topics,
which recently arised), for pinning validated (EV) certificates, it¹s the
best solution vs. pinning or transparency.

>Not relying on a CA is a good thing since its usually advantageous to
>minimize trust (for some definition of "trust"). Plus, CAs don¹t really
>warrant anything, so its not clear what exactly they are providing to
>relying parties (they are providing a a signature for money to the
>applicant).

As there is not internet governance, they are the only available
alternative. Similar to other agencies existing worldwide, they fetch
money for validation services and warrant for mis-validation. They are
dictated strict rules on how to do and be audited to proof, they follow
this rules. That¹s how auditing currently works in many places and
although it¹s not the optimal system, it¹s the one currently available.

>Open question: do you think the browsers will support a model other than
>the CA Zoo for rooting trust?

If a reliable, usable and manageable concept will be established, for
sure. But as e.g. ISO 27001 establish the same model, there is a company
being paid for stating what they audited is correct and issuing a seal
(being ISO 27001 certified) which end users should trust in.

Ryan Sleevi

unread,
Dec 15, 2014, 8:20:54 PM12/15/14
to Christian Heutger, nolo...@gmail.com, public-w...@w3.org, blin...@chromium.org, securi...@chromium.org, dev-se...@lists.mozilla.org
There's a lot of information here that isn't quite correct, but I would hate to rathole on a discussion of CA security vs the alternatives, which inevitably arises when one discusses HTTPS in public fora.

I think the discussion here is somewhat orthogonal to the proposal at hand, and thus might be best if kept to a separate thread.

The question of whether HTTP is equivalent to HTTPS-DV in security or authenticity is simple - they aren't at all equivalent, an HTTPS-DV provides vastly superior value over HTTP. So whether or not UAs embrace HTTPS-EV is separate. But as a UA vendor, I would say HTTPS-DV has far more appeal for protecting users and providing security than HTTPS-EV, for many of the reasons you've heard on this thread from others (e.g. the challenges regarding mixed HTTPS+HTTP is the same as HTTPS-EV + HTTPS-DV).

So, assuming we have HTTP vs HTTPS-EV/HTTPS-DV, how best should UAs communicate to the user the lack of security guarantees from HTTP.


Christian Heutger

unread,
Dec 15, 2014, 8:50:37 PM12/15/14
to rsl...@chromium.org, nolo...@gmail.com, public-w...@w3.org, blin...@chromium.org, securi...@chromium.org, dev-se...@lists.mozilla.org
> So, assuming we have HTTP vs HTTPS-EV/HTTPS-DV, how best should UAs communicate to the user the lack of security guarantees from HTTP.

I would recommend here as mentioned:

No padlock, red bar or red strike, … => no encryption [and no validation], e.g. similar to SHA1 deprecation in worst situation
Only vs. HTTPS: Padlock => everything fine and not red, „normal“ address bar behavior
With EV differentiation: Padlock, yellow bar, yellow signal, … => only encryption, e.g. similar to current mixed content, …
EV: Validation information, Padlock green bar, no extras, … => similar to current EV

Red-Yellow-Green is recognized all other the world, all traffic signals are like this, explanation on what signal means what can be added to the dialog on click. (Red) strike, (yellow) signal, (green) additional validation information follow also the idea to have people without been able to differentiate colors to understand what happens here.

Peter Kasting

unread,
Dec 15, 2014, 9:00:22 PM12/15/14
to Christian Heutger, rsl...@chromium.org, nolo...@gmail.com, public-w...@w3.org, blin...@chromium.org, securi...@chromium.org, dev-se...@lists.mozilla.org
Please don't try to debate actual presentation ideas on this list.  How UAs present various states is something the individual UA's design teams have much more context and experience doing, so debating that sort of thing here just takes everyone's time to no benefit, and is likely to rapidly become a bikeshed in any case.

As the very first message in the thread states, the precise UX changes here are up to the UA vendors.  What's more useful is to debate the concept of displaying non-secure origins as non-secure, and how to transition to that state over time.

PK

Igor Bukanov

unread,
Dec 16, 2014, 12:29:27 AM12/16/14
to Daniel Veditz, Michal Zalewski, Peter Bowen, Chris Palmer, Eduardo Robles Elvira, dev-se...@lists.mozilla.org, blink-dev, public-w...@w3.org, security-dev
On 15 December 2014 at 18:54, Daniel Veditz <dve...@mozilla.com> wrote:
Serve the HTML page over http: but load all sub-resources over https: as
expected after the transition. Add the following header:

Content-Security-Policy-Report-Only: default-src https:; report-uri <me>

This is a nice trick! However, it does not work in general due to the use of protocolless-links starting with // . Or should those be discouraged?

Ryan Sleevi

unread,
Dec 16, 2014, 12:35:28 AM12/16/14
to Igor Bukanov, Daniel Veditz, Michal Zalewski, Peter Bowen, Chris Palmer, Eduardo Robles Elvira, dev-se...@lists.mozilla.org, blink-dev, public-w...@w3.org, security-dev
Sounds like a CSP-bug to me; scheme-relative URLs are awesome, and we should encourage them (over explicit http://-schemed URLs) 

Igor Bukanov

unread,
Dec 16, 2014, 1:17:51 AM12/16/14
to Ryan Sleevi, ferdy.c...@gmail.com, dev-se...@lists.mozilla.org, blink-dev, public-w...@w3.org, Adrienne Porter Felt, security-dev
On 16 December 2014 at 01:18, Ryan Sleevi <rsl...@chromium.org> wrote:
"Authentication" here does not refer to "Does the user authenticate
themselves to the site" (e.g. do they log in), but "Is the site you're
talking to the site you the site you expected" (or, put differently, "Does
the server authenticate itself to the user").

With protocols like SRP or J-PAKE authentication in the first sense (log in) also provides authentication in the second sense (protocols ensures mutual authentication between the user and the server without leaking passwords). I wish there would be at least some support in the browsers for these protocols so one could avoid certificates and related problems in many useful cases.

Andy Wingo

unread,
Dec 16, 2014, 4:14:45 AM12/16/14
to Ryan Sleevi, Igor Bukanov, Daniel Veditz, Michal Zalewski, Peter Bowen, Chris Palmer, Eduardo Robles Elvira, dev-se...@lists.mozilla.org, blink-dev, public-w...@w3.org, security-dev
On Tue 16 Dec 2014 06:35, Ryan Sleevi <rsl...@chromium.org> writes:

> scheme-relative URLs are awesome, and we should encourage them (over
> explicit http://-schemed URLs)

Isn't it an antipattern to make a resource available over HTTP if it is
available over HTTPS? In all cases you could just use HTTPS; no need to
provide an insecure option.

The one case that I know of when scheme-relative URLs are useful is when
HTTPS is not universally accessible, e.g. when the server only supports
TLSv1.2 and so is not reachable from old Android phones, among other
UAs. In that case scheme-relative URLs allow you to serve the same
content over HTTPS to browsers that speak TLSv1.2 but also have it
available insecurely to older browsers.

If there is mention of scheme-relative URLs in a "Marking HTTP as
Non-Secure" set of guidelines for authors and site operators, it should
be to avoid them in favor of explicitly using the HTTPS scheme.

Andy

Sigbjørn Vik

unread,
Dec 16, 2014, 8:59:26 AM12/16/14
to Chris Palmer, public-w...@w3.org, blink-dev, security-dev, dev-se...@lists.mozilla.org
I am happy to see this initiative, I consider the current standard
browser UI broken and upside-down. Today, plain http is not trustworthy,
but it still has the "normal" look in browsers. We ought to change this.

A few thoughts:

Users expect that when they come to a site that looks like Facebook, it
is Facebook. They expect any problems to be flagged, and unless there is
a warning, that everything is OK. They do not understand what most
icons/colors and dialogs mean, and are confused by the complexity of
security (and the web in general). A good UI should present the web the
way the user expects it to be presented. Expecting users to spend their
time learning and memorizing various browser UIs (user education) is
arrogant. Starting this discussion from the implementation details is
starting it in the wrong end.

One example of an experimental browser UI is Opera Coast. It goes much
further than cleaning up the security symbols, it removes the entire
address field. It uses a lot of extra background checks, with the aim to
allow users to browse without having to check addresses. If something
seems wrong, it will warn the user ahead of time. This seems to me to be
the ideal, where security is baked into the solution, not tacked on top.
From a user's perspective, it just works. I think revamping address bars
and badges should take the long term goal into consideration as well.
(I'll happily discuss Coast's solutions, but please start a new thread
if so.)

Browsers normally have 3-5 different visual security states in the UI;
normal (no security), DV and EV. Some browsers have special visual
indicators for various types of broken security (dubious, bad, etc). In
addition there are a multitude of corner cases. Although I can see the
use of three states, to support gradual degradation via the middle
state, more than three states is confusing, and the ideal should be
none, as in the above example.

Given three states for now, the question is how we want to display them.
We need one for general unsecured contents. We want one for top
security, i.e. all the latest encryption standards and EV. Then general
encryption would go into the last bucket. Encryption standards will have
to change over time. From a user perspective, a natural way to mark
three states would be as insecure (red/warning), normal (neutral/no
marking) and secure (green/padlock).

There is no need to distinguish unsecured from dubiously secured, they
can just go into the same bucket. There isn't even any need to warn
users about certificate errors, the UI is just downgraded to insecure,
as a self-signed site is no less secure than an http site. There are
technical reasons for the warnings, but those can be bug-fixed. Active
attacks (e.g. certificate replacement to an invalid one, HSTS failure,
revoked certificates, ...) might still be hard-blocked, but note that
this constitutes a fourth state, and the UI is becoming very complicated
already - there are probably better ways to map such cases into the
insecure state, but that is a separate discussion.

One issue is that browser UI is and should be a place for innovation,
not rigid specifications. At the same time, users would clearly benefit
from consistent and good UI. Diverging from the de-facto UI standard
towards a better one comes with a cost for browsers, and they might not
have the incentive to do so. A coordinated move towards a better future
would be good, as long as we avoid the hard limitations. Regardless of
this discussion, we do need better coordination for removing old crypto
standards (SHA-1, SSLv3, RC4, ...) from the "secure" bucket in the UI.
In short, I am all for a coordinated move, but there needs to be space
for browsers to innovate as well.

In terms of the transition plan, I think a date-based plan is the only
thing which will work. This gives all parties time to prepare, they know
when the next phase will start, and nobody will be arguing if we have
reached a milestone or not. It also avoids any deadlocks where the next
phase is needed to push the web to the state where the next phase will
begin. Any ambitious timeline will fail to get all players on board. A
multi-year plan is still better than the resulting user confusion if
browsers move on their own.

BTW, have you explicitly contacted other browser teams?

--
Sigbjørn Vik
Opera Software

Hanno Böck

unread,
Dec 16, 2014, 12:12:47 PM12/16/14