Proposal: Marking HTTP As Non-Secure

2,145 views
Skip to first unread message

Chris Palmer

unread,
Dec 12, 2014, 7:46:36 PM12/12/14
to public-w...@w3.org, blink-dev, security-dev, dev-se...@lists.mozilla.org
Hi everyone,

Apologies to those of you who are about to get this more than once, due to the cross-posting. I'd like to get feedback from a wide variety of people: UA developers, web developers, and users. The canonical location for this proposal is: https://www.chromium.org/Home/chromium-security/marking-http-as-non-secure.

Proposal


We, the Chrome Security Team, propose that user agents (UAs) gradually change their UX to display non-secure origins as affirmatively non-secure. We intend to devise and begin deploying a transition plan for Chrome in 2015.


The goal of this proposal is to more clearly display to users that HTTP provides no data security.


Request


We’d like to hear everyone’s thoughts on this proposal, and to discuss with the web community about how different transition plans might serve users.


Background


We all need data communication on the web to be secure (private, authenticated, untampered). When there is no data security, the UA should explicitly display that, so users can make informed decisions about how to interact with an origin.


Roughly speaking, there are three basic transport layer security states for web origins:


  • Secure (valid HTTPS, other origins like (*, localhost, *));

  • Dubious (valid HTTPS but with mixed passive resources, valid HTTPS with minor TLS errors); and

  • Non-secure (broken HTTPS, HTTP).


For more precise definitions of secure and non-secure, see Requirements for Powerful Features and Mixed Content.


We know that active tampering and surveillance attacks, as well as passive surveillance attacks, are not theoretical but are in fact commonplace on the web.


RFC 7258: Pervasive Monitoring Is an Attack

NSA uses Google cookies to pinpoint targets for hacking

Verizon’s ‘Perma-Cookie’ Is a Privacy-Killing Machine

How bad is it to replace adSense code id to ISP's adSense ID on free Internet?

Comcast Wi-Fi serving self-promotional ads via JavaScript injection

Erosion of the moral authority of transparent middleboxes

Transitioning The Web To HTTPS


We know that people do not generally perceive the absence of a warning sign. (See e.g. The Emperor's New Security Indicators.) Yet the only situation in which web browsers are guaranteed not to warn users is precisely when there is no chance of security: when the origin is transported via HTTP. Here are screenshots of the status quo for non-secure domains in Chrome, Safari, Firefox, and Internet Explorer:


Screen Shot 2014-12-11 at 5.08.48 PM.png


Screen Shot 2014-12-11 at 5.09.55 PM.png


Screen Shot 2014-12-11 at 5.11.04 PM.png


ie-non-secure.png


Particulars


UA vendors who agree with this proposal should decide how best to phase in the UX changes given the needs of their users and their product design constraints. Generally, we suggest a phased approach to marking non-secure origins as non-secure. For example, a UA vendor might decide that in the medium term, they will represent non-secure origins in the same way that they represent Dubious origins. Then, in the long term, the vendor might decide to represent non-secure origins in the same way that they represent Bad origins.


Ultimately, we can even imagine a long term in which secure origins are so widely deployed that we can leave them unmarked (as HTTP is today), and mark only the rare non-secure origins.


There are several ways vendors might decide to transition from one phase to the next. For example, the transition plan could be time-based:


  1. T0 (now): Non-secure origins unmarked

  2. T1: Non-secure origins marked as Dubious

  3. T2: Non-secure origins marked as Non-secure

  4. T3: Secure origins unmarked


Or, vendors might set thresholds based on telemetry that measures the ratios of user interaction with secure origins vs. non-secure. Consider this strawman proposal:


  1. Secure > 65%: Non-secure origins marked as Dubious

  2. Secure > 75%: Non-secure origins marked as Non-secure

  3. Secure > 85%: Secure origins unmarked


The particular thresholds or transition dates are very much up for discussion. Additionally, how to define “ratios of user interaction” is also up for discussion; ideas include the ratio of secure to non-secure page loads, the ratio of secure to non-secure resource loads, or the ratio of total time spent interacting with secure vs. non-secure origins.


We’d love to hear what UA vendors, web developers, and users think. Thanks for reading!

Eduardo Robles Elvira

unread,
Dec 12, 2014, 8:18:12 PM12/12/14
to Chris Palmer, public-w...@w3.org, blink-dev, security-dev, dev-se...@lists.mozilla.org
Hello Chris et al:

I'm a web developer. I did some vendor security-related development sometime ago inside Konqueror. Some first thoughts:

* In principle, the proposal makes sense to me. Who doesn't want a more secure web? Kudos for making it possible with ambitious proposals like this.

* The biggest problem I see is that to get an accepted certificate traditionally you needed to pay. This was a show-stopper for having TLS certs in small websites. Mozilla, EFF, Cisco, Akamai are trying to fix that [1] and that StartSSL gives free certificates though. Just stating the obvious: you either get easy and free "secure" certificates, or this proposal is going to make some webmasters angry.


Regards,
--
[1] https://www.eff.org/deeplinks/2014/11/certificate-authority-encrypt-entire-web
--
Eduardo Robles Elvira     @edulix             skype: edulix2
http://agoravoting.org       @agoravoting     +34 634 571 634

Chris Palmer

unread,
Dec 12, 2014, 8:32:28 PM12/12/14
to Eduardo Robles Elvira, public-w...@w3.org, blink-dev, security-dev, dev-se...@lists.mozilla.org
On Fri, Dec 12, 2014 at 5:17 PM, Eduardo Robles Elvira <edu...@agoravoting.com> wrote:

* The biggest problem I see is that to get an accepted certificate traditionally you needed to pay. This was a show-stopper for having TLS certs in small websites. Mozilla, EFF, Cisco, Akamai are trying to fix that [1] and that StartSSL gives free certificates though. Just stating the obvious: you either get easy and free "secure" certificates, or this proposal is going to make some webmasters angry.


Oh yes, absolutely. Obviously, Let's Encrypt is a great help, and SSLMate's ease-of-use and low price is great, and CloudFlare's free SSL helps too. 

Hopefully, as operations like those ramp up, it will get increasingly easier for web developers to switch to HTTPS. We (Chrome) will weigh changes to the UX very carefully, and with a close eye on HTTPS adoption.

Alex Gaynor

unread,
Dec 12, 2014, 8:47:00 PM12/12/14
to Chris Palmer, public-w...@w3.org, blink-dev, security-dev, dev-se...@lists.mozilla.org
Fantastic news, I'm very glad to see the Chrome Security Team taking initiative on this.

Existing browsers' behavior of defaulting to, and using a "neutral" UI for, HTTP is fundamentally an assumption what users want. And it's not an assumption that is grounded in data.

No ordinary users' mental for communication on the net includes lack of authenticity, integrity, or confidentially. Plaintext is a blight on the internet, and this is a fantastic step towards making reality match users (TOTALLY REASONABLE!) expectations.

Cheers,
Alex

To unsubscribe from this group and stop receiving emails from it, send an email to security-dev...@chromium.org.

Hanno Böck

unread,
Dec 12, 2014, 9:17:01 PM12/12/14
to securi...@chromium.org, Chris Palmer
Am Fri, 12 Dec 2014 16:46:34 -0800
schrieb "'Chris Palmer' via Security-dev" <securi...@chromium.org>:

> Apologies to those of you who are about to get this more than once,
> due to the cross-posting. I'd like to get feedback from a wide
> variety of people: UA developers, web developers, and users. The
> canonical location for this proposal is:

First of all: ++ from me, I like this very much.

But there's one thing I feel increasingly uneasy about and I'd like to
bring that up: I think much more stuff should be moved into the
"dubious" category.

I feel that there is some "de-facto standard" for secure HTTPS configs
that emerged from discussions around TLS in recent years, some large
webpages and some security conscious people follow it, most ignore it
(including e.g. banks and others who should care). E.g. I'd consider
HSTS pretty much required for a secure setup. Also agl wrote a few days
ago "everything less than TLS 1.2 with an AEAD cipher suite is
cryptographically broken". Of course that's completely reasonable given
the amount of CBC/MacthenEncrypt-related attacks we've seen in the
past. So I'm asking: Why do webpages that only support
"cryptographically broken" protocols get a green lock? They shouldn't.

I feel this is the part that should be discussed alongside.

--
Hanno Böck
http://hboeck.de/

mail/jabber: ha...@hboeck.de
GPG: BBB51E42

Ryan Sleevi

unread,
Dec 12, 2014, 9:23:54 PM12/12/14
to Hanno Böck, security-dev, Chris Palmer

Indeed, and expect a separate discussion of that. You can already see some of the discussion on the security-dev@ list regarding requiring OCSP stapling or modern ciphersuites for EV, and one can naturally assume that will migrate to DV.

That is, just as EV moves to DV when deployed dangerously, so too should DV move to dubious.

But, as you note, that's something to be discussed alongside.

Chris Palmer

unread,
Dec 12, 2014, 9:25:45 PM12/12/14
to Ryan Sleevi, Hanno Böck, security-dev
The SHA-1 deprecation plan that Ryan and I put up (http://googleonlinesecurity.blogspot.com/2014/09/gradually-sunsetting-sha-1.html) was a first take at moving old and busted crypto into Dubious. See also https://code.google.com/p/chromium/issues/detail?id=102949 for broken things we simply un-supported, as well.

Mike West

unread,
Dec 13, 2014, 6:53:18 AM12/13/14
to Ryan Sleevi, Hanno Böck, security-dev, Chris Palmer

That is, just as EV moves to DV when deployed dangerously, so too should DV move to dubious.

Ryan, for clarity, I think you mean that DV should move to dubious iff deployed dangerously, right? Not as a blanket statement? :)

-mike

Igor Bukanov

unread,
Dec 13, 2014, 11:56:22 AM12/13/14
to Chris Palmer, Eduardo Robles Elvira, dev-se...@lists.mozilla.org, blink-dev, public-w...@w3.org, security-dev
Free SSL certificates helps, but another problem is that activating SSL not only generates warnings, but just break the site due to links to insecure resources. Just consider a case of old pages with a few youtube videos served using http iframes. Accessing those pages over https stops the videos from working as browsers blocks access to active insecure context. And in case of youtube one can fix that, but for other resources it may not be possible.

So what is required is ability to refer to insecure context from HTTPS pages without harming user experience. For example, it should be a way to insert http iframe into https site. Similarly, it would be nice if a web-developer could refer to scripts, images etc. over http as long as the script/image tag is accompanied with a secure hash of known context.

_______________________________________________
dev-security mailing list
dev-se...@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-security

Mathias Bynens

unread,
Dec 13, 2014, 12:33:42 PM12/13/14
to Chris Palmer, public-w...@w3.org, blink-dev, security-dev, dev-se...@lists.mozilla.org
On Sat, Dec 13, 2014 at 1:46 AM, 'Chris Palmer' via blink-dev <blin...@chromium.org> wrote:

We know that people do not generally perceive the absence of a warning sign. (See e.g. The Emperor's New Security Indicators.) Yet the only situation in which web browsers are guaranteed not to warn users is precisely when there is no chance of security: when the origin is transported via HTTP. Here are screenshots of the status quo for non-secure domains in Chrome, Safari, Firefox, and Internet Explorer:


Screen Shot 2014-12-11 at 5.08.48 PM.png


Screen Shot 2014-12-11 at 5.09.55 PM.png


Screen Shot 2014-12-11 at 5.11.04 PM.png


ie-non-secure.png


For completeness sake, here’s what a non-secure looks like in Opera:


 

Christian Heutger

unread,
Dec 13, 2014, 2:06:02 PM12/13/14
to pal...@google.com, edu...@agoravoting.com, public-w...@w3.org, blin...@chromium.org, securi...@chromium.org, dev-se...@lists.mozilla.org
I see a big danger in the current trend. Expecting everyone having a free „secure“ certificate and being in requirement to enable HTTPS it will result in nothing won. DV certificates (similar to DANE) do finally say absolute nothing about the website operator. They ensure encryption, so I can then be phished, be scammed, … encrypted. Big advantage!^^ Pushing real validation (e.g. EV with green adressbar and validated details by an independent third party, no breakable, spoofable automatism) vs. no validation is much more important and should be focussed on. However, this „change“ could come with marking HTTP as Non-Secure, but just stating HTTPS as secure is the completely wrong sign and will result in more confusion and loosing any trust in any kind of browser padlocks than before.

Just a proposal:

Mark HTTP as Non-Secure (similar to self-signed) e.g. with a red padlock or sth. similar.
Mark HTTPS as Secure (and only secure in favor of encrypted) e.g. with a yellow padlock or sth. similar
Mark HTTPS with Extended Validation (encrypted and validated) as it is with a green padlock or sth. similar

This would be a good road for more security on the web.

Chris Palmer

unread,
Dec 14, 2014, 12:59:23 PM12/14/14
to Igor Bukanov, Eduardo Robles Elvira, dev-se...@lists.mozilla.org, blink-dev, public-w...@w3.org, security-dev
On Sat, Dec 13, 2014 at 8:56 AM, Igor Bukanov <ig...@mir2.org> wrote:

Free SSL certificates helps, but another problem is that activating SSL not only generates warnings, but just break the site due to links to insecure resources. Just consider a case of old pages with a few youtube videos served using http iframes. Accessing those pages over https stops the videos from working as browsers blocks access to active insecure context. And in case of youtube one can fix that, but for other resources it may not be possible.

Yes, unfortunately we have a collective action problem. (http://en.wikipedia.org/wiki/Collective_action#Collective_action_problem) But just because it's hard, doesn't mean we don't have try. I'd suggest that embedders ask embeddees to at least make HTTPS available, even if not the default.

Also, keep in mind that this proposal is only to mark HTTP as non-secure — HTTP will still work, and you can still host your site over HTTP.
 
So what is required is ability to refer to insecure context from HTTPS pages without harming user experience.

No, because that reduces or eliminates the security guarantee of HTTPS.
 
For example, it should be a way to insert http iframe into https site. Similarly, it would be nice if a web-developer could refer to scripts, images etc. over http as long as the script/image tag is accompanied with a secure hash of known context.

Same thing here. The security guarantee of HTTPS is the combination of server authentication, data integrity, and data confidentiality. It is not a good user experience to take away confidentiality without telling users. And, unfortunately, we cannot effectively communicate that nuance. We have enough trouble effectively communicating the secure/non-secure distinction as it is.

Alex Gaynor

unread,
Dec 14, 2014, 1:01:00 PM12/14/14
to Chris Palmer, Igor Bukanov, Eduardo Robles Elvira, dev-se...@lists.mozilla.org, blink-dev, public-w...@w3.org, security-dev
Chris,

Is there a plan for HTTP to eventually have an interstitial, the way HTTPS with a bogus cert does?

Alex

To unsubscribe from this group and stop receiving emails from it, send an email to security-dev...@chromium.org.

Chris Palmer

unread,
Dec 14, 2014, 1:17:08 PM12/14/14
to Christian Heutger, edu...@agoravoting.com, public-w...@w3.org, blin...@chromium.org, securi...@chromium.org, dev-se...@lists.mozilla.org
On Sat, Dec 13, 2014 at 11:05 AM, Christian Heutger <chri...@heutger.net> wrote:

I see a big danger in the current trend. Expecting everyone having a free „secure“ certificate and being in requirement to enable HTTPS it will result in nothing won. DV certificates (similar to DANE) do finally say absolute nothing about the website operator.

Reducing the number of parties you have to trust from [ the site operator, the operators of all networks between you and the site operator ] to just [ the site operator ] is a huge win.
 
They ensure encryption, so I can then be phished, be scammed, … encrypted. Big advantage!^^ Pushing real validation (e.g. EV with green adressbar and validated details by an independent third party, no breakable, spoofable automatism) vs. no validation is much more important and should be focussed on.

I think you'll find EV is not as "extended" as you might be hoping.

But more importantly, the only way to get minimal server auth, data integrity, and data confidentiality on a mass scale is with something at least as easy to deploy as DV. Indeed, you'll see many of the other messages in this thread are from people concerned that DV isn't easy enough yet! So requiring EV is a non-starter.

Additionally, the web origin concept is (scheme, host, port). Crucially, EV-issued names are not distinct origins from DV-issued names, and proposals to enforce such a distinction in browsers have not gotten any traction because they are not super feasible (for a variety of reasons).
 
However, this „change“ could come with marking HTTP as Non-Secure, but just stating HTTPS as secure is the completely wrong sign and will result in more confusion and loosing any trust in any kind of browser padlocks than before.

HTTPS is the bare minimum requirement for secure web application *transport*. Is secure transport by itself sufficient to achieve total *application-semantic* security? No. But a browser couldn't determine that level of security anyway. Our goal is for the browser to tell as much of the truth as it can programatically determine at run-time.

Igor Bukanov

unread,
Dec 14, 2014, 1:34:26 PM12/14/14
to Chris Palmer, Eduardo Robles Elvira, dev-se...@lists.mozilla.org, blink-dev, public-w...@w3.org, security-dev
On 14 December 2014 at 18:59, Chris Palmer <pal...@google.com> wrote:

Yes, unfortunately we have a collective action problem. (http://en.wikipedia.org/wiki/Collective_action#Collective_action_problem) But just because it's hard, doesn't mean we don't have try. I'd suggest that embedders ask embeddees to at least make HTTPS available, even if not the default.

Also, keep in mind that this proposal is only to mark HTTP as non-secure — HTTP will still work, and you can still host your site over HTTP.

If serving context over HTTPS generates broken pages, the insensitive of enabling encryption is very low. As it was already mentioned, a solution to that is to allow to serve encrypted pages over HTTP so pages that refer to unencrypted elements would not break pages but just produces warnings. Such encrypted http:// also allows to generate less warnings for a page where all context is available over self-signed and key-pinned certificate as that solution is strictly more secure then a plain HTTP.

Chris Palmer

unread,
Dec 14, 2014, 1:34:56 PM12/14/14
to Alex Gaynor, Igor Bukanov, Eduardo Robles Elvira, dev-se...@lists.mozilla.org, blink-dev, public-w...@w3.org, security-dev
On Sun, Dec 14, 2014 at 10:00 AM, Alex Gaynor <alex....@gmail.com> wrote:

Is there a plan for HTTP to eventually have an interstitial, the way HTTPS with a bogus cert does?

We (Chrome) have no current plan to do that. In the Beautiful Future when some huge percentage of pageviews are shown via secure transport, it might or might not make sense to interstitial HTTP then. I kind of doubt that it will be a good idea, but who knows. We'll see.

Chris Palmer

unread,
Dec 14, 2014, 1:40:58 PM12/14/14
to Igor Bukanov, Eduardo Robles Elvira, dev-se...@lists.mozilla.org, blink-dev, public-w...@w3.org, security-dev
On Sun, Dec 14, 2014 at 10:34 AM, Igor Bukanov <ig...@mir2.org> wrote:

If serving context over HTTPS generates broken pages, the insensitive of enabling encryption is very low.

That's the definition of a collective action problem, yes.

I think that the incentives will change, and are changing, and people are becoming more aware of the problems of non-secure transport. There is an on-going culture shift, and more and more publishers are going to offer HTTPS. For example, http://open.blogs.nytimes.com/author/eitan-konigsburg/?_r=0.

As it was already mentioned, a solution to that is to allow to serve encrypted pages over HTTP so pages that refer to unencrypted elements would not break pages but just produces warnings. Such encrypted http:// also allows to generate less warnings for a page where all context is available over self-signed and key-pinned certificate as that solution is strictly more secure then a plain HTTP.

But, again, consider the definition of the origin. If it is possible for securely-transported code to run in the same context as non-securely transported code, the securely-transported code is effectively non-secure.

Igor Bukanov

unread,
Dec 14, 2014, 1:48:08 PM12/14/14
to Chris Palmer, Eduardo Robles Elvira, dev-se...@lists.mozilla.org, blink-dev, public-w...@w3.org, security-dev
On 14 December 2014 at 19:40, Chris Palmer <pal...@google.com> wrote:

But, again, consider the definition of the origin. If it is possible for securely-transported code to run in the same context as non-securely transported code, the securely-transported code is effectively non-secure.

Yes, but the point is that the page will be shown with the same warnings as a plain http page rather then showing a broken page. 

Igor Bukanov

unread,
Dec 14, 2014, 1:53:30 PM12/14/14
to Chris Palmer, Eduardo Robles Elvira, dev-se...@lists.mozilla.org, blink-dev, public-w...@w3.org, security-dev
I.e. just consider that currently a hosting provider has no option to unconditionally encrypt pages they host for modern browsers as that may break pages of the users. With encrypted http:// they get such option delegating the job of fixing warnings about insecure context to the content producers as it should.

Chris Palmer

unread,
Dec 14, 2014, 2:08:32 PM12/14/14
to Igor Bukanov, Eduardo Robles Elvira, dev-se...@lists.mozilla.org, blink-dev, public-w...@w3.org, security-dev
On Sun, Dec 14, 2014 at 10:53 AM, Igor Bukanov <ig...@mir2.org> wrote:

I.e. just consider that currently a hosting provider has no option to unconditionally encrypt pages they host for modern browsers as that may break pages of the users. With encrypted http:// they get such option delegating the job of fixing warnings about insecure context to the content producers as it should.

I'm sorry; I still don't understand what you mean. Do you mean that you want browsers to treat some hypothetical encrypted HTTP protocol as if it were a secure origin, but still allow non-secure embedded content in these origins?

I would argue strongly against that, and so far not even the "opportunistic encryption" advocates have argued for that.

Igor Bukanov

unread,
Dec 14, 2014, 2:26:45 PM12/14/14
to Chris Palmer, Eduardo Robles Elvira, dev-se...@lists.mozilla.org, blink-dev, public-w...@w3.org, security-dev
I would like to see some hypothetical encrypted http:// when a browser present a page as if it was over https:// if everything of a secure origin and as if it was served over plain http if not. That is, if a future browser shows warnings for plain http, so it will show the same warnings for encrypted http:// with insecure resources.

The point of such encrypted http:// is to guarantee that *enabling encryption never degrades user experience* compared with the case of plain http. This will allow for a particular installation to start serving everything encrypted independently from the job of fixing the content. And as the page still served as http://, the user training/expectations about https:// sites no longer applies.

Michal Zalewski

unread,
Dec 14, 2014, 2:48:08 PM12/14/14
to Igor Bukanov, Chris Palmer, Eduardo Robles Elvira, dev-se...@lists.mozilla.org, blink-dev, public-w...@w3.org, security-dev
> I would like to see some hypothetical encrypted http:// when a browser
> present a page as if it was over https:// if everything of a secure origin
> and as if it was served over plain http if not. That is, if a future browser
> shows warnings for plain http, so it will show the same warnings for
> encrypted http:// with insecure resources.

Browsers have flirted with along the lines of your proposal with
non-blocking mixed content icons. Unfortunately, websites are not
static - so the net effect was that if you watched the address bar
constantly, you'd eventually get notified that your previously-entered
data that you thought will be visible only to a "secure" origin has
been already leaked to / exposed to network attackers.

The main point of having a visible and stable indicator for encrypted
sites is to communicate to the user that the site offers a good degree
of resilience against the examination or modification of the exchanged
data by network attackers. (It is a complicated property and it is
often misunderstood as providing clear-cut privacy assurances for your
online habits, but that's a separate topic.)

Any changes that make this indicator disappear randomly at unexpected
times, or make the already-complicated assurances more fragile and
even harder to explain, are probably not the right way to go.

/mz

Igor Bukanov

unread,
Dec 14, 2014, 3:04:05 PM12/14/14
to Michal Zalewski, Chris Palmer, Eduardo Robles Elvira, dev-se...@lists.mozilla.org, blink-dev, public-w...@w3.org, security-dev
On 14 December 2014 at 20:47, Michal Zalewski <lca...@google.com> wrote:
The main point of having a visible and stable indicator for encrypted
sites is to communicate to the user that the site offers a good degree
of resilience against the examination or modification of the exchanged
data by network attackers.

Then browser should show absolutely no indications of secure origin for encrypted http://. The idea is that encrypted http:// experience would be equivalent to the current http experience with no indications of security and no warnings. However, encrypted http:// with insecure elements will start to produce warnings in the same way a future browser will show warnings for plain http.

Without something like this I just do not see how a lot of sites could ever start enabling encryption unconditionally. I.e. currently enabling https requires to modify content often in a significant way. I would for a site operator to have an option to enabling encryption unconditionally without touching the content.

Michal Zalewski

unread,
Dec 14, 2014, 3:07:46 PM12/14/14
to Igor Bukanov, Chris Palmer, Eduardo Robles Elvira, dev-se...@lists.mozilla.org, blink-dev, public-w...@w3.org, security-dev
> Then browser should show absolutely no indications of secure origin for
> encrypted http://. The idea is that encrypted http:// experience would be
> equivalent to the current http experience with no indications of security
> and no warnings. However, encrypted http:// with insecure elements will
> start to produce warnings in the same way a future browser will show
> warnings for plain http.

As mentioned in my previous response, this gets *really* hairy because
the "has insecure elements" part is not a static property that can be
determined up front; so, you end up with the problem of sudden and
unexpected downgrades and notifying the user only after the
confidentiality or integrity of the previously-stored data has been
compromised.

/mz

Christian Heutger

unread,
Dec 14, 2014, 4:41:39 PM12/14/14
to Chris Palmer, edu...@agoravoting.com, public-w...@w3.org, blin...@chromium.org, securi...@chromium.org, dev-se...@lists.mozilla.org
> Reducing the number of parties you have to trust from [ the site operator, the operators of all networks between you and the site operator ] to just [ the site operator ] is a huge win.

But how can I trust him and who is he? No WHOIS records no imprint, all spoofable, so in what should I trust then? If there is a third-party, who state me, the details given are correct and they have a warranty for misinformation, that’s something I could trust. I also look at online-shopping if there are customer reviews, but I do not recognize them as fully trustable as they may be spoofed, if the shop has a seal like Trusted Shops with a money-back guarantee, I feel good and shop there.
 
> I think you'll find EV is not as "extended" as you might be hoping.

I know, but it’s the best we currently have. And DV is much worser, finally loosing any trust in HTTPS with it, scaling it down to encryption, nothing else.

> But more importantly, the only way to get minimal server auth, data integrity, and data confidentiality on a mass scale is with something at least as easy to deploy as DV. Indeed, you'll see many of the other messages in this thread are from people concerned that DV isn’t
> easy enough yet! So requiring EV is a non-starter.

I agree on data confidentiality, maybe also on integrity although DV without effort or costs may break that also in any way, but server auth is somehow saying nothing as any server endpoint I called I get, nothing more is authenticated. However, I support the idea of having mass encryption, but before confusing and damaging end users mind on internet security, there need to be a clear differentiation in just encryption and encryption with valid authentication.

> HTTPS is the bare minimum requirement for secure web application *transport*. Is secure transport by itself sufficient to achieve total *application-semantic* security? No. But a browser couldn't determine that level of security anyway. Our goal is for the browser to tell
> as much of the truth as it can programatically determine at run-time.

But wasn’t that the idea of certificates? Seals on websites can be spoofed, WHOIS records can be spoofed, imprints can be spoofed, but spoofing EV certificates, e.g. in combination with solutions like pinning, is a hard job. Considering there would be no browser warning for self-signed certificates, I do not see any advantage in mass developed (finally requiring a full automated process) DV certificates. It’s a bit back to the roots to times, I remember, some website operators offered their self-signed root to be installed in the browser to remove the browser warning.

Peter Bowen

unread,
Dec 14, 2014, 5:57:44 PM12/14/14
to Chris Palmer, Igor Bukanov, Eduardo Robles Elvira, dev-se...@lists.mozilla.org, blink-dev, public-w...@w3.org, security-dev
On Sun, Dec 14, 2014 at 11:08 AM, 'Chris Palmer' via Security-dev
<securi...@chromium.org> wrote:
> On Sun, Dec 14, 2014 at 10:53 AM, Igor Bukanov <ig...@mir2.org> wrote:
>
>> I.e. just consider that currently a hosting provider has no option to
>> unconditionally encrypt pages they host for modern browsers as that may
>> break pages of the users. With encrypted http:// they get such option
>> delegating the job of fixing warnings about insecure context to the content
>> producers as it should.
>
>
> I'm sorry; I still don't understand what you mean. Do you mean that you want
> browsers to treat some hypothetical encrypted HTTP protocol as if it were a
> secure origin, but still allow non-secure embedded content in these origins?

I'm also not clear on what Igor intended, but there is a real issue
with browser presentation of URLs using TLS today. There is no way to
declare "I know that this page will have insecure content, so don't
consider me a secure origin" such that the browser will show a
"neutral" icon rather than a warning icon. I think there is a strong
impression that a closed lock is better than neutral, but a yellow
warning sign over the lock is worse than neutral. Today prevents
sites from using HTTPS unless they have a very high confidence that
all resources on the page will come from secure origins.

Thanks,
Peter

cha...@yandex-team.ru

unread,
Dec 15, 2014, 4:04:44 AM12/15/14
to Christian Heutger, pal...@google.com, edu...@agoravoting.com, public-w...@w3.org, blin...@chromium.org, securi...@chromium.org, dev-se...@lists.mozilla.org
(Ouch. maintaining the cross-posting ;( ).
 
15.12.2014, 11:57, "Christian Heutger" <chri...@heutger.net>:
However, this „change“ could come with marking HTTP as Non-Secure, but just stating HTTPS as secure is the completely wrong sign and will result in more confusion and loosing any trust in any kind of browser padlocks than before.
 
The message someone wrote about "I understand I am using insecure mixed content, please don't make me look worse than someone who doesn't understand that" strikes a chord - I think there is value in supporting the use case.
 
Just a proposal:
 
Mark HTTP as Non-Secure (similar to self-signed) e.g. with a red padlock or sth. similar.
Mark HTTPS as Secure (and only secure in favor of encrypted) e.g. with a yellow padlock or sth. similar
Mark HTTPS with Extended Validation (encrypted and validated) as it is with a green padlock or sth. similar
 
Just a "nit pick" - please don't EVER rely only on colour distinction to communicate anything important. There are a lot of colour-blind people, and on top there are various people who rely on high-contrast modes and the like which effectively strip the colour out.
 
cheers
 
Chaals
 
--
Charles McCathie Nevile - web standards - CTO Office, Yandex
cha...@yandex-team.ru - - - Find more at http://yandex.com
 

Igor Bukanov

unread,
Dec 15, 2014, 4:16:46 AM12/15/14
to Peter Bowen, Chris Palmer, Eduardo Robles Elvira, dev-se...@lists.mozilla.org, blink-dev, public-w...@w3.org, security-dev
On 14 December 2014 at 23:57, Peter Bowen <pzb...@gmail.com> wrote:
 I think there is a strong
impression that a closed lock is better than neutral, but a yellow
warning sign over the lock is worse than neutral.

The problem is not just a warning sign.

Browsers prevents any active context including iframe served over http from loading. Thus showing a page with youtube and other videos over https is not an option unless one fixes the page. Now consider that it is not a matter of running sed on a set of static files but rather patching the stuff stored in the database or fixing JS code that inserts the video as the task of enabling https becomes non-trivial and very content dependent.

So indeed an option to declare that despite proper certificates and encryption the site should be treated as of insecure origin is needed. This way the page will be shown as if it was served as before with plain http with no changes in user experience.  But then it cannot be a https site since many users still consider that https is enough to assume a secure site. Hence the idea of encrypted http:// or something that makes user experience with an encrypted page absolutely the same as she has with plain http:// down to the browser stripping http:// from the URL.

After considering this I think it will be even fine for a future browser to show a warning for such properly-encrypted-but-explicitly-declared-as-insecure in the same way as a warning will be shown for plain http. And it will be really nice if a site operator, after activating such user-invisible encryption, can receive reports from a browser about any violation of the secure origin policy in the same way how violations of CSP are reported today. This would give nice possibility of activating encryption without breaking anything, collecting reports of violated secure origin policy, fixing content and finally declaring the site explicitly https-only.

Jeffrey Walton

unread,
Dec 15, 2014, 4:29:54 AM12/15/14
to Christian Heutger, public-w...@w3.org, blin...@chromium.org, securi...@chromium.org, dev-se...@lists.mozilla.org
On Sat, Dec 13, 2014 at 2:05 PM, Christian Heutger
<chri...@heutger.net> wrote:
> I see a big danger in the current trend.

Surely you haven't missed the big danger in plain text traffic. That
traffic gets usuroed and fed into susyems like Xkeyscore for Tailored
Access Operations (TAO). In layman's terms, adversaries are using the
information gathered to gain unauthorized access to systems.

> Expecting everyone having a free
> „secure“ certificate and being in requirement to enable HTTPS it will result
> in nothing won. DV certificates (similar to DANE) do finally say absolute
> nothing about the website operator.

The race to the bottom among CAs is to blame for the quality of
verification by the CAs.

With companies like StartCom, Cacert and Mozilla offering free
certificates, there is no barrier to entry.

Plus, I don't think a certificate needs to say anything about the
operator. They need to ensure the server is authenticated. That is,
the public key bound to the DNS name is authentic.

> They ensure encryption, so I can then be
> phished, be scammed, … encrypted. Big advantage!^^

As I understand it, phishers try to avoid TLS because they count on
the plain text channel to avoid all the browser warnings. Peter
Gutmann discusses this in his Engineering Security book
(https://www.cs.auckland.ac.nz/~pgut001/pubs/book.pdf).

> Pushing real validation
> (e.g. EV with green adressbar and validated details by an independent third
> party, no breakable, spoofable automatism) vs. no validation is much more
> important and should be focussed on.

You should probably read Gutmann's Engineering Security. See his
discussion of "PKI me harder" in Chapter 1 or 6 (IIRC).

> However, this „change“ could come with
> marking HTTP as Non-Secure, but just stating HTTPS as secure is the
> completely wrong sign and will result in more confusion and loosing any
> trust in any kind of browser padlocks than before.

Security engineering studies seem to indicate most users don't
understand the icons. It would probably be better if the browsers did
the right thing, and took the users out of the loop. Gutmann talks
about it in detail (with lots of citations).

> Just a proposal:
>
> Mark HTTP as Non-Secure (similar to self-signed) e.g. with a red padlock or
> sth. similar.

+1. In the browser world, plaintext was (still is?) held in higher
esteem then opportunistic encryption. Why the browsers choose to
indicate things this way is a mystery.

> Mark HTTPS as Secure (and only secure in favor of encrypted) e.g. with a
> yellow padlock or sth. similar
> Mark HTTPS with Extended Validation (encrypted and validated) as it is with
> a green padlock or sth. similar

Why green for EV (or why yellow for DV or DANE)? EV does not add any
technical controls. From a security standpoint, DV and EV are
equivalent.

If DNS is authentic, then DANE provides stronger assurances than DV or
EV since the domain operator published the information and the
veracity does not rely on others like CAs (modulo DBOUND).

Not relying on a CA is a good thing since its usually advantageous to
minimize trust (for some definition of "trust"). Plus, CAs don't
really warrant anything, so its not clear what exactly they are
providing to relying parties (they are providing a a signature for
money to the applicant).

Open question: do you think the browsers will support a model other
than the CA Zoo for rooting trust?

Michal Zalewski

unread,
Dec 15, 2014, 4:30:23 AM12/15/14
to Igor Bukanov, Peter Bowen, Chris Palmer, Eduardo Robles Elvira, dev-se...@lists.mozilla.org, blink-dev, public-w...@w3.org, security-dev
> So indeed an option to declare that despite proper certificates and
> encryption the site should be treated as of insecure origin is needed. This
> way the page will be shown as if it was served as before with plain http
> with no changes in user experience. But then it cannot be a https site
> since many users still consider that https is enough to assume a secure
> site. Hence the idea of encrypted http:// or something that makes user
> experience with an encrypted page absolutely the same as she has with plain
> http:// down to the browser stripping http:// from the URL.

Sounds like you're essentially proposing a flavor of opportunistic
encryption for http://, right?

That seems somewhat tangential to Chris' original proposal, and there
is probably a healthy debate to be had about this; it may be also
worthwhile to look at SPDY and QUIC. In general, if you're comfortable
with not providing users with a visible / verifiable degree of
transport security, I'm not sure how the proposal changes this?

By the way, note that nothing is as simple as it seems; opportunistic
encryption is easy to suggest, but it's pretty damn hard to iron out
all the kinks. If there is genuinely no distinction between plain old
HTTP and opportunistically encrypted HTTP, the scheme can be
immediately rendered useless by any active attacker, and suffers from
many other flaws (for example, how do you link from within that
opportunistic scheme to other URLs within your application without
downgrading to http:// or upgrading to "real" https://?). Establishing
a new scheme solves that, but doesn't really address your other
concern - you still need to clean up all links. On top of that, it
opens a whole new can of worms by messing around with SOP.

/mz

Ryan Sleevi

unread,
Dec 15, 2014, 4:38:43 AM12/15/14
to Jeffrey Walton, security-dev, blink-dev, dev-se...@lists.mozilla.org, Christian Heutger, public-w...@w3.org

From an SOP point of view, this is true.
However, it is increasingly less true if you're willing to ignore the (near cataclysmic) SOP failure, as EV gains technical controls such as certificate transparency and pontentially mandatory stronger security settings (e.g. secure ciphersuites in modern TLS, OCSP stapling, etc). Additionally, there are other technical controls (validity periods, key processing) that do offer distinction.

That is, it is not all procedural changes, and UAs can detect and differentiate. While the hope is that these will be able to apply to all sites in the future, any change of this scale takes time.

> If DNS is authentic, then DANE provides stronger assurances than DV or
> EV since the domain operator published the information and the
> veracity does not rely on others like CAs (modulo DBOUND).
>
> Not relying on a CA is a good thing since its usually advantageous to
> minimize trust (for some definition of "trust"). Plus, CAs don't
> really warrant anything, so its not clear what exactly they are
> providing to relying parties (they are providing a a signature for
> money to the applicant).
>
> Open question: do you think the browsers will support a model other
> than the CA Zoo for rooting trust?

Chromium has no plans for this, particularly those based on DNS/DANE, which are empirically less secure and more operationally fraught with peril. I would neither take it as foregone that the CA system cannot improve nor am I confident that any of the known alternatives are either practical or comparable in security to CAs, let alone superior.

Igor Bukanov

unread,
Dec 15, 2014, 5:03:15 AM12/15/14
to Michal Zalewski, Peter Bowen, Chris Palmer, Eduardo Robles Elvira, dev-se...@lists.mozilla.org, blink-dev, public-w...@w3.org, security-dev
On 15 December 2014 at 10:30, Michal Zalewski <lca...@google.com> wrote:
That seems somewhat tangential to Chris' original proposal, and there
is probably a healthy debate to be had about this; it may be also
worthwhile to look at SPDY and QUIC. In general, if you're comfortable
with not providing users with a visible / verifiable degree of
transport security, I'm not sure how the proposal changes this?


Chris' original proposal is a stick. I want to give a site operator also a carrot. That can be an option to activate encryption that is not visible to the user and *receive* from the browser all reports about violations of secure origin policy. This way the operator will know that they can activate HTTPS without worsening user experience and have information that helps to fix the content.

If there is genuinely no distinction between plain old
HTTP and opportunistically encrypted HTTP, the scheme can be
immediately rendered useless by any active attacker

I am not proposing that a user-invisible encryption should stay forever. Rather it should be treated just as a tool to help site operators to transition to the proper https so at no stage the user experience would be worse than continuing to serve pages with plain http.

mof...@gmail.com

unread,
Dec 15, 2014, 9:41:39 AM12/15/14
to securi...@chromium.org
суббота, 13 декабря 2014 г., 3:46:36 UTC+3 пользователь Chris Palmer написал:
> Hi everyone,
>
>
> Apologies to those of you who are about to get this more than once, due to the cross-posting. I'd like to get feedback from a wide variety of people: UA developers, web developers, and users. The canonical location for this proposal is: https://www.chromium.org/Home/chromium-security/marking-http-as-non-secure.
>
>
>
> Proposal
>
> We, the Chrome Security Team, propose that user agents (UAs) gradually change their UX to display non-secure origins as affirmatively non-secure. We intend to devise and begin deploying a transition plan for Chrome in 2015.
>
> The goal of this proposal is to more clearly display to users that HTTP provides no data security.
>
> Request
>
> We’d like to hear everyone’s thoughts on this proposal, and to discuss with the web community about how different transition plans might serve users.
>
> Background
>
> We all need data communication on the web to be secure (private, authenticated, untampered). When there is no data security, the UA should explicitly display that, so users can make informed decisions about how to interact with an origin.
>
> Roughly speaking, there are three basic transport layer security states for web origins:
>
> Secure (valid HTTPS, other origins like (*, localhost, *));
> Dubious (valid HTTPS but with mixed passive resources, valid HTTPS with minor TLS errors); and
> Non-secure (broken HTTPS, HTTP).
>
> For more precise definitions of secure and non-secure, see Requirements for Powerful Features and Mixed Content.
>
> We know that active tampering and surveillance attacks, as well as passive surveillance attacks, are not theoretical but are in fact commonplace on the web.
>
> RFC 7258: Pervasive Monitoring Is an Attack
> NSA uses Google cookies to pinpoint targets for hacking
> Verizon’s ‘Perma-Cookie’ Is a Privacy-Killing Machine
> How bad is it to replace adSense code id to ISP's adSense ID on free Internet?
> Comcast Wi-Fi serving self-promotional ads via JavaScript injection
> Erosion of the moral authority of transparent middleboxes
> Transitioning The Web To HTTPS
>
> We know that people do not generally perceive the absence of a warning sign. (See e.g. The Emperor's New Security Indicators.) Yet the only situation in which web browsers are guaranteed not to warn users is precisely when there is no chance of security: when the origin is transported via HTTP. Here are screenshots of the status quo for non-secure domains in Chrome, Safari, Firefox, and Internet Explorer:
>
>
>
>
>
>
>
>
>
> Particulars
>
> UA vendors who agree with this proposal should decide how best to phase in the UX changes given the needs of their users and their product design constraints. Generally, we suggest a phased approach to marking non-secure origins as non-secure. For example, a UA vendor might decide that in the medium term, they will represent non-secure origins in the same way that they represent Dubious origins. Then, in the long term, the vendor might decide to represent non-secure origins in the same way that they represent Bad origins.
>
> Ultimately, we can even imagine a long term in which secure origins are so widely deployed that we can leave them unmarked (as HTTP is today), and mark only the rare non-secure origins.
>
> There are several ways vendors might decide to transition from one phase to the next. For example, the transition plan could be time-based:
>
> T0 (now): Non-secure origins unmarked
> T1: Non-secure origins marked as Dubious
> T2: Non-secure origins marked as Non-secure
> T3: Secure origins unmarked
>
> Or, vendors might set thresholds based on telemetry that measures the ratios of user interaction with secure origins vs. non-secure. Consider this strawman proposal:
>
> Secure > 65%: Non-secure origins marked as Dubious
> Secure > 75%: Non-secure origins marked as Non-secure
> Secure > 85%: Secure origins unmarked
>
> The particular thresholds or transition dates are very much up for discussion. Additionally, how to define “ratios of user interaction” is also up for discussion; ideas include the ratio of secure to non-secure page loads, the ratio of secure to non-secure resource loads, or the ratio of total time spent interacting with secure vs. non-secure origins.
>
> We’d love to hear what UA vendors, web developers, and users think. Thanks for reading!

This is the worst idea ever. There are no point to encrypt ANYTHING. What for to encrypt cat's images? There are public websites, static websites - there are no need to encrypt it's traffic. They are public, open and not related to something which need to be hidden. HTTPS adds an additional point of failure and additional point to monitor for system administrator and web developers.
Also it's not always possible to use HTTPS. For many websites it's much easier to just stay only HTTP mode.

You basically want all websites to work on HTTPS and it's stupid idea.

If something like it will be invented in chromium, i'll cast a vote to prevent it from grow.

Daniel Veditz

unread,
Dec 15, 2014, 12:55:00 PM12/15/14
to Igor Bukanov, Michal Zalewski, Peter Bowen, Chris Palmer, Eduardo Robles Elvira, dev-se...@lists.mozilla.org, blink-dev, public-w...@w3.org, security-dev
On 12/15/14 2:03 AM, Igor Bukanov wrote:
> Chris' original proposal is a stick. I want to give a site operator also
> a carrot. That can be an option to activate encryption that is not
> visible to the user and *receive* from the browser all reports about
> violations of secure origin policy. This way t