Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Tightening up after the Lenovo and Comodo MITM certificates.

224 views
Skip to first unread message

John Nagle

unread,
Feb 23, 2015, 4:40:15 PM2/23/15
to dev-secur...@lists.mozilla.org
With the Lenovo and Comodo disclosures, the restrictions
on loading new certificates into Firefox clients need to be tightened.

Add-on policy is moving to a new system where add-ons cannot be
added to a production version of Firefox without approval from AMO.
SSL certificates should get the same treatment. If you can't install a
new add-on unless it's been signed by AMO, why should you be able to
install a new SSL certificate without having it signed?

John Nagle
SiteTruth

Clint Wilson

unread,
Feb 23, 2015, 5:14:20 PM2/23/15
to mozilla-dev-s...@lists.mozilla.org
Lots of Enterprises and organizations have very legitimate requirements to add their own internal root CA to the NSS store. In fact, Mozilla even answers the question of how to do this: https://wiki.mozilla.org/CA:FAQ#How_do_I_import_a_root_cert_into_NSS_on_our_organization.27s_internal_servers.3F

What type of "signing" of these internal Root certificates should be required?

Matt Palmer

unread,
Feb 23, 2015, 5:29:37 PM2/23/15
to dev-secur...@lists.mozilla.org
On Mon, Feb 23, 2015 at 02:14:13PM -0800, Clint Wilson wrote:
> Lots of Enterprises and organizations have very legitimate requirements to
> add their own internal root CA to the NSS store.

I suspect John's point is that lots of enterprises and organisations (I
remember a time when those were the same thing...) have very legitimate
requirements to add their own internal add-ons to Firefox, and he is simply
calling out an apparent inconsistency in Mozilla's policies on these two
object types. (John, if I'm misrepresenting your position, please feel free
to correct me).

However, the two situations aren't the same, and thus can't be compared so
simplistically. Mozilla's signature on an add-on says, "we're reasonably
sure this add-on isn't going to do Bad Things to your browser", because they
can wave automated scanning tools over the code to look for dodgy stuff.

The closest thing I can think of for CA certificates would be if Mozilla
OK'd only technically-constrained CA certs -- say, only for domains and IP
ranges which the applicant was known to own. However, that is exactly the
sort of thing that existing trusted root CAs do. I suspect that existing
trusted root CAs would be unhappy if Mozilla took on this task. It would
also be a significant cost to Mozilla, because determining authority to
issue certs for an entire portion of the DNS space is a lot more manual
effort than running a code analysis tool over an add-on.

- Matt

--
"The user-friendly computer is a red herring. The user-friendliness of a
book just makes it easier to turn pages. There's nothing user-friendly about
learning to read."
-- Alan Kay

Richard Barnes

unread,
Feb 23, 2015, 6:55:54 PM2/23/15
to Matt Palmer, dev-secur...@lists.mozilla.org, Daniel Veditz
On Mon, Feb 23, 2015 at 5:28 PM, Matt Palmer <mpa...@hezmatt.org> wrote:

> On Mon, Feb 23, 2015 at 02:14:13PM -0800, Clint Wilson wrote:
> > Lots of Enterprises and organizations have very legitimate requirements
> to
> > add their own internal root CA to the NSS store.
>
> I suspect John's point is that lots of enterprises and organisations (I
> remember a time when those were the same thing...) have very legitimate
> requirements to add their own internal add-ons to Firefox, and he is simply
> calling out an apparent inconsistency in Mozilla's policies on these two
> object types. (John, if I'm misrepresenting your position, please feel
> free
> to correct me).
>

If I understand correctly (dveditz CC'ed to correct me), the current add-on
signing tool has a provision for signing add-ons that are not published
through AMO. They still need to be submitted to AMO to be scanned and
signed, but they're not published.



> However, the two situations aren't the same, and thus can't be compared so
> simplistically. Mozilla's signature on an add-on says, "we're reasonably
> sure this add-on isn't going to do Bad Things to your browser", because
> they
> can wave automated scanning tools over the code to look for dodgy stuff.
>
> The closest thing I can think of for CA certificates would be if Mozilla
> OK'd only technically-constrained CA certs -- say, only for domains and IP
> ranges which the applicant was known to own. However, that is exactly the
> sort of thing that existing trusted root CAs do. I suspect that existing
> trusted root CAs would be unhappy if Mozilla took on this task. It would
> also be a significant cost to Mozilla, because determining authority to
> issue certs for an entire portion of the DNS space is a lot more manual
> effort than running a code analysis tool over an add-on.
>

I think the benefit here would be more transparency than quality. If we
only allowed changes to the root store by signed add-ons, then (1) Mozilla
would at least have internal visibility into all the MitM roots being
deployed, and (2) we could use the add-on blacklist facility to block
things like Superfish once they were detected. These both seem beneficial
in terms of mitigating risk due to MitM.

However, it could be challenging to implement this control. In addition to
the in-browser UI for adding roots (which could easily be disabled), certs
can also be added to the NSS databases directly, even while the browser
isn't active. To counter this risk, we would have to periodically snapshot
the database and check that nothing else had changed it. I'm not sure the
incremental benefit merits the level of development required. Nonetheless,
if we can reinforce the idea that addons are the way to install roots by
simply turning off the UI that exists, that could be beneficial.

(Also, this is more of a Firefox discussion than a CA program discussion,
so it might be more appropriate for dev.tech.crypto.)

--Richard




> - Matt
>
> --
> "The user-friendly computer is a red herring. The user-friendliness of a
> book just makes it easier to turn pages. There's nothing user-friendly
> about
> learning to read."
> -- Alan Kay
>
> _______________________________________________
> dev-security-policy mailing list
> dev-secur...@lists.mozilla.org
> https://lists.mozilla.org/listinfo/dev-security-policy
>

Juergen Christoffel

unread,
Feb 24, 2015, 7:11:42 AM2/24/15
to dev-secur...@lists.mozilla.org
On 23.02.15 22:39, John Nagle wrote:
> With the Lenovo and Comodo disclosures, the restrictions
> on loading new certificates into Firefox clients need to be tightened.

The MITM-Ad/Malware installed via the Windows Certificate Store and not
into browsers, so I cannot follow your conclusion.

My conclusion is that, after the 2011 incident[*] and now PrivDog,
Comodo cannot be trusted and their Root Certificates need to be removed
from browsers.

We still need to be able to install (in a controlled way) our own,
self-signed certificates for our own CAs into browsers and operating
systems. CAcert is one example. And besides our public CA Fraunhofer
uses a private, self-signed CA in a well-regulated way for less
sensitive, internal authentication, for example. And last but not least
I personally use a private one myself to allow authenticated access to
family members and friends to a private server.

Regards, JC

[*]
https://nakedsecurity.sophos.com/2011/03/24/fraudulent-certificates-issued-by-comodo-is-it-time-to-rethink-who-we-trust/

Hubert Kario

unread,
Feb 24, 2015, 11:51:14 AM2/24/15
to dev-secur...@lists.mozilla.org, Matt Palmer, Daniel Veditz, Richard Barnes
This is doubly problematic, as some people may want to preserve roots that
were removed (see the recent removals of 1024 bit roots)...

It's a big can of worms no matter how you approach it.
--
Regards,
Hubert Kario
Quality Engineer, QE BaseOS Security team
Web: www.cz.redhat.com
Red Hat Czech s.r.o., Purkyňova 99/71, 612 45, Brno, Czech Republic
signature.asc

Daniel Veditz

unread,
Feb 24, 2015, 12:40:15 PM2/24/15
to Richard Barnes, Matt Palmer
On 2/23/15 3:55 PM, Richard Barnes wrote:
> If I understand correctly (dveditz CC'ed to correct me), the current add-on
> signing tool has a provision for signing add-ons that are not published
> through AMO. They still need to be submitted to AMO to be scanned and
> signed, but they're not published.

Yes.

> I think the benefit here would be more transparency than quality. If we
> only allowed changes to the root store by signed add-ons, then (1) Mozilla
> would at least have internal visibility into all the MitM roots being
> deployed, and (2) we could use the add-on blacklist facility to block
> things like Superfish once they were detected. These both seem beneficial
> in terms of mitigating risk due to MitM.

I don't think we can restrict it to add-ons since external programs like
Superfish (and the Lenovo removal tool, for that matter) write directly
into the NSS profile database. It would be a bunch of work for precisely
zero win.

Could we make the "real" and only root accepted by Firefox be a Mozilla
root, which cross-signs all the built-in NSS roots as well as any
corporate roots submitted via this kind of program? I thought pkix gave
us those kinds of abilities.

Or we could reject any added root that wasn't logged in CT, and then put
a scanner on the logs looking for self-signed CA=true certs. Of course
that puts the logs in the crosshairs for spam and DOS attacks.

> However, it could be challenging to implement this control. In addition to
> the in-browser UI for adding roots (which could easily be disabled), certs
> can also be added to the NSS databases directly, even while the browser
> isn't active. To counter this risk, we would have to periodically snapshot
> the database and check that nothing else had changed it.

We have this problem with add-ons which can also be added directly while
Firefox isn't running. Where do you store the snapshot such that the
injector can't just tweak it while injecting?

> if we can reinforce the idea that addons are the way to install roots by
> simply turning off the UI that exists, that could be beneficial.

Would it? I haven't heard of any widespread problems with people fooling
others into installing a root via the UI. Meanwhile it would do zero to
stop the actual way unwanted roots get added.

> (Also, this is more of a Firefox discussion than a CA program discussion,
> so it might be more appropriate for dev.tech.crypto.)

It's not a technical crypto issue either; I suggest something more
general like dev-security or dev-firefox.

-Dan Veditz

Brian Smith

unread,
Feb 24, 2015, 1:05:57 PM2/24/15
to Daniel Veditz, Matt Palmer, mozilla-dev-s...@lists.mozilla.org
Daniel Veditz <dve...@mozilla.com> wrote:
> I don't think we can restrict it to add-ons since external programs like
> Superfish (and the Lenovo removal tool, for that matter) write directly
> into the NSS profile database. It would be a bunch of work for precisely
> zero win.

mozilla::pkix makes it so that you can ignore the NSS profile
database, if you wish to do so.

> Could we make the "real" and only root accepted by Firefox be a Mozilla
> root, which cross-signs all the built-in NSS roots as well as any
> corporate roots submitted via this kind of program?

This is effectively what the built-in roots module already does,
except the Mozilla root CA certificate is implied instead of explicit.

> I thought pkix gave us those kinds of abilities.

mozilla::pkix offers a lot of flexibility in terms of how certificate
trust is determined.

> Or we could reject any added root that wasn't logged in CT, and then put
> a scanner on the logs looking for self-signed CA=true certs. Of course
> that puts the logs in the crosshairs for spam and DOS attacks.

Those spam and DoS attacks are why logs are specified (required?
recommended?) to not accept those certificates.

If Mozilla wanted to, it is totally possible to make an extension API
that allows an extension, when it is not disabled, to provide PSM with
a list of roots that should be accepted as trust anchors. And, it is
totally possible for PSM to aggregate those lists of
extension-provided trust anchors and use that list, in conjunction
with the read-only built-in roots module, to determine certificate
trust, while ignoring the read/write profile certificate database.

Whether or not that is a good idea is not for me to decide. But, it
would not be a huge amount of work to implement.

Cheers,
Brian

Peter Kurrasch

unread,
Feb 24, 2015, 6:54:44 PM2/24/15
to Brian Smith, Daniel Veditz, Matt Palmer, mozilla-dev-s...@lists.mozilla.org
I think focusing on the trusted root store as a way to resolve this problem is (or will be) less than ideal.‎ I understand the desire to look there but I don't think it will necessarily end well.

That said I don't have a great alternative myself but I do have some questions:

1) I saw a quote attributed to Richard that Mozilla will blacklist or is considering ‎blacklisting the cert(s) generated by Superfish, Komodia, and so forth. Can you comment here at all, Richard?

2) Has any analysis been performed on the "quality" of the end certs generated by the Komodia proxy? Are there obvious oversights that could be checked or enforced within Mozilla? How much work would that involve?

3) Am I right to presume that affected users did not notice that those sites with EV certs no longer were showing a green address bar? that EV itself provided no protection in this situation?

4) In this situation any sites using HSTS provided no additional benefit but did cert pinning make a difference?

Thanks.

  Original Message  
From: Brian Smith
Sent: Tuesday, February 24, 2015 12:05 PM
To: Daniel Veditz
Cc: Matt Palmer; mozilla-dev-s...@lists.mozilla.org
Subject: Re: Tightening up after the Lenovo and Comodo MITM certificates.

Tyler Szabo

unread,
Feb 24, 2015, 8:31:41 PM2/24/15
to Peter Kurrasch, Brian Smith, Daniel Veditz, Matt Palmer, mozilla-dev-s...@lists.mozilla.org
This thread seems fairly focused on a technical solution; whereas I see
this problem being more of an informed consent situation.

I'm reminded of the discussion leading up to the "know your rights" toolbar
and another discussion with respect to whether or not to display HTTP
connections with explicit insecure visual feedback.

Those discussions leaned towards informing the user. In this case informing
them that they have pre-loaded 3rd party extensions or certificates. With
the right wording a power user could think "I do!? Let's investigate this."
and a regular user could think "Interesting. I wonder why they'd bother to
tell me - better get on with my day."

Gervase Markham

unread,
Feb 25, 2015, 4:28:18 AM2/25/15
to Juergen Christoffel
On 24/02/15 12:10, Juergen Christoffel wrote:
> My conclusion is that, after the 2011 incident[*] and now PrivDog,
> Comodo cannot be trusted and their Root Certificates need to be removed
> from browsers.

That's a pretty major conclusion to reach on pretty shaky evidence. Have
you actually tested a version of PrivDog _as shipped by Comodo_ and
detected this problem?

Gerv

Peter Kurrasch

unread,
Feb 25, 2015, 8:59:34 AM2/25/15
to Tyler Szabo, Brian Smith, Daniel Veditz, Matt Palmer, mozilla-dev-s...@lists.mozilla.org
I'm not sure I totally follow here because informed consent requires the ability to inform, and I don't think we have that yet.

The way any attacker operates is to find gaps in a system and make use of them. In my questions I'm trying the same approach: what are some gaps in the Komodia solution and how might we exploit them ourselves?

From: Tyler Szabo
Sent: Tuesday, February 24, 2015 7:31 PM
To: Peter Kurrasch; Brian Smith; Daniel Veditz
Subject: Re: Tightening up after the Lenovo and Comodo MITM certificates.

This thread seems fairly focused on a technical solution; whereas I see this problem being more of an informed consent situation.

I'm reminded of the discussion leading up to the "know your rights" toolbar and another discussion with respect to whether or not to display HTTP connections with explicit insecure visual feedback.

Those discussions leaned towards informing the user. In this case informing them that they have pre-loaded 3rd party extensions or certificates. With the right wording a power user could think "I do!? Let's investigate this." and a regular user could think "Interesting. I wonder why they'd bother to tell me - better get on with my day."

On Tue, Feb 24, 2015 at 3:54 PM Peter Kurrasch <fhw...@gmail.com> wrote:
I think focusing on the trusted root store as a way to resolve this problem is (or will be) less than ideal.‎ I understand the desire to look there but I don't think it will necessarily end well.

That said I don't have a great alternative myself but I do have some questions:

1) I saw a quote attributed to Richard that Mozilla will blacklist or is considering ‎blacklisting the cert(s) generated by Superfish, Komodia, and so forth. Can you comment here at all, Richard?

2) Has any analysis been performed on the "quality" of the end certs generated by the Komodia proxy? Are there obvious oversights that could be checked or enforced within Mozilla? How much work would that involve?

3) Am I right to presume that affected users did not notice that those sites with EV certs no longer were showing a green address bar? that EV itself provided no protection in this situation?

4) In this situation any sites using HSTS provided no additional benefit but did cert pinning make a difference?

Thanks.

  Original Message  
From: Brian Smith
Sent: Tuesday, February 24, 2015 12:05 PM
To: Daniel Veditz

https://lists.mozilla.org/listinfo/dev-security-policy
_______________________________________________
dev-security-policy mailing list

Phillip Hallam-Baker

unread,
Feb 25, 2015, 9:52:34 AM2/25/15
to Peter Kurrasch, Matt Palmer, Tyler Szabo, mozilla-dev-s...@lists.mozilla.org, Brian Smith, Daniel Veditz
On Wed, Feb 25, 2015 at 8:59 AM, Peter Kurrasch <fhw...@gmail.com> wrote:

> I'm not sure I totally follow here because informed consent requires the
> ability to inform, and I don't think we have that yet.
>
> The way any attacker operates is to find gaps in a system and make use of
> them. In my questions I'm trying the same approach: what are some gaps in
> the Komodia solution and how might we exploit them ourselves?
>

There are multiple problems here. One of them is that what is obvious to
folk in the PKI community is not necessarily obvious to folk in the
Anti-Virus community. Another problem is that following the advice given
out by Harvard Business School and setting up separate arms-length
companies to work on speculative 'disruptive' products means that they are
operating without the usual QA processes you would expect of a larger
company.

I don't want to get into specifics at this point.

We can do finger pointing and blamestorming but what we really need is a
solution. I think informed consent is a major part of the problem.


Malware and crapware are a real problem. My problem with what Lenovo did
isn't just that the code they installed had bugs, it is that they installed
the stuff at all. If I pay $1,000 for a laptop, I do not expect the
manufacturer to fill the inside of the case with manure. It is clearly
worse if the manure carries a disease but the solution to the problem is to
not ship the manure at all rather than trying to pasteurize it.

So one part of the solution here is the Windows Signature edition program
which guarantees customers the crapware free computer they paid for.


Fixing the AV hole is harder. The problem as the Anti-Virus people see it
is how to scan data for potentially harmful content, whether it is mail or
documents or web pages. The AV world regards itself as being a part of the
trusted computing base and thus entitled to have full access to all data in
unencrypted form. AV code has from the start had a habit of hooking
operating system routines at a very low level and taking over the machine.

Now we in the PKI world have a rather different view here. We see the root
store as being the core of the trusted computing base and that the 'user'
should be the only party making changes. We do not accept the excuse that
an AV product is well intentioned. However recall that it was Symantec
bought VeriSign, not the other way round. We don't necessarily have the
leverage here.


The fundamental changeable aspect of the current model for managing the
root store is the lack of accountability or provenance. As a user I have
tools that tell me what roots are in the store but I have no idea how they
got there. On the Windows store (which I am most familiar with), don't have
any way to distinguish between roots from the Microsoft program and those
added by programs.

One quick fix here would be for all trust root managers to use the CTL
mechanism defined by Microsoft (and pretty much a defacto standard) to
specify the trusted roots in their program, thus enabling people to write
tools that would make it easy to see that this version of Firefox has the
200+ keys from the program plus these other five that are not in the
program.


Right now it takes a great deal of expertise to even tell if a machine has
been jiggered or not. That is the first step to knowing if the jiggering is
malicious or not and done competently or not.

Hanno Böck

unread,
Feb 26, 2015, 7:27:35 AM2/26/15
to dev-secur...@lists.mozilla.org
On Wed, 25 Feb 2015 09:26:57 +0000
Gervase Markham <ge...@mozilla.org> wrote:

> That's a pretty major conclusion to reach on pretty shaky evidence.
> Have you actually tested a version of PrivDog _as shipped by Comodo_
> and detected this problem?

As I've been the one finding these issues I think I can give some facts
here:
* No, the TLS validation issue does not affect the Comodo shipped
version. They use a browser plugin which is different and doesn't do
TLS-mitm.
* There are personal ties between Comodo and Privdog and Comodo has
been and is still actively advertising Privdog (that is they link to
the privdog webpage and suggest you should install their product).
* There is another issue that privdog is sending all urls in clear text
home to a server from adtrust media. They told me they'll change this
to https in the future, but still they send out every URL. This issue
is true for both the comodo shipped version and the standalone
version [1].

I will let others decide what they make out of this, I just wanted to
contribute the facts. If there are questions I can answer please ask.

[1]
https://blog.hboeck.de/archives/866-PrivDog-wants-to-protect-your-privacy-by-sending-data-home-in-clear-text.html

--
Hanno Böck
http://hboeck.de/

mail/jabber: ha...@hboeck.de
GPG: BBB51E42

Elbart

unread,
Feb 26, 2015, 9:58:01 AM2/26/15
to mozilla-dev-s...@lists.mozilla.org
On 24.02.2015 13:10, Juergen Christoffel wrote:
> On 23.02.15 22:39, John Nagle wrote:
>> With the Lenovo and Comodo disclosures, the restrictions
>> on loading new certificates into Firefox clients need to be tightened.
>
> The MITM-Ad/Malware installed via the Windows Certificate Store and not
> into browsers, so I cannot follow your conclusion.

Lenovo is not only offering removal-instructions for the Windows
Certficiate Store[1], but also for Mozilla-products[2].

So apparently Superfish is or was installing its certificate into
browsers[3].

[1] http://support.lenovo.com/us/en/product_security/superfish_uninstall#ie
[2]
http://support.lenovo.com/us/en/product_security/superfish_uninstall#firefox
[3] http://blog.erratasec.com/2015/02/some-notes-on-superfish.html ;
second image under "How to uninstall the software?"

Peter Gutmann

unread,
Feb 28, 2015, 3:34:39 AM2/28/15
to br...@briansmith.org, dve...@mozilla.com, fhw...@gmail.com, mpa...@hezmatt.org, mozilla-dev-s...@lists.mozilla.org
Peter Kurrasch <fhw...@gmail.com> writes:

>I think focusing on the trusted root store as a way to resolve this problem
>is (or will be) less than ideal.? I understand the desire to look there but I
>don't think it will necessarily end well.

I think focusing on the browser as a whole is less than ideal. The browser
vendors have chosen to adopt a "security" model where anything that can refer
back to a particular string of bits in a config file is regarded as ultimately
trusted and good. It can be loaded with malware, claim to be Bank of America
but hosted in the Ukraine, and any number of other suspicious behaviours, but
as long as there's a certificate present the browsers regard it as OK. Given
the magic keys-to-the-kingdom approach adopted by the browser vendors, it's
not surprising that attackers are targeting those magic keys, because that's
all they need to do for the browsers to indicate that the site is fine.

So the solution isn't to look for relief from the browser vendors, because if
they were interested they'd have fixed this years ago. Instead, we need to
rely on external agents like anti-malware apps and integrity checkers that
keep an eye on the keys to the kingdom and make sure that they're not tampered
with. Waiting for the browser vendors to fix things is just an exercise in
frustration, while I'm pretty sure the A/V vendors added signatures for it
within hours of finding out.

Peter.

Chris Palmer

unread,
Feb 28, 2015, 4:45:57 PM2/28/15
to Peter Gutmann, mpa...@hezmatt.org, fhw...@gmail.com, mozilla-dev-s...@lists.mozilla.org, br...@briansmith.org, Dan Veditz
Peter, how will AV running as SYSTEM or otherwise with high privilege
secure itself and lower-privilege applications from badware also running as
SYSTEM?

The game Core War is a lesson about the limits of security engineering, not
a model for a good approach.
0 new messages