Debian Weak Key Problem

168 views
Skip to first unread message

Gervase Markham

unread,
Jun 4, 2008, 11:44:21 AM6/4/08
to
[Please respect the Followup-To header, set to mozilla.dev.security]

Many of you will know about the problem with weak keys generated on
Debian or Debian-derivative systems between certain dates.[0] This
affects SSL certificates generated on those systems. CAs trusted by
Firefox have signed, and therefore endorsed, many such keys.

Once a hacker finds a server using a weak key, they can use the public
key to find the private key and so MITM or spy on traffic to that server
perfectly. All the relevant security indications will be present. This
negates all security in the SSL model.

Current data indicates that there may be up to 26,000 such keys out
there being used on public websites.[1] We are attempting, through
various channels, to get a firmer number and a list of vulnerable sites.

It seems that CAs are not bothering to contact their customers with weak
keys[1], although they are of course revoking the keys of customers who
ask, and reissuing certificates.[2] Firefox 3 has caching OCSP turned on
by default (with "soft-fail") and so would detect the revocation of
these keys. Firefox 2 does not have OCSP turned on by default. Both
browsers support CRLs, but do not have the capability to download CRLs
from URLs listed in certificates.

Here are some possible options (not mutually exclusive) for what to do:

- Do nothing

Once Firefox 3 is released, many people will upgrade. This means they
will be using soft-fail OCSP, and so will detect attempts to use revoked
keys.

However, because CAs are not actively contacting their customers, many
weak keys will still be being used, and we'd have no way to tell if
there was an MITM going on in that case.

This might be a reason to try and expedite pushing Firefox 3 into the
Firefox 2 update channel.

- Modify NSS to detect weak keys

Lists of all the weak keys exist - we could create a database of the
first 8 bytes or so of the hash of each and get NSS to check it when it
sees a key. This would involve a code change to NSS and a security
update. We'd probably want to download the list of weak keys rather than
ship it with the update. NSS would then return an error to the
application if a weak key was found, and the connection would abort.[3]

- Modify NSS/Firefox to detect weak sites

If we can get a fairly complete list of vulnerable sites (which might be
smaller than the hashes of a large number of keys) and their key
fingerprints we could modify Firefox or NSS to refuse to connect to them
until the key changed.

- Add the sites to the malware or phishing feeds

Quite a drastic option, but we could even leverage existing code by
adding the vulnerable sites to the feeds of phishing or malware sites.

- Contact CAs to put pressure on for them to contact their customers

We could use our contacts with CAs to try and convince them to change
their position on customer contact.

- Publish a "CA hall of shame"

If and when we get good data on the extent of the problem, we could try
and shame CAs into doing something. "3% of FooCA-signed certificates are
totally insecure. Is yours one of them?"

- Attempt to contact site owners

But 26,000 sites is a lot. We could try cross-referencing the sites with
Alexa data to get a priority order.

- Turn on OCSP in the next Firefox 2 security release

This might help in the short term, with the same limitations as listed
under "Do Nothing" for the Firefox 3 update. But Firefox 2's OCSP
support is not as complete; it doesn't do caching, and it hard-fails. So
we might melt the OCSP servers, and that would severely degrade the user
experience because no-one could make SSL connections.

- Ship a Firefox 2 update with some built-in CRLs

If OCSP is not an option, the next Firefox update could include static
copies of the various different major CAs' revocation lists from the
period concerned. These would probably be of significant size.

- Something else

Suggestions?

Gerv


[0] http://www.us.debian.org/security/2008/dsa-1571
[1] https://bugzilla.mozilla.org/show_bug.cgi?id=435082#c22
[2]
http://wiki.debian.org/SSLkeys#head-5450db0076b3d85650f72117a9884f89d2349032
[3] https://bugzilla.mozilla.org/show_bug.cgi?id=435082


Eddy Nigg (StartCom Ltd.)

unread,
Jun 4, 2008, 4:09:57 PM6/4/08
to Gervase Markham, dev-se...@lists.mozilla.org
Hi Gerv,

[Off-topic] For one I must notice the incredible inconvenience in
working with Bugzilla and this mailing list. It happens many times that
the same issue is discussed and tracked at different bugs in parallel.
I'm a CC bug 434128 and just got aware of bug 435082. Can you tell me
the best way how to KNOW about such bugs which are related and might
interest me? I can't spend my time searching all day long and on a daily
basis for new bugs. I guess there is a formula or something...?


[On-topic] Here is how I evaluate the situation. Debian users are
generally more aware and more informed about problems such as these in
relation to other server products (which includes obviously all brands).
This situation doesn't apply always on shared hosting. StartCom offers
two ways of creating server certificates, one way is to create ones own
private key and submit a CSR, at the other way the Certificates Wizard
creates it on behalf of the client. The majority of CSR submissions are
for IIS servers with a small minority being for Apache users. The rest
creates the keys at StartCom. Those keys aren't subject of this or
similar bugs. Out of the Apache users which create their own CSR (and
hence private key), Debian users are again a minority, at the most 10%.
Judging from the revocation requests we received, at least one third of
those requested revocations. There is about one third which didn't
succeeded in installing the certificate or for other reasons haven't
made use of the certificate (for example realizing the need for a
dedicated IP address etc). Therefore we have about another one third
which might be still using a weak key. This boils down for very few
still affected sites, probably less then 1.66 %.

Since all certificates issued at StartCom are valid for one year only,
the risk assessment didn't warrant for a full scan of all public keys
considering the effort which must put into such an effort. I expect the
situation to be similar at most CAs. See also inline comments.


Gervase Markham:


> Firefox 2 does not have OCSP turned on by default. Both
> browsers support CRLs, but do not have the capability to download CRLs
> from URLs listed in certificates.
>

This is a known shortcoming of FF2 and inherits higher risks then weak
keys. That's because if a certificate is revoked because of a weak key
it was most likely requested by the subscriber himself and he wouldn't
continue use of the weak key anyway. Hence this would make only sense if
CAs would revoke such certificate unilaterally.

> However, because CAs are not actively contacting their customers, many
> weak keys will still be being used, and we'd have no way to tell if
> there was an MITM going on in that case.
>

That's correct.

> Lists of all the weak keys exist - we could create a database of the
> first 8 bytes or so of the hash of each and get NSS to check it when it
> sees a key. This would involve a code change to NSS and a security
> update. We'd probably want to download the list of weak keys rather than
> ship it with the update. NSS would then return an error to the
> application if a weak key was found, and the connection would abort.[3]
>
> - Modify NSS/Firefox to detect weak sites
>

I would cite privacy concerns with such a scenario.

> If we can get a fairly complete list of vulnerable sites
>

How do you intend to find them?

> We could use our contacts with CAs to try and convince them to change
> their position on customer contact.
>
> - Publish a "CA hall of shame"
>

And what if a CA refuses to comment or provide this information?

> - Turn on OCSP in the next Firefox 2 security release
>
> This might help in the short term, with the same limitations as listed
> under "Do Nothing" for the Firefox 3 update. But Firefox 2's OCSP
> support is not as complete; it doesn't do caching, and it hard-fails. So
> we might melt the OCSP servers, and that would severely degrade the user
> experience because no-one could make SSL connections.
>

Yes, that's not a good option really.

> - Ship a Firefox 2 update with some built-in CRLs
>

Again, see above that this makes only sense if an affected site owner
would refuse to replace the certificate because of somebody detected a
weak key. I haven't encountered such a situation yet and doesn't make
much sense.

> Suggestions?
>
>

Even if it doesn't sound so good, do nothing is the right thing to do I
think.


Regards
Signer: Eddy Nigg, StartCom Ltd. <http://www.startcom.org>
Jabber: star...@startcom.org <xmpp:star...@startcom.org>
Blog: Join the Revolution! <http://blog.startcom.org>
Phone: +1.213.341.0390

Boris Zbarsky

unread,
Jun 4, 2008, 5:16:36 PM6/4/08
to
Eddy Nigg (StartCom Ltd.) wrote:
> This is a known shortcoming of FF2 and inherits higher risks then weak
> keys. That's because if a certificate is revoked because of a weak key
> it was most likely requested by the subscriber himself and he wouldn't
> continue use of the weak key anyway.

But the MITM attacker could use it to impersonate the site, which is the whole
point.

> Hence this would make only sense if
> CAs would revoke such certificate unilaterally.

I don't think that's correct; see above.

>> - Modify NSS/Firefox to detect weak sites
>
> I would cite privacy concerns with such a scenario.

Like what?

>> If we can get a fairly complete list of vulnerable sites
>>
>
> How do you intend to find them?

web-crawlers are not exactly rocket science. So the real question is: given an
SSL handshake, how does one tell whether the site is vulnerable? I believe
there are ways to detect this, based on other mails I've seen going through.

>> - Publish a "CA hall of shame"
>>
> And what if a CA refuses to comment or provide this information?

Provide what information? If there is a list of vulnerable sites, there is a
corresponding list of CAs, since the site certificate says who the CA is.

They can of course refuse to comment when someone says they're not doing their
job (assuming that's the implication of such a "hall of shame"). That's their
prerogative. ;)


>> - Ship a Firefox 2 update with some built-in CRLs
>
> Again, see above that this makes only sense if an affected site owner
> would refuse to replace the certificate because of somebody detected a
> weak key.

Again, I don't think that's correct.

> Even if it doesn't sound so good, do nothing is the right thing to do I
> think.

That's the perspective of the CAs (including yourself), sure. We know that already.

-Boris

Gervase Markham

unread,
Jun 4, 2008, 5:22:25 PM6/4/08
to
Eddy Nigg (StartCom Ltd.) wrote:
> [Off-topic] For one I must notice the incredible inconvenience in
> working with Bugzilla and this mailing list. It happens many times that
> the same issue is discussed and tracked at different bugs in parallel.
> I'm a CC bug 434128 and just got aware of bug 435082. Can you tell me
> the best way how to KNOW about such bugs which are related and might
> interest me? I can't spend my time searching all day long and on a daily
> basis for new bugs. I guess there is a formula or something...?

The situation is unusual. Related bugs should be connected with a
dependency relationship, or duped against each other. I'm not sure why
that hasn't happened in this situation.

Bugzilla is not a discussion forum, hence the move here.

> dedicated IP address etc). Therefore we have about another one third
> which might be still using a weak key. This boils down for very few
> still affected sites, probably less then 1.66 %.

But 1.66% of 800,000 is still a lot of sites.

> Since all certificates issued at StartCom are valid for one year only,
> the risk assessment didn't warrant for a full scan of all public keys
> considering the effort which must put into such an effort. I expect the
> situation to be similar at most CAs. See also inline comments.

Because attackers won't bother to exploit the problem until the year has
passed?

Also, won't people just get the same key signed again for another year?
Or is that not possible?

>> If we can get a fairly complete list of vulnerable sites
>
> How do you intend to find them?

Crawling the web.

>> We could use our contacts with CAs to try and convince them to change
>> their position on customer contact.
>>
>> - Publish a "CA hall of shame"
>>
> And what if a CA refuses to comment or provide this information?

We generate the list from the results of our crawl.

Gerv

Eddy Nigg (StartCom Ltd.)

unread,
Jun 4, 2008, 6:23:54 PM6/4/08
to Gervase Markham, dev-se...@lists.mozilla.org
Hi Gerv,

Gervase Markham:


>
> The situation is unusual. Related bugs should be connected with a
> dependency relationship, or duped against each other. I'm not sure why
> that hasn't happened in this situation.
>
> Bugzilla is not a discussion forum, hence the move here.
>

Right! It happened to me on a different bug as well and the discussion
was held (perhaps on purpose) at a different bug, which really annoyed
me. I and many others were left without the chance to influence in
time....But I understand that there isn't some global subscribe option
of formula for bugs.


[Finish Off-topic]

>> dedicated IP address etc). Therefore we have about another one third
>> which might be still using a weak key. This boils down for very few
>> still affected sites, probably less then 1.66 %.
>>
>
> But 1.66% of 800,000 is still a lot of sites.
>

Yes, indeed.

>
> Because attackers won't bother to exploit the problem until the year has
> passed?
>

No, obviously not, but the scope is limited and the thread diminishes in
the foreseeable future. The same thread existed already for some time
(with nobody taking responsibility btw).

> Also, won't people just get the same key signed again for another year?
> Or is that not possible?
>

Yes, it's possible. Obviously the number will get really smaller as
software gets updated and information spreads and so on.

>
> Crawling the web.
>
>

OK, good idea! So you have to have quite some resources for such a
crawler in order find all enabled sites and test their public keys in a
reasonable amount of time. Perhaps you can buy the list from Netcraft to
speed up the process.

> We generate the list from the results of our crawl.
>
>

We'd certainly corporate upon results of such a list.

Eddy Nigg (StartCom Ltd.)

unread,
Jun 4, 2008, 7:31:40 PM6/4/08
to dev-se...@lists.mozilla.org
Boris Zbarsky:

>
> But the MITM attacker could use it to impersonate the site, which is the whole
> point.
>

Yes, in case the attacker managed to get a copy of the previously used
and signed key. Not, in case the subscriber managed to change his cert
before.

>>> - Modify NSS/Firefox to detect weak sites
>>>
>> I would cite privacy concerns with such a scenario.
>>
>
> Like what?
>

I wouldn't like Mozilla to know which sites I'm visiting (including
non-public....and, eheeem all the others ;-) )

>> How do you intend to find them?
>>
>

> web-crawlers are not exactly rocket science.

Nope, but needs quite some resources in order to receive some valuable
results within reasonable time.

> So the real question is: given an SSL handshake, how does one tell whether the site is vulnerable? I believe
> there are ways to detect this, based on other mails I've seen going through.
>

Yes, certainly, but even this might require quite some CPU cycles.

>> And what if a CA refuses to comment or provide this information?
>>
>

> Provide what information?

Whatever they decided to do in respect of this threat.

> If there is a list of vulnerable sites, there is a
> corresponding list of CAs, since the site certificate says who the CA is.
>

Correct, but it's a big if for now.


>> Again, see above that this makes only sense if an affected site owner
>> would refuse to replace the certificate because of somebody detected a
>> weak key.
>>
>
> Again, I don't think that's correct.
>

Well, actually the Debian folks are rather security conscious...in
relation to that they are also the ones preferring Icewiesel and Cacert
because it ain't free enough, with purified openssl for the topping ;-)

>
> That's the perspective of the CAs (including yourself), sure. We know that already.
>
>

I had no clue what other CAs decided in that respect and I offered our
estimates and decisions on this subject. That's not something
coordinated. I'm open to suggestions as always.

--

Boris Zbarsky

unread,
Jun 5, 2008, 1:02:29 AM6/5/08
to
Eddy Nigg (StartCom Ltd.) wrote:
> Yes, in case the attacker managed to get a copy of the previously used
> and signed key. Not, in case the subscriber managed to change his cert
> before.

Could maybe try to brute-force the old key until they come up with a forged
certificate that an SSL library accepts? The whole point is that all the weak
keys come from a limited keyspace, right? In any case, I'm not going to argue
this point, since now I've said everything I can usefully say about it.

>>>> - Modify NSS/Firefox to detect weak sites
>>>>
>>> I would cite privacy concerns with such a scenario.
>>
>> Like what?
>
> I wouldn't like Mozilla to know which sites I'm visiting

Who said anything about "Mozilla" knowing? The idea here would be to have the
browser detect it and refuse to go to the site or something; no need to
communicate anything to "Mozilla".

>> web-crawlers are not exactly rocket science.
>
> Nope, but needs quite some resources in order to receive some valuable
> results within reasonable time.

Yes, it does.

> Yes, certainly, but even this might require quite some CPU cycles.

Sure.

>>> And what if a CA refuses to comment or provide this information?
>>
>> Provide what information?
>
> Whatever they decided to do in respect of this threat.

I see no problem with explicitly listing CAs who won't say whether they're
actually doing something. In a different group from the ones who say they're
doing nothing, probably.

>> If there is a list of vulnerable sites, there is a
>> corresponding list of CAs, since the site certificate says who the CA is.
>
> Correct, but it's a big if for now.

The premise (and a not unreasonable one) is that such a list can be generated if
needed.

> That's not something coordinated.

I didn't mean to imply that it was, by any means.

-Boris

Gervase Markham

unread,
Jun 5, 2008, 7:25:25 AM6/5/08
to
Eddy Nigg (StartCom Ltd.) wrote:
> Yes, in case the attacker managed to get a copy of the previously used
> and signed key. Not, in case the subscriber managed to change his cert
> before.

Right. But I'm not going to bet against the possibility that there a bad
guys even now downloading the public keys from as many SSL servers as
they can find, so that they can later compare them with the weak keys.
If they get a hit, they can impersonate that site from now until the
time the cert expires.

> I wouldn't like Mozilla to know which sites I'm visiting (including
> non-public....and, eheeem all the others ;-) )

As Boris says, modifying NSS or Firefox to detect weak keys does not
involve sending any data anywhere. Check the bug.

Gerv

Eddy Nigg (StartCom Ltd.)

unread,
Jun 5, 2008, 7:49:41 AM6/5/08
to Boris Zbarsky, dev-se...@lists.mozilla.org
Boris Zbarsky:

>
> Could maybe try to brute-force the old key until they come up with a forged
> certificate that an SSL library accepts?

No, not really. It requires the possession of the certificate with the
weak key signed by a CA.

> The whole point is that all the weak
> keys come from a limited keyspace, right?

That's correct. This allows to find the ~100,000 possible keys per key
size the right one.

>
> Who said anything about "Mozilla" knowing? The idea here would be to have the
> browser detect it and refuse to go to the site or something; no need to
> communicate anything to "Mozilla".
>

Oh, that would technically not be possible I guess. Searching for such
keys "dynamically" could take hours per key, hence previously created
keys are used. They would need to be hosted somewhere and compared to.
That's why Mozilla would know about which public key was used (the least).

>
> The premise (and a not unreasonable one) is that such a list can be generated if
> needed.
>

I expect that Mozilla will not come up with the resources for it.

Gervase Markham

unread,
Jun 5, 2008, 9:22:18 AM6/5/08
to
Eddy Nigg (StartCom Ltd.) wrote:
> Oh, that would technically not be possible I guess. Searching for such
> keys "dynamically" could take hours per key, hence previously created
> keys are used. They would need to be hosted somewhere and compared to.
> That's why Mozilla would know about which public key was used (the least).

As https://bugzilla.mozilla.org/show_bug.cgi?id=435082 explains, we
would have a locally-stored blacklist.

> I expect that Mozilla will not come up with the resources for it.

What makes you expect that?

Such a list of weak keys already exists, anyway.
http://metasploit.com/users/hdm/tools/debian-openssl/

Gerv

Juergen Schmidt

unread,
Jun 5, 2008, 9:55:48 AM6/5/08
to Juergen Schmidt
Gervase Markham wrote:

> - Do nothing
>
> Once Firefox 3 is released, many people will upgrade. This means they
> will be using soft-fail OCSP, and so will detect attempts to use revoked
> keys.


As far as I can see, there is no way around having at least an optional
extension that allows to check for weak keys in a blacklist approach.
The only place to migitate this problem is on the user side -- i.e. in
the browser or it's crypto provider. All the other strategies leave
significant attack surface.

# Revocation:

- As far as I know not all CAs support OCSP and a lot of older
certificates provide no OCSP-URI. They will stay attackable without
blacklist. But I have no statistics about that yet.
- Revocation goes *really* slow: We checked the revocation statistics of
Verisign: the second half of Mai they had even less revocations than in
the second half of april or march. Until the 1st June we still found (in
germany) about 3 percent of 4381 servers withe valid certificates using
weak certifcates including a lot of online shops and even payment
providers. This will not simply go away anytime soon -- especially if
browsers don't complain.


# detection of weak sites: Given that revocation does not work with
current browsers every site that ever published a weak certificate can
be spoofed. But having a site on the "list of shame" that revoked it's
weak certificate soon after publication of the problem is not really
fair. So this only works with functional online revocation checks -- but
see above.


# Expiry: there is a weak certificate for a big german bank out there.
It is valid until 2011. And for sure it is not the only one.


Having the blacklists downloaded after the installation makes sure that
they don't increase the download of the browser or the security update.

It also adresses an additional problem that has not been mentioned yet:
the blacklists are going to grow significantly if you strive for
completeness. The current lists only cover the common situation with
1024/2048 keys being generated on 32 Bit Intel systems. There will be
other key lengths, additional keys on 64 bit systems, systems with other
endianess and maybe even a couple more possible side effects influencing
the key generation. So you want to have a prioritised list of
blacklists: the common ones covering 90% + X for the masses and more
exotic ones for the paranoid among us. And don't forget that blacklists
might need to be updated. There is a chance that blacklists are not correct.


BTW: If somebody is working on such an extension/patch/update, I would
appreciate it, if he could notify me. We are thinking about having a
blacklist check built.


bye, ju

--
Juergen Schmidt Chefredakteur heise Security www.heisec.de
Heise Zeitschriften Verlag, Helstorferstr. 7, D-30625 Hannover
Tel. +49 511 5352 300 FAX +49 511 5352 417 EMail j...@heisec.de
GPG-Key: 0x38EA4970, 5D7B 476D 84D5 94FF E7C5 67BE F895 0A18 38EA 4970

Eddy Nigg (StartCom Ltd.)

unread,
Jun 5, 2008, 10:24:23 AM6/5/08
to Gervase Markham, dev-se...@lists.mozilla.org
Gervase Markham:

> Eddy Nigg (StartCom Ltd.) wrote:
>
>> Oh, that would technically not be possible I guess. Searching for such
>> keys "dynamically" could take hours per key, hence previously created
>> keys are used. They would need to be hosted somewhere and compared to.
>> That's why Mozilla would know about which public key was used (the least).
>>
>
> As https://bugzilla.mozilla.org/show_bug.cgi?id=435082 explains, we
> would have a locally-stored blacklist.
>

Locally stored where exactly? Do you have an idea how big such a list
which would cover just the most commonly used key sizes would be?
Doesn't sound feasible to me, hence I thought you were talking about
some kind of lookup service.


> What makes you expect that?
>
> Such a list of weak keys already exists, anyway.
> http://metasploit.com/users/hdm/tools/debian-openssl/
>
>

Yes I know obviously. That's exactly why I think it's not in the cards.

Juergen Schmidt

unread,
Jun 5, 2008, 11:29:25 AM6/5/08
to
Eddy Nigg (StartCom Ltd.) schrieb:

> Do you have an idea how big such a list
> which would cover just the most commonly used key sizes would be?

The current lists of chksslkey that cover the vast majority of the weak
keys take 13 MByte -- no problem for most PCs.

bye, ju

Juergen Schmidt

unread,
Jun 5, 2008, 11:55:51 AM6/5/08
to
Juergen Schmidt schrieb:

> - As far as I know not all CAs support OCSP and a lot of older
> certificates provide no OCSP-URI. They will stay attackable without
> blacklist. But I have no statistics about that yet.

A quick check shows that from 225 weak certificates we collected in the
wild only 43 had an OCSP URI. This is less than 20%.

Note: we counted only explicit "OCSP - URI" declarations in the extended
section. If there are other ways of declaring a valid OCSP responder, we
did not count them. For example some certificates have a "Netscape
Revocation Url". I don't know what this means.

Gervase Markham

unread,
Jun 5, 2008, 1:19:52 PM6/5/08
to
Eddy Nigg (StartCom Ltd.) wrote:
> Locally stored where exactly? Do you have an idea how big such a list
> which would cover just the most commonly used key sizes would be?
> Doesn't sound feasible to me, hence I thought you were talking about
> some kind of lookup service.

Read, the bug, Eddy! :-) There are size estimates in it. If you think
they are wrong, post better figures, with your working.

Gerv

Jean-Marc Desperrier

unread,
Jun 6, 2008, 7:17:59 AM6/6/08
to
Eddy Nigg (StartCom Ltd.) wrote:
> Boris Zbarsky:
>> Could maybe try to brute-force the old key until they come up with a
>> forged
>> certificate that an SSL library accepts?
>
> No, not really. It requires the possession of the certificate with the
> weak key signed by a CA.

I really don't think that "they will need to have access the site before
it changed it's certificate" is a significant mitigation factor for such
a high risk.

I like the black list approach. Would be good to web-crawl to enhance
the estimate, but I think around 99% of sites are using standard key
sizes. And the people who are knowledgeable enough to have used values
different from the standard ones in the scripts certainly have already
changed their cert.

Reply all
Reply to author
Forward
0 new messages