Filippo Valsorda writes:
> in practice it’s hard to imagine a real-world system that can
> survive—as a whole—compromise of its RBG just because its KEM hashes
> the RBG output.
The ecosystem is big. Presumably Dual EC is still deployed.
If the argument here is that the KEM isn't the only way RNG outputs are
leaked: sure, but it's very easy to imagine real-world systems where the
other leaks have been plugged or didn't exist in the first place.
(I'm reminded of how TLS for a long time pointed to unencrypted DNS as
an excuse to not encrypt host names, and vice versa.)
The 203 draft instead makes the narrower claim that hashing is
unnecessary with (current, not Dual EC) "NIST-approved randomness
generation". But NIST-approved randomness generation is a mess of
different options without clearly defined quantitative security claims
and without a full analysis. An extra hashing layer reduces risks.
I commented in an earlier message that
https://eprint.iacr.org/2018/349
did some analysis of NIST's RNG _modes_ (not the full RNGs) and found
various flaws. Here's an example: Section 7.1 of the paper breaks NIST's
claim that HMAC-DRBG provides forward secrecy. This is a fairly narrow
attack, mattering for applications that want forward deniability, but
it's still an illustration of the importance of security review. Hashing
the RNG output stops the stated attack, and more broadly means that any
break of forward secrecy has to recover full RNG outputs (if the hashing
is strong), not just some fragmentary information about the outputs.
Meanwhile, what I still haven't seen from NIST is a statement of how
NIST claims the hashing is supposed to be a _problem_.
Keeping cryptosystem specifications stable for security review is
important. NIST has noted that changes "after the third round ... may
not receive as much public scrutiny and analysis" and has said that it
wants to "minimize changes introduced". What exactly is the issue with
the hashing that's supposed to outweigh this?
---D. J. Bernstein