True. But if he had been using 40 bits, he'd have seen he needed a
delay of 173 nanoseconds (ignoring his misunderstanding about how many
attempts are actually needed), and then he would presumably have
realised that you don't need to add a software delay of 173 nanoseconds.
> The problem is that the application may be eg. about user-defined
> passwords, which are usually relatively short.
>
If you have lower-case and upper-case letters, digits and a couple of
punctuation marks, that's 64 values - 6 bits of "key" per user password
character. (In practice, 5 bits is more realistic as many characters
are rarely picked.) Even with short passwords it's not hard to beat 20
bits of variation.
However, when you use user-defined passwords, you don't use the password
directly for the cryptographic key - you add salt and pass it through a
hash. Run it through SHA256 and you have a 256 bit key. The source
does not have 256 bits of entropy, of course - even if the password
rules insist on 10 characters, an upper case, a lower case, a digit, and
a punctuation character, you might have, say, 30 bits of entropy. But
you can't work backwards from the SHA values - you have to work
forwards. That means you have to enumerate all the possible passwords
and generate the hashes and try them. Tables exist of commonly used
passwords, with algorithms to generate variations (like "Secret1!",
"Secret2!", etc.). But the job is much harder than simply trying all
values from 0 to 2 ^ 30.