Hi,
I bet $2050 that no quantum computer will break the RSA2048 factoring challenge before 2050. Any takers?
https://en.wikipedia.org/wiki/RSA_Factoring_Challenge
Scientific wager is a noble artform practiced by e.g., Stephen Hawking and Richard Feynman.
https://en.wikipedia.org/wiki/Scientific_wager
It would be very good with an estimate from NIST/NSA when US government believe a Cryptanalytically Relevant Quantum Computer (CRQC) will actually be built. Australian Signals Directorate (ASD) believes currently stored encrypted data holdings will remain secure for however long the data remains sensitive [1]. I don't know Australian classification rules by heart but that seems to indicate 50-100 years. I know some countries have information where it can be up to 150 years before you can even request declassification. For US national security systems, a CRQC built 2070 would likely be problematic, but that timeline should not apply to industry unless necessary. Very little information in the private sector needs to be protected for very long times.
So far, the estimate from ASD seems to be the best public estimate. I have zero trust in statements form the quantum industry. As Professor Scott Aaronson (director of the University of Texas Quantum Information Center) accurately said:
“claims that we know how to get near-term speedups for optimization, machine learning, etc. are >95% BS!”[2]
Cheers,
John Preuß Mattsson
Expert Cryptographic Algorithms and Security Protocols
I bet $2050 that no quantum computer will break the RSA2048 factoring challenge before 2050. Any takers?
It would be very good with an estimate from NIST/NSA when US government believe a Cryptanalytically Relevant Quantum Computer (CRQC) will actually be built. Australian Signals Directorate (ASD) believes currently stored encrypted data holdings will remain secure for however long the data remains sensitive [1]. I don't know Australian classification rules by heart but that seems to indicate 50-100 years. I know some countries have information where it can be up to 150 years before you can even request declassification. For US national security systems, a CRQC built 2070 would likely be problematic, but that timeline should not apply to industry unless necessary. Very little information in the private sector needs to be protected for very long times.
>
RSA-2048 is considered secure only until 2030
RSA-2048 should already have been phased out for most use cases. Because of classical computers, _not_ because of quantum computers. NIST and ANSSI only allows RSA-2048 (and FFDH2048) if the protected asset does not have to be protected after 2030. BSI states that RSA-2048 (and FFDH2048) can be used up until the year 2022.
It would be much more interesting with government guidelines for RSA-4096. The CNSA 2.0 timelines makes total sense for NSS use cases that need to protect TOP SECRET information for a century.
But unless NSA is doing this transition many many decades too late, the same timelines should likely not apply to other use cases that do not need to protect TOP SECRET information for a century….
Cheers,
John
--
You received this message because you are subscribed to the Google Groups "pqc-forum" group.
To unsubscribe from this group and stop receiving emails from it, send an email to
pqc-forum+...@list.nist.gov.
To view this discussion on the web visit
https://groups.google.com/a/list.nist.gov/d/msgid/pqc-forum/0e3fded5-e3af-43e4-829b-6856485602d5n%40list.nist.gov.
To view this discussion on the web visit https://groups.google.com/a/list.nist.gov/d/msgid/pqc-forum/GVXPR07MB9678514A058609D9F5D340F3899D9%40GVXPR07MB9678.eurprd07.prod.outlook.com.
If you like charts and tables, RSA-2048 appears to provide up to “security strength” of “112-bits”.
To view this discussion on the web visit https://groups.google.com/a/list.nist.gov/d/msgid/pqc-forum/GVXPR07MB9678514A058609D9F5D340F3899D9%40GVXPR07MB9678.eurprd07.prod.outlook.com.
I think NIST SP 800-57 Part 1 is an excellent document. It specifies in detail how to work with algorithm lifetimes and how to transition between algorithms with different strengths, i.e., exactly what we need for the PQC migration. An essential idea in NIST SP 800-57 is the security life of the data the algorithm is protecting. This makes very much sense and means that a use case requiring 50 years protection needs to migrate to a stronger algorithm 40 years before a use case that only requires 10 years of protection.
Suggestions that all use cases need to migrate to PQC at the same time makes no sense at all and go completely against NIST SP 800-57. I really hope NIST will keep the excellent SP 800-57 Part 1 also for the PQC migration.
https://nvlpubs.nist.gov/nistpubs/SpecialPublications/NIST.SP.800-57pt1r5.pdf
To view this discussion on the web visit https://groups.google.com/a/list.nist.gov/d/msgid/pqc-forum/CABzBS7nUcDOJPgmOYdddranrRVvdecVV1cnaYmOeYOdkDPqg-Q%40mail.gmail.com.
--
You received this message because you are subscribed to the Google Groups "pqc-forum" group.
To unsubscribe from this group and stop receiving emails from it, send an email to pqc-forum+...@list.nist.gov.
To view this discussion on the web visit https://groups.google.com/a/list.nist.gov/d/msgid/pqc-forum/GVXPR07MB96789559471D787F344F2FD0899D9%40GVXPR07MB9678.eurprd07.prod.outlook.com.
To unsubscribe from this group and stop receiving emails from it, send an email to pqc-forum+unsubscribe@list.nist.gov.
To view this discussion on the web visit https://groups.google.com/a/list.nist.gov/d/msgid/pqc-forum/GVXPR07MB96789559471D787F344F2FD0899D9%40GVXPR07MB9678.eurprd07.prod.outlook.com.
--
You received this message because you are subscribed to the Google Groups "pqc-forum" group.
To unsubscribe from this group and stop receiving emails from it, send an email to pqc-forum+unsubscribe@list.nist.gov.
To view this discussion on the web visit https://groups.google.com/a/list.nist.gov/d/msgid/pqc-forum/CALuYY16XNkadiGkKJSi1ZxOUmn4F_p0aOzrmj25%3DQCYp13FBzw%40mail.gmail.com.
To unsubscribe from this group and stop receiving emails from it, send an email to pqc-forum+...@list.nist.gov.
To view this discussion on the web visit https://groups.google.com/a/list.nist.gov/d/msgid/pqc-forum/GVXPR07MB96789559471D787F344F2FD0899D9%40GVXPR07MB9678.eurprd07.prod.outlook.com.
--You received this message because you are subscribed to the Google Groups "pqc-forum" group.
To unsubscribe from this group and stop receiving emails from it, send an email to pqc-forum+...@list.nist.gov.
I've thought about making various bets about post-quantum cryptography like this, but I would suggest the proceeds to go charity
--
You received this message because you are subscribed to the Google Groups "pqc-forum" group.
To unsubscribe from this group and stop receiving emails from it, send an email to pqc-forum+...@list.nist.gov.
To view this discussion on the web visit https://groups.google.com/a/list.nist.gov/d/msgid/pqc-forum/CAHOTMVJeXqk%2Bi2QrOmPfFnSo_dJHv3iLNgrgMhFFKQqqUqa1PA%40mail.gmail.com.
D. J. Bernstein wrote:
>'John Mattsson' via pqc-forum writes:
>> I bet $2050 that no quantum computer will break the RSA2048 factoring
>> challenge before 2050. Any takers?
>
>Happy to take you up on that. I expect the factorization of RSA-2048 to
>be publicly announced, with the factors, and plausibly claimed to be
>done by quantum computing, long before 2050. (I'm already on record
>betting on 2032; see https://protect2.fireeye.com/v1/url?k=31323334-501cfaf3-313273af-454445554331->4c7020056d99329c&q=1&e=fba41b67-a7a8-47fb-8120-9a289442823d&u=https%3A%2F%2Fblog.cr.yp.to%2F20220129->plagiarism.html.)
Great, then it's on like Donkey Kong!
Wow, I didn't know you had so many ongoing bets. I agree that you will win your JPEG bet, that will probably be true in 2050 as well. I think your second bet will be a draw, and I do of course think you will lose your third bet
;)
>A quantum circuit running a Shor-type algorithm needs enough physical
>qubits, a low enough per-qubit error rate, enough error correction to
>build logical qubits, and a quantum modular-exponentiation algorithm;
>so it's good to watch reported advances in each of these four areas.
>Advances in the number of qubits and the error rate are plotted in
>
>
>but beware that the graph incorrectly portrays error correction and
>exponentiation algorithms as unchanging when in fact they've been
>improving---and presumably will continue to do so, for example with
>faster multiplication algorithms and with techniques to encode multiple
>logical qubits more efficiently than encoding each qubit separately.
Yes, Samuel Jaques website is indeed a great resource and algorithms have definitely improved.
I do however not think that the physical aspects of quantum computers will continue to improve as fast as they have in the past years. Any near-term applications for quantum computers are highly uncertain. As stated in National Academies consensus study report "Quantum Computing Progress and Prospects"
"a “noisy intermediate-scale quantum computer,” or NISQ computer, probably isn’t going to be of much practical use. “There are at present no known algorithms/applications that could make effective use of this class of machine,” says the committee. That might change. Or it might not. And if it doesn’t, it seems unlikely that industry will keep investing in quantum computing long enough for the technology to pay dividends.”
https://nap.nationalacademies.org/catalog/25196/quantum-computing-progress-and-prospects#toc
In the current ecomony, money is not free anymore and investors are increasingly reluctant to fund activities making losses. I think we will see a very drastical reduction in the quantum computing industry. The only thing stopping that would be if they find practical use cases soon and start making money, my estimate is that they will not.
The quantum computing industry is fundamentally different form the classical computing industry when it started. For classical computing there was many profitable use cases from the start and always very strong demand for more and more transistors. The quantum computing industry has currently no practical use cases and it is unclear when there will be practical money making use cases.
The quantum industry is suffering from what Google’s former head of quantum computing hardware John Martini called "quantity hype". The end station of quantity hype is death of the quantum computing industry illustrated
with a skull in the linked article.
>> So far, the estimate from ASD seems to be the best public estimate. I
>> have zero trust in statements form the quantum industry.
>
>Google, IBM, etc. certainly have an incentive to hype the advances, but
>insisting on quantification and on scientific papers makes it hard for
>them to make the overall progress sound more rapid than it actually is.
>If you're looking for a survey of expert risk assessments rather than
>extrapolating the numbers yourself, you can find a 67-page report
>
>
>that documents its methodology, _quantifies_ the risk assessments, and
>includes systematic comparisons to previous risk assessments.
I am not as fond of the "2022 Quantum Threat Timeline Report" as you seem to be. Much of the report is based on a survey among researchers and companies working on quantum computers.
I don't think that is a good way to do a risk assesment at all. The sample risks being very biased and people are often way too optimistic about their own work. If you asked fusion researchers in 1930, they also said working fusion reactors were very likely
in the next 20 years. The probabilities in the report does not
align with what I have heard from quantum researchers in other fields of quantum research not building quantum researchers themselves.
One professor told me he thought CRQCs would never be built, or at least not in the next
100 years…
Given that the survey is likely sampled from the same set of people that are responsible for the >95% BS claims about near-term speedups, a reasonable assumtion would be that the time and probability estimates might also be >95% BS.
I think the National Academies consensus study report "Quantum Computing Progress and Prospects" from 2019 seems much better. It concludes that the emergence of a CRQC during the next decade would be highly unexpected.
https://nap.nationalacademies.org/catalog/25196/quantum-computing-progress-and-prospects#toc
>Meanwhile, if you're looking at incentives, you have to consider the
>fact that large-scale attackers are recording RSA/ECC ciphertexts today,
>are working on building quantum computers, and have a strong incentive
>to convince people to delay post-quantum rollout as long as they can.
>
>When ASD says it "does not expect it will possible within these lengths
>of time to build a quantum computer that can break the algorithms and
>key sizes described in the ISM", do we have a clear statement of what
>"these lengths of time" are? Is ASD saying what the basis for its
>(claimed) expectation is? Are the details provided for public review,
>giving an opportunity for errors to be discovered and corrected?
No, given that ASD is Australia’s Signals Intelligence Agency the
public information is as usual
minimal. Not sure about how long documents can be classified in Australia, but in the US and many other countries
there is no firm
end date, but in the US you do need special permission
to classify longer than 75 years.
https://www.govinfo.gov/content/pkg/WCPD-1995-04-24/pdf/WCPD-1995-04-24-Pg634.pdf
>When NSA's FAQ poses the question "Isn't quantum computing a long way
>off?" and answers it by talking only about "systems that will be used
>many decades in the future" and not about _attackers recording data
>right now_, here are two different hypotheses to consider:
>
> * This is NSA honestly believing, for reasons so obvious that they
> don't require any explanation or references, that the probability
> of attackers having quantum computers in time to exploit any of
> today's data is so close to 0 as to not even be worth mentioning.
>
> * This is NSA trying to mislead the public, as a tiny part of its
> quarter-billion-dollar-per-year program to "covertly influence
> and/or overtly leverage" cryptography to make it "exploitable".
I don't think that is an exhaustive list. I absolutely think NSA believe there is a risk that a CRQC is build in time frames to exploit today’s data. I wish NSA would share more of their estimates here, but NSA does not really like
to share information unless they find it necesary. The risk does not at all have to be close to 0. An estimated 25% risk that a CRQC is built in 50 years would likely make NSA act as they do right now. Forcing NSS systems to move to PQC as soon as possible.
I have not seen any signs that NSA is trying to mislead, but NSA definitly has an interest in influencing the industry to tag along quickly in the PQC migration to get cheap off-the-shelf PQC products for NSS.
>Some user data needs long-term confidentiality. This is even a legal
>right (depending on the country) for, e.g., patients talking to doctors,
>clients talking to lawyers, whistleblowers talking to journalists, etc.
>From a risk-management perspective, we have to assume that quantum
>computers are built within the timeframe where today's data still has to
>be kept confidential. This means that we already have a security
>disaster today. Every day that we fail to act is giving away more user
>data to attackers.
>
>> For US national security systems, a CRQC built 2070 would likely be
>> problematic, but that timeline should not apply to industry unless
>> necessary. Very little information in the private sector needs to be
>> protected for very long times.
>
>The U.S. government intercepted private activities of Martin Luther
>King, Jr., in the 1960s, used those intercepts for extortion, and is
>continuing to seal the records---through 2027, last I heard. That's a
>longer timeframe than you're talking about. I doubt that the family
>thinks it'll be okay to release the intercepts in 2027.
>
>I expect that Hoover-style attackers recording information in 2023 will
>find considerable volumes that remain useful for extortion in 2070, and,
>more to the point, in the much shorter timeframe when I expect the
>attackers to have large quantum computers.
Yes, there are definitely personal information that require very long confidentiality as well, but most information don’t. I hope NIST will continue to long-term in advance announce end years for when data can be actively protected with certain algorithms. This allows people to continue to use NIST SP 800-57 Part 1 to determine how long certain algorithms can be used for their use cases based on their need for long-time confidentiality.
Note that NIST announcing e.g., that P-P384 can only be used as key exchange for data that do not need to be confidential longer that 2050 would have very drastic implications. That would imply that NIST believe huge amount of US TOP SECRET information protected using the CNSA 1.0 suite would be readable by other hostile national states by that time. Much of that information would still be classified. It would mean that NIST thinks NSA failed to see the risk in time and updated CNSA way too late.
>> As Professor Scott Aaronson (director of the University of Texas
>> Quantum Information Center) accurately said:
>> “claims that we know how to get near-term speedups for optimization,
>> machine learning, etc. are >95% BS!”
>
>My understanding is that he's criticizing exaggeration of (1) the range
>of algorithms with known quantum speedups and (2) the practical value of
>baby quantum computers that are too noisy to support logical qubits.
>
>---D. J. Bernstein
Cheers,
John
Hi,
If I counted correctly, Daniel J. Bernstein, John Sahhar, Daniel Apon, and Michele Mosca accepted my bet. I am accepting the four "takers" and close the bet for more takers. I would be happy to let the money go to charity such the U.S. National Science Foundation. Also fine to formalize the bet by signing a contract if anybody wants that, maybe that could be arranged at some NIST workshop.
The bet is already mentioned on Wikipedia and got a whole article in New Scientist.
https://en.wikipedia.org/wiki/Scientific_wager
Seems like quite a lot of people are fed up with the quantum hype. At the recent RSA conference Adi Shamir said:
“If I’m trying to characterize what has been delivered in practice in quantum computing. I must say that the main thing which has been delivered is more promises”
"it won’t be for another 30 or 40 years until they are able to pose a risk."
And in the New Scientist article Paul Hoffman says:
"quantum computers will crack encryption in 2060 “at the very earliest”, and that it is entirely possible that the moment will never come."
The two main reasons I expect a << 50% chance that quantum computers will break RSA-2048 before 2050 are:
- Unless there are drastic improvements in algorithms, quantum computers need 6 orders of magnitude more qubits while keeping or slightly improving the error rate. I don't see that happening before 2050. The technical difficulties are enormous, and I don’t believe there will be funding.
- It does not seem like security agencies around the world believe in CRQCs before 2050, if they were they would panic a lot more. The existence of a CRQC before 2050 would mean that most security agencies have failed to protect their nations classified information. The exception would be Sweden that never trusted public key crypto (too much structure) and continued to rely solely on symmetric crypto :) NSA expects the transition to QR algorithms for NSS to be complete by 2035, and earlier transitions like Suite B took significantly longer time than expected.
Given that US NSS is planning to migrate in the next 12 years, most other use cases can wait several more decades following the standard for algorithm migration, NIST SP 800-57. DNSSEC has already decided that they will not transition to PQC now, which makes a lot of sense to me.
Cheers,
John
I bet $2050 that no quantum computer will break the RSA2048 factoring challenge before 2050.
--
You received this message because you are subscribed to the Google Groups "pqc-forum" group.
To unsubscribe from this group and stop receiving emails from it, send an email to pqc-forum+...@list.nist.gov.
To view this discussion on the web visit https://groups.google.com/a/list.nist.gov/d/msgid/pqc-forum/198b3281-fd15-4a2c-8633-793f4f93b021n%40list.nist.gov.
Hello Tony,
I like the idea of betting on a meaningful shorter-term milestone.
1)
Some comments on your suggestion:
i)
Since Shor's algorithm runs in slightly more than quadratic time, without a significant advance in quantum factoring algorithms, we should expect a bit more than 40x more resources (very roughly)
to go from factoring 330 bit semi-primes to 2048 bit semi-primes. E.g. >5x memory and >8x time.
So the difference is a bit more than 5 bits of security.
That's a lot smaller than the gap between the best classical algorithms for RSA100 and for RSA110, or between RSA300 and 2048 bit RSA.
ii)
Implementing Shor's algorithm for RSA100 based on what we know today requires fault-tolerant logical qubits (see https://arxiv.org/abs/1902.01448 for an assessment of some alternative approaches that have been proposed, and why we don't think they represent meaningful cryptanalysis benchmarks).
Most likely (in my opinion), to achieve factoring RSA100 with a quantum computer, the major advances/breakthroughs needed to break 2048-bit RSA with a quantum computer will have been achieved. So RSA100 is not a good mid-way point.
[I might be wrong, of course.... maybe eventually available technology and fault-tolerant error correction will allow us to cram just enough logical qubits on available chips and in available cryostats to break RSA100, but will require significant advances (e.g. entangling across different chips, or significant miniaturization) to break 2048-bit RSA.]
iii)
Checking for no cheating will be non-trivial. If one produces the factors of a 2048-bit RSA challenge, we'll be pretty confident they didn't do it classically and then used the quantum computer
to pretend to factor it and just output the (already known) answer. i.e. there are fewer reasons to need to test or validate the implementation.
But factoring 100 digit numbers is easy to do classically, so one would have to be an expert on quantum computing, and have intimate access to the code and hardware to be able to check if the quantum computer actually did the factorization versus just ran a noisy instance of Shor's algorithm and then rigged the computer to output a string that when post-processed leads to a factorization.
2)
In our threat timeline survey, we asked experts for what would be a good "Next experimental milestone to demonstrate the feasibility of a cryptographically-relevant quantum computer".
Most answers were related to fault-tolerance.
We tried "scalable fault-tolerant logical qubit" in a previous year, but it's hard to converge on a precise notion of scalable.
Further complicating matters is that different platforms may be much more easy to scale than others once a fault-tolerant logical qubit is achieved.
3)
Let me reiterate the point that it would be helpful to make more precise statements about the risk.
And 50% chance isn't in most cases the right number if discussing what date to assume for planning purposes. e.g. a 5% chance is closer to what might be considered a borderline risk tolerance.
So if wanting to back up one's opinion with a bet, please offer a date with 19:1 odds.
Also, if trying to convey a bottom-line risk (which betting odds can capture), then vague qualifiers like "Unless something unexpected happens" or "Unless there are drastic improvements in algorithms" don't realistically allow that.
Deriving a bottom-line risk estimate would require fine-grained estimates like 2% chance of 10000x improvement, 10% chance of 1000x improvement, 20% chance of 100x improvement, 50% chance of
10x improvement, etc., and estimates of the likelihood of the hardware advancing sufficiently in each of those scenarios (and that's a major simplification of the overall challenge of trying to estimate the threat timeline).
Best,
Michele
To view this discussion on the web visit https://groups.google.com/a/list.nist.gov/d/msgid/pqc-forum/YQBP288MB0013C19F1884F0B721CF438483739%40YQBP288MB0013.CANP288.PROD.OUTLOOK.COM.
What about elliptic curve discrete logarithms, won't the quantum-classical break-even be achieved before RSA?
Not all optimizations developed in this paper are directly applicable to arithmetic in elliptic curve groups. It is an interesting topic for future research to study to what extent the optimizations developed in this paper may be adapted to optimize such arithmetic operations (see Section 4.5). This paper should not be perceived to indicate that the RSA integer factoring problem and the DLP in finite fields is in itself less complex than the DLP in elliptic curve groups on quantum computers. The feasibility of optimizing the latter problem must first be properly studied.
Many of the optimization techniques that we use in this paper generalize to other contexts where arithmetic is performed. In particular, consider the Toffoli count for computing discrete logarithms over elliptic curves reported in [74]. It can likely be improved substantially by using windowed arithmetic. On the other hand, because of the need to compute modular inverses, it is not clear if the coset representation of modular integers is applicable. Which of our optimizations can be ported over, and which ones cannot? How much of an improvement would result? These are interesting questions for future research work.
--
You received this message because you are subscribed to the Google Groups "pqc-forum" group.
To unsubscribe from this group and stop receiving emails from it, send an email to pqc-forum+...@list.nist.gov.
To view this discussion on the web visit https://groups.google.com/a/list.nist.gov/d/msgid/pqc-forum/22ae4c13-a619-4f0f-ad77-daf8e0e97d79n%40list.nist.gov.