Dear PQC forum,
Following NIST's request for comments, I have serious concerns regarding the PQC process which I would like to raise. I specifically chose to use a pseudonym because I want to speak about the process in an open manner while putting biases aside.
NIST's PQC team has an important task. The team is working by a tight schedule to complete it, and we all agree that they are doing their best. Unfortunately, although many researchers are working hard to scrutinize the proposed candidates, the circumstances are such that post-quantum cryptography is not yet stable. Despite that, the deadlines of the process force us towards standardization soon.
We need to have an urgent discussion, not about a certain flaw or misconduct, but about the state of the whole process and how it should continue. The finish line is close, and crossing it in the current state might have disastrous consequences. We must think about security first, even if it requires prolonging the process.
As the process is coming to its end, our possibilities have narrowed in two ways:
1. We are left with few signature schemes – Dilithium and Falcon.
2. We have effectively become reliant on lattice schemes.
Each one is problematic on its own, but the latter adversely affects the former, making it much worse.
Over the last few months, we saw many publications of strong attacks against signature schemes in the process [4, 7, 9, 14], as well as general concerns such as [12]. Seeing such a trend so late in the process is very worrying, and I don't see a reason for it to stop just because the process itself is ending. On the contrary, I fear that the worst is still ahead of us, and strong attacks, specifically relevant to lattices, are expected in the near future.
NIST recognized the problems caused by the lack of diversity in the signature category and suggested two countermeasures – standardizing SPHINCS+ in the third round and opening the fourth round for another call for proposals. That is a healthy and a welcomed discussion, but sadly, not a sufficient one – neither Dilithium nor Falcon should be announced as the winner at the current deadline – January 2022.
First, I am concerned that Dilithium may be more vulnerable to future attacks than is currently recognized. Due to the especially large gap in its lattice, it may be open to an additional set of attacks which exploit such gaps in algebraic lattices. While the effect of gaps on BKZ is considered in the core-SVP analysis, results such as [6] demonstrate that there are classes of attacks that can be more effective in this case. These attacks are still being improved upon. For instance, Bernstein et al. recently generalized these methods in [3], and the exact implications of these results are still far from clear. Until we have a more stable understanding of this topic, I believe we should choose to err on the side of caution.
Second, I believe that as far as signatures are concerned, we can take (a bit) more time. Post-quantum KEMs answer a fairly urgent need – protect against "record now, exploit later" sort of attacks. Indeed, some PKI clients are hard to update, but that's what makes deprecating bad crypto out of them twice as hard. Look at SHA1 for example – it took Microsoft almost 10 years to complete its deprecation [13]. It's good that we have started preparing for migration early, but making changes to the PKI impulsively could lead to severe problems with enormous recovery costs.
An even more concerning underlying problem is the fixation on lattices. Not only are lattices the building block of most finalists, NIST declared their intention to choose lattice-based proposals for standardization in both categories. I don't think we should feel comfortable enough with lattices to support that choice.
Improvements to lattice reduction techniques are still being discovered. This April, a new quantum lattice sieving algorithm via quantum random walks was published [5]. This algorithm reduces the core-SVP dimension-to-bits conversion factor from 0.265 to 0.257 and as such it should call for a reevaluation of the security of all lattice-based cryptographic schemes.
The responses to this paper have been underwhelming so far. Not one candidate has updated its security claims accordingly. To advance and renew the discussion, I have published a simpler algorithm which achieves a similar (if slightly less impressive) improvement. The advantage of this approach is that it uses only the tools already present in Laarhoven's original 2^0.265d LSF-based algorithm [11] and can be more easily verified.
The main idea of this algorithm is as follows: In LSF-based nearest neighbor search algorithms, one separates the list of vectors into buckets, where each vector may be contained in more than one bucket. One then searches for close pairs of vectors that lay in the same bucket. When searching for neighbors of a vector v, the original quantum algorithm iterates classically over buckets containing v, producing a list of potential neighbors. It is possible to skip the list-generation phase by using Grover search with a circuit that samples a vector from the potential neighbors. This essentially applies the square root speedup to another part of the process, solving an exact SVP problem of dimension d in time 2^{0.2571d+o(d)} (only slightly worse than the 2^{0.257d+o(d)} claimed by [5]). For more details about this algorithm:
https://eprint.iacr.org/2021/1295.
Most lattice-based finalists have a sieving dimension of 350-400 accounting for "dimensions for free" [8]. Therefore, this algorithm reduces the core-SVP security claim by a further 3 bits. Given that all lattice-based finalists already have a core-SVP strength well below the original target of 128 bits, even a small reduction should not be ignored.
Sub-exponential factors provide us with some additional security, which may allow a small margin of error. No exact guideline has been set for the required core-SVP strength, so we do not know if this new drop in security should be a concern or not. In fact, until the beginning of round 3 it was not clear whether the sub-exponential factors would be considered at all. As core-SVP is a good measure of the strength of lattice-based ciphers, there should be an open discussion about its application to the candidates, and NIST should define clear comparison criteria accordingly.
Most concerning is the fact that we don't know what security margin we can expect to gain from the sub-exponential factors. There have been attempts to quantify the effect from these factors. For instance, [1] estimates that the sub-exponential factors increase Kyber's security by ~30 bits. However, since most of the recent research has been focused on minimizing the exponential costs, there have been few attempts to reduce these sub-exponential factors.
It is very likely that further improvements will be found, whether by using new techniques to reduce the sub-exponential factors, by further formalizing [3] or by exploring other ideas to utilize the algebraic structure. Moreover, combining the presented sieving algorithm with other sieving methods such as [2, 5, 10] could yield even more interesting results. Lattice-based cryptography research is far from exhausted, and I expect we will see many more exciting discoveries, possibly affecting the security claims of the candidates as we understand them today.
To counter these claims, one might say that we can publish a PQC standard right now and advise upon increasing key sizes if necessary.
That would be a poor decision – adapting ciphers whose security level was recently reduced is a bad policy for reasons we are all familiar with. Such an ad-hoc solution may well lead us to choose a cipher which at best has poor performance, and at worst will be unsafe.
NIST said in the third NIST PQC conference that they are happy with the existing options, but I don't think they should be just yet.
Especially when it comes to signatures, I get the sense that we are not choosing the best out of several good options, but merely survivors. The AES process succeeded due to having many good options, making it clear that the one selected was the best.
Don't get me wrong – the team is doing a terrific job, facing a tough and important challenge. It is tempting to complete it according to the committed timetable, but in doing so they would fail to meet their mandate.
NIST are correct to take an active approach, but it is not enough. They should demand more from the cryptographic community, focusing us on current gaps and unknowns, and request assistance where needed. The success of this process is a mutual interest of us all (except CNRS, I guess...), regardless of which scheme is chosen. This requires prolonging the process, but this delay is necessary.
Such a delay, after 5 years of hard work and anticipation, should not be taken lightly – as I acknowledge the weariness of both NIST's PQC team and the participants. Unfortunately, it is the only valid option moving forward. The current proposals, especially in the signature category, are simply not good enough. A decision to standardize one of the lattice-based schemes (or even a compromise, such as SPHINCS+) might end up having irreversible consequences that will affect us all.
Sincerely,
Max Heiser
[1] Martin R. Albrecht, Vlad Gheorghiu, Eamonn W. Postlethwaite, John M. Schanck: Quantum speedups for lattice sieves are tenuous at best.
https://eprint.iacr/2019/1161.pdf[6] Ronald Cramer and Léo Ducas and Chris Peikert and Oded Regev: Recovering Short Generators of Principal Ideals in Cyclotomic Rings.
https://eprint.iacr.org/2015/313[10] Elena Kirshanova, Erik Martensson, Eamonn W. Postlethwaite, Subhayan Roy Moulik: Quantum Algorithms for the Approximate k-List Problem and their Application to Lattice Sieving.
https://eprint.iacr.org/2019/1016