Lattice Sieving Improvement and Concerns

718 views
Skip to first unread message

Max Heiser

unread,
Oct 8, 2021, 3:43:16 AM10/8/21
to pqc-...@list.nist.gov
Dear PQC forum,


Following NIST's request for comments, I have serious concerns regarding the PQC process which I would like to raise. I specifically chose to use a pseudonym because I want to speak about the process in an open manner while putting biases aside.

NIST's PQC team has an important task. The team is working by a tight schedule to complete it, and we all agree that they are doing their best. Unfortunately, although many researchers are working hard to scrutinize the proposed candidates, the circumstances are such that post-quantum cryptography is not yet stable. Despite that, the deadlines of the process force us towards standardization soon.
We need to have an urgent discussion, not about a certain flaw or misconduct, but about the state of the whole process and how it should continue. The finish line is close, and crossing it in the current state might have disastrous consequences. We must think about security first, even if it requires prolonging the process.

As the process is coming to its end, our possibilities have narrowed in two ways:
1. We are left with few signature schemes – Dilithium and Falcon.
2. We have effectively become reliant on lattice schemes.
Each one is problematic on its own, but the latter adversely affects the former, making it much worse.


Over the last few months, we saw many publications of strong attacks against signature schemes in the process [4, 7, 9, 14], as well as general concerns such as [12]. Seeing such a trend so late in the process is very worrying, and I don't see a reason for it to stop just because the process itself is ending. On the contrary, I fear that the worst is still ahead of us, and strong attacks, specifically relevant to lattices, are expected in the near future.

NIST recognized the problems caused by the lack of diversity in the signature category and suggested two countermeasures – standardizing SPHINCS+ in the third round and opening the fourth round for another call for proposals. That is a healthy and a welcomed discussion, but sadly, not a sufficient one – neither Dilithium nor Falcon should be announced as the winner at the current deadline – January 2022.

First, I am concerned that Dilithium may be more vulnerable to future attacks than is currently recognized. Due to the especially large gap in its lattice, it may be open to an additional set of attacks which exploit such gaps in algebraic lattices. While the effect of gaps on BKZ is considered in the core-SVP analysis, results such as [6] demonstrate that there are classes of attacks that can be more effective in this case. These attacks are still being improved upon. For instance, Bernstein et al. recently generalized these methods in [3], and the exact implications of these results are still far from clear. Until we have a more stable understanding of this topic, I believe we should choose to err on the side of caution.

Second, I believe that as far as signatures are concerned, we can take (a bit) more time. Post-quantum KEMs answer a fairly urgent need – protect against "record now, exploit later" sort of attacks. Indeed, some PKI clients are hard to update, but that's what makes deprecating bad crypto out of them twice as hard. Look at SHA1 for example – it took Microsoft almost 10 years to complete its deprecation [13]. It's good that we have started preparing for migration early, but making changes to the PKI impulsively could lead to severe problems with enormous recovery costs.


An even more concerning underlying problem is the fixation on lattices. Not only are lattices the building block of most finalists, NIST declared their intention to choose lattice-based proposals for standardization in both categories. I don't think we should feel comfortable enough with lattices to support that choice.

Improvements to lattice reduction techniques are still being discovered. This April, a new quantum lattice sieving algorithm via quantum random walks was published [5]. This algorithm reduces the core-SVP dimension-to-bits conversion factor from 0.265 to 0.257 and as such it should call for a reevaluation of the security of all lattice-based cryptographic schemes.

The responses to this paper have been underwhelming so far. Not one candidate has updated its security claims accordingly. To advance and renew the discussion, I have published a simpler algorithm which achieves a similar (if slightly less impressive) improvement. The advantage of this approach is that it uses only the tools already present in Laarhoven's original 2^0.265d LSF-based algorithm [11] and can be more easily verified.

The main idea of this algorithm is as follows: In LSF-based nearest neighbor search algorithms, one separates the list of vectors into buckets, where each vector may be contained in more than one bucket. One then searches for close pairs of vectors that lay in the same bucket. When searching for neighbors of a vector v, the original quantum algorithm iterates classically over buckets containing v, producing a list of potential neighbors. It is possible to skip the list-generation phase by using Grover search with a circuit that samples a vector from the potential neighbors. This essentially applies the square root speedup to another part of the process, solving an exact SVP problem of dimension d in time 2^{0.2571d+o(d)} (only slightly worse than the 2^{0.257d+o(d)} claimed by [5]). For more details about this algorithm: https://eprint.iacr.org/2021/1295.

Most lattice-based finalists have a sieving dimension of 350-400 accounting for "dimensions for free" [8]. Therefore, this algorithm reduces the core-SVP security claim by a further 3 bits. Given that all lattice-based finalists already have a core-SVP strength well below the original target of 128 bits, even a small reduction should not be ignored.

Sub-exponential factors provide us with some additional security, which may allow a small margin of error. No exact guideline has been set for the required core-SVP strength, so we do not know if this new drop in security should be a concern or not. In fact, until the beginning of round 3 it was not clear whether the sub-exponential factors would be considered at all. As core-SVP is a good measure of the strength of lattice-based ciphers, there should be an open discussion about its application to the candidates, and NIST should define clear comparison criteria accordingly.

Most concerning is the fact that we don't know what security margin we can expect to gain from the sub-exponential factors. There have been attempts to quantify the effect from these factors. For instance, [1] estimates that the sub-exponential factors increase Kyber's security by ~30 bits. However, since most of the recent research has been focused on minimizing the exponential costs, there have been few attempts to reduce these sub-exponential factors.

It is very likely that further improvements will be found, whether by using new techniques to reduce the sub-exponential factors, by further formalizing [3] or by exploring other ideas to utilize the algebraic structure. Moreover, combining the presented sieving algorithm with other sieving methods such as [2, 5, 10] could yield even more interesting results. Lattice-based cryptography research is far from exhausted, and I expect we will see many more exciting discoveries, possibly affecting the security claims of the candidates as we understand them today.

To counter these claims, one might say that we can publish a PQC standard right now and advise upon increasing key sizes if necessary.
That would be a poor decision – adapting ciphers whose security level was recently reduced is a bad policy for reasons we are all familiar with. Such an ad-hoc solution may well lead us to choose a cipher which at best has poor performance, and at worst will be unsafe.
 

NIST said in the third NIST PQC conference that they are happy with the existing options, but I don't think they should be just yet.

Especially when it comes to signatures, I get the sense that we are not choosing the best out of several good options, but merely survivors. The AES process succeeded due to having many good options, making it clear that the one selected was the best.

Don't get me wrong – the team is doing a terrific job, facing a tough and important challenge. It is tempting to complete it according to the committed timetable, but in doing so they would fail to meet their mandate.

NIST are correct to take an active approach, but it is not enough. They should demand more from the cryptographic community, focusing us on current gaps and unknowns, and request assistance where needed. The success of this process is a mutual interest of us all (except CNRS, I guess...), regardless of which scheme is chosen. This requires prolonging the process, but this delay is necessary.

Such a delay, after 5 years of hard work and anticipation, should not be taken lightly – as I acknowledge the weariness of both NIST's PQC team and the participants. Unfortunately, it is the only valid option moving forward. The current proposals, especially in the signature category, are simply not good enough. A decision to standardize one of the lattice-based schemes (or even a compromise, such as SPHINCS+) might end up having irreversible consequences that will affect us all.


Sincerely,
Max Heiser


[1] Martin R. Albrecht, Vlad Gheorghiu, Eamonn W. Postlethwaite, John M. Schanck: Quantum speedups for lattice sieves are tenuous at best. https://eprint.iacr/2019/1161.pdf
[2] Anja Becker, Nicolas Gama, Antoine Joux: Solving shortest and closest vector problems: The decomposition approach. https://eprint.iacr.org/2013/685
[4] Ward Beullens: Improved Cryptanalysis of UOV and Rainbow. https://eprint.iacr.org/2020/1343
[5] André Chailloux, Johanna Loyer: Lattice sieving via quantum random walks. https://eprint.iacr/https://eprint.iacr.org/2021/570.pdf
[6] Ronald Cramer and Léo Ducas and Chris Peikert and Oded Regev: Recovering Short Generators of Principal Ideals in Cyclotomic Rings. https://eprint.iacr.org/2015/313
[7] Itai Dinur: Cryptanalytic Applications of the Polynomial Method for Solving Multivariate Equation Systems over GF(2). https://eprint.iacr/https://eprint.iacr.org/2021/578.pdf
[8] Léo Ducas: Shortest Vector from Lattice Sieving: a Few Dimensions for Free. https://eprint.iacr/2017/999
[9] Emre Karabulut, Aydin Aysu: Falcon Down: Breaking Falcon Post-Quantum Signature Scheme through Side-Channel Attacks. https://eprint.iacr.org/2021/772
[10] Elena Kirshanova, Erik Martensson, Eamonn W. Postlethwaite, Subhayan Roy Moulik: Quantum Algorithms for the Approximate k-List Problem and their Application to Lattice Sieving. https://eprint.iacr.org/2019/1016
[11] Thijs Laarhoven: Search problems in cryptography, From fingerprinting to lattice sieving. https://pure.tue.nl/ws/files/14673128/20160216_Laarhoven.pdf
[12] Joost Renes: Side-channel-assisted key recovery attacks on IND-CCA transformations. https://groups.google.com/a/list.nist.gov/g/pqc-forum/c/iVbJkCytoog
[13] Ryan Sleevi: A History of Hard Choices. https://medium.com/@sleevi_/a-history-of-hard-choices-c1e1cc9bb089
[14] Chengdong Tao, Albrecht Petzoldt, Jintai Ding: Improved Key Recovery of the HFEv- Signature Scheme. https://eprint.iacr.org/2020/1424

Vadim Lyubashevsky

unread,
Oct 8, 2021, 5:09:17 AM10/8/21
to Max Heiser, pqc-...@list.nist.gov
Dear all,

> To counter these claims, one might say that we can publish a PQC standard right now and advise upon increasing key sizes if necessary.
> That would be a poor decision –

I share the sense of uncertainty in "Max's" email about PQ
algorithms, but I am not as optimistic that the state of the art will
ever settle down enough that one will be able to say "now we know the
exact security of the scheme". With constants like .2571 in the
exponent, it seems natural that adjusting the second or third digit
will happen once in a while. And this especially holds true for
quantum algorithms which haven't really been extensively studied for
most PQ assumptions -- lattices have arguably been the most studied
one here and so it is not surprising that there are slight
improvements on this front. (The presumed quantum security of lattice
schemes is currently higher than it needs to be for the NIST security
levels, due to the fact that it's pretty close to the classical one,
so that could be the reason that a slightly better quantum attack did
not get as much attention).

I also completely agree that setting parameters and adjusting up if
necessary, is a bad decision. We should do it the other way -- set the
parameters high, and adjust down if at some point it's necessary and
believed to be safe. Rather than NIST delaying until things have
completely settled down in cryptanalysis (which might be never as
attacks could be approaching some unknown asymptote), it should make
level 3 the minimum security level. In the submissions that I was
involved in, we were targeting somewhere between what is currently
Level 1 and 3 for our recommended parameters in the design stage
before really looking at what NIST wanted as their security levels,
because this was what we were comfortable recommending given the
unknowns. Speaking for myself, I think that the level 1 parameter
requirement is a mistake that has led to a race to the bottom in
trying to get shorter schemes at the expense of reducing the margin of
safety against improved algorithmic attacks.

Best,
Vadim

D. J. Bernstein

unread,
Oct 8, 2021, 11:37:19 PM10/8/21
to pqc-...@list.nist.gov
'Max Heiser' via pqc-forum writes:
> Despite that, the deadlines of the process force us towards
> standardization soon.

No, they don't. If you look back before the past year of advances in
lattice attacks then you'll find Dustin Moody's pqc-forum message dated
28 Sep 2020 21:15:08 +0000 already spelling out one of the possible
paths for structured-lattice decisions moving to a longer timeline.
("Longer" is his word.) NIST subsequently committed to having a round 4
in any case. The official rules don't require NIST to select _any_
standard, never mind requiring it to happen at a particular moment.

The real question is why NIST is rushing to standardize something where
there are so many recent attack advances; see, e.g., the 12 papers
between 2018 and 2020 cited in https://cr.yp.to/talks.html#2021.06.08
slides 4--6. It's troubling to see that none of the official NIST
reports even bothered to include a _list_ of the known lattice attacks,
never mind an analysis, never mind a "thorough analysis" carried out "in
a manner that is open and transparent to the public". The NIST reports
in the AES competition were much more careful.

---Dan

P.S. As the target of neverending ad-hominem attacks evidently designed
to replace rational discussions with emotional reactions, I fully
understand at least one motivation for using pseudonyms. However, the
transparency rules in the Dual EC post-mortem

https://www.nist.gov/system/files/documents/2017/05/09/VCAT-Report-on-NIST-Cryptographic-Standards-and-Guidelines-Process.pdf

include tracking "by whom" issues are raised; allowing pseudonymous
input makes this impossible.
signature.asc

Vadim Lyubashevsky

unread,
Oct 19, 2021, 9:09:21 AM10/19/21
to Max Heiser, pqc-...@list.nist.gov
Dear all,

On Fri, Oct 8, 2021 at 9:43 AM 'Max Heiser' via pqc-forum
<pqc-...@list.nist.gov> wrote:
>
> As the process is coming to its end, our possibilities have narrowed in two ways:
> 1. We are left with few signature schemes – Dilithium and Falcon.
> 2. We have effectively become reliant on lattice schemes.
> Each one is problematic on its own, but the latter adversely affects the former, making it much worse.
>
> First, I am concerned that Dilithium may be more vulnerable to future attacks than is currently recognized. Due to the especially large gap in its > lattice, it may be open to an additional set of attacks which exploit such gaps in algebraic lattices. While the effect of gaps on BKZ is considered > in the core-SVP analysis, results such as [6] demonstrate that there are classes of attacks that can be more effective in this case.

The attack of [6] does not utilize a "gap" -- it is an attack on SVP
in ideal lattices (i.e. dimension 1 modules over polynomial rings)
which have no gap by virtue of them being ideal lattices.
Perhaps you're referring to attacks against overstretched NTRU
(https://eprint.iacr.org/2021/999 being the latest in the series),
where a "gap" between the nth and (n+1)st shortest vector in the NTRU
lattice creates a special geometry which is exploited by lattice
reduction algorithms. But this attack does not affect Falcon or
Dilithium.

Even though Falcon is based on NTRU, the gap in Falcon is not large
enough for the attack to work. Put another way, if one believes that
the existence of this attack weakens Falcon, then one should have even
stronger objections to NTRU-HRSS and NTRUPrime KEMs, where this gap is
larger. And there are no known attacks *at all* utilizing the "gap"
for improved attacks against Ring/Module-LWE, which would affect
Dilithium (which uses lattices which are
dimension 5+ modules, and so has extremely different geometry). A
better algorithm for any version of Ring/Module-LWE (where the noise
is properly chosen) vs LWE is probably the second biggest open problem
in lattice cryptography (after just trying to attack regular
lattices), and there has not been any progress on this front. There
are many other lattice primitives with *much* larger gaps (e.g. FHE,
in which the modulus gets up as high as 2^1000, while it's 2^23 in
Dilithium); so this is certainly an instance of the problem that has
been on people's mind for a while.

> NIST said in the third NIST PQC conference that they are happy with the existing options, but I don't think they should be just yet.
>
> Especially when it comes to signatures, I get the sense that we are not choosing the best out of several good options, but merely survivors. The > AES process succeeded due to having many good options, making it clear that the one selected was the best.
> ...
> A decision to standardize one of the lattice-based schemes (or even a compromise, such as SPHINCS+) might end up having irreversible
> consequences that will affect us all.

I don't agree at all -- or at the very least, I don't see the
uniqueness of the situation with respect to signatures. The situation
with the signature finalists seems to be the same as the one with
KEMs. On the KEM side, we'll have some algebraic lattice scheme(s)
standardized as the primary option and probably McEliece as a backup
because the latter is based an a somewhat different assumption. On
the signature side, we'll also have some algebraic lattice scheme(s)
as primary options, and SPHINCS+ as very good insurance. And again,
just like for KEMs, there have been no attacks (beyond the few bits
shaved off due to general lattice attacks) against Falcon or
Dilithium. As I described during a talk at the NIST Round 3
conference, a Dilithium & Falcon & SPHINCS+
standards portfolio is a very attractive one, as each scheme carries
its own benefits.

One can't compare the standardization effort for public-key primitives
with the one for AES. There are a whole lot of different ways to
construct block ciphers which will have 256-bit outputs and have
256-bit security. In public key cryptography, on the other hand,
there are very few ways to construct primitives, and they end up
having rather different characteristics. We should feel lucky that
public key cryptography (beyond Merkle-tree signatures) exists at all.

As I wrote before, I agree (does anyone disagree, actually?) that
lattice attacks can get better and we are still in the stage where we
are not sure how everything after the Core-SVP numbers affects actual
security. For Kyber, we approximated this uncertainty to within about
+/- 15 bits (Dilithium is similar), and this could be important for
level 1 security where the margin of safety is not very large to begin
with. Thus if NIST wants to play it safer, it could consider only
standardizing levels 3 and above for now.

Best,
Vadim





>
>
> Sincerely,
> Max Heiser
>
>
> [1] Martin R. Albrecht, Vlad Gheorghiu, Eamonn W. Postlethwaite, John M. Schanck: Quantum speedups for lattice sieves are tenuous at best. https://eprint.iacr/2019/1161.pdf
> [2] Anja Becker, Nicolas Gama, Antoine Joux: Solving shortest and closest vector problems: The decomposition approach. https://eprint.iacr.org/2013/685
> [3] Daniel J. Bernstein: S-unit attacks. https://groups.google.com/a/list.nist.gov/g/pqc-forum/c/3mVeyEfYnUY
> [4] Ward Beullens: Improved Cryptanalysis of UOV and Rainbow. https://eprint.iacr.org/2020/1343
> [5] André Chailloux, Johanna Loyer: Lattice sieving via quantum random walks. https://eprint.iacr/https://eprint.iacr.org/2021/570.pdf
> [6] Ronald Cramer and Léo Ducas and Chris Peikert and Oded Regev: Recovering Short Generators of Principal Ideals in Cyclotomic Rings. https://eprint.iacr.org/2015/313
> [7] Itai Dinur: Cryptanalytic Applications of the Polynomial Method for Solving Multivariate Equation Systems over GF(2). https://eprint.iacr/https://eprint.iacr.org/2021/578.pdf
> [8] Léo Ducas: Shortest Vector from Lattice Sieving: a Few Dimensions for Free. https://eprint.iacr/2017/999
> [9] Emre Karabulut, Aydin Aysu: Falcon Down: Breaking Falcon Post-Quantum Signature Scheme through Side-Channel Attacks. https://eprint.iacr.org/2021/772
> [10] Elena Kirshanova, Erik Martensson, Eamonn W. Postlethwaite, Subhayan Roy Moulik: Quantum Algorithms for the Approximate k-List Problem and their Application to Lattice Sieving. https://eprint.iacr.org/2019/1016
> [11] Thijs Laarhoven: Search problems in cryptography, From fingerprinting to lattice sieving. https://pure.tue.nl/ws/files/14673128/20160216_Laarhoven.pdf
> [12] Joost Renes: Side-channel-assisted key recovery attacks on IND-CCA transformations. https://groups.google.com/a/list.nist.gov/g/pqc-forum/c/iVbJkCytoog
> [13] Ryan Sleevi: A History of Hard Choices. https://medium.com/@sleevi_/a-history-of-hard-choices-c1e1cc9bb089
> [14] Chengdong Tao, Albrecht Petzoldt, Jintai Ding: Improved Key Recovery of the HFEv- Signature Scheme. https://eprint.iacr.org/2020/1424
>
> --
> You received this message because you are subscribed to the Google Groups "pqc-forum" group.
> To unsubscribe from this group and stop receiving emails from it, send an email to pqc-forum+...@list.nist.gov.
> To view this discussion on the web visit https://groups.google.com/a/list.nist.gov/d/msgid/pqc-forum/RF3ruFxJwWKtP1ucHFZKsJoDb9Q7UrrMP6B-bD8jLqbUh6LRFTIBw1G3jHsRW-ndluqGYG61-Nyb-mC37NttTxpyFdbD8hp9TsRzTIRRwgE%3D%40protonmail.com.
Reply all
Reply to author
Forward
0 new messages