https://eprint.iacr.org/2021/578 claims, regarding "recently proposed
instances of the Picnic signature scheme", that "2 out of 3 new
instances do not achieve their claimed security level".
The comparison at the three security levels is to 2^128, 2^192, and
2^256 bit operations. Comparing to 2^143, 2^207, 2^272 would strengthen
the claim to "3 out of 3". (The confusion here would have been avoided
by the recommendation in
https://blog.cr.yp.to/20161030-pqnist.html to
avoid discretizing security levels in the first place.)
However, I would be opposed to NISTPQC taking this paper's claim as a
reason to penalize Picnic. For example, at the middle level, the attack
using 2^188 bit operations is bottlenecked by constant random access to
2^164 bits of memory. The cost of randomly accessing a bit within N bits
of memory is approximately the cost of sqrt(N)/2^5 bit operations (see
Section 6.6 of
https://ntruprime.cr.yp.to/nist/ntruprime-20201007.pdf),
in this case 2^77 bit operations, evidently multiplied by at least
2^164; overall at least 2^241, i.e., at least 2^34 times more bit
operations than the 2^207 target. This gap is too big to be closed by a
switch from a two-dimensional memory structure to an imaginary cubical
memory structure, never mind the question of whether the expected
lifetime of NISTPQC is long enough to consider such memory volumes.
To be clear, I'm concerned about the number of new pieces being pulled
together in Picnic. This includes concern about the possibility of
Picnic being vulnerable to further improvements in this line of papers.
But we've seen larger recent improvements in lattice attacks, including
improvements that (based on the best estimates available, with many
question marks, which is also worrisome) reduced security levels of all
lattice submissions, and still NIST hasn't penalized those submissions.
---Dan