High‑level comments on additional signature schemes

653 views
Skip to first unread message

John Mattsson

unread,
Feb 19, 2026, 11:34:41 AM (7 days ago) Feb 19
to pqc-...@list.nist.gov

Hi,


I analyzed the Round2 portfolio in the context of the current and planned set of stateless PQC signature schemes, namely SLH‑DSA, additional SLH‑DSA parameter sets, ML‑DSA, and FN‑DSA. The main issue with the set of current algorithms is the large sizes.


A screenshot of a computer

AI-generated content may be incorrect.


I would group the Round2 candidate algorithms into four broad categories, reflecting their expected role relative to the existing and planned portfolio:

  • Lattice: Given that both ML‑DSA and FN‑DSA are lattice‑based, I do not expect NIST to standardize HAWK. I believe NIST should take the time necessary to ensure that FN‑DSA is as robust and well‑optimized as possible across a wide range of use cases. This include specifying fixed-point singing.

  • Code, Symmetric, MPC in-the-head: From a performance and trust perspective, these schemes generally fall between SLH‑DSA and ML‑DSA. Their trust level is high but lower than that of SLH‑DSA. Concerns regarding the use of standalone latticebased schemes prior to sufficiently hardened implementations can be mitigated by SLHDSA or hybrids, but would not be addressed by less mature codebased, symmetric, or MPCinthehead schemes. Although the standardization of HQC‑KEM was important, it is difficult to identify a really compelling use case for code‑based, symmetric, or MPC‑in‑the‑head signature schemes, particularly given NIST’s plan to standardize additional SLH‑DSA parameter sets. I therefore suggest that NIST explicitly solicit input from the community to identify concrete and practical use cases for these schemes.

  • Multivariate: The trust in multivariate is lower than in lattice. For protocols that transmit both public keys and signatures, the combined size is comparable to, or larger than, that of FNDSA. In contrast, for protocols where public keys are referenced rather than transmitted by value, multivariate schemes offer significant size advantages. If NIST after the last round has sufficient confidence in the security of multivariate schemes, I believe they should be standardized.

  • Isogeny: Trust is lower than in latticebased schemes. However, contrary to a common perception, the attack on SIKE did not invalidate isogenybased cryptography as a whole, but rather exploited a specific weakness in SIKE, in a manner analogous to the attack on Rainbow. SQISign offers very attractive publickey and signature sizes for constrained IoT radio environments. While its performance is relatively slow, this may be acceptable for many use cases, as computation is typically not the most constrained resource. If there is sufficient confidence in the security of isogenybased cryptography, I believe that standardization would be welcome, both for NIKE and for signature schemes. That said, isogenybased cryptography remains a highly active research area, with significant improvements emerging on a regular basis. This raises the question of whether SQISign, in its current form, is the appropriate algorithm to standardize in 2030. 

An overview of the algorithms, sizes, and performance are given by


https://blog.cloudflare.com/another-look-at-pq-signatures/

https://pqshield.github.io/nist-sigs-zoo/

 

Cheers,
John Preuß Mattsson

img-38fcd6bf-c4b1-42b8-9787-6648ecd1fdc2

Blumenthal, Uri - 0553 - MITLL

unread,
Feb 19, 2026, 12:48:13 PM (7 days ago) Feb 19
to John Mattsson, pqc-...@list.nist.gov
Lattice: Given that both ML‑DSA and FN‑DSA are lattice‑based, I do not expect NIST to standardize HAWK. I believe NIST should take the time necessary to ensure that FN‑DSA is as robust and well‑optimized as possible across a wide range of use cases. This include specifying fixed-point singing.

My preference here would be for NIST to standardize HAWK, maybe to replace FN-DSA for all practical purposes.

Besides the fact that FN-DSA has already been announced — do you see any disadvantages to HAWK compared to FN-DSA?

John Mattsson

unread,
Feb 20, 2026, 4:00:27 AM (7 days ago) Feb 20
to Blumenthal, Uri - 0553 - MITLL, pqc-...@list.nist.gov
Hi Uri,

This was written with the assumption that NIST will publish FN-DSA in FIPS 206 as announced, which I think that is the right approach.

The disadvantages with Hawk are the lower maturity of the algorithm and the hardness assumption LIP. It is likely that NIST will not publish final standards for any of the ramp-on algorithms until around 2030, which is too late to phase out quantum-vulnerable crypto in PKI and long-lived devices before 2035.

Although Hawk is currently the only lattice-based algorithm in the NIST ramp-on track, there are several other recently proposed lattice-based signature schemes, such as NTWE, Mitaka, HAETAE, and SOLMEA, that also offer smaller signature sizes than ML-DSA.

Cheers,
John

Bo Lin

unread,
Feb 21, 2026, 7:45:40 AM (6 days ago) Feb 21
to John Mattsson, Blumenthal, Uri - 0553 - MITLL, pqc-...@list.nist.gov
A scheme with small public key side and signature/ciphertext size, comparable with RSA or ECC, is desirable, I see applications that use SMS channel to establish TLS connection or identification for access. Some smart card / passport readers are also constrained - ISO 7816 can be as slow as 9.6kbps and  ISO 14443 (contactless) 106 kbps. Their frames are small, 256 bytes maximum.

--
You received this message because you are subscribed to the Google Groups "pqc-forum" group.
To unsubscribe from this group and stop receiving emails from it, send an email to pqc-forum+...@list.nist.gov.
To view this discussion visit https://groups.google.com/a/list.nist.gov/d/msgid/pqc-forum/AS5PR07MB105961FC3A547F6DFF9BFACBF8968A%40AS5PR07MB10596.eurprd07.prod.outlook.com.

John Mattsson

unread,
Feb 21, 2026, 9:02:39 AM (5 days ago) Feb 21
to Bo Lin, Blumenthal, Uri - 0553 - MITLL, pqc-...@list.nist.gov
>I see applications that use SMS channel to establish TLS connection or identification for access. Some smart card / passport readers are also constrained - ISO 7816 can be as slow as 9.6kbps and  ISO 14443 (contactless) 106 kbps. Their frames are small, 256 bytes maximum.

Yes, and there are many even more constrained radio networks used in critical infrastructure.

As described in the IETF LPWAN charter, these networks often operate with “frame sizes in the order of tens of bytes transmitted a few times per day at ultra-low speeds.”

Importantly, these are not legacy systems. They are relatively recent technologies, and their adoption is rapidly increasing across sectors such as utilities, smart metering, environmental monitoring, and industrial IoT.

Cheers,
John

Bas Westerbaan

unread,
Feb 24, 2026, 6:45:58 AM (3 days ago) Feb 24
to John Mattsson, pqc-...@list.nist.gov
Hi all,

On most points we have similar thoughts as John, but there are some things we should talk about.
 
  • Code, Symmetric, MPC in-the-head: From a performance and trust perspective, these schemes generally fall between SLH‑DSA and ML‑DSA. Their trust level is high but lower than that of SLH‑DSA. Concerns regarding the use of standalone latticebased schemes prior to sufficiently hardened implementations can be mitigated by SLHDSA or hybrids, but would not be addressed by less mature codebased, symmetric, or MPCinthehead schemes. Although the standardization of HQC‑KEM was important, it is difficult to identify a really compelling use case for code‑based, symmetric, or MPC‑in‑the‑head signature schemes, particularly given NIST’s plan to standardize additional SLH‑DSA parameter sets. I therefore suggest that NIST explicitly solicit input from the community to identify concrete and practical use cases for these schemes.

The VOLEitH paradigm is nothing short of a breakthrough in post-quantum zero-knowledge proofs. It's particularly appealing for its relative simplicity compared to other approaches (such as STARKs and lattice-based ZKP,) and conservative assumptions. It's also clear the schemes haven't been fully optimized yet. At the moment VOLEitH based schemes don't outperform ML-DSA-44 yet, but it's much closer than it was only a year ago, and the improvements to get there were pretty simple. I would like to see where they bottom out. Also currently they seem to be the most promising starting point for post-quantum anonymous credentials / blind signatures. I think it would be good to reduce the number of candidates at some point.
 
  • Multivariate: The trust in multivariate is lower than in lattice. For protocols that transmit both public keys and signatures, the combined size is comparable to, or larger than, that of FNDSA. In contrast, for protocols where public keys are referenced rather than transmitted by value, multivariate schemes offer significant size advantages. If NIST after the last round has sufficient confidence in the security of multivariate schemes, I believe they should be standardized.
I would like to make a distinction between "unstructured" classical UOV, and "structured" schemes that reduce public key by admitting more structure such as MAYO, SNOVA, QR-UOV, GeMMS, and Rainbow. Up until recently, all new attacks affected the latter not the former. Now, Lars' wedges is a genuine surprise, also affecting UOV, but I'd say it'd be too crude to equate UOV's security to that of the structured proposals. Thinking ahead, if NIST were to standardize both UOV and a structured scheme, aligning these standards closely would make sense, even if UOV would be standardized earlier. For example, MAYO with k=1 is in essence UOV.

  • Isogeny: Trust is lower than in latticebased schemes. However, contrary to a common perception, the attack on SIKE did not invalidate isogenybased cryptography as a whole, but rather exploited a specific weakness in SIKE, in a manner analogous to the attack on Rainbow. SQISign offers very attractive publickey and signature sizes for constrained IoT radio environments. While its performance is relatively slow, this may be acceptable for many use cases, as computation is typically not the most constrained resource. If there is sufficient confidence in the security of isogenybased cryptography, I believe that standardization would be welcome, both for NIKE and for signature schemes. That said, isogenybased cryptography remains a highly active research area, with significant improvements emerging on a regular basis. This raises the question of whether SQISign, in its current form, is the appropriate algorithm to standardize in 2030. 

Of all categories, isogenies have seen most progress. Verification times are close to practical not just in niche applications. Complexity is still a concern and the constant-time signing is an active research area. I'd like to keep them in the race, but of course, if they're not ready in time, then we don't have to standardize them.

Best,

 Bas
img-38fcd6bf-c4b1-42b8-9787-6648ecd1fdc2

Thomas Pornin

unread,
Feb 24, 2026, 7:08:36 PM (2 days ago) Feb 24
to pqc-forum, Blumenthal, Uri - 0553 - MITLL, John Mattsson
HAWK's verification uses a bit more RAM and CPU than FN-DSA. Certainly not a lot, it's still quite fast and small by usual standard; but you will prefer FN-DSA if you are on the verification side, and your hardware is a very small microcontroller, say an 8 MHz 16-bit chip with 4 kB of RAM: you can do an FN-DSA-512 verification in that, but not an HAWK verification. Of course, on the signature generation side, the comparison goes fully in the other direction: HAWK signing is much smaller, much faster, and much simpler (and probably a lot easier to properly shield against power analysis, if you are thinking of smartcards or similar devices). The point about verification is just about the only one I see, from a practical point of view, in which HAWK is not strictly better than FN-DSA.

Of course, there is also the question of how much we believe HAWK to be secure. It's not the same security assumption as FN-DSA. Possibly, HAWK could be weak while FN-DSA remains strong. Possibly HAWK is the strong one and FN-DSA is weak! It is hard to reason about these things; the only reliable information is that FN-DSA relies on the security of NTRU lattices and NTRU lattices have been around for more than two decades now, so if that if FN-DSA is weak, then the weakness is certainly not obvious.

My understanding of the world in which NIST operates is that not going through with FN-DSA standardization is unthinkable; it will happen, whatever goes.

Thomas

Reply all
Reply to author
Forward
0 new messages