Constant time digital signature algorithm

901 views
Skip to first unread message

Simon Hoerder

unread,
Feb 20, 2025, 9:04:48 AM2/20/25
to pqc-forum
Hi,

I was wondering whether it makes sense for the signature ramp-on to focus on constant time algorithms. (At least constant time for signing and verification.)

ML-DSA really is not constant time for signature generation due to rejection sampling. SLH-DSA at least makes it relatively to provide an upper bound on worst-case runtime. A constant-time implementation of FN-DSA is possible according to Section 4.5 (“Performances”) but I would argue that FN-DSA is not general-purpose due to its usage of floating point arithmetic. A lot of micro controllers and devices that need SCA security will struggle to implement FN-DSA.

So, in that sense, there’s no real constant-time general-purpose PQC signature algorithm among the NIST standards. But there’s value in having constant time algorithms: it makes testing for timing leakages a lot easier, it helps in functional verification and it helps massively in safety assessments. (Needless to say: all algorithms need to meet the cryptanalytic security requirements first.)

Best,
Simon

John Mattsson

unread,
Feb 20, 2025, 9:41:57 AM2/20/25
to Simon Hoerder, pqc-forum

Hi Simon,

 

Thanks for bringing this up. Real constant-time algorithms would be appreciated. I wish the importance of constant-time would have been discussed more during the previous years of the PQC competition. FN-DSA seems useful for CA certs but problematic for most other use cases.

 

Is there an overview of showing if the algorithms in Round 2 be implemented in constant time? (If not, that would be a good paper for someone to write).

 

Cheers,

John Preuß Mattsson

 

Scott Fluhrer (sfluhrer)

unread,
Feb 20, 2025, 9:54:00 AM2/20/25
to Simon Hoerder, pqc-forum
Actually, providing a constant time SLH-DSA implementation would not be difficult. The reference code is pretty close to constant time for signature generation, and would just need a few tweaks to make it fully constant time. Making the verification code constant time would approximately double the time, but is straightforward to implement.

> -----Original Message-----
> From: pqc-...@list.nist.gov <pqc-...@list.nist.gov> On Behalf Of Simon
> Hoerder
> Sent: Thursday, February 20, 2025 9:04 AM
> To: pqc-forum <pqc-...@list.nist.gov>
> Subject: [pqc-forum] Constant time digital signature algorithm
>
> --
> You received this message because you are subscribed to the Google Groups
> "pqc-forum" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to pqc-forum+...@list.nist.gov.
> To view this discussion visit
> https://groups.google.com/a/list.nist.gov/d/msgid/pqc-forum/9DE246D8-
> 87C9-4977-8745-B11CCAE5E909%40hoerder.net.

Derek Atkins

unread,
Feb 20, 2025, 9:57:40 AM2/20/25
to pqc-...@list.nist.gov, sflu...@cisco.com, si...@hoerder.net
Hi,

I understand why we want constant-time signature generation.  There is secret data involved, and timing could potentially leak that out.

However, could someone please explain the rationale for the requirement of constant-time verification?  There is no (or SHOULD be no) secret data involved in verification, so why does constant time verification matter?  What private/secret data are you potentially leaking?

Thanks,

-derek
-- 
Derek Atkins
Chief Technology Officer
Veridify Security - Securing the Internet of Things®

Office: 203.227.3151  x1343
Direct: 617.623.3745
Mobile: 617.290.5355
Email: DAt...@Veridify.com

This email message may contain confidential, proprietary and / or legally privileged information and intended only for the use of the intended recipient(s) and others specifically authorized. Any disclosure, dissemination, copying, distribution or use of the information contained in this email message, including any attachments, to or by anyone other than the intended recipient is strictly prohibited.  If you received this in error, please immediately advise the sender by reply email or at the telephone number above, and then delete, shred, or otherwise dispose of this message.

Bas Westerbaan

unread,
Feb 20, 2025, 10:00:00 AM2/20/25
to Derek Atkins, pqc-...@list.nist.gov, sflu...@cisco.com, si...@hoerder.net
I don't understand the need for full constant-time verification either, but clearly in many real-time applications it's important if verification time can be bounded. That's the case for SLH-DSA, FN-DSA and ML-DSA verification.

Derek Atkins

unread,
Feb 20, 2025, 10:16:52 AM2/20/25
to b...@cloudflare.com, pqc-...@list.nist.gov, sflu...@cisco.com, si...@hoerder.net
Oh, I understand the need for a bounded validation time; it's the constant-time I'm questioning.  The two are not the same requirement, although you do get bounded-time by constant-time, at a potentially extreme cost in the "normal" case.

-derek

Stephan Mueller

unread,
Feb 20, 2025, 10:21:36 AM2/20/25
to Simon Hoerder, pqc-forum, Scott Fluhrer (sfluhrer)
Am Donnerstag, 20. Februar 2025, 15:53:48 CET schrieb 'Scott Fluhrer
(sfluhrer)' via pqc-forum:

Hi,

> Actually, providing a constant time SLH-DSA implementation would not be
> difficult. The reference code is pretty close to constant time for
> signature generation, and would just need a few tweaks to make it fully
> constant time. Making the verification code constant time would
> approximately double the time, but is straightforward to implement.

The code in [1] contains a few tweaks applied with [3] for SLH-DSA and now the
analysis shows that there is no time-variant code depending on secret data.

The same code also has similar instrumentation NL-DSA and ML-KEM to analyze
that there is no time-variant code depending on secret data as well.

The description for the testing is given in [2].

[1] https://github.com/smuellerDD/leancrypto

[2] https://leancrypto.org/leancrypto/debugging_support/index.html#side-channel-analysis

[3] https://github.com/smuellerDD/leancrypto/commit/
fa9c9d8ec931e44ba99deda165f098af59b9aa01

Ciao
Stephan



Scott Fluhrer (sfluhrer)

unread,
Feb 20, 2025, 10:26:49 AM2/20/25
to Derek Atkins, b...@cloudflare.com, pqc-...@list.nist.gov, si...@hoerder.net

Actually, it may come up in two scenarios:

 

  • If the message, public key and/or the signature is secret.  This isn’t a standard cryptographic requirement on signature verification, but may be placed on it by the application (for example, because of needed anonymity).
  • During “hard” real time programming, that is, somewhere where the requirement may say “this absolutely must be done in x milliseconds, or really bad things will happen”, there is a strong desire to have constant time everywhere – that makes testing/validation a lot easier
  •  

I believe that either scenario is fairly uncommon, but I can see that they can happen.  Whether they should be a concern of this forum is another question entirely.

Simon Hoerder

unread,
Feb 20, 2025, 10:52:57 AM2/20/25
to pqc-forum
Hi Derek,

from a leakage perspective you're correct. There's no secret that could
leak during verification.

For verification, my thinking was more about boot times. If I have to
verify a firmware image during boot, I would like that time to be fairly
well known. In a safety context, it can be quite crucial to know that
time exactly but even in normal contexts many people won't be happy if
there's a 25% chance that verification takes twice as long as on
average. (I've made those numbers up just now, I'm not sure whether
there's any proper research into what time variation would be
acceptable. Also, I'm assuming that the time needed to hash the image is
already being handled.)

Best,
Simon
>>> email to pqc-forum+...@list.nist.gov <mailto:pqc-
>>> forum+un...@list.nist.gov>.
>>> <https://groups.google.com/a/list.nist.gov/d/msgid/pqc-forum/9DE246D8>-
>>> 87C9-4977-8745-B11CCAE5E909%40hoerder.net.
>>
>
> --
>
> Derek Atkins
> Chief Technology Officer
> Veridify Security - /Securing the Internet of Things®/​
>
> Office: 203.227.3151  x1343
> Direct: 617.623.3745
> Mobile: 617.290.5355
> Email: DAt...@Veridify.com
> <mailto:DAt...@SecureRF.com>
> /This email message may contain confidential, proprietary and / or
> legally privileged information and intended only for the use of the
> intended recipient(s) and others specifically authorized. Any
> disclosure, dissemination, copying, distribution or use of the
> information contained in this email message, including any attachments,
> to or by anyone other than the intended recipient is strictly
> prohibited.  If you received this in error, please immediately advise
> the sender by reply email or at the telephone number above, and then
> delete, shred, or otherwise dispose of this message./

Dustin Ray

unread,
Feb 21, 2025, 8:12:14 AM2/21/25
to John Mattsson, Simon Hoerder, pqc-forum
For my own curiosity and understanding I'm wondering if someone here who has a little more knowledge and background than me might be able to enumerate the risks of a variable time signature implementation?

It's clear to me that an observer with access to the local system might be able to glean information about secret materials based on timing, what is less clear to me is what a remote observer might be able to accomplish.

Brent Kimberley

unread,
Feb 21, 2025, 8:27:13 AM2/21/25
to Dustin Ray, John Mattsson, Simon Hoerder, pqc-forum
To the best of my knowledge variable time implementations are more vulnerable to supervision ( aka scada / OT ) aka side channel and fault injection.


From: pqc-...@list.nist.gov <pqc-...@list.nist.gov> on behalf of Dustin Ray <dustinvo...@gmail.com>
Sent: Thursday, February 20, 2025 9:48:51 AM
To: John Mattsson <john.m...@ericsson.com>
Cc: Simon Hoerder <si...@hoerder.net>; pqc-forum <pqc-...@list.nist.gov>
Subject: Re: [pqc-forum] Constant time digital signature algorithm
 
You don't often get email from dustinvo...@gmail.com. Learn why this is important

⚠️CAUTION: This email is from an external source. Verify sender before opening links and attachments.⚠️


THIS MESSAGE IS FOR THE USE OF THE INTENDED RECIPIENT(S) ONLY AND MAY CONTAIN INFORMATION THAT IS PRIVILEGED, PROPRIETARY, CONFIDENTIAL, AND/OR EXEMPT FROM DISCLOSURE UNDER ANY RELEVANT PRIVACY LEGISLATION. No rights to any privilege have been waived. If you are not the intended recipient, you are hereby notified that any review, re-transmission, dissemination, distribution, copying, conversion to hard copy, taking of action in reliance on or other use of this communication is strictly prohibited. If you are not the intended recipient and have received this message in error, please notify me by return e-mail and delete or destroy all copies of this message.

Brent Kimberley

unread,
Feb 21, 2025, 8:32:11 AM2/21/25
to Dustin Ray, John Mattsson, Brent Kimberley, Simon Hoerder, pqc-forum
In essence,  this is why environmental monitoring is required.   I.e. how quickly are we leaking state.  What is the effective in situ key rotation period?

From: 'Brent Kimberley' via pqc-forum <pqc-...@list.nist.gov>
Sent: Friday, February 21, 2025 8:27:01 AM
To: Dustin Ray <dustinvo...@gmail.com>; John Mattsson <john.m...@ericsson.com>

Cc: Simon Hoerder <si...@hoerder.net>; pqc-forum <pqc-...@list.nist.gov>
Subject: Re: [pqc-forum] Constant time digital signature algorithm
 

⚠️CAUTION: This email is from an external source. Verify sender before opening links and attachments.⚠️


Thomas Pornin

unread,
Feb 21, 2025, 8:47:41 AM2/21/25
to pqc-forum, Simon Hoerder
Note that there are two competing meanings for "constant-time", one being what the expression says, i.e. "executes in a constant amount of time", and the other, more common meaning in cryptography, is "variations in timing measurements related to the scheme are computationally unlinkable to secret data".

In signature verification, we usually say that there is only public data, hence only the first meaning may apply, not the second one. You could imagine a contrived situation where the signature and the data are not public, and the data is a low-entropy string (a human-chosen password, for instance), in which case a timing side-channel in verification could be used to power a dictionary attack on the string (the attacker tries potential passwords until one is found that is such that it matches the measured timing signal). Even that would not work for Falcon, where the variability about the input comes from the hash-to-polynomial step, in which the input is hashed along with a 40-byte random nonce -- it the signature is secret then the attacker does not know the nonce, which is enough to deter that dictionary attack. If you have a really great imagination, you could try something where the data is secret and low-entropy, the signature is public, but the public key is not public. Under these conditions, it becomes conceivable that a timing signal in the hash-to-polynomial step could be leveraged in a timing attack. It is really a big stretch, though.

Even though this is rather implausible, I did implement, a few years ago, a constant-time hash-to-polynomial process, and it is there: https://falcon-sign.info/impl/common.c.html#l_70
This is a freak, theoretical only concern, and we should not try to do that in practice. HOWEVER, the analysis which is in there is relevant to the other meaning of "constant-time", i.e. "executes in a constant amount of time". Or, more accurately, what kind of upper bound you can get for the execution time. As explained in the comments in the code linked above, the hash-to-polynomial process in Falcon is a case of rejection sampling: for degree n, you want n coefficients in [0,12288]; for each coefficient, you get 2 bytes from the RNG (SHAKE256) and that yields the coefficient if that 2-byte value falls in [0,61444]; but if the value is in [61445,65535] then you have to drop the 2 bytes and get 2 more, and so on. Thus, there is a bit of "oversampling". How much oversampling you may need is computable. As the table shows, for n = 512, you need at least 512*2 bytes, but the risk that you need more than (512+205)*2 bytes is lower than 2^(-256), i.e. completely negligible.

(2^(-256) is negligible because the probability that the entire mankind dies out because of a huge asteroid strike right during the millisecond or so that your microcontroller is verifying the signature is a lot higher, around 2^(-62), and that is a much more dire occurrence, if only because if that asteroid vaporizes mankind then it will also destroy your microcontroller and prevent it from completing the signature verification. We do not care about random events with probability lower than 2^(-62). In an adversarial context in which an attacker tries to find exactly the most annoying case, we may care about lower probabilities, because attackers may try to find a "bad case" signature that forces the verifier to spend more time; but even then, 2^(-256) is completely unreachable for dedicated attackers, and if an attacker can find a "bad case" with that low a probability then the attacker can also simply break the key and forge signatures. Hence, 2^(-256) risk is truly negligible.)

SHAKE256 outputs bytes by chunks of 136; the cost of SHAKE256 is running the Keccak-f permutation, once per 136 output bytes. With some figures (see https://eprint.iacr.org/2025/123): on an ARM Cortex-M4, I get a signature verification cost of about 255300 cycles (for the "original Falcon"; if the hash of the public key is included in the signed data, then there's an extra 103000 cycles overhead, which is the cost of that hash computation). The hash-to-polynomial involves 8.03 invocations of Keccak-f (i.e. 8 or 9 in practice, with 9 being slightly more probable). Within the 2^(-256) risk, we're good as long as we get (512+205)*2 bytes, which translates to 12 Keccak-f invocations, i.e. 3 or 4 more than the normal, average case. Each Keccak-f is 14348 cycles on that M4, so we are talking of an extra overhead of less than 60k cycles, which we really cannot hit in practice, and over a base cost of 255k. The overall conclusion: for Falcon/FN-DSA verification, this really is NOT a case of "sometimes verification time is doubled". It's more a case of "theoretically we could get some slight overhead, but it will be at most +22% and it won't happen in practice, even if attackers try real hard to make it happen".

TL;DR: with Falcon/FN-DSA and signature verification time, you get nice real-time bounds. Even accounting for freak theoretical-only variations, you still get a total time which is low, as these things go (e.g. on this kind of hardware, this would still be twice faster than, say, a very optimized Ed25519 verification).


If we are talking about signature generation, not verification, then things are a bit different. You do get the hash-to-polynomial process, but it is only a small part of the overall signing cost. The use of floating-point operations certainly increases the cost on hardware which does not have an appropriate FPU, but (see the paper above) it's not catastrophic; on the Cortex-M4, signing time is about 5.4x the average cost of signing with ML-DSA, so it is certainly slower, but not egregiously so. In fact, if we want upper bounds, then that ratio is much lower. At n = 512, Falcon/FN-DSA needs to restart the signing process with probability about 1/230000 (this is mostly due to the need for a low enough L2 norm of the result; it's not an effect of the bounded/padded size limit for the signature). If you want "9 nines" chances of completing the signature (i.e. risk of failure lower than 10^(-9)), then you only need to account for two iterations. On the other hand, Dilithium-II (aka ML-DSA-44) will restart with probability about 0.826 and will need on average about 5.9 iterations, but for a "9 nines" bound the number of iterations must be assumed to be up to 109, i.e. about 18 times the average case.
Again, some figures (in million cycles), on an ARM Cortex M4:
- Falcon/FN-DSA (n = 512): average = 22m, worst case (at risk 10^(-9)) = 45m
- Dilithium-II (ML-DSA-44): average = 4m, worst case (at risk 10^(-9)) = 73m
We arrive here at the somewhat counterintuitive result that on the ARM Cortex-M4, lacking the hardware support for the IEEE 754 "binary64" floating-point values that Falcon signature generation needs to use, Falcon/FN-DSA is actually... faster than Dilithium/ML-DSA? It's a bit artificial, because it comes from the enforcement of a real-time upper bound on time (within some given failure probability, here the marketingly powerful "nine nines" value).


To sum up:
- For verification, you can get really nice real-time bounds with a relatively low overhead (22%) compared with the average case (and, for performance and code simplicity, you'd prefer FN-DSA over ML-DSA, and even over pre-quantum EdDSA).
- For signature generation, if you want real-time bounds, you may end up preferring FN-DSA over ML-DSA again, because its probability of restart is much lower. On the other hand, you might still prefer ML-DSA on the grounds that it is probably easier to mask/blind against other side channels such as differential power analysis. Both algorithms are slower than you might want, in any case (e.g. Ed25519 signing will be less than 0.5m cycles on that hardware). But of course, if you sign stuff on a microcontroller, then you have a private key and you need safe storage for that, and you get a whole lot of new issues.

Personally I would expect the signature verification to be much more commonly used than signature generation on small microcontrollers. A typical case would be verifying the signature on the firmware at boot time. Or checking a server X.509 certificate in a TLS connection to a much beefier server (i.e. you'll want to look at ML-KEM cost and variability, more than signature generation cost).

Thomas

Brent Kimberley

unread,
Feb 21, 2025, 9:28:39 AM2/21/25
to Dustin Ray, John Mattsson, Simon Hoerder, pqc-forum

To elaborate:

  • classical environmental monitoring employs the first 3+1 dimensions.
  • quantum environmental monitoring employs additional dimensions.

 

With the migration to post quantum, the  support processes will also need to be “upskilled”.

Simo Sorce

unread,
Feb 24, 2025, 10:57:58 AM2/24/25
to Dustin Ray, pqc-forum
On Thu, 2025-02-20 at 06:48 -0800, Dustin Ray wrote:
> For my own curiosity and understanding I'm wondering if someone here who
> has a little more knowledge and background than me might be able to
> enumerate the risks of a variable time signature implementation?
>
> It's clear to me that an observer with access to the local system might be
> able to glean information about secret materials based on timing, what is
> less clear to me is what a remote observer might be able to accomplish.

We have been able to observe timing differences with nanosecond
resolution over the network. Statistical sampling with enough signature
attempts is sufficient to reveal the timing differences even over noisy
networks. A TLS server can be attacked this way.


>
> On Thu, Feb 20, 2025 at 6:41 AM 'John Mattsson' via pqc-forum <
> pqc-...@list.nist.gov> wrote:
>
> > Hi Simon,
> >
> >
> >
> > Thanks for bringing this up. Real constant-time algorithms would be
> > appreciated. I wish the importance of constant-time would have been
> > discussed more during the previous years of the PQC competition. FN-DSA
> > seems useful for CA certs but problematic for most other use cases.
> >
> >
> >
> > Is there an overview of showing if the algorithms in Round 2 be
> > implemented in constant time? (If not, that would be a good paper for
> > someone to write).
> >
> >
> >
> > Cheers,
> >
> > John Preuß Mattsson
> >
> >
> >
> > *From: *pqc-...@list.nist.gov <pqc-...@list.nist.gov> on behalf of
> > Simon Hoerder <si...@hoerder.net>
> > *Date: *Thursday, 20 February 2025 at 15:04
> > *To: *pqc-forum <pqc-...@list.nist.gov>
> > *Subject: *[pqc-forum] Constant time digital signature algorithm
> > <https://groups.google.com/a/list.nist.gov/d/msgid/pqc-forum/9DE246D8-87C9-4977-8745-B11CCAE5E909%40hoerder.net>
> > .
> >
> > --
> > You received this message because you are subscribed to the Google Groups
> > "pqc-forum" group.
> > To unsubscribe from this group and stop receiving emails from it, send an
> > email to pqc-forum+...@list.nist.gov.
> > To view this discussion visit
> > https://groups.google.com/a/list.nist.gov/d/msgid/pqc-forum/GVXPR07MB9678C1FF89E2A05B20ED450C89C42%40GVXPR07MB9678.eurprd07.prod.outlook.com
> > <https://groups.google.com/a/list.nist.gov/d/msgid/pqc-forum/GVXPR07MB9678C1FF89E2A05B20ED450C89C42%40GVXPR07MB9678.eurprd07.prod.outlook.com?utm_medium=email&utm_source=footer>
> > .
> >
>

--
Simo Sorce
Distinguished Engineer
RHEL Crypto Team
Red Hat, Inc

Reply all
Reply to author
Forward
0 new messages