How is HQC substantially different from RLWE?

568 views
Skip to first unread message

Bobby McGee

unread,
Oct 1, 2025, 12:43:32 PMOct 1
to pqc-forum
How is HQC much different from RLWE?

Say we have 1-dimensional Kyber with public a, as+e, and HQC with public h, hx+y.  Here a, h are uniform, while s, e, x, y are "small" (CBD or fixed low weight respectively), and arithmetic is in Z[x]/(q, x^n+1), Z[x]/(2, x^n-1) respectively.  The distributions of the public keys are supposed to be computationally uniform and it should be difficult to find appropriate s, x satisfying the relations.

x^n+1 is chosen irreducible over Q (2-power cyclotomic) and q s.t. x^n+1 (mostly) factors over F_q for NTT optimization.  On the other hand, (x^n-1)/(x-1) is irreducible over F_2 to prevent algebraic simplification, and the binary, cyclic structure is for convenience and optimization.  Despite the differences in the ring structure, security seems to scale roughly with the size of the ring.

Encryption/decryption for both schemes follow the same "noisy Elgamal" setup, with slightly different error correction (modulation in the "q" direction and concatenated RS-RM codes respectively).

Obviously there are many analogies between code and lattice problems.  To someone not familiar with the security reductions from "hard" problems (like myself), how would you justify standardizing both Kyber and HQC when they look so superficially similar?  [Perhaps begging the question, why not BIKE?]

Maybe the answer is "go read [X], [Y], [Z] and learn what underlies the security and stop pestering pqc-forum," but is there a digestible or obvious reason HQC is substantially different from a RLWE scheme?

Joshua Holden

unread,
Oct 10, 2025, 2:58:46 PMOct 10
to pqc-forum, Bobby McGee
Since no one seems to be answering this, I'll take a first stab at it:  the currently known best attacks on HQC and RLWE appear to be substantially different.  The known best attacks on RLWE (AFAIK, someone correct me if I'm wrong!) are based on lattice reduction algorithms such as LLL, BKZ, etc.  The known best attacks for HQC are based on information set decoding (again, correct me if I'm wrong!).  According to the NIST third-round report, "This approach [ISD] ignores the structure of the binary code and seeks to recover the error vector based on its low Hamming weight."   Code-based cryptography is not my area of strength, but it seems like although there is a lot of algebraic similarity between HQC and RLWE systems, the attacks are targeting aspects which the systems do not have in common.

BIKE uses a slightly different approach to creating the ciphertexts from HQC, but the known attacks are actually the same.  According to NIST's fourth-round report, the decision between HQC and BIKE was very close, and ultimately came down to technical issues in the BIKE security analysis which left open the potential for more attacks to show up in the near term.

Please, anyone, feel free to correct and/or expand on this!

----Josh

Watson Ladd

unread,
Oct 10, 2025, 3:51:17 PMOct 10
to Joshua Holden, pqc-forum, Bobby McGee
If I might be permitted to indulge:

There is a very large similarly as pointed out by the author of this
thread: all lattice based schemes/coding based schemes are built
around a "multiplication" and addition of a "noisy" or small vector.
The difference is in the geometry or lack there of. Lattice based
schemes can use rounding to decode, and have the secret structure be a
very mild nice basis. Code based schemes use heavyweight machinery
like algebraic geometry codes to recover the error . This enables many
more errors, making the "small" vectors quite a bit bigger, and thus
increasing the difficulty of attacks.

HQC introduces structure very similar to the ring or module structure
used in structured lattices, and with the same motivation: to reduce
the size of the keys while preserving security. A multiplication in an
extension field of degree n is a nxn linear transformation, but
requires only n numbers to express. Similarly the quasicyclic code
requires n numbers to express an nxn linear transformation. But
attacks based on trying to exploit this structure to recover the
secret information will look different because the geometry of Z/qZ
and Z/2Z are very different.

At some level Hamming metric is the most extreme case of small q, and
as q increases eventually the algorithms behave more like BZK, LLL
etc.

Sincerely,
Watson
--
Astra mortemque praestare gradatim

Edoardo Persichetti

unread,
Oct 10, 2025, 3:56:24 PMOct 10
to Watson Ladd, Joshua Holden, pqc-forum, Bobby McGee
Dear Josh, Watson, and all

Thank you - your analysis is substantially correct.

Note that there are interesting connections between the different attack avenues, for example [1,2]; however, ISD and related techniques (e.g. DOOM) are still considered the de facto standard for code-based crypto.

Regarding BIKE: it appears this ultimately came down to NIST’s decision to standardise the fewest possible amount of KEMs, and these KEMs having a “general purpose” nature - unlike what was decided for signatures. BIKE’s technical issues, in fact, only affect the IND-CCA analysis, and would not apply, for instance, to ephemeral key usage. I believe it would be of interest if the community could clarify whether this specific use case (or other use cases, as well) are sufficiently relevant to justify NIST considering a dedicated candidate with a tailored performance profile (such as BIKE).

I would also point out that very recent analysis on BIKE’s DFR is available [3], but unfortunately appeared after NIST had already made their decision; nevertheless, in my humble opinion, this does not change the fact that a candidate such as BIKE is at its best when used as an ephemeral KEM (due for instance to the very short data sizes), without the whole Fujisaki-Okamoto machinery “getting in the way”.

Best,
Edoardo

[1] Thomas Debris-Alazard, Léo Ducas, Wessel P. J. van Woerden: An Algorithmic Reduction Theory for Binary Codes: LLL and More. IEEE Trans. Inf. Theory (2022)

[2] Léo Ducas, Andre Esser, Simona Etinski, Elena Kirshanova: Asymptotics and Improvements of Sieving for Codes. EUROCRYPT 2024

[3] Sarah Arpin, Jun Bo Lau, Antoine Mesnard, Ray Perlner, Angela Robinson, Jean-Pierre Tillich & Valentin Vasseur: Error Floor Prediction with Markov Models for QC-MDPC Codes. CRYPTO 2025

Best,
Edoardo


On 10 Oct 2025, at 15:50, Watson Ladd <watso...@gmail.com> wrote:

               EXTERNAL EMAIL : Exercise caution when responding, opening links, or opening attachments.

D. J. Bernstein

unread,
Oct 10, 2025, 4:57:28 PMOct 10
to pqc-...@list.nist.gov
Watson Ladd writes:
> Code based schemes use heavyweight machinery
> like algebraic geometry codes to recover the error . This enables many
> more errors, making the "small" vectors quite a bit bigger, and thus
> increasing the difficulty of attacks.

BIKE and HQC are like NTRU and Kyber in having distances that are
polynomially smaller than random distances. What you're saying is
correct for McEliece, which uses a more powerful decoder and ends up
with "small" distances only polylogarithmically below random distances.
See https://cr.yp.to/talks.html#2025.03.24 for more details.

A different comparison axis asks whether there's a tight IND-CCA2 proof
from the underlying search problem. McEliece and NTRU have this; it
wouldn't be hard to tweak BIKE to have this if decryption failures could
be brought under control; HQC and Kyber are both noisy-DH systems that
don't have this, so they rely on loose proofs or decisional assumptions.

[ regarding polynomial structure: ]
> But attacks based on trying to exploit this structure to recover the
> secret information will look different because the geometry of Z/qZ
> and Z/2Z are very different.

I don't think this is a safe extrapolation from the literature on
attacks known today.

For attacks ignoring the polynomial structure, certainly there are many
attack speedups and complications for the q>2 case because of size
variations mod q, but combinatorial searches and linear algebra play the
same attack role for q>2 and for q=2. Look, for example, at the HGJ
subset-sum algorithm and the MMT ISD algorithm.

For attacks using the polynomial structure, one sees further overlaps:
e.g., the BIKE decryption failures in https://eprint.iacr.org/2023/659
basically boil down to the pileup of coefficients in (1+x+x^2+...)^2;
this pileup isn't specific to q=2. The line of attacks that I find most
worrisome, such as the break of Gentry's original STOC 2009 FHE system
in quantum polynomial time for cyclotomics (assuming small h^-), has q
not even showing up in most of the attack steps.

---D. J. Bernstein
signature.asc

Guilin Wang

unread,
Oct 13, 2025, 12:23:57 AMOct 13
to pqc-...@list.nist.gov, p...@ietf.org, wang.gu...@gmail.com

Noticed that on Oct 9, the Institute of Commercial Cryptography Standards (ICCS), China, has released its formal announcement of call for proposals for the Next-generation Public-Key Cryptographic Algorithms (NGCC). https://www.niccs.org.cn/niccs/


----------------------------------

Submission Deadline: April 30, 2026

Updating Deadline: June 30, 2026

("From May 1 to June 30, 2026, submitters could update the previously submitted proposals but could not submit new proposals.")

----------------------------------

Call for Proposals for the Next-generation Public-Key Cryptographic Algorithms

https://www.niccs.org.cn/niccs/Notice/pc/content/content_1975896137741635584.html

- Submission Requirements for Public-Key Cryptographic Algorithms

- Evaluation Criteria for Public-Key Cryptographic Algorithms 

- Feedback on the Comments Received on the Draft Submission Requirements and Evaluation Criteria for Public-Key Cryptographic AlgorithmsC

- Contact email: pkcco...@niccs.org.cn.

----------------------------------

Call for Proposals for the Next-generation Cryptographic Hash Algorithms

https://www.niccs.org.cn/niccs/Notice/pc/content/content_1975892908773478400.html

- Submission Requirements for Cryptographic Hash Algorithms

- Evaluation Criteria for Cryptographic Hash Algorithms

- Feedback on the Comments Received on the Draft Submission Requirements and Evaluation Criteria for Cryptographic Hash Algorithms

Contact email: crypthas...@niccs.org.cn

----------------------------------

Cheers, 

Guilin

Markku-Juhani O. Saarinen

unread,
Oct 13, 2025, 5:58:54 AMOct 13
to Guilin Wang, pqc-...@list.nist.gov, p...@ietf.org
Hi,

Does anyone know what kind of concrete security definitions they are using for "quantum-resistant security strength"? The language in the ICCS call does not really align with any mainstream metrics for post-quantum security I know.

Quote: "The submitted algorithms shall be capable of supporting three classical security strength levels: 128, 256, and 512 bits. The corresponding quantum-resistant security strength shall be no less than 80, 128, and 256 bits. The algorithms can optionally support 384-bit classical security strength level with no less than 192-bit quantum-resistant security strength. The n-bit classical (quantum-resistant) security strength means that the complexity of the known classical (quantum) attacks on the algorithm is at least 2n. The algorithms shall not incorporate components that are vulnerable to quantum attacks. Moreover, the algorithms shall provide sufficient security redundancy to reduce potential risks from new quantum algorithms and cryptanalysis techniques."


Best Regards,
-markku

Dr. Markku-Juhani O. Saarinen <mj...@iki.fi>


--
You received this message because you are subscribed to the Google Groups "pqc-forum" group.
To unsubscribe from this group and stop receiving emails from it, send an email to pqc-forum+...@list.nist.gov.
Reply all
Reply to author
Forward
0 new messages