NIST workshop discussion draft on accordion mode requirements

150 views
Skip to first unread message

Dworkin, Morris J. (Fed)

unread,
Apr 10, 2024, 3:41:01 PMApr 10
to 'Morris Dworkin' via ciphermodes-forum

NIST’s Proposal of Requirements for an Accordion Mode has been posted on the NIST website in the form of a “Discussion Draft for the NIST Accordion Mode Workshop 2024” at 

https://csrc.nist.gov/pubs/other/2024/04/10/proposal-of-requirements-for-an-accordion-mode-dis/iprd  .

 

Morris Dworkin

On behalf of the NIST Cipher Modes Team.

 

Arne Padmos

unread,
Apr 30, 2024, 7:55:21 PMApr 30
to ciphermodes-forum, Dworkin, Morris J. (Fed)
Dear NIST accordion cipher mode team,

Although I won't be joining the workshop, I would like to provide some input.

In 'Proposal of Requirements for an Accordion Mode', the security goal is defined as behaving as a tweakable VIL-SPRP, with reference to https://eprint.iacr.org/2018/720.pdf and https://eprint.iacr.org/2003/148.pdf. However, the definition of a VIL-SPRP doesn't require commitment on the tweak, key, or plaintext, as also highlighted in section '5.2.3 Key and Context Commitment' of the workshop discussion draft. Why might such properties be useful? In their paper 'Security protocols and evidence: where many payment systems fail' (https://discovery.ucl.ac.uk/id/eprint/10074689/1/fc14evidence.pdf), Murdoch and Anderson illustrate the utility of cryptographic audit logs for structurally addressing the issue of tampering with logs in the face of dispute resolution, highlighting how it can help produce robust evidence in the context of payment systems. Having a context-committing accordion mode might help provide further guarantees. For example, the zero padding used for committing security could be reappropriated/extended to support 'padding chaining' between messages, whereby (a subset of) the final encrypted message of a session or protocol run can serve as a transcript hash even if keys used for encryption are later disclosed. Instead of the game described on page 5 of the workshop discussion draft, the game would be one where the adversary controls all inputs to the mode. Having been given a reference tuple of tweak, key, input, and output as well as a target set of bits, the adversary has the goal of a collision in the selected bits in the tweak, key, input, and/or output (faster than bruteforce and taking into account the birthday paradox). The bit selection can either be at a fixed offset, or relative to the end of the tweak/key/input/output.

An additional area that I think deserves attention, especially given that nonce-misuse resistance is explicitly called out in the workshop discussion draft, is how to integrate usability evaluation into the evaluation process. Usability results aren't black and white, as well as being hard to independently verify. As such, in addition to reproducible third-party usability studies, it might be good to encourage mode designers to carry out lightweight usability studies to catch problems early on. There is a wide variety of different user study formats, let lone ethnography-inspired approaches and techniques such as co-design. However, as highlighted by Goodman et al. in 'Observing the user experience', even something like a 'nano-size version of a guerrilla usability test' of a couple of minutes with one participant can give useful insights (note that explicit task selection and talk-aloud are useful complements to such passive undirected observation). Also, heuristic evaluation by a (group of) usable security expert(s) could be valuable, either of the chosen interface(s) and/or the specification(s) of the mode. It might be wise to involve one or more people from the Human-Centered Cybersecurity team at NIST in the accordion cipher mode process.

Relatedly, at a broader level, the proposed accordion mode interface and its derived modes and applications (or better 'categories of applications') can be evaluated in a black-box manner to determine pitfalls in expected use. For example, if you ask a representative study population to implement a secure channel using something like SCP03 for managing a secure element with multiple domains, my guess is that you will find many participants that end up reusing the same key for these different domains. Even if you provide engineers with documentation like the NIST SP 800-57 series and tell them to do proper key management, in practice they will likely still make mistakes like this. As such, guidance on (and possibly an additional interface for) domain separation may be useful.

As a case study, let's look at a simplified version of a scheme similar to CMC. While the scheme isn't very performant, the main goal is to serve as an illustration of how the design of a mode of operation could try to design out implementation errors observed in the wild, in this case with XTS (see the public comments and decision proposal comments on the review of SP 800-38E for more details on failure cases). This mode or a different mode designed to keep implementation errors into account could serve as a comparison case against more performant ones in a usability study. We assume a derived key based on a tweak as in e.g. Flex AEAD by Menda et al., and for simplicity assume that we are dealing with an input that is a multiple of the block length (ciphertext stealing could be used to handle variable-sized messages). Two subkeys are created based on the derived key, one being the bitwise inversion of the other. Both encryption as well as decryption take the form of two passes of CBC encryption with a zero IV, the first using the forward direction of the block cipher with the 'even key' in a left-to-right direction while the second pass works in a right-to-left direction and uses the backward direction of the block cipher with the 'odd key' (using whatever definition of odd and even for the two subkeys). Decryption is the same, except for reversing the order of the two subkeys.

It would be interesting to see whether less/different implementation errors are made than when implementing XTS versus the mode described above (e.g. in a user study with a counterbalanced design and/or a think-aloud study), as well as whether similar usability problems can be observed as those in the wild. However, note that while while the mode described above may meet the requirements of being a VIL-SPRP, similar to CBC-MAC, an adversary that has access to the key can easily create multiple messages where part of the output is the same.

A separate point of attention relates to the development and standardization process. In the workshop discussion draft, it is noted that 'NIST is likely to establish a collaborative process for developing a new accordion mode'. I strongly believe that some form of 'adversariality' should be built in to this process in order to provide for sufficient assurance that a situation like OCB2 is unlikely to be repeated. Note that I am not talking about adversarial discussions like on the pqc-forum, but making sure that there is healthy competition driving appropriately incentivised peer-review by those with 'skin in the game'. Additionally, my understanding is that until NIST IR 7977 is reviewed by the Crypto Publication Review Board, the guidance in the document remains valid, including that 'NIST will consider the use of open competitions to establish cryptographic standards particularly when no consensus exists yet around the best algorithmic approach' and that 'cryptographic algorithm competitions are an especially powerful vehicle for working with cryptographers from the broad research community to fill particular standards-related needs'.

Regards,
Arne

Op woensdag 10 april 2024 om 21:41:01 UTC+2 schreef Dworkin, Morris J. (Fed):

John Preuß Mattsson

unread,
May 17, 2024, 4:05:06 AMMay 17
to ciphermodes-forum, Arne Padmos, Dworkin, Morris J. (Fed)

Hi,

Thanks to the NIST Cipher Modes Team for preparing the discussion draft. The draft was very helpful in getting me to start thinking about desired properties and applications of the accordion mode.

I agree with Arne that stronger properties than the security goal in Section 2.2 would be welcome. I kind of assumed that the derived AEAD would provide commitment but as Arne points out that does not follow from the definitions. It would be very nice if the derived modes provide commitment when used for encode-then-encipher authenticated encryption. Preferable fully commiting with security strength of half the tag length. People will not start using 256-bit tags in their propriatary non-standardized protocols, but having 64-bit commitment security would practically be a lot better than 0-bits of commitment security.

Otherwise I have a bit mixed feelings about commitment. On one hand, people building systems that assume that AEADs provides commitment seems quite common. One example I recently thought of is COSE (RFC 9052) [1] that states that “Applications MUST NOT assume that "kid" values are unique. There may be more than one key with the same "kid" value, so all of the keys associated with this "kid" may need to be checked.” I.e., COSE assumes that a ciphertext can only be decrypted under a single key. On the other hand, it is however unclear to me wheater commiting AEADs is the solution. Many of the commitment attacks are on systems using passwords for encryption. Passwords are inheritly insecure and should only be used as part of MFA. In many other cases, a commiting AEAD with 256-bit tags does not seem like the best solution. Taking age [2] as an example. They mitigate that two recipients decrypt to different plaintexts by adding a MAC in the header. Relying on the AEAD for commitment would have caused quite quite expansion as age breaks up the file in chunks and encrypts them separatly. I think the main thing that is needed is more standardized protocols and more formal verification of protocols. I think that there should also be derived modes without tags (for file system encryptiion in existing file systems) as well as short tags. I do not think all parameters should provide strong commitment as that requires very large tags (512-bit tags for 256-bit commitment security

[1] https://www.rfc-editor.org/rfc/rfc9052.html

[2] https://github.com/FiloSottile/age

I agree with Arne that usability is very important. In addition to security properties, I think it is very important to also consider APIs and documentation. One could e.g., argue that users/applications should never even have to care about the details of nonces and replay protection. In addition to documenting the properties an algorithm have, I think it is equally important to clearly describe all the properies an algorithm does not have, preferably with examples. Security properties that you get for free is always welcome, security properties that comes with a performance or additional bytes are more problematic. If people don’t use the new algorithm, any nice security properties are in vain.

Regarding the standardization process, my understanding of “collaborative process” is that there will not be a competition like AES, LWC, SHA-3, and PQC. I think NIST could still have a call for proposals, and then “collaboratively” work on building the best accordion based on that. At the end of the collaborative effort I expect NIST to have a longer review period that for FIPS 203-205. NIST publishing a draft for an accordion mode in inviting cryptanalysis would likely attrack a lot of reasearchers. NIST could even have competition with prizes (money or honour) for the best attacks.

Cheers,
John Preuß Mattsson
Expert Cryptographic Algorithms and Security Protocols, Ericsson Research

Reply all
Reply to author
Forward
0 new messages