Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

interesting - NSA Dual Counter Mode (DCTR)

20 views
Skip to first unread message

David Crick

unread,
Aug 2, 2001, 3:46:05 PM8/2/01
to
http://csrc.nist.gov/encryption/modes/proposedmodes/

http://csrc.nist.gov/encryption/modes/proposedmodes/dctr/dctr-spec.pdf

--
+-------------------------------------------------------------------+
| David A Crick <dac...@cwcom.net> PGP: (AUG-2001 KEY) 0x552FA90F |
| Damon Hill Tribute Site: http://www.geocities.com/MotorCity/4236/ |
| M. Brundle Quotes: http://members.tripod.com/~vidcad/martin_b.htm |
+-------------------------------------------------------------------+

Roger Schlafly

unread,
Aug 2, 2001, 4:04:19 PM8/2/01
to
"David Crick" <dac...@cwcom.net> wrote in message
news:3B69ADFD...@cwcom.net...
> http://csrc.nist.gov/encryption/modes/proposedmodes/
> http://csrc.nist.gov/encryption/modes/proposedmodes/dctr/dctr-spec.pdf

NIST had picked 6 modes of operation for AES. The 4 DES ones
plus CTR (counter) and MAC. Apparently the NSA proposal came
after the deadline but NIST is considering it anyway.

CTR has integrity check and MAC cannot be used with encryption.
The new proposal might be viewed as a replacement for both as
it seems to have the advantages of each.

John Savard

unread,
Aug 2, 2001, 6:06:30 PM8/2/01
to
On Thu, 02 Aug 2001 20:04:19 GMT, "Roger Schlafly"
<roger...@my-dejanews.com> wrote, in part:

>CTR has integrity check and MAC cannot be used with encryption.
>The new proposal might be viewed as a replacement for both as
>it seems to have the advantages of each.

I like DCTR because it has advantages over DESX. However, one ought to
iterate the shift register 128 times, not once, between blocks - I'm
not sure if I understood this right. And it shouldn't necessarily be
restricted to 128 bits in length (for that matter, why just have one).

The idea of the initial counter value being secret is good, but in
some applications, one doesn't have a session key that is different
with each message. (i.e., where Rijndael is used alone, and not in
conjunction with a public-key cipher)

In that case, there should be a specification for an IV which is used
in combination with a secret value to produce an initial counter value
for each message sent with the same Rijndael cipher key.

John Savard
http://home.ecn.ab.ca/~jsavard/index.html
http://plaza.powersurfr.com/jsavard/other/ch02.htm

SCOTT19U.ZIP_GUY

unread,
Aug 2, 2001, 6:07:32 PM8/2/01
to
roger...@my-dejanews.com (Roger Schlafly) wrote in
<7nia7.807$sB2.14...@twister2.starband.net>:

It does on the surface look better than CTR. I wonder
it they will ever consider bijective padding in modes that
require padding.

David A. Scott
--
SCOTT19U.ZIP NOW AVAILABLE WORLD WIDE "OLD VERSIOM"
http://www.jim.com/jamesd/Kong/scott19u.zip
My website http://members.nbci.com/ecil/index.htm
My crypto code http://radiusnet.net/crypto/archive/scott/
MY Compression Page http://members.nbci.com/ecil/compress.htm
**TO EMAIL ME drop the roman "five" **
Disclaimer:I am in no way responsible for any of the statements
made in the above text. For all I know I might be drugged.
As a famous person once said "any cryptograhic
system is only as strong as its weakest link"

Douglas A. Gwyn

unread,
Aug 2, 2001, 7:13:37 PM8/2/01
to
John Savard wrote:
> ... However, one ought to iterate the shift register 128 times,
> not once, between blocks ...

Why?

John Savard

unread,
Aug 3, 2001, 8:30:03 PM8/3/01
to
On Thu, 2 Aug 2001 23:13:37 GMT, "Douglas A. Gwyn" <DAG...@null.net>
wrote, in part:
>John Savard wrote:

>Why?

Then no longer is there the simple relationship "this bit is the same
as that bit" between the quantities used in successive blocks. Of
course, I can't really claim that sort of security is actually needed,
and after all, it is meant to be an AES enciphering mode, not a new
cipher.

Volker Hetzer

unread,
Aug 6, 2001, 10:40:07 AM8/6/01
to
David Crick wrote:
>
> http://csrc.nist.gov/encryption/modes/proposedmodes/
>
> http://csrc.nist.gov/encryption/modes/proposedmodes/dctr/dctr-spec.pdf
What is the reason that x_0 has to be kept secret?
If I didn't need the 64Bit checksum, could I make x_0 public?
Furthermore, the choice of the polynomial seems to be solely for
the checksum. Does that imply that I can just use a counter if
I am not interested in the checksum?

Greetings!
Volker

Volker Hetzer

unread,
Aug 6, 2001, 10:41:30 AM8/6/01
to
If I didn't need the 128Bit checksum, could I make x_0 public?

John Myre

unread,
Aug 6, 2001, 4:03:06 PM8/6/01
to
Volker Hetzer wrote:
<snip>

> What is the reason that x_0 has to be kept secret?
<snip>

Wouldn't the whole thing devolve into ECB otherwise?

JM

Volker Hetzer

unread,
Aug 7, 2001, 6:35:24 AM8/7/01
to
No, repeated Blocks would still encrypt differently.
You see, with CBC and Counter mode the IV needs not
to be hidden. Why in the DCTR?

Greetings!
Volker

John Myre

unread,
Aug 7, 2001, 10:43:47 AM8/7/01
to
Volker Hetzer wrote:
>
> John Myre wrote:
> >
> > Volker Hetzer wrote:
> > <snip>
> > > What is the reason that x_0 has to be kept secret?
> > <snip>
> >
> > Wouldn't the whole thing devolve into ECB otherwise?
> No, repeated Blocks would still encrypt differently.

I see it. Oops.

> You see, with CBC and Counter mode the IV needs not
> to be hidden. Why in the DCTR?

I think it is not just repeated blocks that could be a problem.

In general, we worry about things like chosen ciphertext,
cut-and-paste, and so forth. Also, DCTR explicitly addresses
how to do authentication (without using an expensive primitive
like a MAC). I think of x_0 as the MAC key, although obviously
it isn't quite that simple.

JM

Volker Hetzer

unread,
Aug 7, 2001, 12:56:25 PM8/7/01
to
I agree that x_0 could be important when trying to forge a checksum.
However, it doesn't look as if x_0 needs to be secret for the
encryption part.

Greetings!
Volker

postm...@127.1

unread,
Aug 8, 2001, 7:00:12 AM8/8/01
to

According to John Myre <jm...@sandia.gov>:

> Wouldn't the whole thing devolve into ECB otherwise?

One way it *will* devolve is if the LFSR fill is zero.
There must be a mechanism to ensure Xo is not zero (if it is used alone)
and Xo + (SEQ | PRI | padding) is not zero. The latter can be assured
if ones complement addition is used (it is known that SEQ is not zero
because one half of it is the inverse of the other). Contrary to the
footnote, 128 bit addition is not expensive if it is implemented as four
32-bit adds. The additional latency is incurred only once per packet
and can be hidden by other header processing.

According to John Savard:


> one ought to iterate the shift register 128 times, not once,

In software, an order 127 trinomial could be implemented which would
advance the state by 128 bits in about the same time as it would take
the proposed polynomial to advance by just 1 bit.


--
Don't_send_me_bulk/junk_email.__________...@SRC.Honeywell.com
Kevin R. Driscoll, Senior Staff Scientist PHONE: (612) 951-7263 FAX: -7438
POST: Honeywell M/S MN65-2200; 3660 Technology Drive; Mpls, MN 55418-1006; USA

David Wagner

unread,
Aug 8, 2001, 2:53:49 PM8/8/01
to
>[...] The additional latency is incurred only once per packet

>and can be hidden by other header processing.

It seems that a major goal of DCTR mode is to minimize per-packet
latency for links requiring key-agility, so they might not have been
happy with such additional latency, and one would expect them to have
looked at this (since it was their main goal).

>In software, an order 127 trinomial could be implemented which would
>advance the state by 128 bits in about the same time as it would take
>the proposed polynomial to advance by just 1 bit.

Interesting! How does that work?

John Savard

unread,
Aug 8, 2001, 9:03:47 PM8/8/01
to
On Wed, 8 Aug 2001 18:53:49 +0000 (UTC), d...@mozart.cs.berkeley.edu
(David Wagner) wrote, in part:
quoting "postmaster@127.1"

>>In software, an order 127 trinomial could be implemented which would
>>advance the state by 128 bits in about the same time as it would take
>>the proposed polynomial to advance by just 1 bit.

>Interesting! How does that work?

Hmm. Let's take a polynomial like x^127 + x + 1 and design a shift
register around it.

Since we're talking software, let's run it in Galois mode.

I think the idea is that advancing the state by 128 cycles is done by
an operation similar to x = x xor ( x >> 1 ). There are probably some
little details to clean up, so that isn't the exact equation.

postm...@127.1

unread,
Aug 9, 2001, 6:05:05 AM8/9/01
to
jsa...@ecn.ab.SBLOK.ca.nowhere (John Savard) writes:

> On Wed, 8 Aug 2001 18:53:49 +0000 (UTC), d...@mozart.cs.berkeley.edu
> (David Wagner) wrote, in part:
> quoting "postmaster@127.1"
>
> >>In software, an order 127 trinomial could be implemented which would
> >>advance the state by 128 bits in about the same time as it would take
> >>the proposed polynomial to advance by just 1 bit.
>
> >Interesting! How does that work?
>
> Hmm. Let's take a polynomial like x^127 + x + 1 and design a shift
> register around it.
>
> Since we're talking software, let's run it in Galois mode.

No, throw out the conventional wisdom and use the Fibonacci form. The
trick is that the primitive x^127 + x^63 + 1 has its feedbacks spaced 64
terms apart. This allows a single XOR instruction to produce a full
word of new bits for up to 64-bit-wide machines. Pentium MMX assembly can
produce 128 new bits in 12 instuctions (6 clocks for 2 interleaved LFSRs).

> I think the idea is that advancing the state by 128 cycles is done by
> an operation similar to x = x xor ( x >> 1 ). There are probably some
> little details to clean up, so that isn't the exact equation.

This won't work. The shift has to be larger than the word width in order
to prevent overlaps which require a lot more instuctions "to clean up".

--
Don't_send_me_bulk/junk_email.__________...@HTC.Honeywell.com

Benjamin Goldberg

unread,
Aug 9, 2001, 5:00:02 PM8/9/01
to

On the above page, it says
"** The submission of the Dual Counter Mode has been withdrawn."

> http://csrc.nist.gov/encryption/modes/proposedmodes/dctr/dctr-spec.pdf

--
I need more taglines. This one is getting old.

Roger Schlafly

unread,
Aug 9, 2001, 10:20:28 PM8/9/01
to
"Benjamin Goldberg" <gol...@earthlink.net> wrote in message
news:3B72F9D2...@earthlink.net...

> David Crick wrote:
> > http://csrc.nist.gov/encryption/modes/proposedmodes/
> On the above page, it says
> "** The submission of the Dual Counter Mode has been withdrawn."
> > http://csrc.nist.gov/encryption/modes/proposedmodes/dctr/dctr-spec.pdf

Hmmm. Sounds like somebody cracked it. Or the NSA decided that
it was too good for civilian use.<g>

Tom St Denis

unread,
Aug 9, 2001, 11:46:25 PM8/9/01
to

Somehow I doubt the latter...

Tom

Volker Hetzer

unread,
Aug 10, 2001, 5:48:29 AM8/10/01
to
OTOH I find it unlikely that they spent 18 months designing a mode
only to have it broken within days of publication.

Greetings!
Volker

Mok-Kong Shen

unread,
Aug 10, 2001, 4:53:08 AM8/10/01
to

Volker Hetzer wrote:
>
> OTOH I find it unlikely that they spent 18 months designing a mode
> only to have it broken within days of publication.

Assuming that these time data are right, I would say
it is a rare but not very unlikely event. Scientific
progresses are often realized when some people suddenly
happen to have the right inspirations (together with
very good knowledge and experiences in the fields
concerned, of course). The issue seems to support the
view that the academics have been advancing quite well
in a domain once almost absolutely dominated/reserved
by the agencies on the one hand and that one should
always be very prudent/conservative in one's security
evaluation and be prepared of surprises on the other
hand, in my humble view.

M. K. Shen

Tom St Denis

unread,
Aug 10, 2001, 7:49:19 AM8/10/01
to

Not entirely impossible. Eli Biham IIRC has broken ciphers during the
presentation before [dunno if that's true but makes one heck of an ego
trip story!]

Tom

Tom St Denis

unread,
Aug 10, 2001, 7:51:00 AM8/10/01
to

To put it bluntly and to Paraphrase George Carlin a bit...

"Scientific progresss occurs when somebody figures to nail two things
together that have never been tailed together before."

Also what is the big hype over DCTR anyways? What was it suppose to
prevent as oppose to just normal CTR?

Tom

Volker Hetzer

unread,
Aug 10, 2001, 9:52:59 AM8/10/01
to
Tom St Denis wrote:
> Also what is the big hype over DCTR anyways? What was it suppose to
> prevent as oppose to just normal CTR?
CTR is a stream cipher. DTCR prevents that bit flipping thing without
performance penalty.
Additionally, it offers a checksum, also for free.

Greetings!
Volker

Volker Hetzer

unread,
Aug 10, 2001, 9:57:13 AM8/10/01
to
Tom St Denis wrote:

>
> Volker Hetzer wrote:
> > OTOH I find it unlikely that they spent 18 months designing a mode
> > only to have it broken within days of publication.
>
> Not entirely impossible. Eli Biham IIRC has broken ciphers during the
> presentation before
That's true, but even if the NSA is not all powerful, they are not idiots
either and I had hoped that for so simple a mode they did a decent proof.
- Maybe they didn't want it standardized because it's too specialised?
- Maybe after all their work with SHA-256/384/512 they considered a 128Bit
checksum unsafe?

Greetings!
Volker

Tom St Denis

unread,
Aug 10, 2001, 8:40:45 AM8/10/01
to

No I remember something about him break a cipher at a conference once.
Tom

Tom St Denis

unread,
Aug 10, 2001, 8:41:40 AM8/10/01
to

How can it be for free when DCTR has higher complexity?

Also why is "bit flipping" a bad thing?

I have to agree with Wagner on this one. The role of a block cipher is
not to maintain integrity. That's a HASH function.

Tom

John Savard

unread,
Aug 10, 2001, 9:13:13 AM8/10/01
to
On Fri, 10 Aug 2001 11:48:29 +0200, Volker Hetzer
<volker...@fujitsu-siemens.com> wrote, in part:

>OTOH I find it unlikely that they spent 18 months designing a mode
>only to have it broken within days of publication.

Then one has to figure out a third reason...

perhaps it was felt that it offered nothing some other suggested mode
could do as well?

Volker Hetzer

unread,
Aug 10, 2001, 10:54:13 AM8/10/01
to
Tom St Denis wrote:
>
> Volker Hetzer wrote:
> > Tom St Denis wrote:
> >
> >>Also what is the big hype over DCTR anyways? What was it suppose to
> >>prevent as oppose to just normal CTR?
> >>
> > CTR is a stream cipher. DTCR prevents that bit flipping thing without
> > performance penalty.
> > Additionally, it offers a checksum, also for free.
>
> How can it be for free when DCTR has higher complexity?
Ok, at the cost of an additional xor. However, that is just a pipeline stage.

> Also why is "bit flipping" a bad thing?

I meant that attack on stream ciphers where you get the sender to sent
his data twice and with one bit inserted. Sorry.

Without the checksum it is as fast as CTR (you can even leave out the
second xor x_i) while changing a bit in the ciphertext causes the whole
block to be garbled unpredictably.

Greetings!
Volker

Tom St Denis

unread,
Aug 10, 2001, 9:37:40 AM8/10/01
to
Volker Hetzer wrote:
> Tom St Denis wrote:
>
>>Volker Hetzer wrote:
>>
>>>Tom St Denis wrote:
>>>
>>>
>>>>Also what is the big hype over DCTR anyways? What was it suppose to
>>>>prevent as oppose to just normal CTR?
>>>>
>>>>
>>>CTR is a stream cipher. DTCR prevents that bit flipping thing without
>>>performance penalty.
>>>Additionally, it offers a checksum, also for free.
>>>
>>How can it be for free when DCTR has higher complexity?
>>
> Ok, at the cost of an additional xor. However, that is just a pipeline stage.
>
>
>>Also why is "bit flipping" a bad thing?
>>
> I meant that attack on stream ciphers where you get the sender to sent
> his data twice and with one bit inserted. Sorry.

Oh, well it is assumed in CTR mode that the initial counter is random
anyways.

> Without the checksum it is as fast as CTR (you can even leave out the
> second xor x_i) while changing a bit in the ciphertext causes the whole
> block to be garbled unpredictably.

Similar to CTR with a proper counter. If you guess the key wrong you
get garbage..

Tom

M. S. Bob

unread,
Aug 10, 2001, 10:10:51 AM8/10/01
to
Benjamin Goldberg <gol...@earthlink.net> wrote in message news:<3B72F9D2...@earthlink.net>...
> David Crick wrote:
> >
> > http://csrc.nist.gov/encryption/modes/proposedmodes/
>
> On the above page, it says
> "** The submission of the Dual Counter Mode has been withdrawn."

I've removed email address to not expose the mentioned people to spam,
and annoying sci.crypt readers. :-) The paper was written by Pompiliu
Donescu, Virgil Gligor, and David Wagner, Phillip Rogaway "A Note on
NSA's Dual Counter Mode of Encryption". I'm not aware of the paper on
any web sites yet, but you could check the author's sites.

This reminds me of Skipjack, and how it was "broken" in the open
community, most likely a sloppy job done for various "real-world"
internal reasons, perhaps because it is declassified work, or not
considered their primary job function by some insiders.


>From @radium.ncsc.mil Thu Aug 9 07:22:21 2001
Date: Thu, 09 Aug 2001 07:25:34 -0400
From: Robert Shea <rshea@>
To: @nist.gov, @nist.gov, @nist.gov,
@nist.gov
CC: @radium.ncsc.mil, @eng.umd.edu, @eng.umd.edu,
@cs.ucdavis.edu, @cs.berkeley.edu
Subject: Dual Counter Mode (DCM)

On behalf of Brian Snow, Technical Director, Information Assurance,
NSA,
the following message is forwarded to the AES Team at NIST:

NSA believes that a license-free high-speed integrity-preserving
mode
of operation is needed for the Advanced Encryption Standard, and was
pleased
to submit the "Dual Counter Mode" (DCM) as a participant in the series
of
AES Modes Workshops sponsored by NIST.

Recently Virgil Gligor and Pompiliu Donescu of the University of
Maryland, Phillip Rogaway of the UC Davis and Chiang Mai University,
David Wagner of Berkeley, and possibly others, have produced results
concerning the secrecy and integrity claims made for DCM. We commend
them for their work

We withdraw the Dual Counter Mode for consideration as a mode of
operation for AES at this time, while we consider the observations and
their
ramifications. We believe a license-free high-speed
integrity-preserving
mode of operation is still needed for AES, and will continue to work
on this
problem as well as encourage others to do so.

Brian D. Snow
Technical Director
Information Assurance Directorate
National Security Agency

Mark Wooding

unread,
Aug 10, 2001, 10:45:49 AM8/10/01
to
Volker Hetzer <volker...@fujitsu-siemens.com> wrote:

> - Maybe they didn't want it standardized because it's too specialised?

I think it's more likely that it didn't provide any noticeable advantage
over OCB or similar modes.

> - Maybe after all their work with SHA-256/384/512 they considered a 128Bit
> checksum unsafe?

No. SHA-256 has to be able to resist offline collision-finding, which
takes square-root time. However, an integrity-aware encryption mode
only needs to be able to resist /online/ collision-finding, and moreover
there's a birthday limit on the encryption mode security anyway.

-- [mdw]

Roger Schlafly

unread,
Aug 10, 2001, 2:33:39 PM8/10/01
to
"M. S. Bob" <ms...@hotmail.com> wrote

> This reminds me of Skipjack, and how it was "broken" in the open
> community, most likely a sloppy job done for various "real-world"
> internal reasons, perhaps because it is declassified work, or not
> considered their primary job function by some insiders.

AFAIK, Skipjack was never broken. Biham has a (rather farfetched)
attack on 31-rounds of Skipjack here:
http://www.cs.technion.ac.il/~biham/Reports/SkipJack/

But Skipjack is 32 rounds. This suggests that Skipjack was well-
engineered, not that it was sloppy.

> We withdraw the Dual Counter Mode for consideration as a mode of
> operation for AES at this time, while we consider the observations and
> their ramifications. We believe a license-free high-speed
integrity-preserving
> mode of operation is still needed for AES, and will continue to work
> on this problem as well as encourage others to do so.
> Brian D. Snow
> Technical Director
> Information Assurance Directorate
> National Security Agency

Hmmm. Hope to see something soon.

M.S. Bob

unread,
Aug 11, 2001, 11:32:11 AM8/11/01
to
Roger Schlafly wrote:
>
> "M. S. Bob" <ms...@hotmail.com> wrote
> > This reminds me of Skipjack, and how it was "broken" in the open
> > community, most likely a sloppy job done for various "real-world"
>
> AFAIK, Skipjack was never broken. Biham has a (rather farfetched)
> attack on 31-rounds of Skipjack here:
> http://www.cs.technion.ac.il/~biham/Reports/SkipJack/
>
> But Skipjack is 32 rounds. This suggests that Skipjack was well-
> engineered, not that it was sloppy.

An academic attack that works up to 31-rounds is a fairly significant
certification weakness. The margin of safety is tiny, a mere single
round, which makes many people uncomfortable using it.



> > We withdraw the Dual Counter Mode for consideration as a mode of
> > operation for AES at this time, while we consider the observations and
> > their ramifications. We believe a license-free high-speed integrity-preserving
> > mode of operation is still needed for AES, and will continue to work
> > on this problem as well as encourage others to do so.
> > Brian D. Snow

> > National Security Agency
>
> Hmmm. Hope to see something soon.

Agreed.

Tom St Denis

unread,
Aug 11, 2001, 11:39:55 AM8/11/01
to
M.S. Bob wrote:
> Roger Schlafly wrote:
>
>>"M. S. Bob" <ms...@hotmail.com> wrote
>>
>>>This reminds me of Skipjack, and how it was "broken" in the open
>>>community, most likely a sloppy job done for various "real-world"
>>>
>>AFAIK, Skipjack was never broken. Biham has a (rather farfetched)
>>attack on 31-rounds of Skipjack here:
>>http://www.cs.technion.ac.il/~biham/Reports/SkipJack/
>>
>>But Skipjack is 32 rounds. This suggests that Skipjack was well-
>>engineered, not that it was sloppy.
>>
>
> An academic attack that works up to 31-rounds is a fairly significant
> certification weakness. The margin of safety is tiny, a mere single
> round, which makes many people uncomfortable using it.

Of course you forgot to mention the 31-round attack takes about as much
work as simply guessing the key anyways.

You might as well argue that skipjack is a bad cipher not because it has
a 31-round attack but the key is too small.

At anyrate skipjack was made for 8-bit mcu's where they would only
encrypt at most 500 blocks per second [hehehe TC17 gets over 600,
128-bit blocks per second on my 8051]. Even if the attack broke all 32
rounds with say 2^40 plaintexts it would not be feasible to use in practice.

Tom

Roger Schlafly

unread,
Aug 11, 2001, 11:55:59 AM8/11/01
to
"M.S. Bob"

> > AFAIK, Skipjack was never broken. Biham has a (rather farfetched)
> > attack on 31-rounds of Skipjack here:
> > http://www.cs.technion.ac.il/~biham/Reports/SkipJack/
> > But Skipjack is 32 rounds. This suggests that Skipjack was well-
> > engineered, not that it was sloppy.
> An academic attack that works up to 31-rounds is a fairly significant
> certification weakness. The margin of safety is tiny, a mere single
> round, which makes many people uncomfortable using it.

No, it's not a weakness. Skipjack does what it was designed to do.

The only thing that makes anyone uncomfortable is the 80-bit keysize,
giving a 2^80 step exhaustive search attack. That is acceptable for
most applications, but not for some.

David Wagner

unread,
Aug 11, 2001, 1:41:41 PM8/11/01
to
Roger Schlafly wrote:
>The only thing that makes anyone uncomfortable is the 80-bit keysize,
>giving a 2^80 step exhaustive search attack.

I can't agree with this part. The low security margin makes me somewhat
uncomfortable, and I don't think I'm alone in this respect. (But then,
I'm conservative about these things, so who knows.)

Tom St Denis

unread,
Aug 11, 2001, 1:43:00 PM8/11/01
to

I think there are several issues. What are you protecting? Against who?

Recall that Skipjack is for low-bandwidth devices... an attack that
requires say 2^40 texts is not feasible...

Tom

Roger Schlafly

unread,
Aug 11, 2001, 12:44:42 PM8/11/01
to
"David Wagner" <d...@mozart.cs.berkeley.edu> wrote

Are you similarly uncomfortable about the various DES attacks?
There are academic attacks on DES with 2^47 steps (and I think that
this number has even been improved a little bit). In theory, this is better
than the 2^56 exhaustive search attack. But there are a large number
of chosen plaintexts or other contraints on the academic attack. The
only attack I here people worrying about in practice is the 2^56 brute
force attack. (Or 2^55, as there is a trick that saves a factor of 2.)
If 2^55 security is good enough for an application, then people are
happy with DES. IMO, DES essentially delivers the security it
promises.

And the same is true for Skipjack. In the case of Skipjack, there isn't
even any academic attack on the 32-round algorithm. You might be
a little concerned that it has been studied less than DES, but I think
Skipjack was published about 5 years ago and a number of people
would have liked to embarrass the NSA. IMO, Biham did not.

Maybe someday someone will find an attack on Skipjack with 2^79
chosen plaintexts and 2^79 operations, but I wouldn't lose any
sleep over it.

David Wagner

unread,
Aug 11, 2001, 2:34:33 PM8/11/01
to
Roger Schlafly wrote:
>Are you similarly uncomfortable about the various DES attacks?

The question doesn't even come up with DES, because DES is clearly
unsuitable for deployment in new systems. But note that DES has had
hundreds of person-years of public cryptanalysis, whereas Skipjack has
had much less, so we have much greater assurance that we understand the
properties of DES than we do for Skipjack.

The worry is not that the known shortcut attacks are practical --
clearly they aren't. The fear is that the known attacks may represent
only the tip of the iceberg. "Attacks only get better", and if BBS can
find an attack this devastating in only a few months, what will we find
in a decade? Or, so the concern goes, anyway.

If this sounds too conservative, remember that there are alternatives
out there that are much better understood than Skipjack and that are
sufficient for most applications. Everything else being equal, why
should we use Skipjack when we have triple-DES and AES?

Douglas A. Gwyn

unread,
Aug 11, 2001, 8:57:11 PM8/11/01
to
"M.S. Bob" wrote:
> Roger Schlafly wrote:
> > AFAIK, Skipjack was never broken. Biham has a (rather farfetched)
> > attack on 31-rounds ...

> > But Skipjack is 32 rounds. This suggests that Skipjack was well-
> > engineered, not that it was sloppy.
> An academic attack that works up to 31-rounds is a fairly significant
> certification weakness. The margin of safety is tiny, ...

That so-called "margin of safety" is a bogus notion.

Roger Schlafly

unread,
Aug 11, 2001, 8:13:23 PM8/11/01
to
"David Wagner" <d...@mozart.cs.berkeley.edu> wrote

> Roger Schlafly wrote:
> >Are you similarly uncomfortable about the various DES attacks?
> The question doesn't even come up with DES, because DES is clearly
> unsuitable for deployment in new systems. But note that DES has had
> hundreds of person-years of public cryptanalysis, whereas Skipjack has
> had much less, so we have much greater assurance that we understand the
> properties of DES than we do for Skipjack.
> The worry is not that the known shortcut attacks are practical --
> clearly they aren't. The fear is that the known attacks may represent
> only the tip of the iceberg. "Attacks only get better", and if BBS can
> find an attack this devastating in only a few months, what will we find
> in a decade? Or, so the concern goes, anyway.

If you don't trust Skipjack, or want to speculate that better attacks
exist or will be found, that's fine with me. Its your opinion. But my
objection is to MS Bob's statement that Skipjack was broken, and
to your statement that there is a devastating attack.

Skipjack is not broken. There is no known attack that shows any
weakness in Skipjack. Period.

You don't say that a proof is wrong because someone almost found
a flaw, and you don't say that a block cipher is broken because
someone almost found an attack.

> If this sounds too conservative, remember that there are alternatives
> out there that are much better understood than Skipjack and that are
> sufficient for most applications. Everything else being equal, why
> should we use Skipjack when we have triple-DES and AES?

Skipjack can be implemented very compactly, so there are probably
scenarios where it might be preferable. But this is getting off the
point. Sure TDES & AES are the ciphers of choice. But that doesn't
justify calling everything else "broken".

Tom St Denis

unread,
Aug 11, 2001, 9:16:05 PM8/11/01
to
Roger Schlafly wrote:
> "David Wagner" <d...@mozart.cs.berkeley.edu> wrote
>
>>Roger Schlafly wrote:
>>
>>>Are you similarly uncomfortable about the various DES attacks?
>>>
>>The question doesn't even come up with DES, because DES is clearly
>>unsuitable for deployment in new systems. But note that DES has had
>>hundreds of person-years of public cryptanalysis, whereas Skipjack has
>>had much less, so we have much greater assurance that we understand the
>>properties of DES than we do for Skipjack.
>>The worry is not that the known shortcut attacks are practical --
>>clearly they aren't. The fear is that the known attacks may represent
>>only the tip of the iceberg. "Attacks only get better", and if BBS can
>>find an attack this devastating in only a few months, what will we find
>>in a decade? Or, so the concern goes, anyway.
>>
>
> If you don't trust Skipjack, or want to speculate that better attacks
> exist or will be found, that's fine with me. Its your opinion. But my
> objection is to MS Bob's statement that Skipjack was broken, and
> to your statement that there is a devastating attack.
>
> Skipjack is not broken. There is no known attack that shows any
> weakness in Skipjack. Period.
>
> You don't say that a proof is wrong because someone almost found
> a flaw, and you don't say that a block cipher is broken because
> someone almost found an attack.

While I agree with this view it is important to realize certain warm
fuzzy things. For instance, I'm a dorky kid and I broke a very slight
variant of Noekeon. I would never ever use Noekeon in a real
application for the sole reason that if I could find an attack, most
likely some dedicated people could find a real attack.

Likewise, if in only a few months 31/32 rounds are broken I imagine in a
few years 32/32 rounds will be.

Of course there is a flipside. In a few years if the best attack on SJ
breaks 32 rounds and requires 2^79 work, I will laugh it up and ignore
the attack for all pratical purposes.

>>If this sounds too conservative, remember that there are alternatives
>>out there that are much better understood than Skipjack and that are
>>sufficient for most applications. Everything else being equal, why
>>should we use Skipjack when we have triple-DES and AES?
>>
>
> Skipjack can be implemented very compactly, so there are probably
> scenarios where it might be preferable. But this is getting off the
> point. Sure TDES & AES are the ciphers of choice. But that doesn't
> justify calling everything else "broken".

Seriously when for some applications other non AES cipher could be more
suitable [hmm TC17? :-)]

Tom

Douglas A. Gwyn

unread,
Aug 11, 2001, 9:30:20 PM8/11/01
to
Tom St Denis wrote:
> Likewise, if in only a few months 31/32 rounds are broken I imagine
> in a few years 32/32 rounds will be.

There is absolutely no reason for such a conclusion.
What if, as has been suggested, Skipjack *had* been
designed with *exactly* enough rounds to guarantee
its security (against resources substantially less
than required or brute-force key search)? Then in
time T an effective 31-round attack is found, and
in time Infinity an effective 32-round attack is
found. Whatever form of extrapolation you're using
clearly breaks down in such a situation.

David Hopwood

unread,
Aug 11, 2001, 9:35:16 PM8/11/01
to
-----BEGIN PGP SIGNED MESSAGE-----

Roger Schlafly wrote:
> "M.S. Bob" wrote:
> > Roger Schlafly wrote:

> > > AFAIK, Skipjack was never broken. Biham has a (rather farfetched)
> > > attack on 31-rounds of Skipjack here:
> > > http://www.cs.technion.ac.il/~biham/Reports/SkipJack/
> > > But Skipjack is 32 rounds. This suggests that Skipjack was well-
> > > engineered, not that it was sloppy.
> >
> > An academic attack that works up to 31-rounds is a fairly significant
> > certification weakness. The margin of safety is tiny, a mere single
> > round, which makes many people uncomfortable using it.
>
> No, it's not a weakness. Skipjack does what it was designed to do.

If a publically designed cipher had an attack better than brute force when
reduced by one round, most people wouldn't touch it with a barge pole, at
least for new designs. I fail to see why NSA ciphers should be treated any
differently.


I'm also not surprised that DCTR was broken; the security analysis in
the paper describing it was basically non-existant - just a lot of vague
handwaving. Also, if I'm not mistaken the result described in section 6 of
Jutla's paper [1] applies: the minimum number of additional cryptographic
operations required to achieve the standard notions of message secrecy and
integrity in a mode of this type is O(log n), where n is the plaintext
length. Stepping an LFSR cannot be considered a cryptographic operation,
so DCTR doesn't meet this condition. If the DCTR authors didn't know about
this result, they just weren't paying attention.

[1] Charanjit Jutla,
"Encryption Modes with Almost Free Message Integrity,"
http://eprint.iacr.org/2000/039/

- --
David Hopwood <david....@zetnet.co.uk>

Home page & PGP public key: http://www.users.zetnet.co.uk/hopwood/
RSA 2048-bit; fingerprint 71 8E A6 23 0E D3 4C E5 0F 69 8C D4 FA 66 15 01
Nothing in this message is intended to be legally binding. If I revoke a
public key but refuse to specify why, it is because the private key has been
seized under the Regulation of Investigatory Powers Act; see www.fipr.org/rip


-----BEGIN PGP SIGNATURE-----
Version: 2.6.3i
Charset: noconv

iQEVAwUBO3XdMTkCAxeYt5gVAQE2eQf+NfvtBhR7wOeiTNTbzc3HlOZep893j2AK
Tu6Ln9CVq3V4nAI0qe+xFJ7RZjGEPqGwwnO7//2uOO3MwVZSC2bsTMIwOEzzyvhc
myl8E53rzXVhpxYHEI66GlAYd0vOhI37VkbtOCLoDX7lLo3Ta5CGx5B1Fp8CsQra
yHIlgEz/sKVrjZiVfc7RaYWC8DQzNxnDK7osN+Ohe8IZ0l19e2OmqZLknPoosg8+
yW0O2w5CodqkUgcnr0YYFNebmWaVt1yjLjUKcClpnYQ/l8KogG2rgQ8IMlQ0Xxmp
l152XJc9/TAC/eAmbKG/vV/jq2kd/D0Yd0Y2o68jAgC8XDG6tXtDNQ==
=dcWB
-----END PGP SIGNATURE-----

Tom St Denis

unread,
Aug 11, 2001, 10:41:53 PM8/11/01
to

Clearly you've forgotten all modern crypto literature. FEAL-4 was the
strongest and fast cipher once. It fell, then FEAL-8, then ....

Just because SJ is strong now doesn't mean tommorow I won't find an
attack that takes 10 plaintexts and a minute...

As David re-iterated, "attacks only get better".

Tom

Roger Schlafly

unread,
Aug 11, 2001, 10:23:36 PM8/11/01
to
"Douglas A. Gwyn" <DAG...@null.net> wrote in message
news:3B75D485...@null.net...

Yes. Those who chose Rijndael for AES apparently agree, as it does
not have much of a "margin of safety".

Tom St Denis

unread,
Aug 11, 2001, 11:25:27 PM8/11/01
to

Choosing AES was more about theoretical security. Rijndael was the most
efficient of the bunch.

Tom

David Wagner

unread,
Aug 12, 2001, 12:59:23 AM8/12/01
to
Roger Schlafly wrote:
>objection is [...], and

>to your statement that there is a devastating attack.

For the record, I never stated any such thing. Nor did I call
Skipjack broken. Re-check your attributions, please...

David Wagner

unread,
Aug 12, 2001, 1:01:15 AM8/12/01
to
Douglas A. Gwyn wrote:
>What if, as has been suggested, Skipjack *had* been
>designed with *exactly* enough rounds to guarantee
>its security (against resources substantially less
>than required or brute-force key search)?

That's possible, but beyond the resources of the public
community to verify. Without further evidence, the safe
conclusion to draw is that we can't tell whether Skipjack
is safe or not. We can't tell whether Triple-DES is safe
or not, either, but if I had to place my bets, you can be
sure that I'll place them on Triple-DES, and rightly so,
I believe, given the evidence available to us today. And,
if you think about it, using a cipher in a deployed system
*is* placing a bet...

David Wagner

unread,
Aug 12, 2001, 1:03:07 AM8/12/01
to
David Hopwood wrote:
>I'm also not surprised that DCTR was broken; the security analysis in
>the paper describing it was basically non-existant - just a lot of vague
>handwaving.

I agree.

>Also, if I'm not mistaken the result described in section 6 of
>Jutla's paper [1] applies: the minimum number of additional cryptographic
>operations required to achieve the standard notions of message secrecy and
>integrity in a mode of this type is O(log n), where n is the plaintext
>length.

This is not so clear. Jutla's lower bound assumes the only operations
available are block-wise XORs and block cipher operations. His bound
doesn't apply to the second variant of DCM, because the second variant
uses 32-bit additions. I'm not convinced that Jutla's bound applies to
the first variant, either, because I think Jutla's bound only considers
blockwise XORs and not, e.g., rotations or LFSR-steppings.

David Wagner

unread,
Aug 12, 2001, 1:06:34 AM8/12/01
to
Douglas A. Gwyn wrote:
>That so-called "margin of safety" is a bogus notion.

Why? In light of our own ignorance, it's the best approach
we know of to minimize the chances of having a cipher of ours
get broken. If we had confidence that we knew all the best
attacks on a cipher, then we could pick the exact minimum number
of rounds. In absence of such knowledge, we take a more conservative
stance. It's only rational, in absence of complete information.

Roger Schlafly

unread,
Aug 12, 2001, 12:56:08 AM8/12/01
to
"David Wagner" <d...@mozart.cs.berkeley.edu> wrote

My message attributed "broken" to MS Bob, not you. It also quoted
you from the previous message saying:

The fear is that the known attacks may represent
only the tip of the iceberg. "Attacks only get better", and if BBS can
find an attack this devastating in only a few months, what will we find
in a decade? Or, so the concern goes, anyway.

Perhaps this reflects some concern that is different from your true
beliefs, but I quoted your full paragraph when I made my comment.

Regardless of your intent in the above paragraph, my point is that
there is no break, no devastating attack, and no demonstrated
weakness of Skipjack. The Biham-Shamir attacks are on simplified
versions of Skipjack, and not on Skipjack.


David Wagner

unread,
Aug 12, 2001, 2:55:45 AM8/12/01
to
Roger Schlafly wrote:
>My message attributed "broken" to MS Bob, not you. It also quoted
>you from the previous message saying:
>
> The fear is that the known attacks may represent
> only the tip of the iceberg. "Attacks only get better", and if BBS can
> find an attack this devastating in only a few months, what will we find
> in a decade? Or, so the concern goes, anyway.
>
>Perhaps this reflects some concern that is different from your true
>beliefs, but I quoted your full paragraph when I made my comment.

You're absolutely right. The mistake was all mine.
I posted too quickly, and I apologize.

>Regardless of your intent in the above paragraph, my point is that
>there is no break, no devastating attack, and no demonstrated
>weakness of Skipjack. The Biham-Shamir attacks are on simplified
>versions of Skipjack, and not on Skipjack.

Yes, that's absolutely correct.

Douglas A. Gwyn

unread,
Aug 12, 2001, 3:58:51 AM8/12/01
to
Tom St Denis wrote:
> Clearly you've forgotten all modern crypto literature.

Hardly.

Douglas A. Gwyn

unread,
Aug 12, 2001, 4:00:26 AM8/12/01
to
David Hopwood wrote:
> If a publically designed cipher had an attack better than brute
> force when reduced by one round, most people wouldn't touch it
> with a barge pole, at least for new designs.

But that's due to a prejudice, not justified by logic.

Douglas A. Gwyn

unread,
Aug 12, 2001, 4:18:59 AM8/12/01
to

If it is rational, then somebody ought to be able to exhibit the
logical chain of reasoning. The argumentation you just gave,
which is typical of what I have seen from others who maintain
the same position, assumes some sort of continuity that has not
been demonstrated. There are any number of counterexamples to
this notion available in analogous situations, e.g. percolation.

David Wagner

unread,
Aug 12, 2001, 4:57:52 AM8/12/01
to
Douglas A. Gwyn wrote:

>David Wagner wrote:
>> If we had confidence that we knew all the best
>> attacks on a cipher, then we could pick the exact minimum number
>> of rounds. In absence of such knowledge, we take a more conservative
>> stance. It's only rational, in absence of complete information.
>
>If it is rational, then somebody ought to be able to exhibit the
>logical chain of reasoning. [...]

I thought that is what I was sketching. The cost of having too few
rounds is potentially huge (if AES gets broken); the cost of having a
few extra rounds is small (a few apps run 25% slower); so in the presence
of uncertainty, I would expect the right decision to be pretty clear.

Of course there is no proof that adding rounds will always increase
our assurance in the security of a cipher; this is just an attempt to
reduce the probability of error. And it is probably unrealistic to
expect proofs of correctness. Most of these issues seem to be at least
as hard as proving P!=NP, or some non-asymptotic equivalent (and there
are reasons to expect that this may be fundamental).

Nonetheless, even without a proof of soundness, as a rule of thumb,
it appears to be an extraordinarily good one. Historically, security
almost never decreases, and often increase, when you add more rounds to
real ciphers. If your cipher is secure with N rounds, it is probably
secure with N+1 rounds. There are exceptions, of course, usually due
to bad key scheduling, and there are no guarantees that the next cipher
will follow the same pattern, but as a heuristic, it's served us well,
and as a working hypothesis, it has stood the test of time fairly well.

Of course this reasoning cannot be proven to be sound. But it's the
best we've got. If you've got any suggestions for how to do better,
I'd certainly be interested to hear. If there aren't any concrete ideas
for improvement to be had, what teeth can any criticism have?

Douglas A. Gwyn

unread,
Aug 12, 2001, 5:38:00 AM8/12/01
to
David Wagner wrote:
> I'd certainly be interested to hear. If there aren't any concrete ideas
> for improvement to be had, what teeth can any criticism have?

Well, somebody should point out that you're merely fumbling in the dark:

> Of course there is no proof that adding rounds will always increase
> our assurance in the security of a cipher; this is just an attempt
> to reduce the probability of error.

What do you think a reputable automotive engineer ought to think about:
"Maybe one seat belt and one shoulder harness isn't enough in our
cars; maybe we should add a couple more belts, since I have no clue
how the number of seat belts relates to safety. Surely it is better
to be conservative than to guess wrong."
There is way too much of this kind of sloppy reasoning in today's world,
especially with regard to environmental issues. We don't need to
promulgate it in a sci.* newsgroup.

Sorry to pick on you, but you chose to pursue the argument. Many
others have adopted the same notion of security-as-a-smooth-function-
of-number-of-rounds. Perhaps there is *some* simple encryption
scheme where that property can be proven, but so far as I am aware
it is usually just assumed without even an inkling of proof.

M.S. Bob

unread,
Aug 12, 2001, 6:40:14 AM8/12/01
to
Roger Schlafly wrote:
>
> If you don't trust Skipjack, or want to speculate that better attacks
> exist or will be found, that's fine with me. Its your opinion. But my
> objection is to MS Bob's statement that Skipjack was broken, and
> to your statement that there is a devastating attack.
>
> Skipjack is not broken. There is no known attack that shows any
> weakness in Skipjack. Period.

I apologize for my sloppiness and overzealous statement, Skipjack is not
broken.

Still, I think saying that Biham and Shamir's impossible differential
attack on a 31-round variant of Skipjack does not show any weakness is
an overstatement as well.

> You don't say that a proof is wrong because someone almost found
> a flaw, and you don't say that a block cipher is broken because
> someone almost found an attack.

While I should not of said a cipher is broken because someone almost
found an attack, I think it is still reasonable to claim it does not
appear to be as strong as advertised, and thus not as good as initially
claimed.

M.S. Bob

unread,
Aug 12, 2001, 6:42:17 AM8/12/01
to

Advice on evaluating ciphers, and hashes, would be gratefully received.

M.S. Bob

unread,
Aug 12, 2001, 10:09:21 AM8/12/01
to
"Douglas A. Gwyn" wrote:
>
> David Wagner wrote:
> > I'd certainly be interested to hear. If there aren't any concrete ideas
> > for improvement to be had, what teeth can any criticism have?
>
> Well, somebody should point out that you're merely fumbling in the dark:

Yes, agreed.



> > Of course there is no proof that adding rounds will always increase
> > our assurance in the security of a cipher; this is just an attempt
> > to reduce the probability of error.
>
> What do you think a reputable automotive engineer ought to think about:
> "Maybe one seat belt and one shoulder harness isn't enough in our
> cars; maybe we should add a couple more belts, since I have no clue
> how the number of seat belts relates to safety. Surely it is better
> to be conservative than to guess wrong."

The automotive engineer can design tests within the realms of physics,
which does not change it's characteristics over time. That is, if a
crash test performed gave a 'pass' result on Monday you expect the same
crash test two years later to produce the same 'pass' result.

When designing a cipher to resist any known attack at the time of the
attempted attack. You have to deal with the fact that the body of
knowledge of your opponent has increases as a function of time.

So how can we predict what future knowledge the opponent has, in the
realms of understanding fundamental questions of mathematics, and
computer science? If this can be estimated I'd like to know more.

Otherwise we are limited to evaluating based on what we know, not the
opponent, at the present moment, which even with a probable secure
design still leaves a lot to be desired.

I take it, you think there should be more actual experimentation, more
calculation of resources to mount an attack, and more basis analysis and
more design based upon fundamentals (provable secure design, well
understood basic building blocks, etc.), which is good intentions and
any serious proposal should follow such basic recommendations. I think
it is not complete, otherwise we might be faced with using unwieldy
provable secure systems, or ancient but well understood algorithms (does
DES qualify yet?).

Roger Schlafly

unread,
Aug 12, 2001, 10:02:23 AM8/12/01
to
"M.S. Bob" <ms...@hotmail.com-edy> wrote

> > You don't say that a proof is wrong because someone almost found
> > a flaw, and you don't say that a block cipher is broken because
> > someone almost found an attack.
> While I should not of said a cipher is broken because someone almost
> found an attack, I think it is still reasonable to claim it does not
> appear to be as strong as advertised, and thus not as good as initially
> claimed.

How was Skipjack advertized? I don't remember exactly, except that
it was supposed to be more secure than DES. And I think it is, AFAIK.
If I had to decide between DES and Skipjack today, I would choose
Skipjack as less likely to get broken. Skipjack is stronger than DES
in any way you want to measure it (based on current knowledge).

If Skipjack were release with an analysis saying that, "we determined
that 16 rounds withstand the best attacks, so we doubled it to 32 to
add an extra margin of safety", then I'd agree that it is not as good as
claimed. But the NSA did not say that, and may not even believe in
the "margin of safety" concept. It said that 32 rounds was secure,
and didn't say anything about 31 rounds.

Tom St Denis

unread,
Aug 12, 2001, 11:09:08 AM8/12/01
to

Whoa whoa whoa. You're saying that the NSA does cryptanalysis too?

Tom

Kurt Shoens

unread,
Aug 12, 2001, 12:16:48 PM8/12/01
to
In article <9l5geg$159t$1...@agate.berkeley.edu>,
David Wagner <d...@mozart.cs.berkeley.edu> wrote (about security margin):

>Nonetheless, even without a proof of soundness, as a rule of thumb,
>it appears to be an extraordinarily good one. Historically, security
>almost never decreases, and often increase, when you add more rounds to
>real ciphers.

Doug Gwyn can question this from the perspective of a single cipher, but
what seems shakier is comparing two completely different ciphers on the
basis of security margin to see which is more secure.

Is Serpent more secure than Rijndael? The popular opinion in sci.crypt
says "Definitely yes," but I don't understand how that's justified. Maybe
there's an attack that's effective on Serpent but not on Rijndael.

Tom St Denis

unread,
Aug 12, 2001, 12:20:44 PM8/12/01
to

That's always a possibility. But remember that Serpent has more rounds.
The likelyhood of a 32 round attack is much smaller than the
likelyhood of a 10 rounds attack.

Of course that is not definate. Just look at the design of Serpent.
You will see that it is very hard to get it todo anything controllable.

Similarly with Rijndael.

Tom

Paul Crowley

unread,
Aug 13, 2001, 1:36:32 AM8/13/01
to
d...@mozart.cs.berkeley.edu (David Wagner) writes:
> >Also, if I'm not mistaken the result described in section 6 of
> >Jutla's paper [1] applies: the minimum number of additional cryptographic
> >operations required to achieve the standard notions of message secrecy and
> >integrity in a mode of this type is O(log n), where n is the plaintext
> >length.
>
> This is not so clear. Jutla's lower bound assumes the only operations
> available are block-wise XORs and block cipher operations.

This is also, I think, why OCB mode escapes Jutla's bound.
--
__ Paul Crowley
\/ o\ s...@paul.cluefactory.org.uk
/\__/ http://www.cluefactory.org.uk/paul/
"Conservation of angular momentum makes the world go around" - John Clark

Volker Hetzer

unread,
Aug 13, 2001, 10:13:42 AM8/13/01
to
Tom St Denis wrote:
> > Without the checksum it is as fast as CTR (you can even leave out the
> > second xor x_i) while changing a bit in the ciphertext causes the whole
> > block to be garbled unpredictably.
>
> Similar to CTR with a proper counter. If you guess the key wrong you
> get garbage..
Actually no. In CTR you can toggle single bits, making a checksum
much more important.

Greetings!
Volke

Stanley Chow

unread,
Aug 13, 2001, 1:14:13 PM8/13/01
to
David Wagner wrote:
>
> I thought that is what I was sketching. The cost of having too few
> rounds is potentially huge (if AES gets broken); the cost of having a
> few extra rounds is small (a few apps run 25% slower); so in the presence
> of uncertainty, I would expect the right decision to be pretty clear.

I like to point out a practical problem - what is the correct margin?

In many engineering disciplines, there are accepted rules-of-thumb
but they are mostly based on accurate measurements of strength and
good understandings of failure modes (or long history of analysis
of broken systems).

In crypto, it is inherently not possible to have an accurate
measurement of strength - it is only possible to show resistance
to known attacks. For any given cipher, it is always possible for
the next new attack to completely break it. In many ways, this is
due to our lack of understanding of failure modes - we just don't
know all the attacks!

> Of course this reasoning cannot be proven to be sound. But it's the
> best we've got. If you've got any suggestions for how to do better,
> I'd certainly be interested to hear. If there aren't any concrete ideas
> for improvement to be had, what teeth can any criticism have?

Given this lack of knowledge, how does one pick the margin? Whatever
number of rounds you pick, it is always arguable that you should have
more. When do you stop? It seems to me there is only one sensible
approach:
a) meet good general design principles
b) resist all known attacks
c) project likely growth of known but incomplete attacks, and
resist them.
(I assume that David does not disagree with this, but that he would
add the extra criteria of safty margin).

Note that it takes a certain amount of "confidence" that we have
sufficient knowledge about design principles and attacks; but if we
don't have that confidence, then all bets are off anyway. No amount
of extra rounds can make up for lack of knowledge. As new attacks
are found, formerly good systems will be consider broken; and new
systems have to resist the new attacks. That is how science grows.

I argue that the open crypto community has been following this
general approach. I further argue that the "open knowledge" is now
good enough for this approach to pick good designs (which is not to
say that the picked designs are guaranteed secure). I fully expect
new attacks to happen and for the knowledge to grow, but at this
point of the knowledge curve, there is enough science.


--
Stanley Chow C.T.O. stanle...@cloakware.com
Cloakware Corp (613) 271-9446 x 223

Mok-Kong Shen

unread,
Aug 13, 2001, 2:19:12 PM8/13/01
to

Stanley Chow wrote:
>
> I like to point out a practical problem - what is the correct margin?
>
> In many engineering disciplines, there are accepted rules-of-thumb
> but they are mostly based on accurate measurements of strength and
> good understandings of failure modes (or long history of analysis
> of broken systems).

But in all engineering disciplines there are constraints
of economy and technology (including time of construction).
One doesn't build e.g. a bridge that could withstand
earthquakes of the highest value on the Richter scale
but accept reasonable (certainly arguable) 'compromises'.

>
> In crypto, it is inherently not possible to have an accurate
> measurement of strength - it is only possible to show resistance
> to known attacks. For any given cipher, it is always possible for
> the next new attack to completely break it. In many ways, this is
> due to our lack of understanding of failure modes - we just don't
> know all the attacks!

That's way I believe that one needs to be comparatively
more conservative in crypto than in engineering. Of
course, that's in the end an 'subjective' issue, not
amenable to rigorously objective assessment.

> Given this lack of knowledge, how does one pick the margin? Whatever
> number of rounds you pick, it is always arguable that you should have
> more. When do you stop? It seems to me there is only one sensible
> approach:
> a) meet good general design principles
> b) resist all known attacks
> c) project likely growth of known but incomplete attacks, and
> resist them.
> (I assume that David does not disagree with this, but that he would
> add the extra criteria of safty margin).
>
> Note that it takes a certain amount of "confidence" that we have
> sufficient knowledge about design principles and attacks; but if we
> don't have that confidence, then all bets are off anyway. No amount
> of extra rounds can make up for lack of knowledge. As new attacks
> are found, formerly good systems will be consider broken; and new
> systems have to resist the new attacks. That is how science grows.
>
> I argue that the open crypto community has been following this
> general approach. I further argue that the "open knowledge" is now
> good enough for this approach to pick good designs (which is not to
> say that the picked designs are guaranteed secure). I fully expect
> new attacks to happen and for the knowledge to grow, but at this
> point of the knowledge curve, there is enough science.

It is generally the case in engineering practice that
the factors of safety used depend to some degree on
the type of constructions (intended lifespan, consequence
of failure, cost of repair, etc.). I suppose that this
is also principally true in crypto. Thus messages
of low values may need less security protection than
those of high values and threatened by a mighty opponent.

M. K. Shen

Stanley Chow

unread,
Aug 13, 2001, 9:18:42 PM8/13/01
to
Mok-Kong Shen wrote:
>
> Stanley Chow wrote:
> >
> > I like to point out a practical problem - what is the correct margin?
> >
> > In many engineering disciplines, there are accepted rules-of-thumb
> > but they are mostly based on accurate measurements of strength and
> > good understandings of failure modes (or long history of analysis
> > of broken systems).
>
> But in all engineering disciplines there are constraints
> of economy and technology (including time of construction).
> One doesn't build e.g. a bridge that could withstand
> earthquakes of the highest value on the Richter scale
> but accept reasonable (certainly arguable) 'compromises'.

The same constraints apply in crypto systems as well. I think
the two issues, constrants and margin, are orthogonal.


> > In crypto, it is inherently not possible to have an accurate
> > measurement of strength - it is only possible to show resistance
> > to known attacks. For any given cipher, it is always possible for
> > the next new attack to completely break it. In many ways, this is
> > due to our lack of understanding of failure modes - we just don't
> > know all the attacks!
>
> That's way I believe that one needs to be comparatively
> more conservative in crypto than in engineering. Of
> course, that's in the end an 'subjective' issue, not
> amenable to rigorously objective assessment.

If you know how to accurately measure strength, you don't need to
allow big margins. Some would say this is how NSA designs.

If you don't know how to measure, then it is doubtful you know how
to compare strength, or to decide what is enough strength. Which
means you don't know how much margin you need, and you don't know
how to translate that required margin into extra rounds needed.

Basically, if you don't know how to measure strength, you shouldn't
be in the game. (I think this is a paraphrase of Douglas Gwyn's
"bogus notion" that started this sub-thread).

John A. Malley

unread,
Aug 14, 2001, 1:21:57 AM8/14/01
to
Stanley Chow wrote:

[...]

> If you know how to accurately measure strength, you don't need to
> allow big margins. Some would say this is how NSA designs.

Suppose cipher "strength" is quantified and we validate a given cipher
design as implemented in hardware/software has this measure of
strength. The measured strength might drop (below the accepted minimum
strength) in the presence of one or more hardware failures probable in
the field life of the equipment. If we engineer some margin of strength
above the minimum into the cipher (by its design) we could ensure the
fielded system remains at or above some minimum strength even in the
face of probable hardware failures. The probability of hardware failure
is quantifiable so we can gauge the risk (over the life of the fleet of
deployed cipher system implementations) of any single cipher unit
malfunctioning below the minimum acceptable standard of strength.

No margin of strength with respect to the impact of hardware failures
puts the deployed system at greater risk of compromise due to faults,
doesn't it?

Does anyone know if the developers of field crypto devices for our
world's military forces design for robustness like this?

John A. Malley
10266...@compuserve.com

Douglas A. Gwyn

unread,
Aug 14, 2001, 10:39:19 AM8/14/01
to
Stanley Chow wrote:
> c) project likely growth of known but incomplete attacks, and
> resist them.
> (I assume that David does not disagree with this, but that he would
> add the extra criteria of safty margin).

My point is that there is no justification for the notion that
increasing the number of rounds adds any appreciable amount of
protection against unknown methods of attack.

I bet there is a fixed-point theorem that says that past a
certain point it would even be harmful.

> ... No amount of extra rounds can make up for lack of knowledge. ...
> ... I fully expect new attacks to happen and for the knowledge to


> grow, but at this point of the knowledge curve, there is enough
> science.

"Enough science" for what? Not for increasing the number of rounds
beyond necessity.

Douglas A. Gwyn

unread,
Aug 14, 2001, 10:43:49 AM8/14/01
to
" John A. Malley" wrote:
> No margin of strength with respect to the impact of hardware failures
> puts the deployed system at greater risk of compromise due to faults,
> doesn't it?

The kind of hardware failures that most commonly occur would not be
ameliorated by increasing the number of rounds. Calling that a
"margin of strength" is badly misleading.

> Does anyone know if the developers of field crypto devices for our
> world's military forces design for robustness like this?

There are various safeguards.

Douglas A. Gwyn

unread,
Aug 14, 2001, 10:41:25 AM8/14/01
to
Stanley Chow wrote:
> Basically, if you don't know how to measure strength, you shouldn't
> be in the game. (I think this is a paraphrase of Douglas Gwyn's
> "bogus notion" that started this sub-thread).

It's somewhat more extreme than I intended, but the idea is right.
Basically, I was warning against getting a warm, fuzzy feeling of
security based on a decision that is not justified by good logic.

Stanley Chow

unread,
Aug 14, 2001, 11:52:45 AM8/14/01
to
" John A. Malley" wrote:
>
> Stanley Chow wrote:
>
> [...]
>
> > If you know how to accurately measure strength, you don't need to
> > allow big margins. Some would say this is how NSA designs.
>
> Suppose cipher "strength" is quantified and we validate a given cipher
> design as implemented in hardware/software has this measure of
> strength. The measured strength might drop (below the accepted minimum
> strength) in the presence of one or more hardware failures probable in
> the field life of the equipment. If we engineer some margin of strength
> above the minimum into the cipher (by its design) we could ensure the
> fielded system remains at or above some minimum strength even in the
> face of probable hardware failures. The probability of hardware failure
> is quantifiable so we can gauge the risk (over the life of the fleet of
> deployed cipher system implementations) of any single cipher unit
> malfunctioning below the minimum acceptable standard of strength.

Just to clarify - there are two kinds of margins. John is talking
about "margins for known faults". I am echoing Douglas in opposition
to the "margins for *UN*known faults".

I am totally for what John is proposing, and I would expect a
complete fault-tree analysis (or equivalent) combined with product
and environment to decided the correct margins. I don't think anyone
disagrees with this part.


> No margin of strength with respect to the impact of hardware failures
> puts the deployed system at greater risk of compromise due to faults,
> doesn't it?

The question is really on "margins for unknown faults". There
seems to be no real justification for those.

Tom St Denis

unread,
Aug 14, 2001, 12:04:55 PM8/14/01
to

Why you pack extra socks when going on vacation?

Tom

Tom St Denis

unread,
Aug 14, 2001, 12:19:16 PM8/14/01
to
Tom St Denis wrote:
>> The question is really on "margins for unknown faults". There
>> seems to be no real justification for those.
>
>
> Why you pack extra socks when going on vacation?

Oops... missing a word there..

"Why *would* you pack extra socks when going on vacation?"

Tom

Stanley Chow

unread,
Aug 14, 2001, 1:58:16 PM8/14/01
to
"Douglas A. Gwyn" wrote:
>
> Stanley Chow wrote:
> > c) project likely growth of known but incomplete attacks, and
> > resist them.
>
> My point is that there is no justification for the notion that
> increasing the number of rounds adds any appreciable amount of
> protection against unknown methods of attack.

Man, my writing must be really poor - I am trying to support your
point by showing that it does not make sense to just blindly add
more rounds.

> > ... No amount of extra rounds can make up for lack of knowledge. ...
> > ... I fully expect new attacks to happen and for the knowledge to
> > grow, but at this point of the knowledge curve, there is enough
> > science.
>
> "Enough science" for what? Not for increasing the number of rounds
> beyond necessity.

"Enough science" in the sense that the (collective) knowledge in the
open crypto community is now enough to do a good accessment of the
security of a crypto-system. Until there is enough science, people
have to rely on black magic incantation like more rounds are better;
when there is enough science, you do better by relying on analysis.
(As Brian Gladman points out in another post, the open community may
well have caught up to the NSA, at least in some areas.)

Tom St Denis

unread,
Aug 14, 2001, 2:06:49 PM8/14/01
to

I don't think the negative connotation is warranted. We don't know the
exact effects of asprin on the nervous system is but we still prescribe it.

The idea "more rounds are better" is typically a good idea when the
rounds are implemented correctly. It's about as much science as we
have, aside from the hordes of analysis possible.

Tom

Douglas A. Gwyn

unread,
Aug 14, 2001, 4:34:59 PM8/14/01
to
Tom St Denis wrote:
> The idea "more rounds are better" is typically a good idea
> when the rounds are implemented correctly.

Why?

> It's about as much science as we have, aside from the hordes
> of analysis possible.

I am waiting for a "scientific" answer to the question "why?".
(Of course we're talking about rounds beyond the number known
to be needed to thwart specific known attacks.)

Does cyanoacrylate adhesive bind better when you pour more
and more onto the surfaces being joined? More is not always
better.

Tom St Denis

unread,
Aug 14, 2001, 5:04:29 PM8/14/01
to
Douglas A. Gwyn wrote:
> Tom St Denis wrote:
>
>>The idea "more rounds are better" is typically a good idea
>>when the rounds are implemented correctly.
>>
>
> Why?

If the cipher is a markov chain ...

>>It's about as much science as we have, aside from the hordes
>>of analysis possible.
>>
>
> I am waiting for a "scientific" answer to the question "why?".
> (Of course we're talking about rounds beyond the number known
> to be needed to thwart specific known attacks.)

Take a course in finite mathematics. Then see the affect of the
probability of success on longer markov chains. It's not exacting but a
good place to look at.

For instance, if you find a 1R attack that has a prob of 1/100 of
working. Obviously 30 rounds are better than 10 rounds at making the
attack inefficient.

That means nothing on the whole for "total" security. Since a new
attack may have a prob of 1/5 for 1 round.... but in the absense of a
"best attack prob = " equation for all ciphers... :-)


> Does cyanoacrylate adhesive bind better when you pour more
> and more onto the surfaces being joined? More is not always
> better.

I never said more is ALWAYS better. I said in this case it could be.

Tom

Tom St Denis

unread,
Aug 14, 2001, 5:07:16 PM8/14/01
to
Tom St Denis wrote:
> Douglas A. Gwyn wrote:
> Take a course in finite mathematics. Then see the affect of the
> probability of success on longer markov chains. It's not exacting but
a
> good place to look at.

Eeegad that's rude. Sorry Douglas. I didn't mean any disrespect there.

Tom

wtshaw

unread,
Aug 14, 2001, 8:58:28 PM8/14/01
to
In article <3B7968E6...@yahoo.com>, Tom St Denis
<tomst...@yahoo.com> wrote:


> ... We don't know the

> exact effects of asprin on the nervous system is but we still prescribe it.
>
> The idea "more rounds are better" is typically a good idea when the
> rounds are implemented correctly. It's about as much science as we
> have, aside from the hordes of analysis possible.
>
> Tom

Two wrong statements, both based on incomplete, bad, or almost no
information on your part. With apoligies requested for my being honest,
it might suprise you that those who have worked intensively in both of
these areas have more than a simple clue about them which you don't.
These two areas: I understand them personally rather well.
--
I need a clone.

Tom St Denis

unread,
Aug 14, 2001, 8:52:14 PM8/14/01
to

Never trust science fully. A good rule to live by. Otherwise we would
be all eating Beef galore and etc... [insert funny things].

If you think medicine knows all there is to know about asprin then you
are more ignorant than I am.

Tom

John Savard

unread,
Aug 14, 2001, 10:51:09 PM8/14/01
to
On Sun, 12 Aug 2001 09:38:00 GMT, "Douglas A. Gwyn" <DAG...@null.net>
wrote, in part:

>Many
>others have adopted the same notion of security-as-a-smooth-function-
>of-number-of-rounds. Perhaps there is *some* simple encryption
>scheme where that property can be proven, but so far as I am aware
>it is usually just assumed without even an inkling of proof.

Well, I'm willing to fumble in the dark.

My advice would be that if the encryption scheme _is_ simple -
specifically, if there is any risk that the round function might be a
group (!) or close to one - adding more rounds might not help.

But if the round function is of a complex and heterogenous nature,
adding more rounds is more likely to be helpful.

John Savard
http://home.ecn.ab.ca/~jsavard/index.html

John Savard

unread,
Aug 14, 2001, 10:54:35 PM8/14/01
to
On Tue, 14 Aug 2001 14:43:49 GMT, "Douglas A. Gwyn" <DAG...@null.net>
wrote, in part:
>" John A. Malley" wrote:

>> No margin of strength with respect to the impact of hardware failures
>> puts the deployed system at greater risk of compromise due to faults,
>> doesn't it?

>The kind of hardware failures that most commonly occur would not be
>ameliorated by increasing the number of rounds. Calling that a
>"margin of strength" is badly misleading.

You have made an important point here; a hardware failure can
compromise key, and increasing the number of rounds doesn't help (it
gives more chances for the failure).

Of course, the failure rates of microchips are usually considered to
be so low as to be not worth being concerned about, but if one is
concerned enough to use long keys, one has to be balanced.

John Savard
http://home.ecn.ab.ca/~jsavard/index.html

John Savard

unread,
Aug 14, 2001, 10:56:30 PM8/14/01
to
On Tue, 14 Aug 2001 14:39:19 GMT, "Douglas A. Gwyn" <DAG...@null.net>
wrote, in part:

>My point is that there is no justification for the notion that


>increasing the number of rounds adds any appreciable amount of
>protection against unknown methods of attack.

>I bet there is a fixed-point theorem that says that past a
>certain point it would even be harmful.

Assuming the key is fixed in length, and the key schedule is merely
prolonged without changes, yes, that may well be true. But it might
happen after, say, 2^28 rounds for DES, for example.

John Savard
http://home.ecn.ab.ca/~jsavard/index.html

John A. Malley

unread,
Aug 15, 2001, 1:29:36 AM8/15/01
to
"Douglas A. Gwyn" wrote:
>
> " John A. Malley" wrote:
> > No margin of strength with respect to the impact of hardware failures
> > puts the deployed system at greater risk of compromise due to faults,
> > doesn't it?
>
> The kind of hardware failures that most commonly occur would not be
> ameliorated by increasing the number of rounds. Calling that a
> "margin of strength" is badly misleading.

Agreed, but there's probably some margin of strength with respect to the
impact of hardware failures through *some* means. As you pointed out
there's "no justification for the notion that increasing the number of


rounds adds any appreciable amount of protection against unknown methods

of attack" so we can't assume blindly jacking up the number of rounds in
a Feistel cipher gives us security cushion against the unknown- like
unanticipated hardware failure modes or EMI or new cryptanalytic
attacks.

I expect crypto-professionals account for the effects of the known
probable failure modes of underlying hardware on their cipher's strength
and choose hardware components in the design accordingly - my notions on
what/how they do it are vague. I "guess" something like redundant,
dissimilar (different FPGA/ASIC layouts, different physical chips, but
logically equivalent) hardware "functional blocks" with output
comparison/voting to generate ciphertext and key(stream), Built-In-Test
to isolate/identify faulty or suspect hardware RNGs, may even
tamper-resistent chips that self-destruct when removed for unauthorized
study. Dissimilar redundant hardware approaches offer unquantifiable
protection against unanticipated failure modes like design errors,
manufacturing errors, radiation/EMI susceptibility and aging effects in
a particular chip type.

>
> > Does anyone know if the developers of field crypto devices for our
> > world's military forces design for robustness like this?
>
> There are various safeguards.

Are you at liberty to elaborate? :-)


John A. Malley
10266...@compuserve.com

wtshaw

unread,
Aug 15, 2001, 2:31:30 AM8/15/01
to
In article <3B79C7ED...@yahoo.com>, Tom St Denis
<tomst...@yahoo.com> wrote:

> Never trust science fully. A good rule to live by. Otherwise we would
> be all eating Beef galore and etc... [insert funny things].

Science is not a monolith. It does not speak to advertising except to
explore the bad logic that is used to hype financially attractive ideas.
I don't trust people whose goal is to hide the truth rather than reveal
it. And, you...


>
> If you think medicine knows all there is to know about asprin then you
> are more ignorant than I am.

Aspirin and I are enemies. It cannot help me but it can kill me. One
doctor that I have doesn't understand but another does and pulls expert
rank. Learning about aspirin for me is a question of survival. It is
also the drug that killed my mother. Touted as a safe theraputic, it is
not.

Aspirin is, fittingly for this discussion, an example where beyond
specified limits even for most people, too many rounds will cause it to
turn against you.

Mok-Kong Shen

unread,
Aug 15, 2001, 8:08:26 AM8/15/01
to

John Savard wrote:


>
> On Tue, 14 Aug 2001 14:39:19 GMT, "Douglas A. Gwyn" wrote:
>
> >My point is that there is no justification for the notion that
> >increasing the number of rounds adds any appreciable amount of
> >protection against unknown methods of attack.
>
> >I bet there is a fixed-point theorem that says that past a
> >certain point it would even be harmful.
>
> Assuming the key is fixed in length, and the key schedule is merely
> prolonged without changes, yes, that may well be true. But it might
> happen after, say, 2^28 rounds for DES, for example.

Sooner or later, though, the law of diminishing return
sets in and it would be more profitable to consider
other ways of increasing strength, including re-design,
multiple encryption, etc. (though multiple rounds is
just a 'special' case of multiple encryption).

M. K. Shen

Douglas A. Gwyn

unread,
Aug 15, 2001, 4:15:21 PM8/15/01
to
Tom St Denis wrote:
> Take a course in finite mathematics. Then see the affect of the
> probability of success on longer markov chains. It's not exacting but a
> good place to look at.

That's not relevant.

> I never said more is ALWAYS better. I said in this case it could be.

I'm still waiting to hear a valid reason.
Frankly, I think it's mere superstition.

Douglas A. Gwyn

unread,
Aug 15, 2001, 4:08:25 PM8/15/01
to
John Savard wrote:
> Of course, the failure rates of microchips are usually considered to
> be so low as to be not worth being concerned about, ...

However, in some cases an adversary can *induce* failures.
There was even a paper about this (re. smart cards) not too
long ago.

David Wagner

unread,
Aug 16, 2001, 5:41:35 PM8/16/01
to
Douglas A. Gwyn wrote:

>David Wagner wrote:
>> I'd certainly be interested to hear. If there aren't any concrete ideas
>> for improvement to be had, what teeth can any criticism have?
>
>Well, somebody should point out that you're merely fumbling in the dark:

Yes, of course. I know that. Do you have any alternative? I don't.

David Wagner

unread,
Aug 16, 2001, 5:42:55 PM8/16/01
to
Kurt Shoens wrote:
>Is Serpent more secure than Rijndael?

When people say something like "Serpent is more secure than Rijndael",
that's really just short-hand for a belief that Serpent is less likely
to be insecure (given our current knowledge) than Rijndael is. This is
a subjective judgement of the probability that some unanticipated shortcut
attack is discovered.

David Wagner

unread,
Aug 16, 2001, 5:45:06 PM8/16/01
to
Stanley Chow wrote:
>If you know how to accurately measure strength, you don't need to
>allow big margins. Some would say this is how NSA designs.
[...]

>Basically, if you don't know how to measure strength, you shouldn't
>be in the game.

Sure, we'd all love to know how to measure strength.
But, we don't have any such measure. So, what are we supposed
to do? Give up and go home? That's not an option: people have
real systems that they really need to secure as best as possible.
Given this, we have to "fumble in the dark" as best as possible,
and the real question is how to minimize the likelihood that our
systems get broken.

If you have any suggestions for how to measure strength, I'd
certainly love to hear them, but in absence of any suggestions,
I don't see how we have any choice.

David Wagner

unread,
Aug 16, 2001, 5:47:29 PM8/16/01
to
Douglas A. Gwyn wrote:
>My point is that there is no justification for the notion that
>increasing the number of rounds adds any appreciable amount of
>protection against unknown methods of attack.

There is no proven justification that anything we have is secure.
If we waited for proof before deploying crypto, everything would
be insecure. In absence of proof, we have to go with our best
engineering judgement. And there *are* good reasons to believe
that increasing the number of rounds decreases the likelihood of
unanticipated methods of mathematical shortcut attacks. (Note
that it only stops mathematical attacks; I don't expect increasing
the number of rounds to stop side-channel attacks, for instance.)

David Wagner

unread,
Aug 16, 2001, 5:48:36 PM8/16/01
to
Douglas A. Gwyn wrote:
>I'm still waiting to hear a valid reason.
>Frankly, I think it's mere superstition.

It is based on historical experience. I'd rather have a proof
than a vague argument based on past experience, but past experience
is better than nothing (and better than superstition).

David Wagner

unread,
Aug 16, 2001, 5:50:04 PM8/16/01
to
Douglas A. Gwyn wrote:
>I bet there is a fixed-point theorem that says that past a
>certain point it would even be harmful.

I don't believe it. If your key schedule can be viewed as
producing independently distributed round subkeys, you can
show that adding more rounds can't make your system substantially
weaker. (If it did make the system weaker, an attacker could
add extra rounds himself and simulate the extended-round system
on his own, due to the independence property of the round subkeys.)

Douglas A. Gwyn

unread,
Aug 16, 2001, 6:06:47 PM8/16/01
to
David Wagner wrote:
> Douglas A. Gwyn wrote:
> >My point is that there is no justification for the notion that
> >increasing the number of rounds adds any appreciable amount of
> >protection against unknown methods of attack.
> There is no proven justification that anything we have is secure.
> If we waited for proof before deploying crypto, everything would
> be insecure. In absence of proof, we have to go with our best
> engineering judgement. And there *are* good reasons to believe
> that increasing the number of rounds decreases the likelihood of
> unanticipated methods of mathematical shortcut attacks.

I'm still waiting to hear those reasons.

The problem, as I see it, is that in the absence of knowledge,
people are deluding themselves (and their customers) by this
talk about "margin of safety". Somebody could trust that it
is really analogous to genuine engineering margin-of-safety,
thus depend too heavily on it. That's why I felt obliged to
object to the assumption; a false illusion of security can
in many cases be much worse than a true understanding that
there is an insecurity. (There are numerous examples in
cryptologic history, Enigma being perhaps the most famous.)

It is loading more messages.
0 new messages