Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Live from the First AES Conference

742 views
Skip to first unread message

dian...@tecapro.com

unread,
Aug 21, 1998, 3:00:00 AM8/21/98
to

This morning the First AES Conference started with a presentation
of NIST's Miles Smid and Jim Foti. Here are the main points:

The 15 candidate algorithms were officially announced. The mystery
one is MAGENTA submitted by Deutsche Telecom. Only 5 are from the
U.S. the other 10 are international including some from Canada,
France, Belgium, Germany, Japan, Israel, Corea and Costa Rica
(me).

The analysis period of the 15 algorithm starts August 20, 1998 and
ends February 1, 1999. Approximately five finalists will be chosen
by NIST and late March 1999 the Second AES Conference will take
place. After another nine months or so of public review the AES
will be selected and the Third AES Conference will take place. We
are now in the year 2,000. After that the formal FIPS process will
start. The NIST people made very clear the they would be the ones
doing all the selecting.

The public review process NIST has designed is very interesting:
it has one informal free-format thread implemented through
newsgroups that NIST will create for each individual candidate
(see you at FROG's). Formal comments will be sent to NIST and are
supposed to discuss the algorithms themselves, the evaluation
criteria, objective comparisons, etc.

NIST has already designed a new web site (www.nist.gov/aes) with
all these goodies. There you can find forms for ordering two CDs:
CD-1 has the complete package of the 15 submissions minus source
code and Test Vectors (but these will be available on NIST's site
anyway). CD-2 has the source code and will be available by next
September.

After that came the coffee break. Leaving the hall there was a
table with piles of papers and everybody seemed to want a copy.
One of these was Schneier's cryptanalysis of FROG. I didn't get
one because I already had it. So, for better or worse, FROG will
be noticed at the Conference.

Before lunch two algorithms were presented. The format is 30-35
minutes of presentation followed by 10 more of questions. Carlile
Adams from Canada presented CAST256 a "classical" cipher that
passed several evolutionary stages and is well polished and
analysed. Then DFC from France, presented by Serge Vaudenav. What
is interesting about this cipher is that it is based on proofs
about its strength against differential and linear attacks - but
not on higher order attacks.

At lunch I sat at the same table as two guys from IBM. I spoke
quite a bit with one of them about MARS. I asked how much effort
went into the cipher - they mentioned (if I understood correctly)
an estimate of 1,000 meetings - which is a lot. He told me that a
disadvantage of the AES process is that design teams from
different competitors could not consult freely with each other
because they were afraid that the other team might steal a good
idea.

It turned out that the IBMer I talked with at lunch was Sahi
Halevi who presented MARS immediately after that. The most
interesting aspect of MARS is that it wraps a diffusion layer
around a cryptographic core. He mentioned that this variability of
logic is a possible defense against *unknown* attacks, a theme
that is normally tabu in a field where almost all work in design
is concentrated in defending against known attacks. He said that
they specifically excluded from MARS anything that they could not
cryptanalyze, for example multiplication between data. Overall he
gave a very clear, lucid presentation. Everybody knows the story
of DES so he got questions like: does the MARS design include any
not published criteria? (answer: No), did anybody from the outside
help them design it? (answer: No), how can he show that there are
no trap-doors present (answer: most design follows clear criteria
but there is always a necessary element of trust too.)

After that came MAGENTA presented by a young PhD student who was
not very experienced. MAGENTA is a strange cipher in many ways: it
is quite complex, does not use S-boxes, and has only two rounds of
Feistel (if I understood correctly). The algorithm appeared to be
one order of magnitude slower than everybody else - he mentioned a
hardware card capable of encryption 1Mbit per second. After he
finished he got so many hard questions - you wouldn't believe. I
mean they really tore into him, sometimes putting up traps for him
to fall into. It got so bad that a few of the participants started
doing real time cryptanalysis and suggesting attacks that would
break the algorithm right there and then. I marvelled that the
German guy managed to keep his composure. The whole spectacle was
rather shameful - after all NIST had just announced eight months
for the analysis period and surely everybody will have enough time
to criticise to one's heart's content.

Then came the unpronounceable Rijndael presented by a very
unflappable Joan Daemen. The algorithm based on Square is not of
the Feistel kind - quite elegant and fast. It also uses only XORs
and byte substitutions exactly like FROG.

Other points of interest: There are almost 200 participants (I
have the list) including about 20 from NSA. NSA, by the way, is
never pronounced by name, it's always "they". Actually it is weird
to think about what they may be thinking - maybe they consider the
ciphers presented little more than toys. Who knows?

By the way, this is one informally dressed crowd: many were in
Tshirts, some in jeans, slippers, etc.

Of course, I didn't recognize anybody. Almost. Yesterday evening
while checking in at the hotel I saw the famous Bruce Schneier (I
recognized him from a picture in his web-site) but was too shy to
present myself. He is small, blond, has a pony-tail and dresses
very informally. He gives the impression of unbounded energy and
enthusiasm - usually he is surrounded by people. I did recognize
some famous names in the list of participants including Biham,
Zimmerman, Rivest, Shamir - unfortunately I could not find
familiar names from this newsgroup.

Tomorrow will be a long day with seven presentations including
myself at number six: LOKI97, DEAL, RC6, E2, SERPENT, FROG, and
FROG and Hasty Pudding were put back-to-back most probably by
chance. I am apprehensive about my presentation: I knew I had an
unconventional cipher but I wasn't aware of how unconventional - I
hadn't really looked into the other algorithms before coming here
and I found the ones presented today very close to the beaten
path.

--
http://www.tecapro.com
email: dian...@tecapro.com


--
http://www.tecapro.com
email: dian...@tecapro.com

-----== Posted via Deja News, The Leader in Internet Discussion ==-----
http://www.dejanews.com/rg_mkgrp.xp Create Your Own Free Member Forum

John Savard

unread,
Aug 21, 1998, 3:00:00 AM8/21/98
to
dian...@tecapro.com wrote, in part:

> Actually it is weird
> to think about what they may be thinking - maybe they consider the
> ciphers presented little more than toys. Who knows?

That may be true.

However, I have my own suspicions. They may well consider a 128-bit
keysize to be something worthy of a "toy" cipher...

but they may also consider that the complexity of at least some of the
designs, MARS for example, as being far beyond any realistic threat of
cryptanalysis. (As I've noted, I wouldn't be at all surprised if
'they' find FROG to be closer to the sort of thing they use than are
many of the other candidates.)

John Savard
http://members.xoom.com/quadibloc/index.html

dian...@tecapro.com

unread,
Aug 22, 1998, 3:00:00 AM8/22/98
to

This was a long day.

It started with the announcement of the Second AES Conference.
After building some expectation Miles Spid announced it will be
held in Rome, Italy, March 22-23 1999, at the Hotel Quirinale,
near the Coliseum - according to Miles an apt metaphor for the fact
that there 10 out of the 15 candidates will be eaten alive.

After that came the presentation of LOKI97 presented by Jennifer
Seberry of Australia. This is another Feistel cipher based on the
original LOKI(89) design. An attack has been found on this cipher
which Jennifer plainly explained as well as the fact that it is
easily correctable.

Then came DEAL presented by Richard Outerbridge. This is a curious
candidate in the sense that the principal submitter is not the
cipher's author (Lars Knudsen). DEAL is particular in the sense
that it uses DES as a primitive - it uses between 6 and 8 DES
encryptions for each DEAL encryption. Apparently an attack against
DEAL has been found also - this brings the number of hit
candidates to 4: LOKI97, MAGENTA, DEAL and (for weak keys) FROG.

Then came the very even and fast spoken presentation of Ron
Rivest's RC6 - based of course on RC5. This is a smooth cipher -
conceptually simple and extremely fast to implement in software:
as fast as 2 CPU cycles per bit encrypted. Uses 32 bit
multiplication as does Mars. On the whole everybody felt this is a
very strong contender.

After lunch came E2, the Japanese Candidate, presented by a very
young looking Shiho Moriai. This one is a Feistel too. To me it
looks as a very carefully prepared candidate. She started pointing
out a whole series of known attacks for which their algorithm was
strong (differential, higher order differential, linear,
interpolation and partitioning). A high point came when she
presented the design of a 127 k gates die able to do 0.5 Gbps.

Then Eli Biham's Serpent. It is a very conservative design where
security is based on DES design philosophy and speed (this is an
interesting twist) is based on bit-slicing, the most efficient way
to implement DES in software. Here though, bit-slicing is not an
afterthought but rather an integral part of the design. After he
finished he was asked why he chose 32 rounds when his own analysis
showed that 16 rounds would have been sufficient. His answer
impressed me a lot: he simply wanted the added security - 16
rounds beyond what appears to be sufficient! I wished I had
thought in the same way while designing FROG.

After that came I. I was painful aware that my cipher was quite
off off off the beaten path - that it was not based on any
previous cryptographic experience - that I had done no
cryptanalysis of FROG - that it did not even pretend to be based
on math - in other words, that I was breaking about every rule
there is. Even so I found the crowd receptive, even laughing at my
jokes (prepared beforehand). Midpoint through my presentation I
did get defensive with Schneier's attack (I really think an attack
should work against more than a small fraction of the keys), but
this was not well received and it was a tactical mistake on my
part. I also felt that the crowd did not like my more unorthodox
ideas, such as that speed is not really important (by the way FROG
as submitted is going to be one of the faster candidates when I
optimize it for Pentium in assembler), or that a cipher should be
user-modifiable, and that a cipher should resist known attacks
without actually trying. As usual, I spoke too much and at the end
there was time only for very few questions. These were informal;
about why the name FROG and such. At the very least my ideas are
out in the open - most important cryptographers were present there
and if there is merit in any of my staff it should catch fire in
their minds.

By the way, before starting my presentation Schneier came to shake
my hand - this was quite decent. I returned the compliment
mentioning in public that everything I knew about cryptography
came from Schneier's book and then making clear that any errors in
FROG are therefore indirectly attributable to him.

Lastly came Tasty Pudding, presented by an older statesman type of
guy: Rich Schroeppel. I really don't know what to make of this
cipher. At least, like FROG, he didn't try to emulate anybody. But
it is complicated and openly in-elegant. He based his presentation
more on tricks the cipher can do than on anything else. The most
amazing part was the claim that it can encrypt fractional bits.
Here is the example he gave: Say you want to encrypt a digit
(0..9) which you cannot represent a whole number of bits; you put
it in 4 bits and encrypt them; if the result is not in 0..9 then
you encrypt it again; and so on until you get a digit result.
Let's see an example: you want to encrypt 5 -> 13 (this is not
good so let re-encrypt) -> 3. To decrypt 3 -> 13 (this cannot be
so let's re-decrypt) -> 5. Using this mechanism you can now
encrypt any data type into the same data-type: say dates into
dates, printable characters into printable characters, etc. Tasty
Pudding also uses "spice" a not indispensable secondary key that
does not require setup and can be changed on the fly. - On the
whole it is a weird beast and I wish my fellow dark horse the best
of luck. I also like the name.

After the conclusion of the day's work, interesting things
happened. First a guy from Switzerland approached me and actually
debated with me about FROG; I met a fellow Greek, Yiannis
Tsiounis; also a frequent poster in this newsgroup, Bryan Olson,
came to greet me. In general, I felt people were sympathetic or
even admired my courage for participating in the competition - but I
would rather have them admire my algorithm.

Finally, IBM's Shai Halevi came who, as luck would have it, knew
Yiannis, and the three of us went out for dinner. We talked a lot
about cryptography and I was delighted to bounce my crazy ideas
and preoccupation’s on these good people. There were some surprising
or merely interesting results and I intend to write about this in
another occasion.

Helger Lipmaa

unread,
Aug 22, 1998, 3:00:00 AM8/22/98
to
dian...@tecapro.com wrote:

: By the way, this is one informally dressed crowd: many were in


: Tshirts, some in jeans, slippers, etc.

Don't forget MY t-shirt. The Intellectual Property of Certicom was pretty
insulted because of that.

: Of course, I didn't recognize anybody. Almost. Yesterday evening


: while checking in at the hotel I saw the famous Bruce Schneier (I
: recognized him from a picture in his web-site) but was too shy to
: present myself. He is small, blond, has a pony-tail and dresses
: very informally. He gives the impression of unbounded energy and
: enthusiasm - usually he is surrounded by people.

Bruce is _very_ blond is does hopping around the room.

: I did recognize


: some famous names in the list of participants including Biham,
: Zimmerman, Rivest, Shamir - unfortunately I could not find
: familiar names from this newsgroup.

By some reasons, Shamir doesn't post here. Not sure why, but may be this
place is not interesting for him.

Helger Lipmaa
http://home.cyber.ee/helger; Phone +372-6542422

dian...@tecapro.com

unread,
Aug 23, 1998, 3:00:00 AM8/23/98
to

Today was the last day of the conference - it ended at noon. Today
it was revealed that the participants in the conference came from
26 countries - so this is a very international process.

First, Chae Hoon Lim from Korea presented Crypton. It is an
evolution of SQUARE and has 12 rounds. It was cryptanalyzed
against several attacks including truncated/higher-order
differential, multiple linear approximations, algebraic attacks
such as interpolation attacks. He explained that all S-boxes can
take, at maximum, values for differential and linear approximation
probabilities of 2^(-5) and 2^(-4) respectively and therefore the
8 round characteristic probability of Crypton is at most 2^(-160)
and that the best linear approximation probability is 2^(-128).
More about this later. He mentioned several times that this is a
work in progress, that he intended to improve this, he didn't much
like that, etc.

Then came Bruce Schneier with Twofish. His presentation was
excellent and you could see that he is used to write books and
explain things to people. Twofish is a 16 round Feistel based on
Blowfish. It creates four key-depended S-boxes by modifying two
fixed S-boxes. This is done very carefully and in the case of 128
bit keys all created S-boxes (4 x 2^(32)) were exhaustively
checked for quality - for larger key sides extensive Monte Carlo
tests were applied. Bruce plainly stated that he stole ideas from
Square (I believe this was the 4x4 matrix multiply over GF(256))
and from SAFER (the pseudo-Hadamard Transform used for diffusion).
In the same vain he mentioned that Twofish is amenable to operate
"in non-interoperable variants using a family key" mentioning that
this is something that FROG also does. This I liked very much.
Later on he mentioned that a very efficien versions of Twofish
"compiles" code - using the same words I used when describing what
generalized Frog does. So, ideas present in FROG are starting to
find their way into the mainstream of cryptography (I don't
necessary claim that they were introduced with FROG because I am
not sure it is true.)

Back to Bruce's presentation which was full of interesting and
educational points: He mentioned that very often in the design
process of Twofish they had to make choices between more complex
rounds against a larger number of rounds. For example, by taking
out the rotations included in Twofish they could add one more
round to the cipher with no loss of efficiency. What is very
interesting is that, in general, they found that more rounds of a
simpler structure are stronger than a few complex rounds. Another
interesting point he mentioned is that the internal cipher design
and the key-schedule are interdependent - you cannot design the
two independently and then stick them together. Yet another
interesting point is that "in cipher design it pays to put many
eggs in few baskets" (he was explaining the fact that his key
setup re-uses cryptographic primitives already there), because you
make sure that each of these few baskets is strong enough.

Twofish is fast, practically as fast as RC6 in the NIST standard
platform, achieving about 2.2 CPU cycles per bit. It uses (or can
use) only byte wide, "RISC-like" instructions as do several other
candidates and is therefore efficient on 8-bit processors too.
What he pushed is the idea that Twofish offers the best balance
between security, speed *and* implementation flexibility. He
showed that there are many trade-offs possible when you implement
Twofish depending on the memory available and speed needed. For
example there are several implementation models starting with
almost complete "on the fly" key schedule computation with minimum
memory and ending with the fastest versions that do a lot of
pre-computation. This flexibility can be used not only depending
on available memory but also depending on whether you are going to
encrypt many or few blocks with the same key. On the whole it is a
convincing argument, and I thought it will be an important point
because the AES *is* going to be used in operation environments
that are desperately different.

The last cipher presented was SAFER+ of James Massey, one of the
original authors of IDEA. It is based on the older SAFER designs
with improved key schedule. A unique feature of SAFER+ is that its
number of rounds depends on the length of user key - this was done
for reasons related to the key setup procedure, but also, if you
think about it, makes sense in general. This clearly is a good
design, but on the whole one got the impression that no as much
effort was invested in this candidate as in some of the others.
For example much was left vague about the cipher's speed. SAFER+,
by the way is Cylink's candidate.

After all presentations were over, a quite substantial general
discussion started where several points were touched:

The AES will be available world-wide and royalty-free. Not so the
AES candidates. One point touched was related to good AES
candidates an application developer might want to integrate in a
product *before* the AES is declared. He specifically mentioned as
an example RC6 and the fact that RSA in the past have pushed their
patents very aggressively. Ron Rivest sat through the whole
discussion, quiet and unperturbed as a Buddha, and didn't take the
bait. NIST of course cannot do anything in this matter - what all
candidates had to agree beforehand is not to claim any rights on
their cipher *if* it is declared the winner. By the way several
candidates are already in the public domain, including Twofish,
SAFER+ and, yes, FROG. Mars and RC6 are definitely not, at least
not for commercial use.

Another sticky point is the matter of patent claims on parts of
the candidate algorithms. Somebody proposed that NIST insist that
at least the AES candidates renounce any patent rights they may
have on parts of other candidates' algorithms. The suggestion was
made that NIST could actually demand this. Miles Smid was not
enthusiastic about the idea, clearly feeling that rules should not
be introduced on the fly. My personal opinion is that the
technical problem of choosing the AES is already difficult enough
and that, de facto, NIST can and should impose anything that will
serve to bring the whole thing into a desirable end. Nobody wishes
to have a technical competition being entangled with legal
squabbling. (Imagine some candidate thinking: if the other guy
wins then I shall take them to court.)

A suggestion was made to try to device a method for comparing
candidate ciphers in a more meaningful way. The point is that
there is a fundamental trade-off between security and speed.
Several candidates included a few additional "unnecessary" rounds
in their ciphers for security's sake - Serpent went as far as to
double them. So the suggestion was made to compare ciphers'
security using the same number of rounds. I think this is a very
thoughtful suggestion because otherwise we will not be comparing
apples with apples. This was not really discussed but I think
it will be an important point in the evaluation process.
Maybe a better idea is not to fix the number of rounds (different
ciphers use rounds of different complexity) but rather fix the
speed on the standard platform. The fastest speed right now is
about 2 CPU cycles per bit. Let's ask candidates to produce their
fastest possible code in assembler and compare the security of
their ciphers when they are cut down (or extended up) to 1 cycle
per bit, 1.5 cycles, 2.0 cycles, 3.0 cycles, 5.0 cycles, 10.0
cycles, etc. This is somehow messy - a candidate may claim that,
Mozart-like, you cannot take anything out of his design. Also
cipher "security" is not really something that can be measured -
but then procedures for measuring "security" can be defined - in
fact, roughly, most cryptographers do agree on how this should be
done.

An extremely interesting point came when it was announced that at
the Crypto conference (to be held a few days later) a paper will
be presented that shows that the method that is traditionally used
to estimate a cipher's strength against differential attacks is
not always appropriate. I understood this was a critique against
some of the estimates included in the AES submissions. Amazingly,
they claimed that they had found a differential attack against 31
(out of the 32) rounds of Skipjack that is more efficient than
exhaustive search. (By the way and possibly with no relation to
the previous point, a recurrent theme throughout the conference
was that there are many ways to define "difference".)

On the whole everybody seemed to be quite content about the AES
competition, its openness and levelness, and about the balance
that NIST had reached with the requirements and with the
organization. Significantly, the suggestion was made that NIST
open a new competition for a standard hash function. Miles
responded that this is a possibility and informed that there is,
in fact, a process between NIST and IBM for defining both a PRNG
as well as a true RNG - and that in principle people could make
suggestions. The AES process is a competition and maybe this will
serve as a better model for defining other wide ranging standards
in the future - instead of the traditional way which is either by
bureaucratic committees or imposed by the winner of the market
game, which is not always level.

So the First AES Conference is past, and to me personally it was a
very interesting experience even though I now have an injured
cipher. I was particularly pleased when several people expressed
to me today that they had enjoyed my presentation, that they
agreed with this idea, did not agree with that one, etc.

This will not be my last report about the Conference. Many of the
most interesting things I learned here came from private
discussions with top participants - I intend to include
these in one last post.

By the way, Sandy Harris noted an error in my last report. I
called the HPC, "Tasty Pudding"; actually it is "Hasty Pudding".
Sorry about that, but the presentation used many suggestive
culinary concepts such as "spice".

-----== Posted via Deja News, The Leader in Internet Discussion ==-----

John Savard

unread,
Aug 24, 1998, 3:00:00 AM8/24/98
to
dian...@tecapro.com wrote, in part:

> Actually it is weird
> to think about what they may be thinking - maybe they consider the
> ciphers presented little more than toys. Who knows?

That may be true.

W T Shaw

unread,
Aug 25, 1998, 3:00:00 AM8/25/98
to
In article <35e18bb5...@news.prosurfr.com>,
jsa...@tenMAPSONeerf.edmonton.ab.ca (John Savard) wrote:

>(As I've noted, I wouldn't be at all surprised if
> 'they' find FROG to be closer to the sort of thing they use than are
> many of the other candidates.)
>

Frog, a miracle product of 4 and a half days of work, at least met the
entrance requirements. It certainly presents a novel way of doing a
cipher, a program in the key. This would seem to be worth study in the
stream cipher direction.

I will admit that my notes on Frog consist largely of a sketch of one of
the creatures ready to jump from the page. But, given the utility made of
the time consumed, I wonder what could come from a more extensive effort.
I would encourage him to continue workeven as he is apt to strike out in
the present game.
--
----
Arm yourself. It's code wheels at twenty paces!
Decrypt with ROT13 to get correct email address.

Peter Gutmann

unread,
Aug 25, 1998, 3:00:00 AM8/25/98
to

dian...@tecapro.com writes:

>The most amazing part was the claim that it can encrypt fractional bits. Here
>is the example he gave: Say you want to encrypt a digit (0..9) which you
>cannot represent a whole number of bits; you put it in 4 bits and encrypt
>them; if the result is not in 0..9 then you encrypt it again; and so on until
>you get a digit result. Let's see an example: you want to encrypt 5 -> 13
>(this is not good so let re-encrypt) -> 3. To decrypt 3 -> 13 (this cannot be
>so let's re-decrypt) -> 5. Using this mechanism you can now encrypt any data
>type into the same data-type: say dates into dates, printable characters into
>printable characters, etc. Tasty

You can do that with any cipher, I posted a technique for doing this here
about 1 1/2 years ago and there were a number of followups and comments on it
(the thread was "Encrypting data with a restricted range of values", starting
on 23rd January 1997, Dejanews should have it). This is currently being used
with 3DES to protect data going over comms channels with very weird
properties (they have certain groups of characters which they'll pass, but
others which are either blocked or modified).

Peter.


dian...@tecapro.com

unread,
Aug 25, 1998, 3:00:00 AM8/25/98
to

This is my last report about the First AES Conference. Here I
describe various ideas that came out of informal discussions with
other participants. What follows is my personal understanding of
what I heard - also I may be mixing my own ideas in the brew -
therefore I will not mention any names.

1. On the relative security of public key cryptography against
symmetric key cryptography. Three very knowledgeable people, two
of which mainly work with public key cryptography, clearly stated
that they would be less surprised to see a cryptanalytic
breakthrough against public key than against symmetric ciphers.
This goes against the more prevalent view that public key
cryptography is slower but more secure "because it is based on
pure mathematics".

2. On the security of symmetric ciphers. Here I got mixed
impressions. On the one hand there is a great deal of confidence
that the better symmetric ciphers are very strong indeed and that
the probability that a even remotely workable attack (not a
theoretical attack) against them will be found is almost nil. On
the other hand the body language of the AES candidates presenters
did (I think) reveal worry in this sense. Most added "unnecessary"
rounds to their ciphers up to the extreme of Serpent that doubled
them - and this in a competition where speed seems to be of great
importance. Then again, maybe, they are only afraid of theoretical
attacks (certificational weaknesses) that may derail their chances
in the AES process.

3. On the matter of the unknown attacks. This factor is coming out
the closet - a memorable phrase was that designing a cipher is
"like fighting against ghosts". Several presenters (AES candidates
MARS, E2, TWOFISH) specifically discussed why their cipher may
resist unknown attacks, SERPENT implicitly tries to defend against
them, and, of course, FROG makes unknown attacks a central issue
in design methodology.

4. On the cryptanalysis of ciphers. I got the impression that modern
cryptanalysis tries to show the strength of a cipher testing
specific points (narrowly defined attacks) and generalizes from
that. Even so, heuristics are often used to move the process
forward and everybody is careful to state what an *estimate* of
the cipher's security against this or that type of attack is.
Differential attacks, for example, work by analyzing how
differences in plaintexts propagate throughout the cipher - but it
is not quite clear what the optimal definition of "difference" is
and this can vary between different ciphers: with RC6 the
difference with more penetrating capacity is not the traditional
XOR but rather subtraction. E2 presents a proof of its strength
against differential attacks but not against higher order
differentials. The situation can become complex because one can
imagine the existence of other complex functions (very different
from xor or subtraction) that remain relatively invariant within
the cipher operation and can therefore be used in a "differential"
chosen text attack.

5. On exotic computers that may exist in the future. It seems that
quantum computers as far as it is understood today can at most
optimize a linear search up to the square root of the size of the
space to be searched. In other words, if we use a 256 bit key
then, even theoretically, a quantum computer would need
sqrt(2^256) = 2^128 operations to implement a search. When and if
quantum computers become realizable a new set of mathematics will
be developed as a framework for their computational abilities.
Exactly as happened with classical computing, this new math will
probably be used in cipher design too. Nor should molecular
computers be a big worry: at most they will subtract 64 bits of
security from a cipher. After all 2^64 molecules require a lot of
space. So, it is algorithmic breakthroughs rather than exotic
hardware technologies we should worry about.

6. On the importance of speed. This I kept asking everybody about and
I am still not very convinced. Against the "old" idea that DES is
slow in software, in fact bit-slicing implementations are quite
fast. 3DES achieves today speeds of approximately 120 CPU cycles
per byte on a 32 bit computer, or 3 Mbytes per second of 400 MHz
Pentium. (This kind of DES speed, by the way, is the standard
against which the AES candidates will be measured). This kind of
speed is sufficient for most personal computer applications up to
compressed real-time video. With Moore's law still looking strong,
in a few years time we should have the capacity of over 10 Mbytes
per second of 3DES on a medium size PC. In the other extreme of
the spectrum, smart cards need only encrypt a few bytes at a time
- speed here is no big concern. So why the hurry? One response I
got is that throughputs will also increase; if you have a channel
of several Gbps (Gigabits per second) and you want to encrypt it
what do you do? This seems to me a very uncommon application - for
datastreams to arrive at this throughput it means they got
concentrated from several sources and they should have been
encrypted at the source; if the arrive un-encrypted at the
concentrator then if is already too late for security. There may
be applications where speed is really necessary - I am thinking
about pay-per-view cable TV where one may want to encrypt each
channel with a different password before sending it out - but
again these are very specific and isolated cases. In fact I
believe throughputs will *not* increase indefinitely in the
future: even crystalline phone (or video-phone) conversation does
not require that much throughput and the number of people and the
number of hours per day is limited too. What I am saying is that
speed based on advances of hardware technology probably will grow
faster then broad-band requirements.

7. On FROG. When I asked why should I worry about attacks that
require 2^50 plaintexts and only rarely work I got the plain
answer that when used with a 256 bit key this meant that FROG
sometimes would lose some 200 bits of security. Cryptography
abhors a flaw. Another argument I got about the long range
prospects of FROG was that even though the idea is interesting it
is too new and that ciphers based on older principles and designs
represented a more conservative choice for a new standard. By the
way, FROG is relatively fast and, my arguments against the
importance of speed notwithstanding, speed does represent a
relative advantage for my cipher.

8. On the importance of memory. It is important that the AES use
little memory in order to work well in smart cards, where RAM is
sometimes limited to 256 bytes. Here Moore's law does not apply
very well for the following reason: most smart card readers
operate at relatively high voltages (3V - 5 V). A more densely
populated chip must work at lower voltages and to decrease the
voltage in the card becomes difficult after some point because of
heat dissipation problems.

9. On who the favourites are. Of course it is far to early to discuss
this but it is being discussed nonetheless. The two "standard"
favourites are IBM's MARS and RSA's RC6. I heard TWOFISH mentioned
often. Personally I was impressed with E2 also, but this is a new
cipher.

10. On whether the civilization would come to an end if somebody
managed to break the AES in the year 2030. All thought definitely
not - I am still worried. At the very least such a scenario would
make the cost of fixing the Y2K bug look cheap.


That's all. Hope to see you in NIST's FROG newsgroup. If not, see
you in Rome next year.

W T Shaw

unread,
Aug 25, 1998, 3:00:00 AM8/25/98
to
In article <6rudtq$pc5$1...@scream.auckland.ac.nz>, pgu...@cs.auckland.ac.nz
(Peter Gutmann) wrote:

> dian...@tecapro.com writes:
>
> >The most amazing part was the claim that it can encrypt fractional
bits. Here
> >is the example he gave: Say you want to encrypt a digit (0..9) which you
> >cannot represent a whole number of bits; you put it in 4 bits and encrypt
> >them; if the result is not in 0..9 then you encrypt it again; and so on
until
> >you get a digit result. Let's see an example: you want to encrypt 5 -> 13
> >(this is not good so let re-encrypt) -> 3. To decrypt 3 -> 13 (this
cannot be
> >so let's re-decrypt) -> 5. Using this mechanism you can now encrypt any data
> >type into the same data-type: say dates into dates, printable
characters into
> >printable characters, etc. Tasty
>

> You can do that with any cipher, I posted a technique for doing this here
> about 1 1/2 years ago and there were a number of followups and comments on it
> (the thread was "Encrypting data with a restricted range of values", starting
> on 23rd January 1997, Dejanews should have it). This is currently being used
> with 3DES to protect data going over comms channels with very weird
> properties (they have certain groups of characters which they'll pass, but
> others which are either blocked or modified).

Any length character set or base can be used for encryption. The more
characters in the set, the more information in each information unit.
Rather than fractional bits, implying something smaller than a bit, there
is rather a lack of congruence between the sizes of the parts manipulated,
as in one trit, for base 3, equals 1.5849625... bits, for base two.

Kemal Bicakci

unread,
Aug 25, 1998, 3:00:00 AM8/25/98
to
Maybe it is not the right place to ask, but since I saw it in this
newsgroup I ask to you;

How can you change "From address"? What are the advantages of it?
If you use ROT13 only, how can you avoid getting unwanted emails!?
Do we need a special kind of mail program or we can make it in for example
"pine"?

Thanks.
Kemal

Karen E Cooper

unread,
Aug 25, 1998, 3:00:00 AM8/25/98
to
jgf...@EnqvbSerrGrknf.pbz (W T Shaw) writes:

>Frog, a miracle product of 4 and a half days of work, at least met the
>entrance requirements. It certainly presents a novel way of doing a
>cipher, a program in the key. This would seem to be worth study in the
>stream cipher direction.

>I will admit that my notes on Frog consist largely of a sketch of one of
>the creatures ready to jump from the page. But, given the utility made of
>the time consumed, I wonder what could come from a more extensive effort.
>I would encourage him to continue workeven as he is apt to strike out in
>the present game.


Yes, this is the remark I have heard many times in the last few days.
Take heart, Frog Man.

Karen. [Live from Crypto '98]

Helger Lipmaa

unread,
Aug 25, 1998, 3:00:00 AM8/25/98
to pgu...@cs.auckland.ac.nz
On Wed, 26 Aug 1998 pgu...@cs.auckland.ac.nz wrote:

> So what was your t-shirt? Mine are usually insulting to a certain large
> computer manufacturer, I can't think of one which would insult Certicom.

Inner joke. If you are from Certicom, don't feel offended :-). Mail me and
I'll explain you. Otherwise just forget it.

Helger Lipmaa
http://home.cyber.ee/helger

George Barwood

unread,
Aug 26, 1998, 3:00:00 AM8/26/98
to
On Tue, 25 Aug 1998 20:58:18 GMT, dian...@tecapro.com wrote:


> 2. On the security of symmetric ciphers. Here I got mixed
> impressions. On the one hand there is a great deal of confidence
> that the better symmetric ciphers are very strong indeed and that
> the probability that a even remotely workable attack (not a
> theoretical attack) against them will be found is almost nil. On
> the other hand the body language of the AES candidates presenters

> did (I think) reveal worry in this sense....

I think there is a sort of psychological paradox at work here.The more
confident one is about something, the more one tends to worry about
it.

For example, no reasonable person could be confident that there are no
bugs in Windows 98. So one does not worry too much about it.

On the other hand, a reasonable person will (hopefully) be able to be
reasonably confident that the AES winner has no significant
cryptographic weakness. Therefore it is the subject of much 'worry'.

In other words, it is the things which we feel most confident about
that we appear *not* to be confident about at all.

This can be unfortunate, in that a non-expert (say a journalist) may
get the wrong impression if he doesn't take this effect into account.

George (sorry if this sounds like nonsense - I've got insomnia)

Terry Ritter

unread,
Aug 26, 1998, 3:00:00 AM8/26/98
to

On Wed, 26 Aug 1998 02:34:10 GMT, in
<35e36f7f...@news.dial.pipex.com>, in sci.crypt
george....@dial.pipex.com (George Barwood) wrote:

>[...]


>For example, no reasonable person could be confident that there are no
>bugs in Windows 98. So one does not worry too much about it.
>
>On the other hand, a reasonable person will (hopefully) be able to be
>reasonably confident that the AES winner has no significant
>cryptographic weakness. Therefore it is the subject of much 'worry'.

I think a "reasonable" person *must* be *constantly* afraid that *any*
of our ciphers may have significant unfound weakness.

One can only be confident of cipher "strength" with respect to
specific *known* attacks. If *unknown* attacks exist, there is no
reason to be confident at all. Can we really say that no such attacks
can possibly exist?


>In other words, it is the things which we feel most confident about
>that we appear *not* to be confident about at all.
>
>This can be unfortunate, in that a non-expert (say a journalist) may
>get the wrong impression if he doesn't take this effect into account.

The journalist is right. There is a problem in cryptography, and it
is fundamental: We cannot prove the very property which our customers
rely upon, which is "overall strength." All we can describe is cipher
strength with respect to specific attacks. We cannot state that there
are no other attacks.

There is, of course, no absolute security, and any security system can
only be guaranteed to defend against *known* attacks. But the contest
between offense and defense in cryptography is far worse than in
physical conflict: At least in a military action one knows when a
base has been defeated, and also probably why. This means one can
take action to solve that problem.

But in cryptography, if our methods are penetrated, we probably will
not know. This means that we have no reason to change things, and
also gain no information about what to avoid in the next design.

This is not a problem of a "wrong impression," it is instead a real
problem which underlies all of cryptographic design, and, as far as I
can see, AES has not improved it much at all.

---
Terry Ritter rit...@io.com http://www.io.com/~ritter/
Crypto Glossary 1998-08-23: http://www.io.com/~ritter/GLOSSARY.HTM

Jerry Leichter

unread,
Aug 26, 1998, 3:00:00 AM8/26/98
to dian...@tecapro.com
| 4. On the cryptanalysis of ciphers. I got the impression that modern
| cryptanalysis tries to show the strength of a cipher testing
| specific points (narrowly defined attacks) and generalizes from
| that. Even so, heuristics are often used to move the process
| forward and everybody is careful to state what an *estimate* of
| the cipher's security against this or that type of attack is.
| Differential attacks, for example, work by analyzing how
| differences in plaintexts propagate throughout the cipher - but it
| is not quite clear what the optimal definition of "difference" is
| and this can vary between different ciphers: with RC6 the
| difference with more penetrating capacity is not the traditional
| XOR but rather subtraction....

The vagueness of definition for "difference" is fundamental: There is,
I would guess, always *some* notion of "difference" under which a given
system is "weak". To find it, construct a characteristic by subdividing
the outputs in some useful way, then find pairs of inputs that produce
the appropriate subdivisions. Such pairs probably always exist - but
they can be completely arbitrary. Finding them is almost certainly at
least as hard - in general - as brute-forcing the cipher.

Actually, it's probably possible to come up with a generalized notion of
security against differential attack this way: For some notion of
"reasonably computable" (RC) characteristics and "reasonably computable
difference functions", a cipher is secure no RC characteristic can be
induced using an RC difference function. The formalization of this
sounds fairly complex, and proving any given cipher has the property
sounds even more difficult. Decorrelation techniques - I forget which
of the AES candidates is based on them - are perhaps a first step in
this direction.
-- Jerry

Mark Wooding

unread,
Aug 27, 1998, 3:00:00 AM8/27/98
to
Jerry Leichter <leic...@smarts.com> wrote:

> Decorrelation techniques - I forget which of the AES candidates is
> based on them - are perhaps a first step in this direction.

That would be DFC, by Serge Vaudenay et al.

-- [mdw]

William Hugh Murray

unread,
Aug 27, 1998, 3:00:00 AM8/27/98
to Jerry Leichter
I would like to read your response to what I took to be the point of the
citation, i.e., the objective of modern cryptanalysis. In my lectures, I
have always used the definition that cryptanalysis is the art of finding the
message without benefit of the key. However, the work of Biham and Shamir
does not seem to have this as a purpose. Your comments and observations
please.

Jerry Leichter wrote:

> | 4. On the cryptanalysis of ciphers. I got the impression that modern
> | cryptanalysis tries to show the strength of a cipher testing
> | specific points (narrowly defined attacks) and generalizes from
> | that. Even so, heuristics are often used to move the process
> | forward and everybody is careful to state what an *estimate* of
> | the cipher's security against this or that type of attack is.
> | Differential attacks, for example, work by analyzing how
> | differences in plaintexts propagate throughout the cipher - but it
> | is not quite clear what the optimal definition of "difference" is
> | and this can vary between different ciphers: with RC6 the
> | difference with more penetrating capacity is not the traditional

> | XOR but rather subtraction....
>
> The vagueness of definition for "difference" is fundamental: There is,
> I would guess, always *some* notion of "difference" under which a given
> system is "weak". To find it, construct a characteristic by subdividing
> the outputs in some useful way, then find pairs of inputs that produce
> the appropriate subdivisions. Such pairs probably always exist - but
> they can be completely arbitrary. Finding them is almost certainly at
> least as hard - in general - as brute-forcing the cipher.
>
> Actually, it's probably possible to come up with a generalized notion of
> security against differential attack this way: For some notion of
> "reasonably computable" (RC) characteristics and "reasonably computable
> difference functions", a cipher is secure no RC characteristic can be
> induced using an RC difference function. The formalization of this
> sounds fairly complex, and proving any given cipher has the property

> sounds even more difficult. Decorrelation techniques - I forget which


> of the AES candidates is based on them - are perhaps a first step in
> this direction.

> -- Jerry


Terry Ritter

unread,
Aug 27, 1998, 3:00:00 AM8/27/98
to

On Thu, 27 Aug 1998 10:46:54 -0400, in
<35E5715D...@sprynet.com>, in sci.crypt William Hugh Murray
<whmu...@sprynet.com> wrote:

>[...]


>In my lectures, I
>have always used the definition that cryptanalysis is the art of finding the
>message without benefit of the key.

I agree that revealing the message is the ultimate goal. Intermediate
to that, however, one might work on finding the *key* -- which will
then reveal the message, of course.

>However, the work of Biham and Shamir
>does not seem to have this as a purpose. Your comments and observations
>please.

As far as I know, revealing message data *is* the purpose of
Differential Cryptanalysis.

One approach is to find some difference between two blocks which will
imply a particular key bit value with some probability. Then, over
the course of many tests (often assuming defined plaintext), that key
bit value can be detected statistically from under the masking noise
of the random transformation. This is essentially the detection of a
slight imbalance in the effect of a keying bit by comparing large
numbers of blocks which have a particular useful difference.

But I suspect that *any* way one could use the comparison between two
related plaintexts (or even ciphertexts) would *also* be a form of
Differential Cryptanalysis.

Mark Wooding

unread,
Aug 27, 1998, 3:00:00 AM8/27/98
to
William Hugh Murray <whmu...@sprynet.com> wrote:

> I would like to read your response to what I took to be the point of

> the citation, i.e., the objective of modern cryptanalysis. In my


> lectures, I have always used the definition that cryptanalysis is the

> art of finding the message without benefit of the key. However, the


> work of Biham and Shamir does not seem to have this as a purpose.
> Your comments and observations please.

You could modify your definition slightly: change `... without benefit
of the key' to `without prior knowledge of any keys used'.
Determination of keys is, I think, a legitimate cryptanalytic step, and
has obvious usefulness to an attacker attempting to subvert a
cryptosystem.

I'd go further myself, and say that the objective of a cryptanalyst is
to do anything that a given cryptosystem shouldn't let him do, whether
that be reading messages, impersonating someone, or determining that
Alice is older than Bob.

-- [mdw]

Douglas A. Gwyn

unread,
Aug 27, 1998, 3:00:00 AM8/27/98
to
Mark Wooding wrote:
> I'd go further myself, and say that the objective of a cryptanalyst is
> to do anything that a given cryptosystem shouldn't let him do, whether
> that be reading messages, impersonating someone, or determining that
> Alice is older than Bob.

That's a fairly useful way of characterizing it.
For example, cryptanalysis of a steganographic system
would be considered successful if it resulted in a
reliable test for the presence of a hidden message,
even if it didn't recover the message, because the
main purpose of steganography is to hide a message.

Alec Kosky

unread,
Aug 27, 1998, 3:00:00 AM8/27/98
to
In article <jgfunj-2508...@dialup83.itexas.net>,

jgf...@EnqvbSerrGrknf.pbz (W T Shaw) writes:
> Frog, a miracle product of 4 and a half days of work, at least met the
> entrance requirements. It certainly presents a novel way of doing a
> cipher, a program in the key. This would seem to be worth study in the
> stream cipher direction.

I've been playing with the idea of using a key in the implementation
of a stream cipher. A different key should lead to a different keystream
from the generator. I have a design that I'm playing with, but I haven't
had the time requred to thoroughly analyze it... Right now it is in the
design stage - school leaves little time for much else... Since it is my
senior project, I will have it analyzed fairly well in the not-too-distant
future (fingers crossed).

--Alec--

Lenz

unread,
Aug 27, 1998, 3:00:00 AM8/27/98
to
In article , William says...

>
>I would like to read your response to what I took to be the point of the
>citation, i.e., the objective of modern cryptanalysis. In my lectures, I
>have always used the definition that cryptanalysis is the art of finding the
>message without benefit of the key.

Your definition impresses me as slightly narrow, since it seems to exclude known
plaintext or chosen plaintext attacks.

On the other hand, showing that someone could retrieve a key in some system with
2**56 chosen plaintext attacks is substantially less than finding real messages
or keys. It might be open to debate if there is any practical use for this kind
of analysis. I think there is: It will be likely to influence the result of the
AES competition.

Lenz :-)

Jerry Leichter

unread,
Aug 28, 1998, 3:00:00 AM8/28/98
to whmu...@sprynet.com
| I would like to read your response to what I took to be the point of
| the citation, i.e., the objective of modern cryptanalysis. In my
| lectures, I have always used the definition that cryptanalysis is the
| art of finding the message without benefit of the key. However, the
| work of Biham and Shamir does not seem to have this as a purpose.

Cryptography has expanded beyond its traditional definition as "the art
of making messages inaccessible to someone lacking the right key".
Thus, authentication, signatures, cryptographically-secure checksums,
and so on, have all come to be considered part of cryptography.

If cryptanalysis is defined as some kind of inverse to cryptography -
the art of doing what the cryptographer is trying to render impossible -
than it has naturally expanded its domain to match.

Attacks on cryptographically-secure checksums come under the modern
rubric of cryptanalysis. Since no key at all is involved, your
definition obviously can't apply!

Word meanings shift. In this case, the shift in meaning is natural,
since as we've learned more about cryptography, we've learned about
interconnections among things that previously seemed unrelated. Thirty
years ago, there was (at least in public knowledge) no apparent
connections among cryptography, digital signatures (if they were even
conceived of), hash functions (known only for use in hash tables), or
protocol design (known only as an issue for error correction). Today,
we know that these are all intimately intertwined. Hence, the domains
of cryptography and "anti-cryptography" (aka cryptanalysis) have
expanded.
-- Jerry

John Savard

unread,
Aug 28, 1998, 3:00:00 AM8/28/98
to
al...@dakotacom.net (Alec Kosky) wrote, in part:

>In article <jgfunj-2508...@dialup83.itexas.net>,
> jgf...@EnqvbSerrGrknf.pbz (W T Shaw) writes:

yes, he did write,

and he did quote me

but nothing he quoted from me was reproduced, this is something he (W
T Shaw) said...


>> Frog, a miracle product of 4 and a half days of work, at least met the
>> entrance requirements. It certainly presents a novel way of doing a
>> cipher, a program in the key. This would seem to be worth study in the
>> stream cipher direction.

Some people who are inexperienced in reading headers might get
confused by this lack of trimming.

John Savard
http://members.xoom.com/quadibloc/index.html

Lincoln Yeoh

unread,
Aug 28, 1998, 3:00:00 AM8/28/98
to
On Thu, 27 Aug 1998 10:46:54 -0400, William Hugh Murray
<whmu...@sprynet.com> wrote:

>I would like to read your response to what I took to be the point of the
>citation, i.e., the objective of modern cryptanalysis. In my lectures, I
>have always used the definition that cryptanalysis is the art of finding the
>message without benefit of the key. However, the work of Biham and Shamir

>does not seem to have this as a purpose. Your comments and observations
>please.

Well I'm a real newcomer. So far what cryptanalysis seems to me is analysis
of cryptographical systems and methods, finding flaws, inefficiencies,
better understanding etc.

I believe there doesn't have to be a single objective of cryptanalysis.
It's just like other scientific analysis.

I propose that an objective of modern cryptanalysis is to improve
cryptography.

Link.
****************************
Reply to: @Spam to
lyeoh at @peo...@uu.net
pop.jaring.my @
*******************************

Owen Lewis

unread,
Aug 28, 1998, 3:00:00 AM8/28/98
to
In article <35E5715D...@sprynet.com>

whmu...@sprynet.com "William Hugh Murray" writes:

>I would like to read your response to what I took to be the point of the
>citation, i.e., the objective of modern cryptanalysis. In my lectures, I
>have always used the definition that cryptanalysis is the art of finding the
>message without benefit of the key. However, the work of Biham and Shamir
>does not seem to have this as a purpose. Your comments and observations
>please.

Your view sounds fair enough to me, as it should since I share it.

However, manipulating the meaning of words is a means on tilting the balance
of a debate, changing the ground rules, subtly or otherwise. ISTR that this
was first formulated by Hegel in his dialectic method and was developed in
the Marxian dialectic that we now all imbibe daily as though it were mother's
milk.

As Lewis Carroll put it, in the mouth of Humpty Dumpty, "Words mean what
I choose them to mean. The question is, who shall be the master".

Owen


W T Shaw

unread,
Aug 29, 1998, 3:00:00 AM8/29/98
to
In article <35e6c34...@news.prosurfr.com>,
jsa...@tenMAPSONeerf.edmonton.ab.ca (John Savard) wrote:

> al...@dakotacom.net (Alec Kosky) wrote, in part:
>
> >In article <jgfunj-2508...@dialup83.itexas.net>,
> > jgf...@EnqvbSerrGrknf.pbz (W T Shaw) writes:
>
> yes, he did write,
>
> and he did quote me
> >> In article <35e18bb5...@news.prosurfr.com>,
> >> jsa...@tenMAPSONeerf.edmonton.ab.ca (John Savard) wrote:
>
> but nothing he quoted from me was reproduced, this is something he (W
> T Shaw) said...
> >> Frog, a miracle product of 4 and a half days of work, at least met the
> >> entrance requirements. It certainly presents a novel way of doing a
> >> cipher, a program in the key. This would seem to be worth study in the
> >> stream cipher direction.
>
> Some people who are inexperienced in reading headers might get
> confused by this lack of trimming.
>
> John Savard
> http://members.xoom.com/quadibloc/index.html

An accident I suppose; I have little chance of seeing old postings.

I keep a separate place setting for Murphy, who comes and goes without
advance notice, rather rude of him, of course. Lots of what John Savard
says tends to be in a quotable direction.

W T Shaw

unread,
Aug 29, 1998, 3:00:00 AM8/29/98
to
In article <6rnq25$k9n$4...@news.sas.ab.ca>, jsa...@freenet.edmonton.ab.ca
() wrote:

> dian...@tecapro.com wrote:
> : By the way, before starting my presentation Schneier came to shake
> : my hand - this was quite decent. I returned the compliment
> : mentioning in public that everything I knew about cryptography
> : came from Schneier's book and then making clear that any errors in
> : FROG are therefore indirectly attributable to him.
>
> LOL! (Laughing out loud.)

It would have been a boring event without some comic relief. I have
accused the schedulers of planning some diversions for the second day.
>
> However, you are too modest. As FROG is "off the beaten path", it involves
> some ideas you wouldn't have seen in Applied Cryptography.

Getting off the beaten path is a good way to see something different.
When Frog took one giant leap away from that path, such was expected. New
ideas should always be welcomed with joyous laughter.
>
> If you had read a book which described the SIGABA - which improves on an
> ordinary rotor machine by using another set of rotors to control rotating
> the main rotors - then you would have had a source of inspiration for
> FROG. (Incidentally, now that the AES candidates are in the open, finally,
> after examining them, my juices flowed, and my muse inspired me to design
> a 128-bit block block cipher...with ideas snatched from several of the AES
> candidates. I lifted, with changes, your method of generating a 256-byte
> S-box, but I couldn't find a place to use the main idea of FROG.)
>
As Shiho Moriai would say is a whisper as a credit to such help, "Thank
you very much."

One should carefully be reminded, as were attendees, that all or any
aspects of an algorithm presented does not automatically mean anything
about it is in the public domain.

Bruce Schneier

unread,
Aug 30, 1998, 3:00:00 AM8/30/98
to
On Sun, 23 Aug 1998 08:25:19 GMT, dian...@tecapro.com wrote:
> Then came Bruce Schneier with Twofish. His presentation was
> excellent and you could see that he is used to write books and
> explain things to people. Twofish is a 16 round Feistel based on
> Blowfish. It creates four key-depended S-boxes by modifying two
> fixed S-boxes. This is done very carefully and in the case of 128
> bit keys all created S-boxes (4 x 2^(32)) were exhaustively
> checked for quality - for larger key sides extensive Monte Carlo
> tests were applied. Bruce plainly stated that he stole ideas from
> Square (I believe this was the 4x4 matrix multiply over GF(256))
> and from SAFER (the pseudo-Hadamard Transform used for diffusion).

Yes. I forgot to mention that the S-box construction was stolen from
CS-Cipher by Serge Vaudenay, although it may have been used elsewhere.
The Twofish paper gives credit.

> In the same vain he mentioned that Twofish is amenable to operate
> "in non-interoperable variants using a family key" mentioning that
> this is something that FROG also does. This I liked very much.
> Later on he mentioned that a very efficien versions of Twofish
> "compiles" code - using the same words I used when describing what
> generalized Frog does. So, ideas present in FROG are starting to
> find their way into the mainstream of cryptography (I don't
> necessary claim that they were introduced with FROG because I am
> not sure it is true.)

There were a few more similarities between FROG and Twofish that I did
not mention. We both generated 8x8-bit S-boxes out of 4-bit
permutations and fixed construction rules. We did that so the S-boxes
could be stored in applications where there is not enough memory for
two 256-byte tables.

> Back to Bruce's presentation which was full of interesting and
> educational points: He mentioned that very often in the design
> process of Twofish they had to make choices between more complex
> rounds against a larger number of rounds. For example, by taking
> out the rotations included in Twofish they could add one more
> round to the cipher with no loss of efficiency. What is very
> interesting is that, in general, they found that more rounds of a
> simpler structure are stronger than a few complex rounds.

This was, in some ways, a suprise. We started out with a much more
complicated round function, and took things out until we could not
take any more out.

> Another
> interesting point he mentioned is that the internal cipher design
> and the key-schedule are interdependent - you cannot design the
> two independently and then stick them together. Yet another
> interesting point is that "in cipher design it pays to put many
> eggs in few baskets" (he was explaining the fact that his key
> setup re-uses cryptographic primitives already there), because you
> make sure that each of these few baskets is strong enough.
>
> Twofish is fast, practically as fast as RC6 in the NIST standard
> platform, achieving about 2.2 CPU cycles per bit. It uses (or can
> use) only byte wide, "RISC-like" instructions as do several other
> candidates and is therefore efficient on 8-bit processors too.

Yes. RC6 is a few clock cycles faster than us on the Pentium Pro and
Pentium II, and we are about three times faster than them on a
Pentium. The data-dependent rotations and large multiplications kill
RC6 performance on all but the highest end platforms. And they will
be slower on the Alpha; it doesn't have a rotate instruction.

> What he pushed is the idea that Twofish offers the best balance
> between security, speed *and* implementation flexibility. He
> showed that there are many trade-offs possible when you implement
> Twofish depending on the memory available and speed needed. For
> example there are several implementation models starting with
> almost complete "on the fly" key schedule computation with minimum
> memory and ending with the fastest versions that do a lot of
> pre-computation. This flexibility can be used not only depending
> on available memory but also depending on whether you are going to
> encrypt many or few blocks with the same key. On the whole it is a
> convincing argument, and I thought it will be an important point
> because the AES *is* going to be used in operation environments
> that are desperately different.

We tried to stress that Twofish gives both speed and implementation
flexibility, along with security and a conservative design. Other
algorithms are also secure, but the design elements are newer and less
well understood. We chose, in the beginning, not to design a cipher
that would make for interesting science or a good academic paper. We
chose to design a conservative cipher for operational use.

> This will not be my last report about the Conference. Many of the
> most interesting things I learned here came from private
> discussions with top participants - I intend to include
> these in one last post.

Would it be possible foryou to send me your earlier posts. I missed
them on the newsgroup.

Bruce
**********************************************************************
Bruce Schneier, President, Counterpane Systems Phone: 612-823-1098
101 E Minnehaha Parkway, Minneapolis, MN 55419 Fax: 612-823-1590
Free crypto newsletter. See: http://www.counterpane.com

Bruce Schneier

unread,
Aug 30, 1998, 3:00:00 AM8/30/98
to
On Tue, 25 Aug 1998 20:58:18 GMT, dian...@tecapro.com wrote:

>
>
> This is my last report about the First AES Conference. Here I
> describe various ideas that came out of informal discussions with
> other participants. What follows is my personal understanding of
> what I heard - also I may be mixing my own ideas in the brew -
> therefore I will not mention any names.
>
> 1. On the relative security of public key cryptography against
> symmetric key cryptography. Three very knowledgeable people, two
> of which mainly work with public key cryptography, clearly stated
> that they would be less surprised to see a cryptanalytic
> breakthrough against public key than against symmetric ciphers.
> This goes against the more prevalent view that public key
> cryptography is slower but more secure "because it is based on
> pure mathematics".

I agree. Is that the prevalent view? I always thought that most
people thought public-key cryptosystems vulnerable because they are
based on pure mathematics.

> 2. On the security of symmetric ciphers. Here I got mixed
> impressions. On the one hand there is a great deal of confidence
> that the better symmetric ciphers are very strong indeed and that
> the probability that a even remotely workable attack (not a
> theoretical attack) against them will be found is almost nil. On
> the other hand the body language of the AES candidates presenters
> did (I think) reveal worry in this sense. Most added "unnecessary"
> rounds to their ciphers up to the extreme of Serpent that doubled
> them - and this in a competition where speed seems to be of great
> importance. Then again, maybe, they are only afraid of theoretical
> attacks (certificational weaknesses) that may derail their chances
> in the AES process.

Theoretical attacks are real, because they show flaws in the
underlying design model. I don't think that serious attacks will be
discovered against the better submissions, but there will be minor
things. It is VERY unlikely that the ciphers will be broken to a
point where an attacker can read traffic in most settings.

> 3. On the matter of the unknown attacks. This factor is coming out
> the closet - a memorable phrase was that designing a cipher is
> "like fighting against ghosts". Several presenters (AES candidates
> MARS, E2, TWOFISH) specifically discussed why their cipher may
> resist unknown attacks, SERPENT implicitly tries to defend against
> them, and, of course, FROG makes unknown attacks a central issue
> in design methodology.

It will be interesting to see how all the ciphers fare against Biham
and Shamir's new impossible cryptanalysis. We can see how well the
ciphers defended themselves against an "unknown attack."

> 6. On the importance of speed. This I kept asking everybody about and
> I am still not very convinced. Against the "old" idea that DES is
> slow in software, in fact bit-slicing implementations are quite
> fast. 3DES achieves today speeds of approximately 120 CPU cycles
> per byte on a 32 bit computer, or 3 Mbytes per second of 400 MHz
> Pentium. (This kind of DES speed, by the way, is the standard
> against which the AES candidates will be measured). This kind of
> speed is sufficient for most personal computer applications up to
> compressed real-time video.

Don't think 32-bit CPUs. To a first approximation, all encryption is
done on small 8-bit CPUs: smart cards, pay-TV meters, etc. Most of
the rest of encryption is done in hardware, so think hardware speed
and gate count. Agreed that the high-end microprocessors will always
get better, but so what.

And don't think that you have the entire processor to work with. Yes,
current DES implementations can encrypt video, but assume the computer
is going to have to do a lot of other things at the same time.

> With Moore's law still looking strong,
> in a few years time we should have the capacity of over 10 Mbytes
> per second of 3DES on a medium size PC. In the other extreme of
> the spectrum, smart cards need only encrypt a few bytes at a time
> - speed here is no big concern. So why the hurry? One response I
> got is that throughputs will also increase; if you have a channel
> of several Gbps (Gigabits per second) and you want to encrypt it
> what do you do? This seems to me a very uncommon application - for
> datastreams to arrive at this throughput it means they got
> concentrated from several sources and they should have been
> encrypted at the source; if the arrive un-encrypted at the
> concentrator then if is already too late for security.

Nonsense. Think routers. Think firewalls. Think proxy servers. Think
the Internet. Companies have these requirements today. Some are
trying to put IPSec into hardware. Others are trying to encrypt audio
and video over the net.

> There may
> be applications where speed is really necessary - I am thinking
> about pay-per-view cable TV where one may want to encrypt each
> channel with a different password before sending it out - but
> again these are very specific and isolated cases. In fact I
> believe throughputs will *not* increase indefinitely in the
> future: even crystalline phone (or video-phone) conversation does
> not require that much throughput and the number of people and the
> number of hours per day is limited too. What I am saying is that
> speed based on advances of hardware technology probably will grow
> faster then broad-band requirements.

Yes. And so will all applications. The world doesn't wait for
crypto; crypto has to fit into the world.

> 8. On the importance of memory. It is important that the AES use
> little memory in order to work well in smart cards, where RAM is
> sometimes limited to 256 bytes. Here Moore's law does not apply
> very well for the following reason: most smart card readers
> operate at relatively high voltages (3V - 5 V). A more densely
> populated chip must work at lower voltages and to decrease the
> voltage in the card becomes difficult after some point because of
> heat dissipation problems.

Yes.

Cheers,

Bruce Schneier

unread,
Aug 30, 1998, 3:00:00 AM8/30/98
to
On 23 Aug 1998 14:00:33 GMT, jsa...@freenet.edmonton.ab.ca () wrote:

>dian...@tecapro.com wrote:
>: Apparently an attack against
>: DEAL has been found also - this brings the number of hit
>: candidates to 4: LOKI97, MAGENTA, DEAL and (for weak keys) FROG.
>
>This is big news.
>
>DEAL, like MISTY and SKIPJACK, uses a Feistel round cipher as an
>f-function. So, an attack against DEAL implies a possible attack against
>these ciphers.
>
>On top of that, there was, I think, a proof in Volume 8 of Journal of
>Cryptology - the second author was Knudsen - (but I may be
>misunderstanding the reference I saw in the MAGENTA document) that ciphers
>with that type of structure were _provably_ immune to both differential
>and linear cryptanalysis.
>
>Since DEAL with a 256-bit key is clearly slower than Triple-DES, I thought
>_that_ would be sufficient to blow it out of the water, anyhow.

There is supposed to be a new attack by Stephan Lucks against DEAL; I
have not seen details.

Bruce Schneier

unread,
Aug 30, 1998, 3:00:00 AM8/30/98
to
On 25 Aug 1998 13:24:10 GMT, pgu...@cs.auckland.ac.nz (Peter Gutmann)
wrote:

>
>
>dian...@tecapro.com writes:
>
>>The most amazing part was the claim that it can encrypt fractional bits. Here
>>is the example he gave: Say you want to encrypt a digit (0..9) which you
>>cannot represent a whole number of bits; you put it in 4 bits and encrypt
>>them; if the result is not in 0..9 then you encrypt it again; and so on until
>>you get a digit result. Let's see an example: you want to encrypt 5 -> 13
>>(this is not good so let re-encrypt) -> 3. To decrypt 3 -> 13 (this cannot be
>>so let's re-decrypt) -> 5. Using this mechanism you can now encrypt any data
>>type into the same data-type: say dates into dates, printable characters into
>>printable characters, etc. Tasty
>
>You can do that with any cipher, I posted a technique for doing this here
>about 1 1/2 years ago and there were a number of followups and comments on it
>(the thread was "Encrypting data with a restricted range of values", starting
>on 23rd January 1997, Dejanews should have it). This is currently being used
>with 3DES to protect data going over comms channels with very weird
>properties (they have certain groups of characters which they'll pass, but
>others which are either blocked or modified).

Agreed. This is not something that HPC can do; it is a general mode
that can be used with any algorithm. One of my complaints about HPC
is that it takes many things such as this, which are best defined
outside the block cipher, and defines them within the cipher.

Paul L. Allen

unread,
Aug 30, 1998, 3:00:00 AM8/30/98
to
In article <35e99fd...@news.visi.com>
schn...@counterpane.com (Bruce Schneier) writes:

> On Tue, 25 Aug 1998 20:58:18 GMT, dian...@tecapro.com wrote:
> > 6. On the importance of speed. This I kept asking everybody about and
> > I am still not very convinced. Against the "old" idea that DES is
> > slow in software, in fact bit-slicing implementations are quite
> > fast. 3DES achieves today speeds of approximately 120 CPU cycles
> > per byte on a 32 bit computer, or 3 Mbytes per second of 400 MHz
> > Pentium. (This kind of DES speed, by the way, is the standard
> > against which the AES candidates will be measured). This kind of
> > speed is sufficient for most personal computer applications up to
> > compressed real-time video.
>
> Don't think 32-bit CPUs. To a first approximation, all encryption is
> done on small 8-bit CPUs: smart cards, pay-TV meters, etc. Most of
> the rest of encryption is done in hardware, so think hardware speed
> and gate count. Agreed that the high-end microprocessors will always
> get better, but so what.

Umm, *do* think 32-bit CPUs. The ARM family of 32-bit CPUs is rapidly
becoming the most popular CPU family because of its small gate count, low
power dissipation and high performance. It is available from multiple
(around 20 last time I looked) semiconductor houses. Intel's (formerly
Digital's before they shot themselves in the foot repeatedly and then
gave away rights to the design along with their semi fabs when Intel
hadn't asked for it) StrongARM has integer performance comparable with
the Pentium at the same clock speed for 1/25th the die size and 1/25th
the power dissipation.

A lot of smart cards these days use one of the ARM family CPUs...

--Paul

dian...@tecapro.com

unread,
Aug 30, 1998, 3:00:00 AM8/30/98
to
In article <jgfunj-2508...@dialup83.itexas.net>,
jgf...@EnqvbSerrGrknf.pbz (W T Shaw) wrote:
> [...]

> Frog, a miracle product of 4 and a half days of work, at least met the
> entrance requirements. It certainly presents a novel way of doing a
> cipher, a program in the key. [...]

Five and a half. To be clear about this, we started working on the
submission package five and a half days before the latest possible
moment we would have to sent it by courier to NIST in order to
make the deadline. We just managed to sent it on time and I
remember we were disappointed when we found out afterwards that
NIST had extended the closing date from Monday 15 to Friday 19 of
June; this sounded like an awful lot of additional time we could
have used. - The FROG idea as well as some code had been around
for weeks or months.

Your expression of FROG being a "miracle" product made me think.

FROG is certainly a very simple cipher and the fact that the
submission package was prepared so quickly shows that NIST's
formal requirements were not really that complex. NIST, by the way,
is quite happy about the number of candidates it got: a much
higher number or a much lower number would either complicate or
invalidate the process.

What may seem miraculous about FROG is not that it was accepted as
a candidate after so little work, but rather how much security it
seems to offer in such a simple structure. Schneier's paper (which
I still haven't read in detail) shows the existence of a small set
of weak keys - this is a true weakness but not a general type of
attack. Also FROG achieves its strength after 8 simple rounds
whereas many other strong AES candidates contain between 12
and 48 rounds. Schneier's paper estimates that FROG with 16 rounds
would not have any weak keys. Let us compare FROG with DES: DES
with 8 rounds can be broken much easier than FROG and for all
keys, and much more effort and experience have gone into the
design of DES than into FROG. Now, of course, to compare FROG with
DES is not quite acceptable because a *LOT* more cryptanalysis
has gone into DES and because DES was designed under a
different set of criteria than the AES candidates. It would be
more valid to compare all AES candidates after the same number of
rounds or, even better, after the same quantity of CPU time
invested.

What I would like to call "miraculous" is what FROG's design
methodology tries to achieve: a cipher that is strong against
known attacks without actually trying. This is the basic argument
why FROG-like ciphers may be strong against unknown attacks also.
Other AES candidate ciphers were specifically designed to resist
known attacks. Therefore, even if FROG is shown to be
theoretically weaker than other candidates against known attacks,
I think it is valid to argue that we gain more confidence in the
ability of FROG to resist yet unknown attacks that will be discovered
in the next century.

I am quite aware that for now my arguments are rather vague - I
had better sit down and do some serious analysis on FROG myself
(Bruce Schneier and his associates have already shown the way). I
have little experience in this - if the reader is interested in
collaborating with me, please let me know by email.

--
http://www.tecapro.com
email: dian...@tecapro.com

-----== Posted via Deja News, The Leader in Internet Discussion ==-----
http://www.dejanews.com/rg_mkgrp.xp Create Your Own Free Member Forum

STL137

unread,
Aug 31, 1998, 3:00:00 AM8/31/98
to
*ahem* Biham and Shamir's "impossible cryptanalysis"? What's that? I haven't
heard of that at all.... *confused*
------
STL...@aol.com
"I have sworn upon the altar of God eternal hostility against every form of
tyranny over the mind of man" - Thomas Jefferson
Website: http://members.aol.com/stl137/ PGP keys: ~~~pgp.html Quotes:
~~~quotes.html

John Savard

unread,
Aug 31, 1998, 3:00:00 AM8/31/98
to
jgf...@EnqvbSerrGrknf.pbz (W T Shaw) wrote, in part:

>One should carefully be reminded, as were attendees, that all or any
>aspects of an algorithm presented does not automatically mean anything
>about it is in the public domain.

Yes, I'm well aware of that. LOKI 97 is one of the public ones - and
it is the only one I've borrowed anything substantial from. Neither
shift registers nor block ciphers with a layered structure are newly
invented by MARS, so I think I'm safe there; I haven't yet found the
patent, however, to check its claims.

John Savard
http://members.xoom.com/quadibloc/index.html

Paul Crowley

unread,
Aug 31, 1998, 3:00:00 AM8/31/98
to
schn...@counterpane.com (Bruce Schneier) writes:
> On Tue, 25 Aug 1998 20:58:18 GMT, dian...@tecapro.com wrote:
> > This goes against the more prevalent view that public key
> > cryptography is slower but more secure "because it is based on
> > pure mathematics".
>
> I agree. Is that the prevalent view? I always thought that most
> people thought public-key cryptosystems vulnerable because they are
> based on pure mathematics.

The "more prevalent" view is certainly one voiced by newcomers to
sci.crypt from time to time...
--
__
\/ o\ pa...@hedonism.demon.co.uk Edinburgh fetish club Permission \ /
/\__/ Paul Crowley Sept 13 http://www.hedonism.demon.co.uk/permission /~\

W T Shaw

unread,
Aug 31, 1998, 3:00:00 AM8/31/98
to
In article <35eb03e2...@news.prosurfr.com>,
jsa...@tenMAPSONeerf.edmonton.ab.ca (John Savard) wrote:

With the international array of participants, and since laws on patents
and processes vary widely, it is best not to assume anything. A lawyer
might suggest that it is best to ask directly of a source what the
situation is with their work. Several announced public domain status, but
you should have it in more formal form. But, just because someone used a
method and said it was public domain does not mean that someone else had
protected an important part of it before.

In an opposing spirit, as I have an open debate with myself, not being one
to be tied to fear as a premise to start your day, or your design, fair
notice and open questioning might be a more realistic way to see if
something is free and clear. After all, how many people are actually
involved in crypto, not that many, and they do or should be reading or
receiving second hand input from sci.crypt chronically.

ma5...@my-dejanews.com

unread,
Sep 1, 1998, 3:00:00 AM9/1/98
to
In article <35e99fd...@news.visi.com>,
schn...@counterpane.com (Bruce Schneier) wrote:

<snip>

> Nonsense. Think routers. Think firewalls. Think proxy servers. Think
> the Internet. Companies have these requirements today. Some are
> trying to put IPSec into hardware. Others are trying to encrypt audio
> and video over the net.
>

I am just beginning a project and anticipate having to do this. What would be
good for encrypting a RealAudio stream?

<snip>

> Yes.
>
> Cheers,
> Bruce
> **********************************************************************
> Bruce Schneier, President, Counterpane Systems Phone: 612-823-1098
> 101 E Minnehaha Parkway, Minneapolis, MN 55419 Fax: 612-823-1590
> Free crypto newsletter. See: http://www.counterpane.com
>

ma5ter

Douglas A. Gwyn

unread,
Sep 1, 1998, 3:00:00 AM9/1/98
to
Paul Crowley wrote:

> schn...@counterpane.com (Bruce Schneier) writes:
> > On Tue, 25 Aug 1998 20:58:18 GMT, dian...@tecapro.com wrote:
> > > This goes against the more prevalent view that public key
> > > cryptography is slower but more secure "because it is based on
> > > pure mathematics".
> > I agree. Is that the prevalent view? I always thought that most
> > people thought public-key cryptosystems vulnerable because they are
> > based on pure mathematics.
> The "more prevalent" view is certainly one voiced by newcomers to
> sci.crypt from time to time...

Okay, guys. What difference does it make what a public opinion poll
says?

Experience shows that all cryptosystems have *some* vulnerabilities
unless you can logically demonstrate that they have not.
The practical question is, who is the cryptosystem trying to hide
secrets from and how likely is it that they will be able to find
and exploit vulnerabilities? Factor that in with the consequences
of the "enemy" learning the secrets, operational costs, etc. to make
a rational decision about whether the system is suitable (taking into
account similar analyses for the alternatives).
It does appear that, apart from a few cryptologic organizations,
there is not much known about how to perform these vulnerability
analyses. That means that, unless a devastating attack is publicly
known, public selection of cryptosystems is largely guesswork.
(I.e., just because you can show that a handful of known methods
of cryptanalytic attack won't succeed, doesn't mean that nobody
knows of, or could discover, yet another attack that will succeed.)

Mathematical methods are used in attacks against various kinds of
cryptosystem. The advantage of such a system as RSA, however, is
that it is unlikely that anybody knows *much* more about attacking
it than is known to the public researchers. (If it is true that
any successful general attack on RSA will also be equivalent to
an effective method of factoring arbitrarily large products of
primes, which by the way I don't know whether is really true,
then many of the people who might discover a solution are
working in the public sector and *have been for centuries*.)
That to some degree justifies the notion that such systems are
fairly secure, while the general lack of knowledge about most
stream and block encipherment methods tends to make one worry
about their security.

Thus, a successful attack against RSA might constitute a
mathematical breakthrough; a successful attack against, say,
IDEA might already have been accomplished for all we know.

jsa...@freenet.edmonton.ab.ca

unread,
Sep 2, 1998, 3:00:00 AM9/2/98
to
dian...@tecapro.com wrote:
: I am quite aware that for now my arguments are rather vague - I

: had better sit down and do some serious analysis on FROG myself

Of course, one of the reasons that FROG is likely to be secure against an
unknown attack may be because a cipher like it is very hard to analyze.

It's specifically in order to have something they can understand well
enough to analyze, and say something meaningful about, that many designs
are oriented towards a specific set of known attacks. This is a sensible
academic approach: for the rough-and-tumble "real world", I would
recommend mixing that approach with the use of the most powerful
constructs possible to stymie any apparent hope of cryptanalysis - there
are ways of ensuring that a hard to analyze addition to a cipher is very
unlikely to weaken it (independent, or nearly-independent keys).

John Savard

Terry Ritter

unread,
Sep 2, 1998, 3:00:00 AM9/2/98
to

On 2 Sep 1998 03:39:44 GMT, in <6siem0$gd4$1...@news.sas.ab.ca>, in
sci.crypt jsa...@freenet.edmonton.ab.ca () wrote:

>dian...@tecapro.com wrote:
>: I am quite aware that for now my arguments are rather vague - I
>: had better sit down and do some serious analysis on FROG myself
>
>Of course, one of the reasons that FROG is likely to be secure against an
>unknown attack may be because a cipher like it is very hard to analyze.

In contrast, I have always thought that a cipher should be designed to
be *as* *easily* *analyzed* *as* *possible*.

The analysis we can support and share in the open literature is likely
to be far *less* than can be done by those with greater resources who
work in secret. So a design which makes analysis difficult tends lock
*us* out of deep understanding, yet may have little effect on our
Opponents. This means the attackers may well understand the defense
better than the defenders, which is not a good situation.

This is a classic problem of defense: The attackers may see weaknesses
which the defenders do not realize, and then exploit them. But in
crypto the effects are greatly worsened because failure of the defense
can occur without external signs. The defenders will not respond
because they will not know they are being repeatedly defeated.

This is why I think it is important to have a cipher which we can
understand deeply. I suggest that any such cipher *must* scale down
to tiny size so that we at least have a chance of finding abstract
signs of weakness experimentally.


>It's specifically in order to have something they can understand well
>enough to analyze, and say something meaningful about, that many designs
>are oriented towards a specific set of known attacks.

Many new cipher designs do seem almost ad-hoc, or specifically built
to defeat the old attacks. But in any fixed design the designer is
obviously unable to respond to *future* attacks, so it is not clear
that this design approach is sufficient. This ideal would be some
sort of "strong simplicity" which inherently handles the old attacks,
thus giving us some confidence that it may handle new attacks as well.


>This is a sensible
>academic approach: for the rough-and-tumble "real world", I would
>recommend mixing that approach with the use of the most powerful
>constructs possible to stymie any apparent hope of cryptanalysis - there
>are ways of ensuring that a hard to analyze addition to a cipher is very
>unlikely to weaken it (independent, or nearly-independent keys).

Perhaps ideally we would like to see multiple distinct ciphering
constructs inside the cipher. These would function sequentially as a
sort of internal multiple ciphering in "layers," as opposed to simply
increasing the number of "rounds" of a single structure.

Crypto Glossary 1998-08-27: http://www.io.com/~ritter/GLOSSARY.HTM

Terry Ritter

unread,
Sep 2, 1998, 3:00:00 AM9/2/98
to

On Tue, 01 Sep 1998 03:35:40 GMT, in <35EB6B83...@null.net>, in
sci.crypt "Douglas A. Gwyn" <DAG...@null.net> wrote:

>[...]


>Experience shows that all cryptosystems have *some* vulnerabilities
>unless you can logically demonstrate that they have not.

>[...]


>That means that, unless a devastating attack is publicly
>known, public selection of cryptosystems is largely guesswork.
>(I.e., just because you can show that a handful of known methods
>of cryptanalytic attack won't succeed, doesn't mean that nobody
>knows of, or could discover, yet another attack that will succeed.)
>

>The advantage of such a system as RSA, however, is
>that it is unlikely that anybody knows *much* more about attacking
>it than is known to the public researchers.

>[...]


>
>Thus, a successful attack against RSA might constitute a
>mathematical breakthrough; a successful attack against, say,
>IDEA might already have been accomplished for all we know.

I think this was a *great* summary of the strength situation.

But in predicting of the future, I think we could just as easily argue
that there might be an essential *weakness* in RSA-like systems.

The main thing that disturbs me about number-theoretic crypto is that
so many great mathematicians have individually had the intuition that
there *ought* to be a way to factor quickly. Disregarding the
intuition of a great mathematician is not something to take lightly.

It does not particularly encourage me that no such way has been found
by these very same men. One might well hypothesize the existence of
factoring methods which are conceptually complex, yet computationally
easy. The ultimate solution process may even be more complex than a
person *can* understand, at least at any one time. But that does not
necessarily make it a hard computation.

For example, perhaps in an "easy" approach there are a vast number of
particular cases, but the desired case can be found relatively
quickly, and then solved directly. This may not be the sort of thing
that would reduce well to mathematical expressions or reasoning, but
the computer is not so constrained. (This handwave is not intended to
be taken literally -- if a process really is "conceptually difficult"
we are not going to be describing it in a paragraph. But we *can*
describe the idea that there might be such a thing.)

Even though there is a long history of factoring in mathematics, there
is not a long history of applying computers to mathematical reasoning
about anything, let alone factoring. This means that we have to worry
about the possibility of essentially new forms of mathematical thought
which become practical based on new tools. And if significant
computer-aided reasoning ever develops, our comforting history of past
factoring failure could be irrelevant.

bo...@rsa.com

unread,
Sep 2, 1998, 3:00:00 AM9/2/98
to
In article <35ed82e2...@news.io.com>,

rit...@io.com (Terry Ritter) wrote:
>
> On Tue, 01 Sep 1998 03:35:40 GMT, in <35EB6B83...@null.net>, in
> sci.crypt "Douglas A. Gwyn" <DAG...@null.net> wrote:

> The main thing that disturbs me about number-theoretic crypto is that
> so many great mathematicians have individually had the intuition that
> there *ought* to be a way to factor quickly.

A rather bold assertion. Please name the mathematicians or cite references.
Exact quotes might be useful.

I guarantee that I know most of the mathematicians working in this
area. I don't know any who have said what you claim.

Jerry Leichter

unread,
Sep 2, 1998, 3:00:00 AM9/2/98
to Terry Ritter
| The main thing that disturbs me about number-theoretic crypto is that
| so many great mathematicians have individually had the intuition that
| there *ought* to be a way to factor quickly....

|
| It does not particularly encourage me that no such way has been found
| by these very same men. One might well hypothesize the existence of
| factoring methods which are conceptually complex, yet computationally
| easy. The ultimate solution process may even be more complex than a
| person *can* understand, at least at any one time. But that does not
| necessarily make it a hard computation.

A great quote due to Alan Perlis: All the truths of mathematics are
simple. If there are any non-simple truths, mathematicians will never
find them.

| Even though there is a long history of factoring in mathematics, there
| is not a long history of applying computers to mathematical reasoning
| about anything, let alone factoring. This means that we have to worry
| about the possibility of essentially new forms of mathematical thought
| which become practical based on new tools. And if significant
| computer-aided reasoning ever develops, our comforting history of past
| factoring failure could be irrelevant.

I agree absolutely.

Consider that the best currently known methods for factoring use
randomization: Construct enough cases at random, and eventually you can
paste them together into a set of factors. This notion of an algorithm
is very new - at best, 40-50 years old, though in terms of actual
practice, perhaps no more than 25 years old. Mathematics in the past
has dealt with proofs, which may be constructive or non-constructive.
Constructive proofs have historically been given as deterministic
algorithms.

The complexity of the algorithms was seldom at issue. Even early work
on computability looked at Turing machines plus coin tosses, showed that
anything computable with coin tosses was also computable without them,
and stopped there.

It takes a fairly well developed theory of computational complexity
before you can really start looking at randomized algorithms in any
meaningful way. There simply is no "long history of great mathemati-
cians" attacking factoring using the kinds of tools that seem, at
present, to be the most effective - the tools simply hadn't been
conceived, or if conceived, could not be fit into contemporary notions
of "reasonable computation".

It would not shock me to learn of a factoring algorithm efficient enough
to work with, say, 10,000 bit numbers today. It wouldn't even shock me
to learn of a poly-time factoring algorithm. I've mentioned this to
mathematicians familiar with the field, and they've generally agreed.

-- Jerry

W T Shaw

unread,
Sep 2, 1998, 3:00:00 AM9/2/98
to
In article <35ed8261...@news.io.com>, rit...@io.com (Terry Ritter) wrote:
>
> Perhaps ideally we would like to see multiple distinct ciphering
> constructs inside the cipher. These would function sequentially as a
> sort of internal multiple ciphering in "layers," as opposed to simply
> increasing the number of "rounds" of a single structure.
>
Rounds mean reuse, which can be risky...how many are needed, how few are
necessary, how many are suggested and why.

Here are some round numbers:

48--Cast 25
8--DFC
32--Mars
6--Magenta
10/12/14--Rijndael
16--Loki97
8--Deal
20--RC6
12,2--E2
32--Serpent
8--Frog
0--HPC
12--Crypton
16--Twofish
1/12/16--Safer+

Douglas A. Gwyn

unread,
Sep 3, 1998, 3:00:00 AM9/3/98
to
Terry Ritter wrote:
> It does not particularly encourage me that no such way has been found
> by these very same men. One might well hypothesize the existence of
> factoring methods which are conceptually complex, yet computationally
> easy. The ultimate solution process may even be more complex than a
> person *can* understand, at least at any one time. But that does not
> necessarily make it a hard computation.

Well, yes and no. There is certainly precedent for your idea,
for example the "resolution principle" used in logic programming
systems is difficult for people to comprehend and virtually
impossible for them to accomplish mentally, yet it is readily
implemented on a computer and provides efficient solutions.
On the other hand, before *any* algorithm is programmed, *some*
human(s) must have figured it out. The current best factoring
approaches seem to require computers, but of course they were
devised by mathematicians.

There *are* some very interesting mathematical "breakthroughs"
that are essentially computational; for example, Petkovsek,
Wilf, and Zeilberger's amazing book "A=B" shows how to *easily*
find and prove various useful identities/theorems, using certain
computer-based algorithms, that might never be found "by hand".
(See
http://www.sigmaxi.org/amsci/issues/Sciobs97/sciobs97-11gorilla.html
for an easily understandable capsule summary of this work.)

> For example, perhaps in an "easy" approach there are a vast number of
> particular cases, but the desired case can be found relatively
> quickly, and then solved directly. This may not be the sort of thing
> that would reduce well to mathematical expressions or reasoning, but
> the computer is not so constrained.

Certainly, if you can program a computer to search through a
manageable set of cases to find one meeting some specified
condition, it can at times be useful. But I wouldn't call
it necessarily non-mathematical. It's just a different way
of using mathematics.

When I was first studying mathematics (perhaps when you were too),
emphasis was on "closed form" solutions and "nice" behavior.
In a way I feel gypped, because it turned out that *most*
functions are not "nice" and *most* problems don't have
closed-form solutions. There's a lot of power in the classical
methods *when they work*, but it's not always clear when they
are capable of solving specific problems.

> Even though there is a long history of factoring in mathematics, there
> is not a long history of applying computers to mathematical reasoning
> about anything, let alone factoring.

Well, not *as* long a history, but (for example) I was just
one of many who applied computers to solve problems in
combinatorial logic as long ago as the early 1960s, and
much of the Bletchley Park "prehistoric" computing was
targeted at specific mathematical problems that happened
to have relevance in cryptanalysis.

> This means that we have to worry
> about the possibility of essentially new forms of mathematical thought
> which become practical based on new tools. And if significant
> computer-aided reasoning ever develops, our comforting history of past
> factoring failure could be irrelevant.

I think we now have enough evidence of what forms
"computer-aided reasoning" about mathematics might take,
to judge the general shape of the future in that regard.
There doesn't seem to be any danger of computers "thinking"
in ways that people didn't first devise, but they are tools
that can *apply* unorthodox "thinking" methods much faster
and more accurately than people can. (Examples given above.)
In particular, they can try out large sets of possibilities
that people cannot or would not want to tackle. (Witness
the proof of the 4-color theorem.)

This means that researchers who make use of computing can
indeed head off in directions that otherwise would not have
been pursued, but that isn't any different really from the
invention of calculus, complex analysis, group theory,
category theory, algebraic K-theory, or whatever.
Mathematicians keep devising new tools that allow us to
do things we could not do before. They're even making
*some* progress on factoring large numbers!

Douglas A. Gwyn

unread,
Sep 3, 1998, 3:00:00 AM9/3/98
to
Please be more careful when including attributions in message text to
which you're replying.
In the follwoing, you make it appear that *I* had said the "rather bold
assertion",
but of course I didn't. Apparently (from the >s) it was said by Ritter.
There is no reason for my name to have appeared in BobS's post at all.

bo...@rsa.com wrote:
>
> In article <35ed82e2...@news.io.com>,


> rit...@io.com (Terry Ritter) wrote:
> >
> > On Tue, 01 Sep 1998 03:35:40 GMT, in <35EB6B83...@null.net>, in
> > sci.crypt "Douglas A. Gwyn" <DAG...@null.net> wrote:
>

> > The main thing that disturbs me about number-theoretic crypto is that
> > so many great mathematicians have individually had the intuition that

Jason Stratos Papadopoulos

unread,
Sep 4, 1998, 3:00:00 AM9/4/98
to
bo...@rsa.com wrote:
: In article <35ed82e2...@news.io.com>,
: rit...@io.com (Terry Ritter) wrote:
: >
: > On Tue, 01 Sep 1998 03:35:40 GMT, in <35EB6B83...@null.net>, in
: > sci.crypt "Douglas A. Gwyn" <DAG...@null.net> wrote:

: > The main thing that disturbs me about number-theoretic crypto is that
: > so many great mathematicians have individually had the intuition that
: > there *ought* to be a way to factor quickly.

: A rather bold assertion. Please name the mathematicians or cite references.
: Exact quotes might be useful.

: I guarantee that I know most of the mathematicians working in this
: area. I don't know any who have said what you claim.


Hans Riesel seems to think so, and devotes several pages of "Primality
Testing and Computer Methods of Factorization" (?) to some possible ideas.

jasonp

bo...@rsa.com

unread,
Sep 4, 1998, 3:00:00 AM9/4/98
to
In article <6solp5$n6f$1...@hecate.umd.edu>,

Hans is a competent mathematician and number theorist. But he is not among
those trying to invent new algorithms. (at least in public announcements)

The idea he presents: that one might be able to insert exponentially primes
into a product mod p, while doing polynomially many pseudo-random evaluations
mod N is unconvincing.

I have discussed those ideas with a number of people.

The really great people working in this area (both Lenstras, Coppersmith,
Pomerance, Adleman, Pollard, H. Cohen, some of the arithmetic geometers
etc.) generally say that we don't know enough to hazaard a guess one way
or the other.

Bruce Schneier

unread,
Sep 5, 1998, 3:00:00 AM9/5/98
to

> I am quite aware that for now my arguments are rather vague - I
> had better sit down and do some serious analysis on FROG myself

> (Bruce Schneier and his associates have already shown the way). I
> have little experience in this - if the reader is interested in
> collaborating with me, please let me know by email.

Remember that the cryptanalysis of Frog only took a few hours. It
does not make sense to generalize and say something like "Frog only
took a few days to create, and the cryptanalysis wasn't that
impressive." It is more reasonable to say that "Fron only took a few
days to create, and a few hours to analyze." Certainly more time
spent on analysis will find more severe weaknesses, just as more time
at design will produce a better design.

It's not hard to sit down for a few hours and design a cipher that
can't be broken in a few days. That's not terribly interesting or
useful. It's more interesting to design a cipher that won't break
even after tens of thousands of hours of analysis.

W T Shaw

unread,
Sep 5, 1998, 3:00:00 AM9/5/98
to
In article <35f13e5...@news.visi.com>, schn...@counterpane.com

(Bruce Schneier) wrote:
>
> It's not hard to sit down for a few hours and design a cipher that
> can't be broken in a few days. That's not terribly interesting or
> useful. It's more interesting to design a cipher that won't break
> even after tens of thousands of hours of analysis.
>
In essence, time may have nothing to do with results, as many work hard
and never actually accomplish much, but preliminary evidence of productive
thinking in a short period of time should be taken as a promise of
possible things to come, given suitable effort.

Nice, neat, subtile systems are easily dismissed without attracting
sufficient analysis to see their true merits either an an entity or as a
means of demonstrating useful techniques. It seems easier to sell
something complicated that no one knows how to *break* yet, than something
that many can imagine breaking without actually doing so.

The trick is to get those tens of thousands of hours of analysis to
satisfy others to your own belief that you have something, or at least to
find the good stuff within it so that Savard can use it somehow. ;)
--
----
Boy is it hot! The State Bird of Texas should be the Broiler.

dian...@tecapro.com

unread,
Sep 7, 1998, 3:00:00 AM9/7/98
to
In article <35f13e5...@news.visi.com>,
schn...@counterpane.com (Bruce Schneier) wrote:
[...]

>
> Remember that the cryptanalysis of Frog only took a few hours. It
> does not make sense to generalize and say something like "Frog only
> took a few days to create, and the cryptanalysis wasn't that
> impressive." It is more reasonable to say that "Fron only took a few
> days to create, and a few hours to analyze." Certainly more time
> spent on analysis will find more severe weaknesses, just as more time
> at design will produce a better design.
>
> It's not hard to sit down for a few hours and design a cipher that
> can't be broken in a few days. That's not terribly interesting or
> useful. It's more interesting to design a cipher that won't break
> even after tens of thousands of hours of analysis.

Time is relative and depends on the current state of art. After
all it took years to find out that DES is strong against
differential cryptanalysis and years more to find out that it is
strong against linear cryptanalysis too.

What I was referring to is the *apparent* strength of 8-round FROG
when compared to 8-round DES:

8-DES FROG

Chosen plaintexts (differential attack) 2^14 2^58

Known plaintexts (linear attack) 2^21 2^56

Probability of success 1 2^(-32)

Also observe that FROG is faster then 8-round DES when doing
one encryption at a time (let us not discuss FROG's gargantuan
key-setup). I understand this comparison is very rough but still
I think it is significant because FROG is conceptually much
simpler than DES and because FROG was not specifically designed
to resist differential or linear attacks (and even so, attacks were
found only against a small set of the keys). The more I think
about it the clearer it appears to me that a cipher that not by
design resists known attacks is more likely to resist unknown
attacks too.

FROG is one instance of a novel design principle. I shall try to
investigate how the attacks on your paper work against more general
versions of FROG which were announced before your paper was
published. If it turns out that by more consistently following its
design principles FROG becomes stronger against known attacks with
the same number of rounds and at the same speed then, I believe,
the soundness of FROG's design principles becomes very convincing
indeed and much confidence is gained in the generalized FROG's
ability to resist unknown attacks.

NIST, reasonably enough, does not allow changes in the current
phase of the competition. To make it to the second round of the
AES competition it appears I have only two chances:

a) Find some error in your paper and show that the current version
of FROG does not have any certification weaknesses.

b) Show the strength of FROG's design principles using its more
general version.

Now, point a) is unlikely; point b) I believe is likely.

I shall have to quickly analyze these points if I want to have any
possibility to make it to the second round. NIST clearly wants the
AES to be a world-wide standard. They know that it will take years
for the AES competition to be concluded and the implementation
phase to start. If sufficient evidence can be presented that FROG
does indeed represent an instance of a powerful and sound design
principle they may just decide to let it slip to the second round
and gain some time for further analysis, rather than even remotely
risk that a cipher not anymore considered for the AES gains a lot of
credibility and makes the acceptance of the AES questionable.

Changing the subject, Bruce could you please post some information about
"Biham and Shamir’s new impossible cryptanalysis" you mentioned the
other day? I am sure a lot of us are very curious about that.

dian...@tecapro.com

unread,
Sep 8, 1998, 3:00:00 AM9/8/98
to
In article <35ed8261...@news.io.com>,

rit...@io.com (Terry Ritter) wrote:
>
> On 2 Sep 1998 03:39:44 GMT, in <6siem0$gd4$1...@news.sas.ab.ca>, in
> sci.crypt jsa...@freenet.edmonton.ab.ca () wrote:
> >Of course, one of the reasons that FROG is likely to be secure against an
> >unknown attack may be because a cipher like it is very hard to analyze.
>
> In contrast, I have always thought that a cipher should be designed to
> be *as* *easily* *analyzed* *as* *possible*.
>
> The analysis we can support and share in the open literature is likely
> to be far *less* than can be done by those with greater resources who
> work in secret. So a design which makes analysis difficult tends lock
> *us* out of deep understanding, yet may have little effect on our
> Opponents. This means the attackers may well understand the defense
> better than the defenders, which is not a good situation.

Many would say that at least there is no reason to design
unnecessarily complicated ciphers just in order to make analysis
complicated too. Let us play devil's advocate and investigate
this: Security is about the cost of breaking a cipher. This cost
has two components: the cost of cryptanalysis that discovers a
feasible attack and the cost of carrying out this attack. Now if a
cipher is tremendously complicated to analyze then the cost of the
first component can become unacceptably high and therefore this
cipher will provide good security. You see, even a very powerful
cryptanalyst who already disposes of a menu with all good attacks
and needs only to probe a new cipher with each one to detect a
weakness, must presumably invest some effort in this probing that
varies from cipher to cipher depending on its complexity. (Let us
assume or hope that there do not exist automated cryptographers
yet :) ).

Probably, such a complicated cipher will not be fast and could not
compete for the AES. Actually several years ago I wrote such a
cipher, "GodSave", which is designed to overwhelm an attacker's
analytic ability. GodSave uses Luby-Rackoff in a remarkably
complicated manner including large sections of machine generated
"unreadable" code; no wonder it is about 100 times slower than the
fastest AES candidates.

FROG is a different kind of beast. When John says that "one of the


reasons that FROG is likely to be secure against an unknown attack

may be because a cipher like it is very hard to analyze", I think
he does not imply that FROG is too complicated but rather that it
is too simple. When a cryptographer probes a cipher to see how
well it withstands attack X then he is looking for a specific
feature in the cipher. If the cipher is almost featureless then
there is a good probability that the desired feature is not
present. (Here it is immaterial whether the cipher's designer knew
something about attack X or not.) To use an analogy: if you want
to defend a target one possibility is to armor it, another it to
make the target very small.

> This is a classic problem of defense: The attackers may see weaknesses
> which the defenders do not realize, and then exploit them. But in
> crypto the effects are greatly worsened because failure of the defense
> can occur without external signs. The defenders will not respond
> because they will not know they are being repeatedly defeated.
>
> This is why I think it is important to have a cipher which we can
> understand deeply. I suggest that any such cipher *must* scale down
> to tiny size so that we at least have a chance of finding abstract
> signs of weakness experimentally.

I agree. To scale a cipher down to not secure levels presents another
advantage: one can measure the "dynamics" of its security, i.e. how fast
does its security against a particular attack grow. For example, DES at 8
rounds is easier to break with a differential attack than with a linear
attack, but its strength against differential attacks grows faster and
therefore full 16 rounds DES is weaker against a linear attack.

> >It's specifically in order to have something they can understand well
> >enough to analyze, and say something meaningful about, that many designs
> >are oriented towards a specific set of known attacks.
>
> Many new cipher designs do seem almost ad-hoc, or specifically built
> to defeat the old attacks. But in any fixed design the designer is
> obviously unable to respond to *future* attacks, so it is not clear
> that this design approach is sufficient. This ideal would be some
> sort of "strong simplicity" which inherently handles the old attacks,
> thus giving us some confidence that it may handle new attacks as well.

I wholeheartedly agree.

> >This is a sensible
> >academic approach: for the rough-and-tumble "real world", I would
> >recommend mixing that approach with the use of the most powerful
> >constructs possible to stymie any apparent hope of cryptanalysis - there
> >are ways of ensuring that a hard to analyze addition to a cipher is very
> >unlikely to weaken it (independent, or nearly-independent keys).
>

> Perhaps ideally we would like to see multiple distinct ciphering
> constructs inside the cipher. These would function sequentially as a
> sort of internal multiple ciphering in "layers," as opposed to simply
> increasing the number of "rounds" of a single structure.

This is what MARS does.

Bruce Schneier

unread,
Sep 8, 1998, 3:00:00 AM9/8/98
to
On Tue, 08 Sep 1998 09:34:32 GMT, dian...@tecapro.com wrote:
> Many would say that at least there is no reason to design
> unnecessarily complicated ciphers just in order to make analysis
> complicated too.

I certainly agree with this.

> Let us play devil's advocate and investigate
> this: Security is about the cost of breaking a cipher. This cost
> has two components: the cost of cryptanalysis that discovers a
> feasible attack and the cost of carrying out this attack.

The rules are a little bit different in academic cryptanalysis. The
cost to attack a cipher is the complexity (space or time) of an
attack. The "cost" of discovering an attack doesn't really count.
The security of a cipher is weakened after the attack is discovered;
there isn't really any equation anymore.

Our goal is to make the easiest attack costly, even after it is
discovered. Let's face it, everything is secure before you know how
to attack it.

> Now if a
> cipher is tremendously complicated to analyze then the cost of the
> first component can become unacceptably high and therefore this
> cipher will provide good security.

No. Not really. If a cipher is incredibly complicated then there
could or could not be a good attack, but it will take a lot of
analysis to find it. Note that this does not have to be sequential
analysis; you can assume that lots of people will be looking at AES in
parallel. Also note that other people can do your work for you; if
you are the NSA and want to break AES, you are free to use whatever
techniques Biham and Shamir discover.

> You see, even a very powerful
> cryptanalyst who already disposes of a menu with all good attacks
> and needs only to probe a new cipher with each one to detect a
> weakness, must presumably invest some effort in this probing that
> varies from cipher to cipher depending on its complexity. (Let us
> assume or hope that there do not exist automated cryptographers
> yet :) ).

No. I once thought this, too. Attacking is cipher is NOT anything
like taking a menu of standard techniques and, in a standard way,
applying those techniques. Study the different differential attacks
against ciphers. Only the first few were taking existing techniques
and applying them in a straightforward manner. The real work in an
attack, at least an attack against a well-designed cipher, is
modifying the attack technique so that it works. Knudsen's papers are
an excellent example of this; he is a master at making an attack work
where others have failed. Differentials work where characteristics
don't. Truncated differentials work where normal differentials don't.
Even this year's exciting find, impossible differentials, are simply
another way at looking at a differential attack. A cryptanalyst with
a "menu" would have never found any of those attacks, and would have
broken far fewer ciphers.

Do not confuse complexity with security. It is a grave mistake.

> Probably, such a complicated cipher will not be fast and could not
> compete for the AES. Actually several years ago I wrote such a
> cipher, "GodSave", which is designed to overwhelm an attacker's
> analytic ability. GodSave uses Luby-Rackoff in a remarkably
> complicated manner including large sections of machine generated
> "unreadable" code; no wonder it is about 100 times slower than the
> fastest AES candidates.

Indeed. The problem with most ad hoc ciphers is that they are very
slow. Most of the AES candidates encrypt in less than 500 clock
cycles per block and set up a key in less than 5000. (Serpent is an
exception, but they went for a more conservative design.) Given
enough clock cycles, it's easy to make something secure.

> FROG is a different kind of beast. When John says that "one of the
> reasons that FROG is likely to be secure against an unknown attack
> may be because a cipher like it is very hard to analyze", I think
> he does not imply that FROG is too complicated but rather that it
> is too simple. When a cryptographer probes a cipher to see how
> well it withstands attack X then he is looking for a specific
> feature in the cipher. If the cipher is almost featureless then
> there is a good probability that the desired feature is not
> present. (Here it is immaterial whether the cipher's designer knew
> something about attack X or not.) To use an analogy: if you want
> to defend a target one possibility is to armor it, another it to
> make the target very small.

To use another analogy, a chain is as secure as the weakest link.

Bruce Schneier

unread,
Sep 8, 1998, 3:00:00 AM9/8/98
to
On Mon, 07 Sep 1998 15:07:31 GMT, dian...@tecapro.com wrote:
>In article <35f13e5...@news.visi.com>,
> schn...@counterpane.com (Bruce Schneier) wrote:
> Time is relative and depends on the current state of art. After
> all it took years to find out that DES is strong against
> differential cryptanalysis and years more to find out that it is
> strong against linear cryptanalysis too.
>
> What I was referring to is the *apparent* strength of 8-round FROG
> when compared to 8-round DES:
>
> 8-DES FROG
>
> Chosen plaintexts (differential attack) 2^14 2^58
>
> Known plaintexts (linear attack) 2^21 2^56
>
> Probability of success 1 2^(-32)

Yes, you can argue that, after looking at FROG for a few hours, the
apparent strength is stronger than that of DES. But imagine this
analogy. You go to your doctor, who has just conducted a half-day
study that appears to show that injecting you with a ground up taco
will work better than a shot of penicillin. What would you do?

(A tip of the hat to Matt Blaze for the analogy, albeit in another
context.)

> Also observe that FROG is faster then 8-round DES when doing
> one encryption at a time (let us not discuss FROG's gargantuan
> key-setup). I understand this comparison is very rough but still
> I think it is significant because FROG is conceptually much
> simpler than DES

I consider DES to be conceptually much simpler than FROG. You are not
a good judge, being a designer of both. I am probably not a great
judge either, since DES has been in my brain for so many years.

> and because FROG was not specifically designed
> to resist differential or linear attacks (and even so, attacks were
> found only against a small set of the keys). The more I think
> about it the clearer it appears to me that a cipher that not by
> design resists known attacks is more likely to resist unknown
> attacks too.

There is some truth to this.

> FROG is one instance of a novel design principle. I shall try to
> investigate how the attacks on your paper work against more general
> versions of FROG which were announced before your paper was
> published. If it turns out that by more consistently following its
> design principles FROG becomes stronger against known attacks with
> the same number of rounds and at the same speed then, I believe,
> the soundness of FROG's design principles becomes very convincing
> indeed and much confidence is gained in the generalized FROG's
> ability to resist unknown attacks.

Good. Keep working.

> NIST, reasonably enough, does not allow changes in the current
> phase of the competition. To make it to the second round of the
> AES competition it appears I have only two chances:
>
> a) Find some error in your paper and show that the current version
> of FROG does not have any certification weaknesses.

Actually, there are some errors in our paper. We will try to deal
with them when we get the chance.

> b) Show the strength of FROG's design principles using its more
> general version.
>
> Now, point a) is unlikely; point b) I believe is likely.

I don't think that NIST will accept anything like (b), but that
doesn't mean you shouldn't try to improve FROG. Remember to fix the
asymmetry between encryption and decryption; your algorithm is much
easier to attack in the decryption direction. Look at CAST-256 or
Skipjack.

> I shall have to quickly analyze these points if I want to have any
> possibility to make it to the second round. NIST clearly wants the
> AES to be a world-wide standard. They know that it will take years
> for the AES competition to be concluded and the implementation
> phase to start. If sufficient evidence can be presented that FROG
> does indeed represent an instance of a powerful and sound design
> principle they may just decide to let it slip to the second round
> and gain some time for further analysis, rather than even remotely
> risk that a cipher not anymore considered for the AES gains a lot of
> credibility and makes the acceptance of the AES questionable.

Good.

>Changing the subject, Bruce could you please post some information about
>"Biham and Shamir’s new impossible cryptanalysis" you mentioned the
>other day? I am sure a lot of us are very curious about that.

Check Biham's website. It's his analysis. Basically, he looks at
truncated differentials with low or zero probability, and attacks
ciphers that way. The technique seems to work well against Skipjack,
and less well against CAST-256.

David Wagner

unread,
Sep 8, 1998, 3:00:00 AM9/8/98
to
In article <6t0srj$1b1$1...@nnrp1.dejanews.com>, <dian...@tecapro.com> wrote:
> What I was referring to is the *apparent* strength of 8-round FROG
> when compared to 8-round DES:

Well, I'm not sure that this is a good comparison point.
Each FROG round modifies the entire block; each DES round affects
only half of the block.

Personally, I think it might make more sense to use the notion of
a "cycle" (first introduced by Schneier & Kelsey in FSE'96). A
cycle is the number of rounds needed before every bit in the block
is potentially changed. Thus, in DES, two rounds make a cycle;
but in FROG, one round makes a cycle.

So both the full 16-round DES and the full 8-round FROG contain
exactly 8 cycles. This suggests (to me) that it might make more
sense to compare 8-round FROG to 16-round DES.

With this comparison point, FROG still compares favorably to DES
against known attacks: with FROG, one expects to recover the first
key after 2^{56.7} chosen ciphertexts (*if* there are sufficiently
many cryptanalytic targets), whereas with DES, the best known
attack recovers the key after 2^{43} known plaintexts.

Alternatively, instead of making a comparison based on cycles,
you might make a comparison based on performance. However, the
numbers I have access to suggest that single-DES is somewhat faster
than FROG, so a performance-based comparison won't make FROG look
any better than a cycle-based comparison.

Most importantly, you should keep in mind that DES has seen much
more analysis than FROG has, so the complexity of known attacks
is not a very good foundation to make a comparison on. It seems
entirely plausible that if someone spent the time to study FROG
in more depth, they might be able to significantly improve our
attacks. (Or maybe not; who knows.)

> Also observe that FROG is faster then 8-round DES when doing
> one encryption at a time (let us not discuss FROG's gargantuan
> key-setup).

Odd. The performance guru I spoke to said that full 16-round
DES runs in 43 clocks/byte in handtuned assembly on a Pentium,
whereas FROG requires at least 48 clocks/byte as a theoretical
minimum (and perhaps closer to 70 clocks/byte due to instruction
pairing problems).

How does this compare to your performance numbers? Have you
done an implementation?

dian...@tecapro.com

unread,
Sep 8, 1998, 3:00:00 AM9/8/98
to
In article <6t3l3n$5bg$1...@joseph.cs.berkeley.edu>,

d...@joseph.cs.berkeley.edu (David Wagner) wrote:
> In article <6t0srj$1b1$1...@nnrp1.dejanews.com>, <dian...@tecapro.com> wrote:
> > What I was referring to is the *apparent* strength of 8-round FROG
> > when compared to 8-round DES:
>
> Well, I'm not sure that this is a good comparison point.
> Each FROG round modifies the entire block; each DES round affects
> only half of the block.

Yes, but each DES round does process (read or write) the whole
block too.

> Personally, I think it might make more sense to use the notion of
> a "cycle" (first introduced by Schneier & Kelsey in FSE'96). A
> cycle is the number of rounds needed before every bit in the block
> is potentially changed. Thus, in DES, two rounds make a cycle;
> but in FROG, one round makes a cycle.
>
> So both the full 16-round DES and the full 8-round FROG contain
> exactly 8 cycles. This suggests (to me) that it might make more
> sense to compare 8-round FROG to 16-round DES.

I am not familiar with this notion of "cycle" - but in the context
of our discussion I don't quite agree. It seems to me that a
fairer measurement would be to count how many bits are read and
processed, or processed and written per block-bit per round.
Example: FROG requires just seven 8-bit machine instructions per
byte per round, i.e. 7 "bit-operations" per bit per round. By my
estimate DES requires at least 11 "bit-operations" per bit per
round (here I assume that full DES requires 43 32-bit instructions
to encrypt one byte which is probably an understatement).
Therefore I think it is reasonable to compare one round of FROG
to one round of DES because their computational workload per
byte is similar.

It is not very relevant here but interesting to note that you can
write an assembly routine of just 22 machine instructions that does
full FROG encryption or decryption. This is a very simple cipher.

> With this comparison point, FROG still compares favorably to DES
> against known attacks: with FROG, one expects to recover the first
> key after 2^{56.7} chosen ciphertexts (*if* there are sufficiently
> many cryptanalytic targets), whereas with DES, the best known
> attack recovers the key after 2^{43} known plaintexts.
>
> Alternatively, instead of making a comparison based on cycles,
> you might make a comparison based on performance. However, the
> numbers I have access to suggest that single-DES is somewhat faster
> than FROG, so a performance-based comparison won't make FROG look
> any better than a cycle-based comparison.

Not quite - see bellow.

> Most importantly, you should keep in mind that DES has seen much
> more analysis than FROG has, so the complexity of known attacks
> is not a very good foundation to make a comparison on. It seems
> entirely plausible that if someone spent the time to study FROG
> in more depth, they might be able to significantly improve our
> attacks. (Or maybe not; who knows.)

I agree. That is why I stressed *apparent* strength.

> > Also observe that FROG is faster then 8-round DES when doing
> > one encryption at a time (let us not discuss FROG's gargantuan
> > key-setup).
>
> Odd. The performance guru I spoke to said that full 16-round
> DES runs in 43 clocks/byte in handtuned assembly on a Pentium,
> whereas FROG requires at least 48 clocks/byte as a theoretical
> minimum (and perhaps closer to 70 clocks/byte due to instruction
> pairing problems).
>
> How does this compare to your performance numbers? Have you
> done an implementation?

The 43 Pentium clocks/byte you mention for DES I believe refers to
bit-splicing implementations where 32 encryptions run in parallel.
I have an assembler version of FROG that uses less than 90
clocks/byte, but this is 8086 assembler, not Pentium assembler.
FROG is amenable to "byte-slicing" where up to 8 byte-wide
FROG encryptions run quasi-parallel using both Pentium integer
pipelines. I haven't coded this yet but my guess is I can push
FROG down to about 20 clocks/byte. (This speed analysis does
of course overlook FROG's complex and slow key setup.)

Comparing ciphers seems to be an art form in its own right and it
appears very reasonable that NIST has opened a discussion forum
in their server just about this matter.

W T Shaw

unread,
Sep 8, 1998, 3:00:00 AM9/8/98
to
In article <6t2tn8$ooe$1...@nnrp1.dejanews.com>, dian...@tecapro.com wrote:

> In article <35ed8261...@news.io.com>,
> rit...@io.com (Terry Ritter) wrote:

>
> FROG is a different kind of beast. When John says that "one of the
> reasons that FROG is likely to be secure against an unknown attack
> may be because a cipher like it is very hard to analyze", I think
> he does not imply that FROG is too complicated but rather that it
> is too simple. When a cryptographer probes a cipher to see how
> well it withstands attack X then he is looking for a specific
> feature in the cipher. If the cipher is almost featureless then
> there is a good probability that the desired feature is not
> present. (Here it is immaterial whether the cipher's designer knew
> something about attack X or not.) To use an analogy: if you want
> to defend a target one possibility is to armor it, another it to
> make the target very small.

These are good thoughts you have. Something need not be complicated to be
effective, and elaboration builds dark corners that might be hard to
explore.

Bruce Schneier

unread,
Sep 9, 1998, 3:00:00 AM9/9/98
to
On Tue, 08 Sep 1998 23:12:26 GMT, dian...@tecapro.com wrote:

>In article <6t3l3n$5bg$1...@joseph.cs.berkeley.edu>,
> d...@joseph.cs.berkeley.edu (David Wagner) wrote:
>> In article <6t0srj$1b1$1...@nnrp1.dejanews.com>, <dian...@tecapro.com> wrote:
>> > What I was referring to is the *apparent* strength of 8-round FROG
>> > when compared to 8-round DES:
>>
>> Well, I'm not sure that this is a good comparison point.
>> Each FROG round modifies the entire block; each DES round affects
>> only half of the block.
>
> Yes, but each DES round does process (read or write) the whole
> block too.

You're missing the point. To compare ciphers of different
structures--Fesitel Networks, SP Networks, incomplete contsructions
like Skipjack, etc--it makes no sense to talk about "rounds." A round
is simply what the designer decided to call a round, not anything
special. (Note that Rivest went against all tradition by calling what
everyone else would call two rounds a single round in RC5. He
corrected this in RC6.)

A good atomic structure for comparison is encrypting each bit of the
text block once. That is two Feistel rounds, one IDEA or FROG or
SAFER+ round, or four Skipjack or CAST-256 rounds. If you think
about it, only half the DES block is processed (i.e., encrypted) in
each round. This metric isn't about performance; it's about
encryption.

>> Personally, I think it might make more sense to use the notion of
>> a "cycle" (first introduced by Schneier & Kelsey in FSE'96). A
>> cycle is the number of rounds needed before every bit in the block
>> is potentially changed. Thus, in DES, two rounds make a cycle;
>> but in FROG, one round makes a cycle.
>>
>> So both the full 16-round DES and the full 8-round FROG contain
>> exactly 8 cycles. This suggests (to me) that it might make more
>> sense to compare 8-round FROG to 16-round DES.
>
> I am not familiar with this notion of "cycle" - but in the context
> of our discussion I don't quite agree. It seems to me that a
> fairer measurement would be to count how many bits are read and
> processed, or processed and written per block-bit per round.

We're not doing optimization studies, so how many bits are "read" is
not really relevent. You can look at cycles of input (the above was
cycles of output). In each round of DES, half the bits are input into
the F function. In each round of Skipjack, one quarter of the bits
are input into the F function. In each round of FROG, all the bits
are input into the (for lack of a better term) F function.

> Example: FROG requires just seven 8-bit machine instructions per
> byte per round, i.e. 7 "bit-operations" per bit per round. By my
> estimate DES requires at least 11 "bit-operations" per bit per
> round (here I assume that full DES requires 43 32-bit instructions
> to encrypt one byte which is probably an understatement).
> Therefore I think it is reasonable to compare one round of FROG
> to one round of DES because their computational workload per
> byte is similar.

You can compare computational workload, i.e., clock cycles. This
makes sense. I don't think you're numbers are accurate, but that's
fine. DES is hard to optimize. ANd remember, workload is very
dependent on platform. What is comparible on smart cards may not be
comparible on 32-bit machines. And what is comparible in hardware may
not be comparible in software.

> It is not very relevant here but interesting to note that you can
> write an assembly routine of just 22 machine instructions that does
> full FROG encryption or decryption. This is a very simple cipher.

Indeed. It is interesting.

>> > Also observe that FROG is faster then 8-round DES when doing
>> > one encryption at a time (let us not discuss FROG's gargantuan
>> > key-setup).
>>
>> Odd. The performance guru I spoke to said that full 16-round
>> DES runs in 43 clocks/byte in handtuned assembly on a Pentium,
>> whereas FROG requires at least 48 clocks/byte as a theoretical
>> minimum (and perhaps closer to 70 clocks/byte due to instruction
>> pairing problems).
>>
>> How does this compare to your performance numbers? Have you
>> done an implementation?
>
> The 43 Pentium clocks/byte you mention for DES I believe refers to
> bit-splicing implementations where 32 encryptions run in parallel.

The 43 Pentium clocks/byte is NOT a bit sliced implementation. It is
conventional implementation. Bit slicing is faster, but I consider
that "cheating" in benchmark comparisons, and don't use those numbers.

> Comparing ciphers seems to be an art form in its own right and it
> appears very reasonable that NIST has opened a discussion forum
> in their server just about this matter.

Indeed it is. I learned a lot about it during the Twofish design.

Bruce Schneier

unread,
Sep 9, 1998, 3:00:00 AM9/9/98
to
On Tue, 08 Sep 1998 09:34:32 GMT, dian...@tecapro.com wrote:

> FROG is a different kind of beast. When John says that "one of the
> reasons that FROG is likely to be secure against an unknown attack
> may be because a cipher like it is very hard to analyze", I think
> he does not imply that FROG is too complicated but rather that it
> is too simple. When a cryptographer probes a cipher to see how
> well it withstands attack X then he is looking for a specific
> feature in the cipher. If the cipher is almost featureless then
> there is a good probability that the desired feature is not
> present. (Here it is immaterial whether the cipher's designer knew
> something about attack X or not.) To use an analogy: if you want
> to defend a target one possibility is to armor it, another it to
> make the target very small.

This is not true in practice. When a cryptanlyst looks to see if a
cipher is vulnerable to attack X, he is not always looking for a
specific feature in the cipher. To take an example, what is the
feature in Skipjack that makes it vulnerable to impossible
cryptanalysis? There is nothing that you could take out of that
cipher that makes it stronger. It's the same thing with FEAL and
differential cryptanalysis, or KN-cipher and any of the higher-order
attacks.

When an attacker looks to see if a cipher is vulnerable to attack X,
he looks to see if he can adapt the attack to the particulars of the
cipher. He looks to see if the cipher has whatever
artifact--differentials, impossible differentials, linear relations,
byte symmetry, parity relations, whatever--that he needs to make his
attack work.

By your logic, weak ciphers could be made stronger by removing parts
of them. This is not true most of the time.

Certainly adding complications for no reason is not a good idea, and
can introduce security vulnerabilities. But deleting complications
for no reason is just as bad an idea, maybe a worse one.

0 new messages