Why is IDEA only 128 bits

8 views
Skip to first unread message

Syed Mujtaba

unread,
Jul 15, 1997, 3:00:00 AM7/15/97
to

I have been told that the IDEA is one of the most secure encryption
algorithms available today. So why is it's bit key size limited to 128
bits? Would'nt a 256 bit key be even more secure?


Chuck Willis

unread,
Jul 15, 1997, 3:00:00 AM7/15/97
to

> I have been told that the IDEA is one of the most secure encryption
> algorithms available today. So why is it's bit key size limited to 128
> bits? Would'nt a 256 bit key be even more secure?

Yeah, but a 512 bit key would be even MORE secure. Or what about a 1024 bit
key? The point is that you have to stop somewhere and in the forseeable
future a 128 bit key is as secure as a 2^128 bit key. There are other
algorithms like Blowfish and SPEED that allow longer keys, but I don't know
if they scale in the same manner. That is, I am not sure if a 128 bit
Blowfish key is as secure as a 128 bit IDEA key. I wouldn't imagine that it is
much different though.
Chuck

Chuck Willis - cfwi...@uiuc.edu - http://www.uiuc.edu/ph/www/cfwillis/
Finger me for my PGP Public key keyID = DBBBF455
Fingerprint = 4E 7A FE A7 D7 80 F1 F6 4D FF 98 1E 59 CC 77 A6


Will Ware

unread,
Jul 15, 1997, 3:00:00 AM7/15/97
to

Chuck Willis (cfwi...@students.uiuc.edu) wrote:
: I am not sure if a 128 bit
: Blowfish key is as secure as a 128 bit IDEA key.

The most important question about that is, are there attacks on either
cipher that are more efficient than brute force key search? As far as I'm
aware, no such attacks are publicly known for either cipher.

If you're stuck doing a brute force key search, and the keyspaces are of
equal sizes, the next question is how long each key takes to verify.
Blowfish was specifically designed with a complicated set-up procedure
for each key, to make brute force attacks inconveniently slow.
--
-------------------------------------------------------------
Will Ware email: wware[at]world[dot]std[dot]com
PGP fingerprint 45A8 722C D149 10CC F0CF 48FB 93BF 7289

David A. Scott

unread,
Jul 15, 1997, 3:00:00 AM7/15/97
to

In a previous article, cfwi...@students.uiuc.edu (Chuck Willis) says:

>> I have been told that the IDEA is one of the most secure encryption
>> algorithms available today. So why is it's bit key size limited to 128
>> bits? Would'nt a 256 bit key be even more secure?
>
>Yeah, but a 512 bit key would be even MORE secure. Or what about a 1024 bit
>key? The point is that you have to stop somewhere and in the forseeable
>future a 128 bit key is as secure as a 2^128 bit key. There are other
>algorithms like Blowfish and SPEED that allow longer keys, but I don't know
>if they scale in the same manner. That is, I am not sure if a 128 bit
>Blowfish key is as secure as a 128 bit IDEA key. I wouldn't imagine that it is

In modern encryption there is more than one school of thought one
is to use a size that is the bare minimum of what is projected to be
safe for a few years. Once DES at 56 bits was to be safe. Now the experts
say 128 bits is such a number. Another approach is design an encryption
system so that the max key size is what ever is easy to do on a commputers
If you like that approach try SCOTT16U.ZIP it has over 900,000 bits of
real key space. But be careful the old crypto people think it is weak
but they are afriad to look at it. Yes the the orginal was weak if
the nonpadded encryption used with choosen plain text attack but not the
U version. None of the contests have been solved so good luck.

--
David A. Scott for serious file encryption and $ contest scott*.zip
contest files ftp ftp.ridgecrest.ca.us in pub/users/d/dscott/
code files in US http://www.sni.net/~mpj/crypto.htm
in UK http://www.users.zetnet.co.uk/hopwood/crypto/scott16/

Steve Peterson

unread,
Jul 15, 1997, 3:00:00 AM7/15/97
to

On 15 Jul 1997 17:41:00 GMT, "Syed Mujtaba" <muj...@worldnet.att.net>
wrote:

>I have been told that the IDEA is one of the most secure encryption
>algorithms available today. So why is it's bit key size limited to 128
>bits? Would'nt a 256 bit key be even more secure?

Well, look at it this way. Assume that you could build a computer that
could test 1 trillion IDEA keys in a second (mind you, that's not 1
trillion instructions per second, since each IDEA encryption would take
thousands of instructions), and you could built a trillion of these
mythical computers (certainly more expensive than all the wealth in the
world could afford, since we're talking about 200 computers per person),
how long, on average, would it take to crack 1 IDEA encryption?

Oh, about 5 1/2 million years. This is with technology that we aren't even
*close* to having, nor could afford to build.

128 bit key length (assuming that the algorithm itself is secure, which
IDEA seems to be) is definitely long enough for the near future.

Would IDEA be stronger if it used a longer key? Maybe not. Some
algorithms might actually be less secure with a longer key, due to the
subtle changes that using a longer key might place on the algorithm.

This is not to say that an algorithm with a longer key is less secure, just
that changing the length of a key can introduce variables into the cipher.
The best bet is to choose a key length that is acceptable for your needs,
and then design the algorithm around that key length.

Steve Peterson

Markus Kuhn

unread,
Jul 15, 1997, 3:00:00 AM7/15/97
to

Syed Mujtaba wrote:
> I have been told that the IDEA is one of the most secure encryption
> algorithms available today. So why is it's bit key size limited to 128
> bits? Would'nt a 256 bit key be even more secure?

IDEA is probably the most interesting algorithm available.
Sure, you can blow up the key size to 900 000 bits like someone
else in this group has tried, but normally you want to have the
shortest possible key size that provides you the security that
you need. From a good algorithm, you expect that years of intensive
efforts have not found a chosen plaintext attack that is in any
way better than a full search of the entire key search. Only after
this requirement has been fulfilled, it makes any sense to look
at the key length, since otherwise the key length is very
meaningless. There are very few algorithms so far that have
been analysed with reasonably scrutiny in the scientific literature.
DES is one of them, IDEA is on the way to become another one, and
there are not many more candidates today.

If a brute force algorithm is really the most efficient attack, then
any key length over 90 bits is absolutely secure from even the richest
governments. With 128 bits, you are far over this limit. If you do not
understand why, then you probably have not understood really what
exponential growth really means.

In the other side if a brute force attack is not the most efficient
attack, then the key length is mostly irrelevant to the actual
security of an algorithm and not the aspect you should worry about.

Markus

--
Markus Kuhn, Computer Science grad student, Purdue
University, Indiana, US, email: ku...@cs.purdue.edu

Michael Sierchio

unread,
Jul 15, 1997, 3:00:00 AM7/15/97
to

David A. Scott wrote:
>... Once DES at 56 bits was to be safe. Now the experts

> say 128 bits is such a number.

128-bit DES keys make a brute force key search harder, to
be sure. The DES variants with larger keyspaces do not
invariably appear to be stronger than DES in resisting
differential cryptanalytic attacks, however.

Key size is only one consideration...

Patrick Juola

unread,
Jul 16, 1997, 3:00:00 AM7/16/97
to

In article <5qgrs6$c...@news.ysu.edu> an...@yfn.ysu.edu (David A. Scott) writes:
>
>In a previous article, cfwi...@students.uiuc.edu (Chuck Willis) says:
>
>>> I have been told that the IDEA is one of the most secure encryption
>>> algorithms available today. So why is it's bit key size limited to 128
>>> bits? Would'nt a 256 bit key be even more secure?
>>
>>Yeah, but a 512 bit key would be even MORE secure. Or what about a 1024 bit
>>key? The point is that you have to stop somewhere and in the forseeable
>>future a 128 bit key is as secure as a 2^128 bit key. There are other
>>algorithms like Blowfish and SPEED that allow longer keys, but I don't know
>>if they scale in the same manner. That is, I am not sure if a 128 bit
>>Blowfish key is as secure as a 128 bit IDEA key. I wouldn't imagine that it is
> In modern encryption there is more than one school of thought one
>is to use a size that is the bare minimum of what is projected to be
>safe for a few years. Once DES at 56 bits was to be safe.

Nope. Even the experts who designed Lucifer (the proto-DES) wanted 64 bits.
They knew how hard it would be to BF 56 bits.


>Now the experts
>say 128 bits is such a number.

Well, do the math yourself. Assume you get a billion trial encryptions
per second....

>Another approach is design an encryption
>system so that the max key size is what ever is easy to do on a commputers
>If you like that approach try SCOTT16U.ZIP it has over 900,000 bits of
>real key space. But be careful the old crypto people think it is weak

Only because Paul Onions cracked the first version in a week.

-kitten


Marc Thibault

unread,
Jul 16, 1997, 3:00:00 AM7/16/97
to

"Syed Mujtaba" <muj...@worldnet.att.net> writes:

> I have been told that the IDEA is one of the most secure encryption
> algorithms available today. So why is it's bit key size limited to 128
> bits? Would'nt a 256 bit key be even more secure?
>

Just as there comes a point where adding yet another lock to a
door is not worth the cost or the effort, there comes a point
where adding bits to a key falls into the same class.

256-bit keys are like eight locks on an apartment door. They add
to the work of using them without adding much practical security.

The thing to keep in mind is that 129 bits is twice as secure as
128. By the time you get to 256, you have gone from a "life of the
universe" time to break, to a "life of every conceivable universe
all one after the other" time to break. Both are significantly
more than the useful life of any secret.

David A. Scott

unread,
Jul 16, 1997, 3:00:00 AM7/16/97
to

In a previous article, pat...@gryphon.psych.ox.ac.uk (Patrick Juola) says:

>
>>Another approach is design an encryption
>>system so that the max key size is what ever is easy to do on a commputers
>>If you like that approach try SCOTT16U.ZIP it has over 900,000 bits of
>>real key space. But be careful the old crypto people think it is weak
>
>Only because Paul Onions cracked the first version in a week.

True the first version was cracked because Paul Onions had found
a plain text attack and I never considered that before. However it was
not a week the contest was old then and people saying key to long. By the
way the contest as never been sloved yet. When I release the next contest
for scott16u.zip I will allow plain texts attacks. I just have to find
a way to test it with out anwsering hunreds of plaintext things. As far
as I know no one has ever ofered a contest where plain text attacks allowed.
Any one have bright ideas how to do it so I don't have to run my program
several times over and over?

Mark Atwood

unread,
Jul 16, 1997, 3:00:00 AM7/16/97
to

Markus Kuhn <ku...@cs.purdue.edu> writes:
>
> If a brute force algorithm is really the most efficient attack, then
> any key length over 90 bits is absolutely secure from even the richest
> governments. With 128 bits, you are far over this limit. If you do not
> understand why, then you probably have not understood really what
> exponential growth really means.
>

In fact, not even considering the time involved, doesn't it take more
energy than is contained in the sun to *just* step the most efficient
possible counter through 128 bits?

--
Mark Atwood | Thank you gentlemen, you are everything we have come to
z...@ampersand.com | expect from years of government training. -- MIB Zed


gene m. stover

unread,
Jul 16, 1997, 3:00:00 AM7/16/97
to

David A. Scott (an...@yfn.ysu.edu) wrote:

: When I release the next contest


: for scott16u.zip I will allow plain texts attacks. I just have to find
: a way to test it with out anwsering hunreds of plaintext things. As far
: as I know no one has ever ofered a contest where plain text attacks allowed.
: Any one have bright ideas how to do it so I don't have to run my program
: several times over and over?

David,

If you wanted to simulate a known (or guessed) plaintext
attack, you might make sure that all the words in your
plaintext message are from some standard dictionary,
preferably one that's online (like the spell-check
dictionary on Unix). Or maybe you could tell the contestants
that it's the source code from an example program in some
well known book such as the ARM (Ellis & Stroustrup) or K&R.
In the real world, guessed plaintext attacks (using current
issues, dates, or names of recipients & senders) are common.
Personally, I suspect that guessed plaintext attacks will
also find a use if cryptanalysis on computer files sent over
the Internet is ever common; computer files have well-known,
understandable syntaxes. For example, the header portions
of almost any type of document. My point is that a
known/guessed plaintext contest might be interesting, & it's
relevant & feasible.

For a chosen plaintext attack, you might have to provide an
encryption server that has a fixed key (which you keep
secret until the contest is solved) & which allowed clients
to submit plaintext to be encrypted by the server using the
fixed, unknown key. The goal of the contest would be either
to recover the key or to recover the plaintext of a message
you have enciphered using that key. I'm not satisfied with
this method of offering a chosen plaintext contest; it
doesn't solve your requirement of not running your encryptor
many times. Maybe there are better ways.

gene
--
---
gene m. stover <ge...@CyberTiggyr.com>
Making the world safe from democracy.

Kaz Kylheku

unread,
Jul 16, 1997, 3:00:00 AM7/16/97
to

In article <01bc9146$7a911d00$bb84410c@zain>,

Syed Mujtaba <muj...@worldnet.att.net> wrote:
>
>I have been told that the IDEA is one of the most secure encryption
>algorithms available today. So why is it's bit key size limited to 128
>bits? Would'nt a 256 bit key be even more secure?

It would be, but 128 bits is ``good enough'' for most purposes. If brute
force is the only way to crack an IDEA key, then you are looking at a
very long search.

Also, if keys are derived from pass phrases (which is not always the case; keys
are often randomly chosen), the longer the key, the longer your phrase has to
be to take advantage of the extra bits. Block ciphers with very long keys are
only practical in applications where the key is randomly generated rather than
hashed from a password phrase. To take advantage of 256 bits of entropy, an
English-text password phrase has to be over 100 letters long!

There are probably also considerations of hardware implementation; it's easy to
dream up ciphers that turn out to be costly to implement in hardware.

\begin{shamelessplug}
If you are interested in 256 bit key encryption, download
http://www.cafe.net/kaz/hunt/phantom-1.0.tar.gz
\end{shamelessplug}

Markus Kuhn

unread,
Jul 16, 1997, 3:00:00 AM7/16/97
to

Mark Atwood wrote:
> In fact, not even considering the time involved, doesn't it take more
> energy than is contained in the sun to *just* step the most efficient
> possible counter through 128 bits?

This is normally part of every first lecture cryptography homework,
but here we go:

Let's assume, we can build a circuit that can search through the
key space with 10 GHz = 10^10 1/s.

Let us assume, we can build 1 trillion (10^12) of those circuits
and use them in parallel (absolutely utopic). Just think about it:
we can do only <10^7 transistors today per chip and you will
certainly need >10^4 transistors per circuit, i.e. 10^12 circuits
would mean >10^9 chips, much more than Intel will ever sell in this
century.

With this incredible machine, the construction of which would
make look the Manhattan and Apollo project like trivial exercices,
we can test 10^22 keys per second, i.e. around 2^73 keys
per second.

Now let us say that we want to bring the success probability of
finding the key with this absolutely utopic machine down to less
then one in 1000 within 100 years. This means that the machine
has to search for 100 000 years = 10^5 y = 3.2*10^12 s
< 2^42 s.

This requirement would be fullfilled with a 2^(73+42) = 2^115
elements large key space, i.e. with 115 key bits.

With 128 key bits, you are way beyond this requirement, and
even if you have only 112 bits, you are more than close enough.

With somewhat more realistic assumtions of the technology and
money available to your worst imagineable attacker (say NSA),
>90 bit keys should already be more than sufficiently secure,
too, but we usually do not care about those four additional bytes
much.

All this assumes of course again that a brute force search is really
the most efficient attack, which is the really interesting part of
the security evaluation of an algorithm.

The Messiah

unread,
Jul 16, 1997, 3:00:00 AM7/16/97
to

David A. Scott wrote:
> True the first version was cracked because Paul Onions had found a plain text attack and I never considered > that before.

Doesn't matter how the algorithm is broken, if it's broken, it's broken.
If a lock breaks when hit really hard with a rock, it still breaks, and
the intruder has access to it all.

> However it was not a week the contest was old then and people saying key to long.

It was 2 weeks, and the kept on telling you not to pimp your algorithm
until it had been proved. It takes about 5 years before an algorithm can
really be considered secure.

--
+--------------------------------+
| The Messiah |
+--------------------------------+
| Remove the spam blockers (x) |
+--------------------------------+
| Ou' sont les neiges d' antan |
| Villon |
+________________________________+
| There is a man... |
| playing a violin... |
| and the strings... |
| are the nerves in his own arm. |
| A twisted soul- the mortar... |
| despair- the bricks... |
| to build a temple of sadness. |
| The Crow, J. O'Barr |
+--------------------------------+

Michel Arboi

unread,
Jul 16, 1997, 3:00:00 AM7/16/97
to

> In modern encryption there is more than one school of thought one
> is to use a size that is the bare minimum of what is projected to be
> safe for a few years. Once DES at 56 bits was to be safe. Now the experts

> say 128 bits is such a number.

If I remember well, there is a "proof" in "Applied cryptography" that
256 bit long keys are immune to brute force attack (the proof says
that there is not enough energy in the sun to count from 0 to 2^256-1
according to the information theory...)
But of course, other attacks may succeed.

Are you sure that SCOTT16U is much safer with, e.g. 900000 bit keys
than with 1024 or 256 bit keys ?


David Hopwood

unread,
Jul 17, 1997, 3:00:00 AM7/17/97
to

In message <5qiing$f...@news.ysu.edu>

an...@yfn.ysu.edu (David A. Scott) writes:

> In a previous article, pat...@gryphon.psych.ox.ac.uk (Patrick Juola) says:

> > David Scott wrote:
> > >But be careful the old crypto people think [scott16u] is weak


> >
> >Only because Paul Onions cracked the first version in a week.

> True the first version was cracked because Paul Onions had found
> a plain text attack and I never considered that before. However it was


> not a week the contest was old then

It was about two weeks - and the fact that you hadn't considered plaintext
attacks doesn't inspire confidence.

> and people saying key to long. By the

> way the contest as never been sloved yet. When I release the next contest


> for scott16u.zip I will allow plain texts attacks. I just have to find
> a way to test it with out anwsering hunreds of plaintext things. As far
> as I know no one has ever ofered a contest where plain text attacks allowed.

There have been several contests that offered money for any attack at all
(Dr. Dobbs Journal offered this for Blowfish, IIRC, although there was some
controversy about the payout for someone who found a set of partially weak
keys).

> Any one have bright ideas how to do it so I don't have to run my program
> several times over and over?

That's no problem; when someone is testing an attack they can use their own
plaintext, so you will only have to do this in order to confirm attacks that
are already known to work. Just make sure that you provide source-code for an
implementation in _portable_ C (i.e. no reliance on byte order, unaligned
memory accesses, etc.), or Java.

David Hopwood
david....@lmh.ox.ac.uk, hop...@zetnet.co.uk

David A. Scott

unread,
Jul 17, 1997, 3:00:00 AM7/17/97
to

In a previous article, xt...@xsxixnxnxexrxzx.xcxoxmx (The Messiah) says:

>David A. Scott wrote:
>> True the first version was cracked because Paul Onions had found a plain text attack and I never considered > that before.
>

>Doesn't matter how the algorithm is broken, if it's broken, it's broken.
>If a lock breaks when hit really hard with a rock, it still breaks, and
>the intruder has access to it all.

I don't think this is true. My code was not broken for the way I intended
it to be used. Besides you might as well through away IDEA as used in PGP.
Since it uses CFB feed back mode. If one uses same key and IV vector that
was used for a secret message you can get 8 bytes of the secrect message
for each plain text attack used. Why IDEA choose this form of chainning
is beyond me. The only thing worse would be to uses OFB then only one
chossen plain text attack is needed. PGP is some what safe even with
the inferior chaining since it uses a random IV. My code also allowed
for use of a random IV. In comparsion of a lock. The theif only gets to
break it if you let him put a flash light in the safe. It is not the same
as seeing coded messages of the enemy and then trying to break it. Like I
said it did what it was designed for. But people convenced me that for
a modern cipher it should also be bullet proof to the plain text attack.
That is why I came out with SCOTT16P version The u version which is final
until machine get faster and memory get larger than a 24u version might
come out. I also may add an easier to use key interface for people stuck
in the short key range of 128 bits. I do plan to make it easier to use and
may add a diffie hellman type public key exchange. Any way the U version
added since when the simplist possible cycles selected as a key and the
data all constants the output ecrypted data did not look completely random
to the DIEHARD tests. I did not want any weak keys like those know in IDEA
so in the U version have not been able to make an example to fail the DIEHARD
tests.

Bryan Olson

unread,
Jul 17, 1997, 3:00:00 AM7/17/97
to

Mark Atwood wrote:

> In fact, not even considering the time involved, doesn't it take more
> energy than is contained in the sun to *just* step the most efficient
> possible counter through 128 bits?

Hmmmm...where's an envelope...

We're about 1.5*10^11 meters from the Sun, and under direct
Sunlight a square meter gets a little over a kilowatt of solar
radiation, and that's post-atmosphere. The surface area of a
sphere is 4*Pi*r^2, so at our distance that's about 2.7 *10^23
square meters, so the total solar radiation is about 3*10^26
Watts.

2^128 is about 3.4*10^38, so how long would it take one Watt
to power as many counters as it could through a total of a
little over 10^12 steps? Not long. I don't know how much power
an ordinary counter takes, but I'm sure it's less than a
milliwatt, and I wouldn't be surprised if we could run one on a
microwatt. So even using counters we could produce today, the
Sun might produce that much power in a second, and almost
certainly in under an hour. Using theoretically optimal
counters, I'd guess the Sun blows that much energy out between
cycles of my Pentium.


We will not live to see a systems for which exhaustive search
of a 2^128 bit keyspace is close to being near to the vicinity
of the easiest attack. Still, it's not inconceivable, or even
very unlikely, that such a search will someday be possible. I
think 256 bits is a fine size for a conventional key, and I
don't really see anything wrong with 512 bits. To give people
the benefit of the doubt, I won't even call anyone clueless for
proposing 1024 bit symmetric keys.

--Bryan

Sandy Harris

unread,
Jul 17, 1997, 3:00:00 AM7/17/97
to

In article <5qk649$k...@news.ysu.edu>, an...@yfn.ysu.edu (David A. Scott) wrote:

: (The Messiah) says:
:
:>David A. Scott wrote:
:>> True the first version was cracked because Paul Onions had found a plain
: text attack and I never considered > that before.
:>
:>Doesn't matter how the algorithm is broken, if it's broken, it's broken.
:>If a lock breaks when hit really hard with a rock, it still breaks, and
:>the intruder has access to it all.
:
: I don't think this is true. My code was not broken for the way I intended
:it to be used.

There is no requirement for an attacker to behave as you intend or hope.
If there were, we wouldn't need encryption. We'd just assume attackers
polite enough not to read our messages.

:Besides you might as well through away IDEA as used in PGP.


:Since it uses CFB feed back mode.

Nonsense.

:If one uses same key and IV vector that was used for a secret message


:you can get 8 bytes of the secrect message for each plain text attack used.

If I have the key, I can get the whole message.

David A. Scott

unread,
Jul 17, 1997, 3:00:00 AM7/17/97
to

In a previous article, hop...@zetnet.co.uk (David Hopwood) says:

>
>That's no problem; when someone is testing an attack they can use their own
>plaintext, so you will only have to do this in order to confirm attacks that
>are already known to work. Just make sure that you provide source-code for an
>implementation in _portable_ C (i.e. no reliance on byte order, unaligned
>memory accesses, etc.), or Java.

Since machines are so different is truely portable version on my part
a necessity. If I supply a working GNU C version for the PC and an
executable for the PC which is what is in SCOTT16U.ZIP is that enough.
I don't know Java but is it more portable. I fear if I truely made a
portable one with out a bunch of contional include statements it would
run several orders of magnitude slower and I don't want it to run slow.

Markus Kuhn

unread,
Jul 17, 1997, 3:00:00 AM7/17/97
to

David A. Scott wrote:
> I don't think this is true. My code was not broken for the way I intended
> it to be used. Besides you might as well through away IDEA as used in PGP.
> Since it uses CFB feed back mode. If one uses same key and IV vector that

> was used for a secret message you can get 8 bytes of the secrect message
> for each plain text attack used. Why IDEA choose this form of chainning
> is beyond me.

The more postings I read from you, the more I get the impression that
you have completely missunderstood how CFB works and that all your
complains about CFB are based on this missunderstanding. Read carefully:
CBC, CFB, and OFB *REQUIRE* that you use a different IV each time you start
a new transmission. The way PGP uses the IV is the way it always was
supposed to be used. If you transmit the same file twice with CFB, the
attacker has no way of finding out whether you have modified any bit of
this filethis file (except the length of course), because each time a
new IV is used and this causes each time ~1/2 of all ciphertext
bits to change.

> PGP is some what safe even with
> the inferior chaining since it uses a random IV.

Any fixed IV implementation of CFB deserves the description "seriously
brocken" and it will certainly fail in any reasonable certification.

mgra...@spice.mhv.net

unread,
Jul 17, 1997, 3:00:00 AM7/17/97
to

Bryan Olson (myN...@removethis.uptronics.thistoo.com) wrote:
: We will not live to see a systems for which exhaustive search
: of a 2^128 bit keyspace is close to being near to the vicinity
: of the easiest attack. Still, it's not inconceivable, or even
: very unlikely, that such a search will someday be possible. I
: think 256 bits is a fine size for a conventional key, and I
: don't really see anything wrong with 512 bits. To give people
: the benefit of the doubt, I won't even call anyone clueless for
: proposing 1024 bit symmetric keys.
:
: --Bryan

Nah.. I'd say that 2048 is the upper limit.. I mean, there are
some things that you dont want God to know.

--
Michael Graffam (mgra...@mhv.net)
http://www.mhv.net/~mgraffam - Religion, Philosophy, Computers, etc
"Enlightenment is man's emergence from his self-incurred immaturity.
Immaturity is the inability to use one's own understanding without the
guidance of another. . .Sapere aude! Have the courage to use your own
understanding!" - Immanuel Kant "What is Enlightenment?"

David A. Scott

unread,
Jul 17, 1997, 3:00:00 AM7/17/97
to

In a previous article, car...@eye.cis.ohio-state.edu (mark carroll) says:

>In article <5qloir$4...@news.ysu.edu>, David A. Scott <an...@yfn.ysu.edu> wrote:
>>
>>In a previous article, hop...@zetnet.co.uk (David Hopwood) says:
>>
>>>
>>>That's no problem; when someone is testing an attack they can use their own
>>>plaintext, so you will only have to do this in order to confirm attacks that
>>>are already known to work. Just make sure that you provide source-code for an
>>>implementation in _portable_ C (i.e. no reliance on byte order, unaligned
>>>memory accesses, etc.), or Java.
>>
>> Since machines are so different is truely portable version on my part
>>a necessity. If I supply a working GNU C version for the PC and an
>>executable for the PC which is what is in SCOTT16U.ZIP is that enough.
>>I don't know Java but is it more portable. I fear if I truely made a
>>portable one with out a bunch of contional include statements it would
>>run several orders of magnitude slower and I don't want it to run slow.
>

>Er... I haven't had much trouble writing crypto stuff in portable C.
>Why should conditional #includes slow things down? They vanish after
>the preprocessor. You could probably get some help from comp.lang.c
>
What I meant was my code was based on a PC. The includes would be
necessary to make it work on a non PC. It would most likely run slower
on non PC machines and since so many machines I would not really be able
to test it anyway. Take a look at SCOTT16U.ZIP and tell me how easy to
cahnge to a non PC machine. Forget the IO portion which is very poor look
at the heart doing the encryption and decryption and tell me how easy to
make it smoothly transportable.

David A. Scott

unread,
Jul 17, 1997, 3:00:00 AM7/17/97
to

In a previous article, car...@eye.cis.ohio-state.edu (mark carroll) says:

>
>Portable C crypto stuff: also, you could check out part five of
>Schneier's "Applied Cryptography" for a good example.
>
I will when the book appears at a library near me.

mark carroll

unread,
Jul 17, 1997, 3:00:00 AM7/17/97
to

In article <5qloir$4...@news.ysu.edu>, David A. Scott <an...@yfn.ysu.edu> wrote:
>
>In a previous article, hop...@zetnet.co.uk (David Hopwood) says:
>
>>
>>That's no problem; when someone is testing an attack they can use their own
>>plaintext, so you will only have to do this in order to confirm attacks that
>>are already known to work. Just make sure that you provide source-code for an
>>implementation in _portable_ C (i.e. no reliance on byte order, unaligned
>>memory accesses, etc.), or Java.
>
> Since machines are so different is truely portable version on my part
>a necessity. If I supply a working GNU C version for the PC and an
>executable for the PC which is what is in SCOTT16U.ZIP is that enough.
>I don't know Java but is it more portable. I fear if I truely made a
>portable one with out a bunch of contional include statements it would
>run several orders of magnitude slower and I don't want it to run slow.

Er... I haven't had much trouble writing crypto stuff in portable C.
Why should conditional #includes slow things down? They vanish after
the preprocessor. You could probably get some help from comp.lang.c

-- Mark

mark carroll

unread,
Jul 18, 1997, 3:00:00 AM7/18/97
to

In article <5qm7t4$b...@news.ysu.edu>, David A. Scott <an...@yfn.ysu.edu> wrote:
>
>In a previous article, car...@eye.cis.ohio-state.edu (mark carroll) says:
(snip)

>>Er... I haven't had much trouble writing crypto stuff in portable C.
>>Why should conditional #includes slow things down? They vanish after
>>the preprocessor. You could probably get some help from comp.lang.c
>>
> What I meant was my code was based on a PC. The includes would be
>necessary to make it work on a non PC. It would most likely run slower
>on non PC machines and since so many machines I would not really be able

Why? I don't notice any special optimisations for PCs.

>to test it anyway. Take a look at SCOTT16U.ZIP and tell me how easy to
>cahnge to a non PC machine. Forget the IO portion which is very poor look
>at the heart doing the encryption and decryption and tell me how easy to
>make it smoothly transportable.

It didn't look too bad last time I looked. Have a look how Schneier
does the common operations in "Applied Cryptography" - it's really
not so hard. If I had the time, I'd do it myself.

-- Mark

mark carroll

unread,
Jul 18, 1997, 3:00:00 AM7/18/97
to

In article <5qm7vp$b...@news.ysu.edu>, David A. Scott <an...@yfn.ysu.edu> wrote:
>
>In a previous article, car...@eye.cis.ohio-state.edu (mark carroll) says:
>
>>
>>Portable C crypto stuff: also, you could check out part five of
>>Schneier's "Applied Cryptography" for a good example.
>>
> I will when the book appears at a library near me.

Ah - have you asked them to get it in, yet? Libraries are wonderful
things. (-:

You can download the source code from the book from a few FTP sites -
I think one of them's in the UK (Oxford)? Can anyone give the URL?
It should make a great example.

Good luck,
Mark

Robert and Frances Egan

unread,
Jul 18, 1997, 3:00:00 AM7/18/97
to

Robert and Frances Egan wrote [in part]:
>
> PS: Don't buy into JAVA portability. My company supports more
> platforms than anyone else I know, and JAVA byte code doesn't
> run on at least half of them!
>
> Robert Egan

This argument was poorly framed because of fatigue. I want to clarify
that JAVA doesn't work correctly on half of our supported OS's, NOT half
of the machines out there. It does in fact work on the most popular
platforms (re: market share), and as such is more _portable_ than most
other applications.

Nonetheless, there are many platforms out there which still don't (and
may never) support it. And as long as our customer base includes those
systems, we cannot standardize around it. This is regrettable, because
we have about 12 developers working on our equivalent of JAVA and JAVA
workshop, and we'd much rather have them writing JAVA enhancements.

Robert Egan

Byron Foster

unread,
Jul 18, 1997, 3:00:00 AM7/18/97
to

The Messiah wrote:
>
> David A. Scott wrote:
> > True the first version was cracked because Paul Onions had found a plain text attack and I never considered > that before.
>
> Doesn't matter how the algorithm is broken, if it's broken, it's broken.
> If a lock breaks when hit really hard with a rock, it still breaks, and
> the intruder has access to it all.
>

It's important when you go back and redesign the lock.

Byron

Steve Peterson

unread,
Jul 18, 1997, 3:00:00 AM7/18/97
to

On Fri, 18 Jul 1997 14:21:42 GMT, rwo...@xmission.com (Steve Peterson)
wrote:

>On 15 Jul 1997 17:41:00 GMT, "Syed Mujtaba" <muj...@worldnet.att.net>


>wrote:
>
>>I have been told that the IDEA is one of the most secure encryption
>>algorithms available today. So why is it's bit key size limited to 128
>>bits? Would'nt a 256 bit key be even more secure?
>
>

>Let's take a look at the useful life of an IDEA encryption, using 128 bits.
>All of the data below assumes that brute force is the easiest means of
>attacking IDEA, and, of course, assumes that the encryption is
>well-implemented, with a well-chosen key, and all the other precautions are
>taken.
>
>For our purposes, I will will calculate how long it would take to build 500
>state-of-the-art brute-force IDEA crackers that could crack an IDEA
>encryption in 10 years. I will assume that "Moore's Law," which states
>that for a given cost, the speed of computers doubles evey 18 months, is
>true. Evidence shows that reality is actually slower than that, and may in
>fact be slowing down, but it's a starting point.
>
>So, first, on average it takes (2^128)/2, or 1.7 * 10^38 encryptions to
>break an IDEA by brute force. Assume there are 500 computers working on it
>(state-of-the-art computer are *very* expensive), and you have 3.4 * 10^35
>encryptions each computer will have to try.
>
>Now, let's assume that state-of-the-art right now is a computer that can do
>1 billion encryptions per second. I really don't know what it is, since
>NSA won't tell me (for some strange reason), but it's a reasonable guess.
>(If you disagree with this number, let me know)
>
>So, assuming that the state-of-the art increases twofold every 1.5 years,
>we are 132 years from being able to build these computers. Which means
>that the useful life of your encryption is about 142 years (on average).
>What does this mean for users of encryption? It means that, assuming that
>IDEA is secure, our encryptions will outlive us by a long shot.
>
>What this does mean, however, is that at some point (maybe 50 years in the
>future) we will have to consider a new encryption key length to ensure
>greater security, but for the time being, 128 bits is sufficient.
>
>BTW, using these calculations, 256 bits is secure for 191 years, and 512
>bits is secure for about 447 years.

Oops. Made a slight mistake in my calculations. 256 would actually
be secure for 286 years and 512 would be secure for 670 years.

Sorry if there was confusion there.

Steve Peterson


Michael C Taylor

unread,
Jul 18, 1997, 3:00:00 AM7/18/97
to

David A. Scott (an...@yfn.ysu.edu) wrote:

: In a previous article, car...@eye.cis.ohio-state.edu (mark carroll) says:

: >Portable C crypto stuff: also, you could check out part five of
: >Schneier's "Applied Cryptography" for a good example.

I disagree. The example C code in AC does not work "out of the box" for
a 64-bit little-endian processor. Not all programming is done on Intel
x86.

--
Michael C. Taylor <mcta...@mta.ca> <http://www.mta.ca/~mctaylor/>
Programmer, Computing Services, Mount Allison University, Canada

W T Shaw

unread,
Jul 18, 1997, 3:00:00 AM7/18/97
to

> David A. Scott (an...@yfn.ysu.edu) wrote:
>
> : When I release the next contest

> : for scott16u.zip I will allow plain texts attacks. I just have to find
> : a way to test it with out anwsering hunreds of plaintext things. As far
> : as I know no one has ever ofered a contest where plain text attacks allowed.
> : Any one have bright ideas how to do it so I don't have to run my program

> : several times over and over?
>
How many is unreasonable? It should not matter technically if the same
program is run many times, the number of ciphertexts for even a given
standard plaintext should huge for a really good algorithm.

If you get tired of rerunning your program after so many times, just say
so. You could limit the number of people accepted in the offer, first
come first served, not their requests for more ciphertexts. Consider 5 or
10, if even that many are interested.
--
WTShaw, Macintosh Crypto Programmer
wts...@htcomp.net wts...@itexas.net

Lubarsky's Law of Cybernetic Entomology:
There's always one more bug. --Murphy

David A. Scott

unread,
Jul 18, 1997, 3:00:00 AM7/18/97
to

In a previous article, car...@brain.cis.ohio-state.edu (mark carroll) says:

>In article <5qm7t4$b...@news.ysu.edu>, David A. Scott <an...@yfn.ysu.edu> wrote:
>>
>>In a previous article, car...@eye.cis.ohio-state.edu (mark carroll) says:

>(snip)
>>>Er... I haven't had much trouble writing crypto stuff in portable C.
>>>Why should conditional #includes slow things down? They vanish after
>>>the preprocessor. You could probably get some help from comp.lang.c
>>>
>> What I meant was my code was based on a PC. The includes would be
>>necessary to make it work on a non PC. It would most likely run slower
>>on non PC machines and since so many machines I would not really be able
>
>Why? I don't notice any special optimisations for PCs.
>

Well there are a couple features that make it messy. For
example I have two 16 bit wide arrays on top of each other that
are offset from each other by 8 bits. Is this structure even valid
on many machines.

Markus Kuhn

unread,
Jul 18, 1997, 3:00:00 AM7/18/97
to

David A. Scott wrote:
> car...@eye.cis.ohio-state.edu (mark carroll) says:
> >Portable C crypto stuff: also, you could check out part five of
> >Schneier's "Applied Cryptography" for a good example.
> >
> I will when the book appears at a library near me.

Well, if you are really interested in cryptography, it might be
a very good idea to actually own a few of the standard texts.

I'd suggest some of the following for a start:

Alfred J. Menezes, Paul C. van Oorschot, Scott A. Vanstone:
Handbook of Applied Cryptography, CRC Press, October 1996, 816
pages, ISBN 0-8493-8523-7.

Bruce Schneier: Applied Cryptography - Protocols, Algorithms, and
Source Code in C. Second edition, John Wiley Sons, Inc., 1996.

Douglas R. Stinson: Cryptography - Theory and Practice, CRC
Press, 1995, 448 pages, ISBN 0-8493-8521-0.

Dorothy Denning, Cryptography and Data Security,
Addision-Wesley, 1983.

William Hugh Murray

unread,
Jul 18, 1997, 3:00:00 AM7/18/97
to Syed Mujtaba

Syed Mujtaba wrote:
>
> I have been told that the IDEA is one of the most secure encryption
> algorithms available today. So why is it's bit key size limited to 128
> bits? Would'nt a 256 bit key be even more secure?

It is true that, all other things equal, the longer the key length the
better. However, all other things never are equal. So the answer to
your question is that there should be a balance between the key length
of a block cipher and its complexity.

Note that IDEA uses a longer key length than DES, in part, to compensate
for complexity reduced to improve performance in software. (DES was
optimized for hardware.)

So, simply lengthening the key, while holding everything else in the
algorithm constant, would not have necessarily increased its security.
Biham and Shamir demonstrated that the security of DES would not have
been much improved by increasing its key length to 64 bits. If the
requirement for cost of attack exceeds 56 bits, one is better off using
triple DES, which is exactly what the IBM engineers did for protecting
the key table in the IBM 3848.

mark carroll

unread,
Jul 18, 1997, 3:00:00 AM7/18/97
to

In article <5qnthq$5...@news.ysu.edu>, David A. Scott <an...@yfn.ysu.edu> wrote:
(snip)

> Well there are a couple features that make it messy. For
>example I have two 16 bit wide arrays on top of each other that
>are offset from each other by 8 bits. Is this structure even valid
>on many machines.

Ah - fair enough - I didn't notice that. Yes, you're right - that
would have to be changed, I think.

-- Mark

Steve Peterson

unread,
Jul 18, 1997, 3:00:00 AM7/18/97
to

On 15 Jul 1997 17:41:00 GMT, "Syed Mujtaba" <muj...@worldnet.att.net>
wrote:

>I have been told that the IDEA is one of the most secure encryption
>algorithms available today. So why is it's bit key size limited to 128
>bits? Would'nt a 256 bit key be even more secure?

Steve Peterson

gene m. stover

unread,
Jul 18, 1997, 3:00:00 AM7/18/97
to

David A. Scott (an...@yfn.ysu.edu) wrote:

: Take a look at SCOTT16U.ZIP and tell me how easy to


: cahnge to a non PC machine. Forget the IO portion which is very poor look
: at the heart doing the encryption and decryption and tell me how easy to
: make it smoothly transportable.

David,

I took a look at scott16u.zip with the intention of giving the contest
a fair go, but I don't have an MS-DOG machine, & porting your program
to Unix was too involved. (That's right, I'll waste my time on
non-productive cryptanalysis, but I just can't bring myself to port
code. ;-)

Three facets of the cryptosystem must be portable:

1. You must define a data format. It can't be a bunch of instances
of ``u16'' written with fwrite(). It needs to be defined in
machine-independant terms, in a natural language, not a
programming language. It can use whatever byte-order & word size
you see fit, but it needs to be defined & documented outside of
the code that implements it. The simplest, possibly most general
format is a string of 8-bit bytes to make it easier to write an
endian-independant implementation.

2. The cryptosystem, itself, must be defined in a natural language
so we can understand the big picture. Graphs, flow, diagrams,
pseudocode, or anything else to let a human understand the
system. It's difficult to extract an algorithm from code that
implements it.

3. The implementation needs to be separated from the
file-handling, key management, & interactive stuff. The
encryption, decryption, & support functions (like initialization)
should be in one library that has nothing to do with files & user
interface. You could supply an encrypt() and a decrypt() function,
both in a .c file by themselves. They should operate on blocks of
memory, not on files. As it's written, the cryptosystem is
tightly coupled with the file-handling, user-interface, & error
control of a single program that encrypts & decrypts files.

I spent a few hours reading the code & trying to extract the algorithm
from it. From what I could understand of it, I thought the system was
interesting, & I would have liked to spent more time analysing it, but
it was just too difficult to get at the concepts through the
implementation.

gene m. stover (ge...@CyberTiggyr.com)
--
---
gene m. stover <ge...@CyberTiggyr.com>
Making the world safe from democracy.

David A. Scott

unread,
Jul 18, 1997, 3:00:00 AM7/18/97
to

In a previous article, ge...@netcom.com (gene m. stover) says:

>David A. Scott (an...@yfn.ysu.edu) wrote:
>
>: Take a look at SCOTT16U.ZIP and tell me how easy to
>: cahnge to a non PC machine. Forget the IO portion which is very poor look
>: at the heart doing the encryption and decryption and tell me how easy to
>: make it smoothly transportable.
>
>David,
>
>I took a look at scott16u.zip with the intention of giving the contest
>a fair go, but I don't have an MS-DOG machine, & porting your program
>to Unix was too involved. (That's right, I'll waste my time on
>non-productive cryptanalysis, but I just can't bring myself to port
>code. ;-)

I do have Sun versions for a lot of my stuff. So I know I can port
it to the SUN using gnu c. The sun is a Unix system. I will be a little
messy but it will be stand alone on the Sun. It may take a few days but
I guess I really should do it. If you want a copy I can see if mpj would
put it on his server and then maybe it will magically get stolen to the
U.K.

mark carroll

unread,
Jul 18, 1997, 3:00:00 AM7/18/97
to

In article <5qnppb$ek1$1...@nimble.mta.ca>,

Michael C Taylor <mcta...@fractal.mta.ca> wrote:
>David A. Scott (an...@yfn.ysu.edu) wrote:
>
>: In a previous article, car...@eye.cis.ohio-state.edu (mark carroll) says:
>
>: >Portable C crypto stuff: also, you could check out part five of
>: >Schneier's "Applied Cryptography" for a good example.
>
>I disagree. The example C code in AC does not work "out of the box" for
>a 64-bit little-endian processor. Not all programming is done on Intel
>x86.

It doesn't? There are '#ifdef LITTLE_ENDIAN' lines and things (at
least in the second edition), which suggest some concerns about
endian-ness... at least it would be much more easy to port than the
current source of David Scott's code - the encryption isn't even
separated from the file I/O very well.

Actually, even it was still stuck on Intel x86, that would be a big
improvement - I haven't tried out his system yet because I use Linux,
not Windows.

-- Mark

Steve Peterson

unread,
Jul 18, 1997, 3:00:00 AM7/18/97
to

On Fri, 18 Jul 1997 16:04:58 GMT, rwo...@xmission.com (Steve Peterson)
wrote:

>On Fri, 18 Jul 1997 14:21:42 GMT, rwo...@xmission.com (Steve Peterson)
>wrote:
>

>Oops. Made a slight mistake in my calculations. 256 would actually
>be secure for 286 years and 512 would be secure for 670 years.
>
>Sorry if there was confusion there.

Damn, Damn, Damn!

Curse me and my sloppiness.

OK, let me explain what's goin on here, and then maybe I'll get it right
this time. Maybe hit the right buttons on my calculator, too.

If I want to find the number of years (Y) that a cryptosystem with a key
length of (K) bits will be secure from brute force attack, given that an
adversary will have (C) of these cracking computers, that will crack the
code in an average of (T) years. If a state-of-the-art computer can now
try (N) keys per year, and we assume that the speed of computers for a
given cost doubles every (D) years (1.5 if we use Moore's Law), then the
number of years that a ciphen can be considered secure is given by:


y = (log[(2^K-1)/(N*C*T)] * D)/log(2)

Sorry, it's a bit hard to read, but parse it out and it should make sense.
E-mail me and I'll give you a detailed explanation of how I derived this
formula.

THIS does work, and actually gives me the correct estimate. So, here is
the lifespan of K-bit ciphers, assuming that brute force is the best
method. I used the following values for the variables:

N = 3.15576E16 (1 billion encryptions per second, extended to yearly)
C = 500 computers
T = 10 years
D = 1.5 years to double the speed (Moore's Law)


K =
128 bits -- 89.86 (90) years
256 bits -- 281.86 (282) years
512 bits -- 665.86 (666) years

I really think this is correct. I have double-checked and triple checked
both the derivation of the formula and my calculations this time.

Steve

David A. Scott

unread,
Jul 19, 1997, 3:00:00 AM7/19/97
to

In a previous article, car...@brain.cis.ohio-state.edu (mark carroll) says:

>In article <5qnppb$ek1$1...@nimble.mta.ca>,
>Michael C Taylor <mcta...@fractal.mta.ca> wrote:
>>David A. Scott (an...@yfn.ysu.edu) wrote:
>>
>>: In a previous article, car...@eye.cis.ohio-state.edu (mark carroll) says:

>> ...


>It doesn't? There are '#ifdef LITTLE_ENDIAN' lines and things (at
>least in the second edition), which suggest some concerns about
>endian-ness... at least it would be much more easy to port than the
>current source of David Scott's code - the encryption isn't even
>separated from the file I/O very well.

The actual encryption and decryption is done in 3 routines
doEnce( un16 *a, un32 x) doDece(un16 *a, un32 x) Paul( un16 *a ,un32 x)
the single cycle lookup table which can be any posible single cycle
table is in ft[] the bt[] is same cycle in reverse direction.
*a is pointer to a 16 bit wide array of data that is to be encrypted
or decrypted x is the number of characters in the array. NOTE if
x is odd there is an element that contains only 8 bits. As each pass
occurs the data in array a is written over so previous data gone on
each pass. The routine Paul is used to defeat any sort of plaintext
attack. It does this be using the single cycle data in the reverse
direction against the grain if you will by a cycle of XOR's. To
create the data in this cycle is same as solving for key so not practical.

>
>Actually, even it was still stuck on Intel x86, that would be a big
>improvement - I haven't tried out his system yet because I use Linux,
>not Windows.
>

I have used Linux before it uses GNU C so it should be easy to get
to run on Linux. I wrote for DOS not windows though it can run in Windows.

David A. Scott

unread,
Jul 22, 1997, 3:00:00 AM7/22/97
to

In a previous article, ar...@dorotech.fr (Michel Arboi) says:

>> In modern encryption there is more than one school of thought one
>> is to use a size that is the bare minimum of what is projected to be
>> safe for a few years. Once DES at 56 bits was to be safe. Now the experts
>> say 128 bits is such a number.
>
>If I remember well, there is a "proof" in "Applied cryptography" that
>256 bit long keys are immune to brute force attack (the proof says
>that there is not enough energy in the sun to count from 0 to 2^256-1
>according to the information theory...)
>But of course, other attacks may succeed.
>
>Are you sure that SCOTT16U is much safer with, e.g. 900000 bit keys
>than with 1024 or 256 bit keys ?

I am sure that the security of SCOTT16U is not tottally due
to the key length. It can use very short keys. At the end of my
contest for scott8.zip I will show just how short a simple key
can be and yet no one has broken the contest. The key that is used
for SCOTT16U.ZIP can be long since I have not really found any bad
keys and I made it so any single cycle table can be used. In fact
IDEA could be made more secure if each subkey choesn independently
but I think they choose shorter key with the hope that there would be
no weak keys used. However some weak keys have been found so maybe
the expansion of the 128 bit key to the subkeys used can be improved.
But no matter how you expand the main key to the subkeys. One can
always analysis IDEA as if each sub key is independent. If one does
and if it leads to certain subkeys bits being found then that knowledge
can be used to find other subkey bits if traditional IDEA. The key
expansion should have been done in such a way that knowledge of a few
subkey bits does not lead to knwledge of other subkeys bits or the
master key itself. That should have been done in IDEA but was not.
Maybe the subkey scheduling done in such a trival way to make the
work of the NSA easier. Now my system is weak in this expanding of
a small key to large key too. That is if one uses a single small key.
But this is not how mine should be run.

Osmo Ronkanen

unread,
Jul 23, 1997, 3:00:00 AM7/23/97
to

In article <33cf7bd6...@news.alt.net>,

Steve Peterson <rwo...@xmission.com> wrote:
>
>Now, let's assume that state-of-the-art right now is a computer that can do
>1 billion encryptions per second. I really don't know what it is, since
>NSA won't tell me (for some strange reason), but it's a reasonable guess.
>(If you disagree with this number, let me know)
>
>So, assuming that the state-of-the art increases twofold every 1.5 years,
>we are 132 years from being able to build these computers. Which means
>that the useful life of your encryption is about 142 years (on average).
>What does this mean for users of encryption? It means that, assuming that
>IDEA is secure, our encryptions will outlive us by a long shot.

That is a bad assumption as in real life there cannot be unlimited
exponential growth. Even the Moore's law has to break at some point. In
one billionth of a second light travels 30 centimeters. This can be seen
as the maximum size of the processor. Now how big is the machine after
132 years? 0.3 m / 2^(132/1.5) = 6*10^-29 meters. That's much much much
smaller than an atom (about 10^-10 m).

>
>What this does mean, however, is that at some point (maybe 50 years in the
>future) we will have to consider a new encryption key length to ensure
>greater security, but for the time being, 128 bits is sufficient.
>
>BTW, using these calculations, 256 bits is secure for 191 years, and 512
>bits is secure for about 447 years.

There are about 10^90 atoms in the universe. Lets assume each of them is
a super computer testing 3E8 / 1E-10 = 3E18 keys per second. That
would make 3E108 keys per second. Now lets assume we have 10 billion
years or 3E17 seconds that would mean we can handle 9E125 keys. 9E125 <1
E 126 = 1000^42 < 1024^42 = 2^420. So 420 bit key cannot be brute
forced even "in theory" in the time there is available. Of course there
were many excessive assumptions so one could probably say same for 256
bits.

Note with your assumption of exponential growth many NP complete problems
would become trivial. Just wait for computers that are fast enough.

Osmo


Louis K. Scheffer

unread,
Jul 23, 1997, 3:00:00 AM7/23/97
to

ronk...@cc.helsinki.fi (Osmo Ronkanen) writes:

>There are about 10^90 atoms in the universe. Lets assume each of them is
>a super computer testing 3E8 / 1E-10 = 3E18 keys per second. That
>would make 3E108 keys per second. Now lets assume we have 10 billion
>years or 3E17 seconds that would mean we can handle 9E125 keys. 9E125 <1
>E 126 = 1000^42 < 1024^42 = 2^420. So 420 bit key cannot be brute
>forced even "in theory" in the time there is available. Of course there
>were many excessive assumptions so one could probably say same for 256
>bits.

This is not true for quantum computers, which is part of the reason that
people are interested in them. A quantum computer can, in theory, explore
all 2^420 paths in parallel. (To see this, imagine an interferometer
with 420 beam splitters. To compute the output according to quantum
mechanics, you need to sum over all 2^420 paths, since the result is the
vector sum of all the possible paths. No-one has any doubt that this
experiment would work.) The hard part is computing anything complex
without making a measurement that collapses the wave function, and in
such a way that only the correct answer interferes constructively.
This too is possible in theory, but in practice so far it has been
extremely difficult. If someone gets this to work, though, almost no
key would be too big to brute force.

-Lou Scheffer

Osmo Ronkanen

unread,
Jul 25, 1997, 3:00:00 AM7/25/97
to

In article <5r44l6$3...@gap.cco.caltech.edu>,

Louis K. Scheffer <l...@alumnae.caltech.edu> wrote:
>ronk...@cc.helsinki.fi (Osmo Ronkanen) writes:
>
>>There are about 10^90 atoms in the universe. Lets assume each of them is
>>a super computer testing 3E8 / 1E-10 = 3E18 keys per second. That
>>would make 3E108 keys per second. Now lets assume we have 10 billion
>>years or 3E17 seconds that would mean we can handle 9E125 keys. 9E125 <1
>>E 126 = 1000^42 < 1024^42 = 2^420. So 420 bit key cannot be brute
>>forced even "in theory" in the time there is available. Of course there
>>were many excessive assumptions so one could probably say same for 256
>>bits.
>
>This is not true for quantum computers, which is part of the reason that
>people are interested in them. A quantum computer can, in theory, explore
>all 2^420 paths in parallel. (To see this, imagine an interferometer
>with 420 beam splitters. To compute the output according to quantum
>mechanics, you need to sum over all 2^420 paths, since the result is the
>vector sum of all the possible paths. No-one has any doubt that this
>experiment would work.)

Are there 2^420 photons in the universe?

>Ther The hard part is computing anything complex


>without making a measurement that collapses the wave function, and in
>such a way that only the correct answer interferes constructively.
>This too is possible in theory, but in practice so far it has been
>extremely difficult. If someone gets this to work, though, almost no
>key would be too big to brute force.

Osmo


>
> -Lou Scheffer

John Savard

unread,
Jul 25, 1997, 3:00:00 AM7/25/97
to

ronk...@cc.helsinki.fi (Osmo Ronkanen) wrote:

>Are there 2^420 photons in the universe?

No, but supposedly there don't need to be for a quantum computer to
work, since the wave function of a single particle can be very
complex.

My news server wasn't working at all yesterday morning, and so I
checked sci.crypt through DejaNews, and saw your earlier post there -
it never showed up on my server, but when it at least started working
again, I posted a reply anyways:

it may well be that computers will not keep increasing in speed
according to Moore's Law, and it may well be that quantum computers
will never get off the ground.

So, if you want to make a cautious assumption about how quickly a
problem will be solved, it is incorrect to base a forecast on Moore's
Law continuing indefinitely.

But if you want to make a cautious assumption about how long a problem
will remain unsolved, on how secure a cipher is, then what you need is
not a lower bound, or even a best estimate, but an _upper_ bound on
computer speeds. Making an allowance even for blue sky possibilities
for such a purpose is, therefore, _cautious_, not reckless.

Welcome to sci.crypt.

John Savard

Louis K. Scheffer

unread,
Jul 25, 1997, 3:00:00 AM7/25/97
to

ronk...@cc.helsinki.fi (Osmo Ronkanen) writes:

>In article <5r44l6$3...@gap.cco.caltech.edu>,
>Louis K. Scheffer <l...@alumnae.caltech.edu> wrote:
>>ronk...@cc.helsinki.fi (Osmo Ronkanen) writes:
>>
>>This is not true for quantum computers, which is part of the reason that
>>people are interested in them. A quantum computer can, in theory, explore
>>all 2^420 paths in parallel. (To see this, imagine an interferometer
>>with 420 beam splitters. To compute the output according to quantum
>>mechanics, you need to sum over all 2^420 paths, since the result is the
>>vector sum of all the possible paths. No-one has any doubt that this
>>experiment would work.)

>Are there 2^420 photons in the universe?

No - that's where the quantum computer takes advantage of the weirdness of
quantum mechanics. Such a beamsplitter will give the predicted result even if
you fire only one photon at a time through it, even though you need to sum
all 2^420 paths to figure out what the output should be!
This is easily verified (for 2^4 or so) with small interferometers and
dim lasers. It is very counter-intuitive, but experiment after
experiment bears it out - the result is as if the photon, or particle,
explores all possible paths before choosing one (statistical) outcome.

-Lou Scheffer

Mikko J Särelä

unread,
Jul 28, 1997, 3:00:00 AM7/28/97
to

Louis K. Scheffer (l...@alumnae.caltech.edu) wrote:
: >Are there 2^420 photons in the universe?
:
: No - that's where the quantum computer takes advantage of the weirdness of
: quantum mechanics. Such a beamsplitter will give the predicted result
: even if
: you fire only one photon at a time through it, even though you need to sum
: all 2^420 paths to figure out what the output should be!
: This is easily verified (for 2^4 or so) with small interferometers and
: dim lasers. It is very counter-intuitive, but experiment after
: experiment bears it out - the result is as if the photon, or particle,
: explores all possible paths before choosing one (statistical) outcome.

Correct me if I'm wrong, but I believe you will need 2^420 detectors to do
detect which way the photon acted. Do you have that?

Mikko

Stanley Chow

unread,
Jul 28, 1997, 3:00:00 AM7/28/97
to

In article <5r9k9k$7...@gap.cco.caltech.edu>,

Louis K. Scheffer <l...@alumnae.caltech.edu> wrote:
>ronk...@cc.helsinki.fi (Osmo Ronkanen) writes:
>
>>In article <5r44l6$3...@gap.cco.caltech.edu>,
>>Louis K. Scheffer <l...@alumnae.caltech.edu> wrote:
>>Are there 2^420 photons in the universe?
>
>No - that's where the quantum computer takes advantage of the weirdness of
>quantum mechanics. Such a beamsplitter will give the predicted result even if
>you fire only one photon at a time through it, even though you need to sum
>all 2^420 paths to figure out what the output should be!

The problem is then how do you set up all those paths.

>This is easily verified (for 2^4 or so) with small interferometers and
>dim lasers. It is very counter-intuitive, but experiment after
>experiment bears it out - the result is as if the photon, or particle,
>explores all possible paths before choosing one (statistical) outcome.

That is not the problem. The problem is how to make it compute. The
current push is to factor 15=(3*5). A number that is somewhat small
than 2^420.


--
Stanley Chow; sc...@nortel.ca, (613) 763-2831
Nortel/BNR, PO Box 3511 Station C, Ottawa, Ontario
Me? Represent other people? Don't make them laugh so hard.

John Krueger

unread,
Jul 28, 1997, 3:00:00 AM7/28/97
to

Osmo Ronkanen wrote:
>
> In article <5r44l6$3...@gap.cco.caltech.edu>,
> Louis K. Scheffer <l...@alumnae.caltech.edu> wrote:
[snip]

> >This is not true for quantum computers, which is part of the reason that
> >people are interested in them. A quantum computer can, in theory, explore
> >all 2^420 paths in parallel. (To see this, imagine an interferometer
> >with 420 beam splitters. To compute the output according to quantum
> >mechanics, you need to sum over all 2^420 paths, since the result is the
> >vector sum of all the possible paths. No-one has any doubt that this
> >experiment would work.)
>
> Are there 2^420 photons in the universe?

This question is irrelevant for two reasons:

1. It is not necessary to have one photon for each path,
because each photon, in a sense, follows all paths
according to quantum mechanics. A better explanation
might get more technical, but this one won't, because...

2. The total number of photons in the universe is not
conserved. You can make more any time you want, just
by turning on a light.

--
-JJK

Louis K. Scheffer

unread,
Jul 29, 1997, 3:00:00 AM7/29/97
to

msa...@cc.hut.fi (Mikko J Srel) writes:

>Louis K. Scheffer (l...@alumnae.caltech.edu) wrote:

>: >Are there 2^420 photons in the universe?
>:
>: No - that's where the quantum computer takes advantage of the weirdness of


>: quantum mechanics. Such a beamsplitter will give the predicted result
>: even if
>: you fire only one photon at a time through it, even though you need to sum
>: all 2^420 paths to figure out what the output should be!

>Correct me if I'm wrong, but I believe you will need 2^420 detectors to do

>detect which way the photon acted. Do you have that?

No, the paths can re-converge, with a net result (say) of one normal sin(x)/x
interference pattern. If you fire photons one at time through the experiment,
and record where each one hits, they will slowly build up the same sin(x)/x
pattern.
-Lou Scheffer

>Mikko

John Savard

unread,
Jul 29, 1997, 3:00:00 AM7/29/97
to

John Krueger <kru...@cs.hope.edu> writes, according to <>:

> 2. The total number of photons in the universe is not
> conserved. You can make more any time you want, just
> by turning on a light.

That hardly makes it irrelevant; the total mass-energy of the Universe
hardly allows for 2^420 photons, at any usable frequency.

John Savard

Reply all
Reply to author
Forward
0 new messages