I am just in the middle of convincing my employers that they should
use PGP to protect our confidential information when I noticed this
article in the 6th March 1997 issue of "Computing" (uk). If I've
noticed it, some of my "power" users will also have. I quote
**begin quotation
The software encryption program Pretty Good Privacy (PGP) has been
attacked as unsafe and commerically dangerous by industry experts,
who are advising companies to stop using the product.
Two computer security specialists (sic), each addressing different
conferences on the same day last month, made the claim that it was
bad systems management to encrypt messages using PGP.
One was against the scrambling system because it had been cracked,
and the oher disliked it because systems managers would be unable
to crack it and know what messages employees were sending outside
the company.
Addressing the Netlaw Legal Internet Symposium in London, Neil
Barrett, senior consultant with Bull Information Systems' security
division, said: "Don't use it. We're hearing it's been broken and
broken fairly easily."
"If it's true, it means you can break the US version of PGP inside
40 minutes, regardless of key length"
John Austen, director of Computer Crime Consultants and founder of
New Scotland Yard's told the WebSec 97 conference: "I think it's
dreadful.
It is individual key encryption - you wouldn't know what your
employee was sending out."
Since 1994, there have been 900 daily downloads of PGP from its Web
site. It boasts two million users, 30% of the Fortune 100 companies
as customers.
**** end quote
Taking each bit of the article at a time.
1. Obviously Neil Barrett does not understand how public key
encryption really works. To come out with a statement that the
cracking time is in no way dependent on key sizes almost beggars
disbelief.
2. Has someone finally discovered a flaw in IDEA that renders it
liable to being broken in the lifetime of our universe? Personally
I find that hard to believe but I am always open to demonstration.
3. Notice how it is third party information. "We are hearing it has.."
followed by "if it's true...".
4. And why the distinction about the US version. AFAIK the only
difference between the US and International version is the library
used to implement the RSa routines.
5. Give us proof Neil. As a "senior consultant" (sic) he should
know better than merely to pass on third party rumour. I hope he
doesn't advise paying customers in the same slap happy manner, or
at least that they have some people who work for them, that at least
know something about the subject.
6. I agree with the comments of John Austen to some extent. What we
are looking at is a "company" secret/private key pair. It would be
nice to be able to buy Viacrypt's version in the UK, I believe it
offers the option to force each encryption to also use a predefined
key, in addition to the key specified by the user.
Observations anyone ?
regards
Gordon
-------
pgp keyid: D8CBC4B9
key on servers or my web site www.rugeley.demon.co.uk
-----BEGIN PGP SIGNATURE-----
Version: 2.6.3i
Charset: cp850
Comment: How do you *know* Big Brother isn't watching you...
iQENAwUBMyKp+syDPobYy8S5AQGJngfAuALlpZaddnrnVFelJx9qqaytMsmz21e0
CMD32tgSKCZvO/4E1oooFVX3kYTZ40IpkAEafnPUVik9zsuEMVM0Uqs1QjDLRtHY
sEo68EIUgxxMzITtBqe98hHY29tUAaZTsnSjtR5r912fav1smGs8iD4mTRd9RM1w
LmWm36PtmxWsQabaXL/9UM2xqJyiK3+Uc87XgxT27AfKxcwcYET3q86vHqpT3hpp
aLuQclq8bUMHDppgP9nVZUfj/VGhO4FrvBXn+/CUDz+O+dfQApWDb05AY1f9AwPN
EYSgp7vNl5TCVRPn0mX4UQ99Tms6uBgaTZNuBm3HGkY=
=n+cb
-----END PGP SIGNATURE-----
>-----BEGIN PGP SIGNED MESSAGE-----
>I am just in the middle of convincing my employers that they should
>use PGP to protect our confidential information when I noticed this
>article in the 6th March 1997 issue of "Computing" (uk). If I've
>noticed it, some of my "power" users will also have. I quote
>**begin quotation
>The software encryption program Pretty Good Privacy (PGP) has been
>attacked as unsafe and commerically dangerous by industry experts,
>who are advising companies to stop using the product.
I wonder what they have to sell themselves...
I guess the license fee for IDEA (needed for commercial use of PGP
outside the USA) is much lower than the commision they get on
the software they sell.
>Two computer security specialists (sic), each addressing different
>conferences on the same day last month, made the claim that it was
>bad systems management to encrypt messages using PGP.
>One was against the scrambling system because it had been cracked,
>and the oher disliked it because systems managers would be unable
>to crack it and know what messages employees were sending outside
>the company.
At least one of those "experts" must be wrong then....
>Addressing the Netlaw Legal Internet Symposium in London, Neil
>Barrett, senior consultant with Bull Information Systems' security
>division, said: "Don't use it. We're hearing it's been broken and
>broken fairly easily."
So advise from Bull (nomen est omen..) Information System's consultants
is rather worthless.
Put up or shut up, Neil.
>"If it's true, it means you can break the US version of PGP inside
>40 minutes, regardless of key length"
>John Austen, director of Computer Crime Consultants and founder of
>New Scotland Yard's told the WebSec 97 conference: "I think it's
>dreadful.
>It is individual key encryption - you wouldn't know what your
>employee was sending out."
>Since 1994, there have been 900 daily downloads of PGP from its Web
>site. It boasts two million users, 30% of the Fortune 100 companies
>as customers.
>**** end quote
>Taking each bit of the article at a time.
>1. Obviously Neil Barrett does not understand how public key
>encryption really works. To come out with a statement that the
>cracking time is in no way dependent on key sizes almost beggars
>disbelief.
-- might be if IDEA was broken. IDEA has a fixed key size.
>2. Has someone finally discovered a flaw in IDEA that renders it
>liable to being broken in the lifetime of our universe? Personally
>I find that hard to believe but I am always open to demonstration.
No.
>3. Notice how it is third party information. "We are hearing it has.."
>followed by "if it's true...".
I wonder how much the shee^H^H customers have to pay for those rumours...
>4. And why the distinction about the US version. AFAIK the only
>difference between the US and International version is the library
>used to implement the RSa routines.
Correct.
>5. Give us proof Neil. As a "senior consultant" (sic) he should
>know better than merely to pass on third party rumour. I hope he
>doesn't advise paying customers in the same slap happy manner, or
>at least that they have some people who work for them, that at least
>know something about the subject.
Don't hold your breath...
>6. I agree with the comments of John Austen to some extent. What we
>are looking at is a "company" secret/private key pair. It would be
>nice to be able to buy Viacrypt's version in the UK, I believe it
>offers the option to force each encryption to also use a predefined
>key, in addition to the key specified by the user.
Perhaps some shells also offer such an option, always including
encrypttoself with "self" being some kind of company key.
If you use real encryption, there is no hope for forgotten passwords,
so make sure you have a good backup strategy.
>Observations anyone ?
Your comments are basically correct. Oh well, at least you now know what
the advise from those "experts" is worth.
Boudewijn
--
+-------------------------------------------------------------------+
|Boudewijn Visser |E-mail:vis...@ph.tn.tudelft.nl |finger for |
|Dep. of Applied Physics,Delft University of Technology |PGP-key |
+-- my own opinions etc --------------------------------------------+
>**begin quotation
>The software encryption program Pretty Good Privacy (PGP) has been
>attacked as unsafe and commerically dangerous by industry experts,
>who are advising companies to stop using the product.
>
>Two computer security specialists (sic), each addressing different
>conferences on the same day last month, made the claim that it was
>bad systems management to encrypt messages using PGP.
>
>One was against the scrambling system because it had been cracked,
>and the oher disliked it because systems managers would be unable
>to crack it and know what messages employees were sending outside
>the company.
>
These so called experts are incorrect. I don't think they are truly
addressing PGP! Any expert, worth their salt, knows this!
The issue they might be addressing, might be the complexity of new users in
using PGP, whereby systems managment will find it alot of work in the
beginning. However, recently, a number of software enhancement packages
are currently available to reduce this problem quite a bit.
In so far as cracking RSA encrypted PGP mail, and if the key sizes are 1024
bits or higher, and passphrase generating has been adquately analyzed, and
some basic security precautions have been taken at the workstation in terms
of one's secret keys and other security precautions regardless of what
encryption engine one chooses, then, in terms in modern day language, PGP
CANNOT be cracked in real-time until 'All Hell freezes over.' :-)
PGP has been scruntized repeatedly by the International Community and so,
far, it has passed the test. The same goes for "Speak Freely" an
international effort to create good 1-1 real-time audio on Window's 3.1,
3.11, Window's 95, Unix and Linux. One of the encryption engines, IDEA, is
employed as real-encryption, rather than just relying on alogorithms of
compression/decompression as "encryption." This is NOT encryption in the
real sense. "Speak Freely" is the only software I have found, repeatedly
that works beautifully in the encryption mode of IDEA. I had tried
PGPfone, but it was too buggy and gave up on it.
Dan Frezza
PGP Key Id: Dan Frezza <d...@frezza.org>
PGP User Id: 0x0B6C9381
PGP Fingerprint: 8C E2 78 50 24 80 D7 0C 64 29 D2 3B FE 4B C5 4E
On Sun, 09 Mar 1997 12:16:21 +0100, Gordon spence
<gor...@rugeley.demon.co.uk> wrote:
>
>I am just in the middle of convincing my employers that they should
>use PGP to protect our confidential information when I noticed this
>article in the 6th March 1997 issue of "Computing" (uk). If I've
>noticed it, some of my "power" users will also have. I quote
>
>**begin quotation
>The software encryption program Pretty Good Privacy (PGP) has been
>attacked as unsafe and commerically dangerous by industry experts,
>who are advising companies to stop using the product.
>
>Two computer security specialists (sic), each addressing different
>conferences on the same day last month, made the claim that it was
>bad systems management to encrypt messages using PGP.
>
>One was against the scrambling system because it had been cracked,
>and the oher disliked it because systems managers would be unable
>to crack it and know what messages employees were sending outside
>the company.
>
>Addressing the Netlaw Legal Internet Symposium in London, Neil
>Barrett, senior consultant with Bull Information Systems' security
>division, said: "Don't use it. We're hearing it's been broken and
>broken fairly easily."
>
>"If it's true, it means you can break the US version of PGP inside
>40 minutes, regardless of key length"
>
What you tell your employers is that encryption is about trust and
mistrust. They can trust what some experts say that some unidentified
parties have told them or they can download the code, examine it, and
compile their own copy of PGP.
When it comes to trusting a system with things I place a lot of value
in, the choice becomes a no-brainer.
Tony
-----BEGIN PGP SIGNATURE-----
Version: 2.6.2
Comment: Anthony E. Greene <agr...@nemaine.com> 0x78CD4329
iQCdAwUBMyLywkRUP9V4zUMpAQHQGQQ5Afg54+suGg3HYm16FwrRMDdikkMIVXQf
2CDXDhSsUd340qWxBLw7j+L7wirBHM+1L6dP6+lGjwwlpLv0174Dmnn41rQrPBa/
0L/anmmPHZeRYmHd1p327CuMl/npBK2BeVg4ATp1dxdDqkJNcfc8DivY5dWlb/px
2fITi/CMMzDW+9Mfw4uh2Q==
=onsi
-----END PGP SIGNATURE-----
--------------------------------------------------
Anthony E. Greene <agr...@nemaine.com>
PGP Key Id: pub 1083 0x78CD4329
---------------------------------------------------
PGP Key: Send me email with Subject: send pgp key
PGP Info: Send me email with Subject: send pgp info
or visit PGP Inc at <http://www.pgp.com/>
---------------------------------------------------
On Sun, 09 Mar 1997 12:16:21 +0100, Gordon spence wrote:
:I am just in the middle of convincing my employers that they should
:use PGP to protect our confidential information when I noticed this
:article in the 6th March 1997 issue of "Computing" (uk). If I've
:noticed it, some of my "power" users will also have. I quote
:
:**begin quotation
:The software encryption program Pretty Good Privacy (PGP) has been
:attacked as unsafe and commerically dangerous by industry experts,
:who are advising companies to stop using the product.
<<snip>>
:John Austen, director of Computer Crime Consultants and founder of
:New Scotland Yard's told the WebSec 97 conference: "I think it's
:dreadful.
:It is individual key encryption - you wouldn't know what your
:employee was sending out."
<<snip>>
:**** end quote
:
:Taking each bit of the article at a time.
:
:6. I agree with the comments of John Austen to some extent. What we
:are looking at is a "company" secret/private key pair. It would be
:nice to be able to buy Viacrypt's version in the UK, I believe it
:offers the option to force each encryption to also use a predefined
:key, in addition to the key specified by the user.
There is a program available to split a file securely such that
n of m parts would be required to put it back together. It is
called SecSplit. It is available from a number of places. I
found it using Archie.
ftp://ftp.dsi.unimi.it/pub/security/crypt/code/secsplit.zip
ftp://isdec.vc.cvut.cz/ppub/security/unimi/crypt/secsplit.zip
ftp://nic.funet.fi/pub/crypt/ftp.dsi.unimi.it/code/secsplit.zip
This has a DOS-executable. It comes with source code if you want
to port it to another platform.
If you are using Windows, there is a front-end available for it.
ftp://ftp.eskimo.com/u/j/joelm/secshare.zip
Here is a description from Joel McNamara's Privacy tools page
http://www.eskimo.com/~joelm/tools.html
:Secret Sharer for Windows
:
:Worried about losing your PGP key's passphrase, or maybe
:getting hit by a bus and leaving a hard disk full of
:encrypted business data, or perhaps that the government's
:key escrow plan may not really be all that secure?
:Then use Secret Sharer.
:
:Secret Sharer splits up a passphrase or file into a
:user-specified number of encrypted pieces. You give these
:pieces to different trusted people, then specify a
:minimum number of pieces that can be used to restore the
:passphrase/file.
:
:For example, you split your passphrase into 8 different
:pieces and give one piece to each of 8 trusted
:acquaintances. Before you split up the passphrase, you
:indicate a minimum of 3 pieces must be used to restore
:the passphrase. This means 3 individuals out of the 8
:must get together and use their pieces to successfully
:restore your passphrase. This is called "secret-sharing."
:
:Secret Sharer is a Windows front-end to a DOS application
:called SECSPLIT.EXE, written by Hal Finney.
:Because of the hassles with the ITAR regulations
:(restrictions on export of cryptography outside the US),
:SECSPLIT is not included with Secret Sharer. Download
:Secret Sharer, then read the SECSHARE.TXT file for
:some FTP locations. Once you have a copy of SECSPLIT, put
:it in the same directory as Secret Sharer.
HTH
- --
Pat McCotter
Finger for public key pa...@connix.com or get it from the servers
Type Bits/KeyID Date User ID
pub 2047/D437B2D9 1996/05/10 Pat McCotter <pa...@connix.com>
Key fingerprint = D0 E7 C6 5A 9E EF 0D CF C7 10 88 2A 73 41 11
24
-----BEGIN PGP SIGNATURE-----
Version: 2.6.3a
Charset: cp850
iQEVAwUBMyMrRwhAaMnUN7LZAQFmjQf/TbfXzjm6ZxHwK3PKHUeh35LxHFs9geUa
J6jzt95qZA1E5Zs+nav3Jqqpt7lRZ+gOnbDozpnS5Zma3xSqD4FokYLmsQxy8CCE
Yzm+ncPezCwRvY/gz5azgGkJIskhrcSN3CtIaGoSXQpjMHLWxONttYTSgNRx9qqf
lYIqHTOVO3zzGCd2gSKjXKHascs6Zw4kZkr/c9pj4Ife5idqESRYiJxApkuOqpf1
Yz33SO65WLGbjRsy0XmuE7zDXkWkrBg7Moh0w7zr2L/BzTktdl3+/MgBrhNeDeeX
86ioPfTcnqC2v8Zloqitzqq3JJqsx1DYMzHKzr8ROSCEJ+cTc4WAPA==
=PcVI
-----END PGP SIGNATURE-----
>**begin quotation
>The software encryption program Pretty Good Privacy (PGP) has been
>attacked as unsafe and commerically dangerous by industry experts,
>who are advising companies to stop using the product.
>Two computer security specialists (sic), each addressing different
>conferences on the same day last month, made the claim that it was
>bad systems management to encrypt messages using PGP.
>One was against the scrambling system because it had been cracked,
>and the oher disliked it because systems managers would be unable
>to crack it and know what messages employees were sending outside
>the company.
One doesn't like it because it's insecure, and the other doesn't like to
because it's *too* secure.
>Addressing the Netlaw Legal Internet Symposium in London, Neil
>Barrett, senior consultant with Bull Information Systems' security
>division, said: "Don't use it. We're hearing it's been broken and
>broken fairly easily."
Hmm. Lawyers, not techies. Sounds like a variation of the PGPCrack
hooha that we see here periodically. Yes, it will break PGP, but how
long will it take?
>"If it's true, it means you can break the US version of PGP inside
>40 minutes, regardless of key length"
Since it's only the US version, if true, it would be a bug in the RSA
libraries or the interfaces to them. Not likely -- they've been
checked by experts. There was, of course, no demonstration.
>John Austen, director of Computer Crime Consultants and founder of
>New Scotland Yard's told the WebSec 97 conference: "I think it's
>dreadful.
>It is individual key encryption - you wouldn't know what your
>employee was sending out."
Or receiving. Why does this never seem to be a problem?
>Since 1994, there have been 900 daily downloads of PGP from its Web
>site. It boasts two million users, 30% of the Fortune 100 companies
>as customers.
>**** end quote
Fear, Uncertainty, and Doubt. No data. No facts.
For years now, the mainstream software vendors have worked very hard to
convince everyone that all shareware or freeware programs are buggy,
virus infected trojan horses. They have had a great deal of success --
most sysadmins that I have worked with treat shareware or freeware like
it was plutonium.
This is just more of the same. It would be interesting to know what Mr.
Barrett and Mr. Austen are selling.
As to the problem of not knowing what your employees are sending out,
there is the problem that management wants crypto that is secure, but
that also has an easy to use back door. This is a contradiction in
terms. Like a virgin prostitute, there are people willing to sell it to
you, but it may not be exactly what you expect (:-).
BTW, the people who complain about not being able to read their
employee's e-mail seem to be far less worried about corporate liability
and the possible leakage of confidential information than they are about
not being able to snoop at will.
--
Steve Smith s...@access.digex.net
Agincourt Computing +1 (301) 681 7395
"Everything should be made as simple as possible, but no simpler."
quoting the article:
>Addressing the Netlaw Legal Internet Symposium in London, Neil
>Barrett, senior consultant with Bull Information Systems' security
>division, said: "Don't use it. We're hearing it's been broken and
>broken fairly easily."
>"If it's true, it means you can break the US version of PGP inside
>40 minutes, regardless of key length"
No doubt, somebody has made a mistake...but where could they have
gotten such an idea from?
John Savard
In article <5fvg7b$g...@tor-nn1-hb0.netcom.ca>, sew...@netcom.ca says...
Better question: Why would they want to misinform the audience?
Several answers come to mind. We use PGP, we understand it at (in my case) the
basic operator's level. The average corporate type sees this as a foreign
language and can easily be deceived as to the ease of cracking the code.
Back to question one: Why should anyone, or who, would want to deceive?
- --
Good company and good discourse are the very sinews of virtue.
Izaak Walton, "The Compleat Angler" (1653)
voice: 718.421.4598 + http://www.webcom.com/sasohn + fax: 718.421.4098
-----BEGIN PGP SIGNATURE-----
Version: 2.6.2
Comment: Classification: OWC Only - Burn Before Reading
iQCVAwUBMyN8kJsE3EZsdmbFAQG70QP/ekSKG5czTIVtoIgp8wZ5G2oIYohdEbxz
xYhJJQMtOOseeoNw+uZgFxodPgXG3/Zbyq5Sg7RuADSPghHUaLP+G8u6KuXC1FRI
TjzvkG2blZ4bwh/hzWcm/VQPFJWBM5z6xSY1ZARdPpYFWEEaZJNsHtsGQhWcN0m1
c/FP2pD9p7c=
=RFfd
-----END PGP SIGNATURE-----
'Que sais-je?' <Montaigne>
Did you try Nautilus? See http://www.lila.com/nautilus/index.html
for more info.
>No doubt, somebody has made a mistake...but where could they have
>gotten such an idea from?
As someone's dot.sig says:
PGP *must* be a good idea; look who objects to it.
For a laugh, I asked a friend who is a crypto guru. You'd know his
name, he's a published author in the computer security area.
His comment was ~~"sounds like FUD"......
Why not call GCHQ and ask them ;-?
--
A host is a host from coast to coast.................wb8foz@nrk.com
& no one will talk to a host that's close........[v].(301) 56-LINUX
Unless the host (that isn't close).........................pob 1433
is busy, hung or dead....................................20915-1433
The obvious answer to these security concerns is to do away with PGP, and
have mandatory strip-searches of employees before they can leave each
day. Then, there is the chance that someone might remember something
useful and actually pass that information while at home. Putting
employees in corporate controlled concentration camps should do the trick,
eh, Adoph?
--
WTShaw, Macintosh Crypto Programmer
wts...@htcomp.net wts...@itexas.net
Waste not, Run out of Storage Space
>sew...@netcom.ca (John Savard) writes:
>
>
>>No doubt, somebody has made a mistake...but where could they have
>>gotten such an idea from?
>
>As someone's dot.sig says:
> PGP *must* be a good idea; look who objects to it.
>
-----BEGIN PGP SIGNED MESSAGE-----
I can't take anything seriously that is not cited. There was nothing
in that article that was concrete...it was, on one side saying that
PGP does exactly what it is made for (when it mentions that employers
may not be able to read employees encrypted mail) and on the other it
only said that so and so said yada yada yada..
If there was a crack for PGP we would see how it was broken all over
the |net by now...PGP.com might be a lot better off than they were
when Phil first came up with PGP, but I don't think they are silencing
everyone!
Scott Brower
Electronic Frontiers Florida
http://www.efflorida.org
-----BEGIN PGP SIGNATURE-----
Version: 4.5
iQEVAgUBMyOt4vW6Owy092c1AQGPugf+JBg3nAUR5PRX3+08UCNT7trtO3kntk/r
zYMVmISlSxSS3HkfMm1+1H+M6FqLePiJOWs7tRGAP+c5AAWI80FkQ+SrzngpTYzO
wT4nN+h+sXiEtUPuHC7joKpv6Rr5LtuRhwwDz7flEEJiCL4iTR+E3R1WX8DLpB3m
xpf21AdMplhOvCEcgMuuLp9E01zPgsLe5eiImI/BAUp4LMv0OJ37HvZR8Tz9pH0C
9N5UVN2atMcZBgbrdVOMDpRPXvagYkQFxiZGWrWWfajmhdT9YP2vYkBe/vo0zccB
JWbS2IEG6mdq0mFiXH4xncXp/N5aF4qFlqTMlOUB8MLIIu4ClQF3Pg==
=U/Li
-----END PGP SIGNATURE-----
>quoting the article:
>
>>Addressing the Netlaw Legal Internet Symposium in London, Neil
>>Barrett, senior consultant with Bull Information Systems' security
>>division, said: "Don't use it. We're hearing it's been broken and
>>broken fairly easily."
>
>>"If it's true, it means you can break the US version of PGP inside
>>40 minutes, regardless of key length"
>
>No doubt, somebody has made a mistake...but where could they have
>gotten such an idea from?
The word is not that the RSA and IDEA ciphers are easily cracked but that
the PGP security model is flawed. To give one specific, many would argue that
the 'web of trust' which underpins PGP (as opposed to RSA & IDEA) is
inherently unsafe. It is of course possible to use the ciphers in their PGP
bindery without implementing the web of trust but that is not what the author
intended and advocates.
The security of a cryptosystem is at least as much about issues of
implementation as it is about the strength of the algorithms. Arguably,
security that fully matches the strength of the RSA/IDEA algorithms is
unlikely to be implemented by all but a very small percentage of PGP users.
That said, it is also worth noting that RSA with 512 bit keys has been an ITAR
approved export into approved non-US markets for ten years or more. For those
with ears to hear, that speaks volumes. One also notes that only now is the
first open source attack to be made on 512 bit keys.
Owen
Yes. Nautilus worked fine for non-internet communication. However, I was
referring to "Internet" 1-1 real time audio. Also, you have to be careful
for Window's 95 users to use the pure dos mode, not the shell. Speak
Freely is hard to beat for Internet real time 1-1 audio and real time 1-1
true encryption using IDEA 128 bits and it also has the option for using
your PGP public key as a seed for IDEA. It works great!
: >quoting the article:
: >
: >>Addressing the Netlaw Legal Internet Symposium in London, Neil
: >>Barrett, senior consultant with Bull Information Systems' security
: >>division, said: "Don't use it. We're hearing it's been broken and
: >>broken fairly easily."
: >
: >>"If it's true, it means you can break the US version of PGP inside
: >>40 minutes, regardless of key length"
: >
: >No doubt, somebody has made a mistake...but where could they have
: >gotten such an idea from?
: The word is not that the RSA and IDEA ciphers are easily cracked but that
: the PGP security model is flawed. To give one specific, many would argue that
: the 'web of trust' which underpins PGP (as opposed to RSA & IDEA) is
: inherently unsafe. It is of course possible to use the ciphers in their PGP
: bindery without implementing the web of trust but that is not what the author
: intended and advocates.
So? Web of trust is a superset of hierarchical key certification. In
any case, to say that "we're hearing it's been broken and broken
fairly easily" is misleading and only likely to drive users who need
security into the arms of the purveyors of snake oil.
I wonder what system Mr. Barrett recommends.
: The security of a cryptosystem is at least as much about issues of
: implementation as it is about the strength of the algorithms. Arguably,
: security that fully matches the strength of the RSA/IDEA algorithms is
: unlikely to be implemented by all but a very small percentage of PGP users.
None of them, I wouldn't think. You'd need secure computers and
operating systems amongst (many) other things.
: That said, it is also worth noting that RSA with 512 bit keys has been an ITAR
: approved export into approved non-US markets for ten years or more. For those
: with ears to hear, that speaks volumes. One also notes that only now is the
: first open source attack to be made on 512 bit keys.
Maybe rather than insinuation you ought to say what you mean. What do
you mean by "ITAR approved"? Approved for export to whom? Do you
believe that someone somewhere can cheaply factorize 512 bit RSA keys,
or that perhaps someone has a working attack on 512 bit RSA which does
not require factorization?
Andrew.
I always have question about this. I am highly "agnostic" about the
degree of trust that I can have in this "web of trust."
Supposing (as seems to be the case) that the RSA/IDEA combination is
technically strong, there is the question of just what it means to
have something signed by someone.
Does a signature mean:
a) I believe that this person is completely trustworthy in all ways?
or (at the other extreme)
b) I believe that someone with control over this PGP key used it at
some point in time to encrypt this message.
You can't prove that I haven't given out my PGP key to ten people, all
of whom thus can both read and sign messages in my name. I can
certainly claim that I'm the only one with my key; I indeed *do*
assert this to be the case.
I have a very hard time believing that the "web of trust" goes any
further than the set of people that:
a) I indeed do personally trust, or
b) That have *highly* publicized their keys, and would have
significant jeapordy if they were to be known to have acted in an
untrustworthy fashion.
I don't see any great value to holding "key-signing parties;" I also
don't see what value there is to getting signatures directly (e.g. in
person) from Phil Zimmerman or other "PGP luminaries." It may be nice
and all, but it's rather difficult to evaluate what degree of reliance
ought to be put on those signatures. I (for instance) knew Colin
Plumb in university, and his participation in PGP work is certainly no
surprise. But I really know little about how much I ought to trust
him about crypto keys.
If the perspective is of verifying that someone is really the one that
posted news or email (in general), the vague "web of trust" may be
quite reasonable. Not nearly so likely if there's something *really
important* in the message.
The $64,000 question: What does "trust" mean, anyways?
--
Christopher B. Browne, cbbr...@unicomp.net, chris_...@sdt.com
PGP Fingerprint: 10 5A 20 3C 39 5A D3 12 D9 54 26 22 FF 1F E9 16
URL: <http://www.conline.com/~cbbrowne/>
Linux: When one country worth of OS developers just isn't enough...
These are two independent steps in the "web of trust"; whom do I
trust to own keys, and whom do I trust to sign them.
>I have a very hard time believing that the "web of trust" goes any
>further than the set of people that:
>
>a) I indeed do personally trust, or
>
>b) That have *highly* publicized their keys, and would have
>significant jeapordy if they were to be known to have acted in an
>untrustworthy fashion.
>
>I don't see any great value to holding "key-signing parties;" I also
>don't see what value there is to getting signatures directly (e.g. in
>person) from Phil Zimmerman or other "PGP luminaries." It may be nice
>and all, but it's rather difficult to evaluate what degree of reliance
>ought to be put on those signatures. I (for instance) knew Colin
>Plumb in university, and his participation in PGP work is certainly no
>surprise. But I really know little about how much I ought to trust
>him about crypto keys.
Well, that's an evaluation that you need to make; as it happens, the
only one of the PGP luminaries whose signatures I "trust" are
PRZ and Chris Hall -- mainly because I know what lengths they go to
to verify keys before they sign them.
If I have a key signed by God herself, that doesn't mean that I'm any
more trustworthy to sign keys. It does, however, mean that God trusts
that the key presented to her is one that I have control of. If you
trust God's word on that, then you can accept that I didn't give a dozen
copies to all of my friends.
Patrick
>The obvious answer to these security concerns is to do away with PGP, and
>have mandatory strip-searches of employees before they can leave each
>day. Then, there is the chance that someone might remember something
>useful and actually pass that information while at home. Putting
>employees in corporate controlled concentration camps should do the trick,
>eh, Adoph?
Don't say this too loud around some of the management types that I know
(1/2 :-).
In article <5fvg7b$g...@tor-nn1-hb0.netcom.ca>,
sew...@netcom.ca (John Savard) wrote:
>Gordon spence <gor...@rugeley.demon.co.uk> noted that a recent
>magazine article advised strongly against corporate use of PGP,
>quoting one expert as saying that unescrowed individual-key encryption
>could cause chaos, and another...
>quoting the article:
>>Addressing the Netlaw Legal Internet Symposium in London, Neil
>>Barrett, senior consultant with Bull Information Systems' security
>>division, said: "Don't use it. We're hearing it's been broken and
>>broken fairly easily."
>>"If it's true, it means you can break the US version of PGP inside
>>40 minutes, regardless of key length"
>
>No doubt, somebody has made a mistake...but where could they have
>gotten such an idea from?
It's not impossible that somebody injected David Scott's postings
into the rumour mill, and that Neil Barrett picked up on the results.
The D.S. postings I've seen lately have been sparse and sober. It's
also not impossible that he really is who he said he was, and that he's
taken himself off to learn cryptology.
Regards. Mel.
-----BEGIN PGP SIGNATURE-----
Version: 2.6.2
iQCVAwUBMyRC9EoezilfrZRVAQGlXwP/Xen7wrC+ygrfcA0hl3fhP3BmTA3El70S
DkFgJXdS6rXzfl2Qpmb/0PLH7hQJPZKvXmXIwVE/2iuqj9f5rnd9BAHrd+UECNO2
I4nFS39DPNOHItDcLbe8nS+hJ15c8+FCufwWgX8oV3rvn1ByER1l4P0bSOGlBMvS
76oAyc9Qb8s=
=Q2AJ
-----END PGP SIGNATURE-----
>[snip]
>Does a signature mean:
>a) I believe that this person is completely trustworthy in all ways?
>or (at the other extreme)
>b) I believe that someone with control over this PGP key used it at
>some point in time to encrypt this message.
Signing someone's key is like introducing them to the world and
confirming that to your knowledge they are who they claim to be (i.e.
are the personally generally known by the name they use for the key).
It has no other significance that I can see. This introduction is of
value only to those who already know the introducer, of course.
>
>You can't prove that I haven't given out my PGP key to ten people, all
>of whom thus can both read and sign messages in my name. I can
>certainly claim that I'm the only one with my key; I indeed *do*
>assert this to be the case.
Giving your secret key away is like giving away blank signed cheques.
It may be difficult later to deny responsibility for how at other
people filled them in.
--
Nicholas Bohm
[Please disregard numerals in my email address;
they are to frustrate junk email.]
In article <MPG.d8d46b7e...@netnews.worldnet.att.net>,
sas...@worldnet.att.net (Steven Sohn) wrote:
> Back to question one: Why should anyone, or who, would want to deceive?
Now that PGP is commercial, spreading FUD about its strength could
help competitors in the crypto marketplace. If you can convince a
potential customer that PGP is weak but your program is strong, then
he will buy your program.
- --
To find out more about PGP, send mail with HELP PGP in the SUBJECT line to me.
NEW ADDRESS: --> gala...@stack.nl <-- Please PGP encrypt your mail.
Finger gala...@turtle.stack.nl for public key (key ID 0x416A1A35).
Anonymity and privacy site: http://www.stack.nl/~galactus/remailers/
-----BEGIN PGP SIGNATURE-----
Version: 2.6.3i
Charset: cp850
iQCVAgUBMyRZaTyeOyxBaho1AQG+3wP/bhBat/AuH/GCD3/1ozmOBZQhoAbSW8nV
hQ5og2VeAsHNGvmir5VKTn9NFUMcyWTE3qD6A2g/xRX/Jk6Y1YDRifi3gL1kVlSr
xrYwGKH4Ivwcg26cdHG4MU9gd610wSVtyQNQGr1Ushw6d1oT5GVQQF5AaxEbIuMv
A9swRIr7UcA=
=tJ/n
-----END PGP SIGNATURE-----
*>: That said, it is also worth noting that RSA with 512 bit keys has been an ITAR
*>: approved export into approved non-US markets for ten years or more. For those
*>: with ears to hear, that speaks volumes. One also notes that only now is the
*>: first open source attack to be made on 512 bit keys.
*>Maybe rather than insinuation you ought to say what you mean. What do
*>you mean by "ITAR approved"? Approved for export to whom? Do you
*>believe that someone somewhere can cheaply factorize 512 bit RSA keys,
*>or that perhaps someone has a working attack on 512 bit RSA which does
*>not require factorization?
He is of course insinuating that 512 bits is breakable by the gov't.
No crypto system is "approved". Some may be licenced, but the gov't has
insisted that all such licenses are on a case by case basis with no
public guidelines. So, the question is, do you know of someone who has
received a license for 512 bit export into a relatively open market (ie
not to a US office of foreign soil)?
--
Bill Unruh
un...@physics.ubc.ca
*>a) I indeed do personally trust, or
*>b) That have *highly* publicized their keys, and would have
*>significant jeapordy if they were to be known to have acted in an
*>untrustworthy fashion.
......
On the other hand the alternative model, name key signing authorities
are equally dubious in the trust I should place in them. They will
almost certainly not know the person whose key they sign personally. All
they will demand is some paper trail, a paper trail which could easily
be forged in most cases I suspect. Of course such forgeries might be
illegal (under what law I wonder) but so what. Ie, all models of public
key verification (other than personal verification) have the same kinds
of difficulties with them. The only "advantage" of the central signing
model is that then you have someone to sue, but they probably disclaim
all libility anyway if someone forges the "paper" trail.
--
Bill Unruh
un...@physics.ubc.ca
--
-Colin
> >1. Obviously Neil Barrett does not understand how public key
> >encryption really works. To come out with a statement that the
> >cracking time is in no way dependent on key sizes almost beggars
> >disbelief.
>
> -- might be if IDEA was broken. IDEA has a fixed key size.
Non sequitur. *Because* IDEA has a constant key size, the key can be
discovered in constant time. That the time is inconveniently long
doesn't alter the argument.
RSA, on the other hand, has a variable key size and no-one knows how to
break RSA itself in constant time. However, interoperable
implementations of PGP have a fixed maximum keysize. Those keys can be
broken in constant time. Again, that time is inconveniently long.
Apart from these theoretical considerations, I can't really determine
what the speaker was trying to say except, perhaps, the following:
the cryptography is so good in PGP that by far the easiest and cheapest
way of breaking the system is through unconventional cryptanalysis.
The cost of blackbag, rubberhose and chequebook cryptanalysis is
independent of keysize, while TEMPEST is only linearly dependent.
Paul
>Better question: Why would they want to misinform the audience?
>Back to question one: Why should anyone, or who, would want to deceive?
Maybe the answer to "who" can be found in the word "unescrowed" in
the above quote.
Now, who do we know who is in favor of key escrow for strong crypto?
And if these people wanted to discourage the use of strong, non-escrowed
crypto, what would be the best way to do it? Tell them they should
trust their government, that it's okay for big brother to read their mail?
Hardly. They tried that already.
The best way to discourage the use of PGP and other secure systems is
to convince users that it isn't secure after all. Then the whole world
will line up for government-approved, key-escrowed crypto systems,
because THEY are secure. Right?
The Clipper chip line forms to the right...:-)
*>William Unruh wrote:
*>> of difficulties with them. The only "advantage" of the central signing
*>> model is that then you have someone to sue, but they probably disclaim
*>> all libility anyway if someone forges the "paper" trail.
*>No, they just purchase insurance against such an event and pass the cost
*>on to you either by charging for certification or by charging for
Well, if the insurance company has any sense it refuses to insure. The
potential costs are millions or even billions (eg loss of trade secrets
because info was sent to a competitor rather than the intended
recipient.) If I were the insurance company I would demand such a waiver
of liability in the contract (except of course the injured party might
well not be a party to the original contract). Looks like this is going
to be a lawyer's dream.
--
Bill Unruh
un...@physics.ubc.ca
In article: <857991...@eloka.demon.co.uk> o...@eloka.demon.co.uk
(Owen Lewis) writes:
>
[snip]
> The word is not that the RSA and IDEA ciphers are easily cracked
> but that the PGP security model is flawed. To give one specific,
> many would argue that the 'web of trust' which underpins PGP (as
> opposed to RSA & IDEA) is inherently unsafe. It is of course
> possible to use the ciphers in their PGP bindery without
> implementing the web of trust but that is not what the author
> intended and advocates.
That was certainly not the thrust of the article as reported. It was
made quite clear that the Bull security "expert" was referring to
PGP being broken as in "cracked".
I don't need a web of trust for my company to exchange information
directly with another company. We already trust each other.
Gordon
- --
PGP KeyId: D8CBC4B9 key on servers or at
www.rugeley.demon.co.uk
-----BEGIN PGP SIGNATURE-----
Version: 2.6.3i
Charset: cp850
iQENAwUBMyWfBsyDPobYy8S5AQE6XQfA7zvJacvDK/4llyat8i/h3CJEj08E5zNf
NyTEvHjyaEvy6fSd8udZcTbFlYcoINWy45YTQLH4JVJlqeJPoDt8zcVLNRXP2Cd2
X4uXb1FgROt5D9rsZ+wueK3NqSeJtXDAPpBHVf1IwMOISbc5bnLUg+lbZwK8rGJt
23BtamzoMi3JuLC77aAjjJ6D0z+mMN6GCtH608if4ZJuqEfONPUZhkhYemxl3x4W
hqU6jqL5sxpzBnQri5UUzUETV9MXt3btcbJOtF1AnxWKrN6/yz+jaJfMOcyYYZOs
jgcQdI2ZlFzr0dDojOvSnkID2upyG3QfwivSBciebJ8=
=5RWN
-----END PGP SIGNATURE-----
I am even placing less "value" to the meaning of 'trust' to signing
somebody's key. To me, signing somebody's key is just your trust that
the key belongs to the person. It has nothing to do with your 'trust'
of the person ( ie his personality ). In other words, by putting your
signature on the someone else key, you are just claiming that the
key indeed belongs to the person. If later this key is used to sign
a donation to a charity organisation, you know it is done by that
person who holds the key. But if later this key is used in a computer
fraud ( like stealing some ATM money or creating a computer trojan horse
or computer virus ), you also arrive at the same conclusion, you know
that ***IT IS DONE BY THE PERSON WHO HOLDS THE KEY***. That's all. It is
just a trust of the identity of the key !!! By signing somebody's key,
you should not have to carry the social or political responsibility
or obligation of believing the person is going to do good things
and always "trust-worthy".
In fact this is how CA signs a key. If VeriSign signs SexyGirls key,
which is in turn used to sign SexyGirls' trojan horse program. And
you downloaded SexyGirls trojan horse program and it caused damages
to you. You can only make this conclusion :- The trojan horse is
indeed coming from SexyGirls and nobody else. So next time you
downloaded something which is verified to have SexyGirls's signature,
you better be careful. You "trust" that the keys belongs to SexyGirls
and as a result, you "don't trust" that the program is going to do
any good things for you. You should not have to associate signing to
believing the person to be trust-worthy. And certainly you can't claim
damages from VeriSign !!!
>
> The $64,000 question: What does "trust" mean, anyways?
> --
This "trust" ( of the key belonging to the person ) is not that
"trust" ( of the person is always having good will ).
Ming-Ching
The problem of allocating liability based upon third party reliance is
indeed a thorny one. Several states have Digital Signature statutes
which allocate liability in controversial manners. Some shield the
CA from certain liabilities so long as certain procedures are followed to
verify identity.
It boils down to who do you trust? And how much? Some CAs will issue certificates
to just about anyone, others will be more certain of identity before
issuing a certificate. Some CAs will have a range of certificates,
based upon more or less stringent ID guidelines, and will caution third parties
to do their own evaluation of whether it is wise to rely upon the lower
level certificates. Some of the state statutes require insurance to be
carried by the CA, and license them for different levels of reliance
based upon the amount of insurance carried. In this kind of scenario,
it seems likely that the insurance company will call the shots as to the
procedures to be followed by its insured, just as property insurers now
can inspect premises and require different procedures be followed (such as
installation of sprinkler systems).
The liablity questions will be a BIG topic, and will have to shake out before
there is general reliance upon digital signatures.
--
=========================================================
--------...@shore.net------------
=========================================================
>vis...@ph.tn.tudelft.nl (Boudewijn W. Ch. Visser) writes:
>> >1. Obviously Neil Barrett does not understand how public key
>> >encryption really works. To come out with a statement that the
>> >cracking time is in no way dependent on key sizes almost beggars
>> >disbelief.
>>
>> -- might be if IDEA was broken. IDEA has a fixed key size.
>Non sequitur. *Because* IDEA has a constant key size, the key can be
>discovered in constant time. That the time is inconveniently long
>doesn't alter the argument.
To pick a minor nit, my statement was not a non-sequitur. I commented on the
piece quoted and attributed to Neil Barett :
Addressing the Netlaw Legal Internet Symposium in London, Neil
Barrett, senior consultant with Bull Information Systems' security
division, said: "Don't use it. We're hearing it's been broken and
broken fairly easily."
"If it's true, it means you can break the US version of PGP inside
40 minutes, regardless of key length"
This part does not say that the attack is on RSA.
[snip further good comments from Paul]
Boudewijn
--
+-------------------------------------------------------------------+
|Boudewijn Visser |E-mail:vis...@ph.tn.tudelft.nl |finger for |
|Dep. of Applied Physics,Delft University of Technology |PGP-key |
+-- my own opinions etc --------------------------------------------+
>If I have a key signed by God herself, that doesn't mean that I'm any
>more trustworthy to sign keys. It does, however, mean that God trusts
>that the key presented to her is one that I have control of. If you
>trust God's word on that, then you can accept that I didn't give a dozen
>copies to all of my friends.
Mmmm.... whose God, yours or mine?
Owen
>Owen Lewis (o...@eloka.demon.co.uk) wrote:
>.... Web of trust is a superset of hierarchical key certification.
Too facile for me. Web of Trust is at the root of both the PGP security model
and hence 'strong cryptography for the masses'. It has weaknesses that
are not typical of hierarchical key management.
>: The security of a cryptosystem is at least as much about issues of
>: implementation as it is about the strength of the algorithms. Arguably,
>: security that fully matches the strength of the RSA/IDEA algorithms is
>: unlikely to be implemented by all but a very small percentage of PGP users.
>
>None of them, I wouldn't think. You'd need secure computers and
>operating systems amongst (many) other things.
Quite so.
>: That said, it is also worth noting that RSA with 512 bit keys has been an
ITAR>: approved export into approved non-US markets for ten years or more.
For those
>: with ears to hear, that speaks volumes. One also notes that only now is the
>: first open source attack to be made on 512 bit keys.
>
>Maybe rather than insinuation you ought to say what you mean.
I insinuate nothing but reflect on the known facts.
What do you mean by "ITAR approved"? Approved for export to whom?
Approved by USG for export from the US for limited purposes by non-US
end-users in countries not on the COCOM list (we are talking ten years back).
>Do you
>believe that someone somewhere can cheaply factorize 512 bit RSA keys,
>or that perhaps someone has a working attack on 512 bit RSA which does
>not require factorization?
I believe that no national security agency authorises for general use a cypher
it cannot break, should push come to shove. Indeed, I would think it very
remiss were it to do so. The fact that software implementations facilitate
driving a horse and cart through the restrictions by those minded to do so is
an entirely separate argument.
I merely note that for over ten years USG has approved the export of RSA
implementations restricted to a key length of not more that 512 bits.
You are as welcome to draw your own deductions from the facts as I am.
Owen
>The best way to discourage the use of PGP and other secure systems is
>to convince users that it isn't secure after all. Then the whole world
>will line up for government-approved, key-escrowed crypto systems,
>because THEY are secure. Right?
>
>The Clipper chip line forms to the right...:-)
The one has a poor security model and the other has a better security model
but with known and major caveats. For those who will not curl up and die if
Uncle Sam etc. reads their traffic may be well served by Clipper. Those who
who swim in the sea of 'strong cryptography for the masses' may actually be
no safer but the security risks are less clearly defined. People will choose
as they will (where allowed the luxury of any crypto, let alone a choice).
Owen
Yes. RSA implementations with key kengths not to exceed 512 bits were
available in the EU financial sector some ten years ago.
Owen
Perhaps you should reread the certification standards associated with
the PKCS system. It is a straw man to set up a requirement of absolute
perfection for any system. In comparing systems, differential factors
are usually persuasive and you have not addressed that question. What is
more, your use of "equally" is a piece of unsupportable propaganda.
>The only "advantage" of the central signing
> model is that then you have someone to sue, but they probably >disclaim
> all libility anyway if someone forges the "paper" trail.
1. Your use of "only" is unsupportable propaganda. Again, read the PKCS
certification standards.
2. You are making up that they will disclaim all liability in the case
you cite. Is it excluded by the agreement? (Ex post answers not
acceptable. This is not just a factual issue but one of credibility.)
3. Your comment is an example of special pleading. You have omitted many
material facts including the liability they would have for malfeasance
of their own employees. Such liability is a considerable advantage and
should be particularly noteworthy for those who are always complaining
about not trusting certifiers or (in the past) escrow agents.
David
So what? How many web of trust signers do you know with such protection.
David
This is a common misconception, usually spread by PGP's rationalizers (I
don't say "defenders" because PGP has some solid defenses even if this
isn't one of them).
Heirarchical key certification requires strict standards and a closed
heirarchy in the sense that certifiers not meeting those standards are
excluded. In contrast, anyone may be a certifier in web of trust and
there are no agreed standards.
Putting it another way, heirarchical key certification is a closed
system in which those willing to meet particular standards for
certifying others may become certifiers, but others are excluded as
certifiers AND USERS MAY COUNT ON THAT. Web of trust breaks that system.
We have seen attempts (RIPEM, for example) to extend heirarchical
certification to accomodate web of trust. Such systems are hybrid, not
super- or sub-sets of heirachical.
What is more, PGP contains no mechanisms for enforcing heirarchical
certification, so the notion that web of trust is a superset of
heirarchical, even if it were true, is non-operational in any current
discussion of PGP.
David
> Well, that's an evaluation that you need to make; as it happens, the
> only one of the PGP luminaries whose signatures I "trust" are
> PRZ and Chris Hall -- mainly because I know what lengths they go to
> to verify keys before they sign them.
You are, of course, free to make your own decisions. That's the whole
point of web of trust.
But why should a general user trust a signature from Phil Zimmermann,
given what seems at least to me to be his past willingness to uh, er,
um, resist in aid of his cause of the day? Would he be willing to
subvert signature validity in aid of an anti-nuclear cause? a privacy
cause? some new cause he takes on? (Note that I make no comment on the
merits of the stated causes.)
This is not an idle question nor a personal slam. For any certification
system to be widely accepted among arms-length individuals or commercial
users it must have at least:
1. A past history of trust. Note that "trust" does not mean
"trust-in-a-cause" or "trust except for ..." to the general user;
2. Uniqueness of a certifier's certificates. Once a certificate
is issued for us...@site.dom, no other can be issued by a certifier
unless the certificate holder securely revokes his earlier certificate;
3. Near-iron-clad certification rules/protections. For example PKCS
systems use "BBN boxes": tamper-proof hardware of "nuclear missile
launch" quality, both to resist malfeasance and to insure 2 (above).
Users given Verisign Class 1 (persona) PKCS certificates are the unique
recipients of a certificate for the claimed signed public key and e-mail
address combination. Users seeking Verisign level 2 certificates are
checked against national data bases as well to authenticate their
persona; For Class 3 they must appear personally and present accepted
forms of photo ID and confirming ID to a notary and swear under penalty
of perjury that they are who they say, etc.;
4. Legal liability for malfeasance and sufficient resources to make a
malfeasance judgement practical;
5. Credibility.
David
: >Owen Lewis (o...@eloka.demon.co.uk) wrote:
: >.... Web of trust is a superset of hierarchical key certification.
: Too facile for me.
Facile? It's true. If you're saying that a piece of software should
_enforce_ a strict key certification hierarchy, then sure, PGP doesn't
do that. But that doesn't mean that PGP cannot be used in that way.
: >: That said, it is also worth noting that RSA with 512 bit keys has been an
: ITAR>: approved export into approved non-US markets for ten years or more.
: For those
: >: with ears to hear, that speaks volumes. One also notes that only now is the
: >: first open source attack to be made on 512 bit keys.
: > What do you mean by "ITAR approved"? Approved for export to whom?
: Approved by USG for export from the US for limited purposes by non-US
: end-users in countries not on the COCOM list (we are talking ten years back).
Approved for export to whom? Commercial or military? With what
controls? I'm sure that the US has never granted a general export
licence for such software.
: >Do you
: >believe that someone somewhere can cheaply factorize 512 bit RSA keys,
: >or that perhaps someone has a working attack on 512 bit RSA which does
: >not require factorization?
: I believe that no national security agency authorises for general use a cypher
: it cannot break, should push come to shove. Indeed, I would think it very
: remiss were it to do so.
I think that's probably true, although it would have been very
difficult for anyone to break DES when it was published, unless DES
has a security hole that we still don't know about. That's looking
increasingly unlikely.
: I merely note that for over ten years USG has approved the export of RSA
: implementations restricted to a key length of not more that 512 bits.
: You are as welcome to draw your own deductions from the facts as I am.
Andrew.
[snip]
>...The only "advantage" of the central signing
>model is that then you have someone to sue, but they probably disclaim
>all libility anyway if someone forges the "paper" trail.
>--
>Bill Unruh
>un...@physics.ubc.ca
Now there's a whiff of the real world. Ever put a "Stop Payment" on
a check? Ever had the bank honor it anyway? Then you know what
position they take: "So we made an error. If you've been harmed, sue
us. Otherwise, get lost."
--
Howard Arons
-Colin
Level one means that nobody can spoof the persona. That is--you can be
sure that there is a continuity of conversation with the holder of that
certificate. If someone else gets there first to spoof your identity,
you can always complain to Verisign. Once you are satisfied that the
holder of the certificate is the person claimed, you can rely on it in
future. For example, if I give you my credit card and certificate, and
the card proves valid, in future I need not repeat my credit card number
as long as I sign my order with the same certified key.
In contrast, we have no way of knowing that a web of trust signer won't
sign two different keys claiming the same persona and e-mail address.
As for level 2, it's as trustworthy as a credit check with (say) TRW
Credit Data, Mastercard, or VISA which many businessmen will rely on for
significant financial committments. In fact Verisign uses (among other
things) such a standard credit check data base. Also not chopped liver
compared with the Joe Blow-signed PGP key.
> The point is that, currently, there is no universaly believed and accepted
> method of electronic verification.
Beware of the use of universals such as "universally believed" in
arguments. There will never be a "universally believed and accepted
method of electronic verification". I know people who won't take credit
cards. But there are, and will be "good enough" approaches. Web-of-trust
isn't for arms length use. Verisign is, and merchants will sell you
stuff based on a Verisign-certified PKCS key where they would never
trust a PGP-certified key.
> I am not well versed in the PKCS system and I would appreciate some
> references.
www.verisign.com; www.rsa.com.
David
>> Web of trust is a superset of hierarchical key certification.
> Heirarchical key certification requires strict standards and a closed
> heirarchy in the sense that certifiers not meeting those standards are
> excluded. In contrast, anyone may be a certifier in web of trust and
> there are no agreed standards.
>
> Putting it another way, heirarchical key certification is a closed
> system in which those willing to meet particular standards for
> certifying others may become certifiers, but others are excluded as
> certifiers AND USERS MAY COUNT ON THAT. Web of trust breaks that system.
Nonsense. A user may set his web-of-trust software to accept only
signatures from the certifiers who meet those standards, and for him
the result is precisely equivalent to a heirarchical system.
If "web of trust" breaks the heirarchical system in the sense that
the availability of the former results in insufficient demand for
the latter, that's how the cookie crumbles.
> We have seen attempts (RIPEM, for example) to extend heirarchical
> certification to accomodate web of trust. Such systems are hybrid, not
> super- or sub-sets of heirachical.
The issue is whether web-of-trust can include heirarchical (which is
trivially easy, as explained above), not whether heirarchical can be
readily extended to web-of-trust (which is more difficult, except by
simply implementing heirarchical as a special case of web-of-trust).
> What is more, PGP contains no mechanisms for enforcing
. ^^^^^^^^^
What's that duck doing in here, and how did it get a $100 bill
in its beak?
> heirarchical
> certification, so the notion that web of trust is a superset of
> heirarchical, even if it were true, is non-operational in any current
> discussion of PGP.
Again, one simply sets the trusted signatures to be those you choose
to establish as one's signing authorities. This ain't nuclear physics.
--
Steve Brinich ste...@access.digex.net If the government wants us
PGP:89B992BBE67F7B2F64FDF2EA14374C3E to respect the law
http://www.access.digex.net/~steve-b it should set a better example
> 2. You are making up that they will disclaim all liability in the case
> you cite. Is it excluded by the agreement? (Ex post answers not
. ^^ ^^^^ ^^^^^^^ ^^^
> acceptable. This is not just a factual issue but one of credibility.)
. ^^^^^^^^^^
Really, Mr. Unruh, you're slipping. You should have been prepared to
answer Mr. Sternlight's question before he asked it.
Yes--this is exactly the same thing that it means to Phil Zimmerman, if
I may quote the PGP manual:
Bear in mind that your signature on a public key certificate does
not vouch for the integrity of that person, but only vouches for the
integrity (the ownership) of that person's public key. You aren't
risking your credibility by signing the public key of a sociopath,
if you were completely confident that the key really belonged to
him. Other people would accept that key as belonging to him because
you signed it (assuming they trust you), but they wouldn't trust
that key's owner. Trusting a key is not the same as trusting the
key's owner.
The web of trust doesn't have to do with which people to trust in
general (this is beyond the scope of PGP!! :) It exists only to help
verify that a public key does, indeed, belong to a certain individual.
(A daunting problem, indeed, referred to as "the Achilles' heel of
public key cryptography" by Zimmerman.)
Cheers,
-Beej
--
Disclaimer: Bill Gates is a dork.
Statement 2:
>Giving your secret key away is like giving away blank signed cheques.
Both of those statements can't be true.
One indicates "vague" reliance. The other indicates "financial
institution" reliance.
I've been observing the PGP phenomenon for several years now; I first
saw the algorithms about ten years ago, and have kept abreast (at
least at low intensity) of things.
Algorithmically, RSA is now quite mature, and it looks like the
associated block ciphers are also not bad. (I'd like to see PGP with
an interface to plug in the algorithms of the user's choice, along
with some versioning protocols; "This week I'm using GOST 4.1; next
week, I'll be using Blowfish 2.3; the week after that, a Caesar
substitution. Oops. Shouldn't have mentioned that last one...")
Key management, the supposed strength of PGP, is still something that
is clearly not understood clearly.
It probably makes sense to have multiple personal keys for multiple
roles; I'm not sure whether it's regrettable or not that the signing
system doesn't provide for stating rather a lot more about what a
particular signature is intended to mean.
Right now, about all that PGP digital signatures are good for *in
public* are relatively low-reliance indications that Usenet postings
are indeed from the people that claim to have posted them. I wouldn't
put much more trust in the public "networks of trust" than that.
--
Christopher B. Browne, cbbr...@unicomp.net, chris_...@sdt.com
PGP Fingerprint: 10 5A 20 3C 39 5A D3 12 D9 54 26 22 FF 1F E9 16
URL: <http://www.conline.com/~cbbrowne/>
Linux: When one country worth of OS developers just isn't enough...
*>In article <5g24pb$cfe$1...@nntp.ucs.ubc.ca>
*> un...@physics.ubc.ca "William Unruh" writes:
*>> So, the question is, do you know of someone who has
*>>received a license for 512 bit export into a relatively open market (ie
*>>not to a US office of foreign soil)?
*>Yes. RSA implementations with key kengths not to exceed 512 bits were
*>available in the EU financial sector some ten years ago.
Since RSA was open knowldge which anyone could code up, and since RSA
has no protection anywhere outside the USA, you also need to
say that it was available from a US supplier. Was it? And was it for
encryption, not verification (eg signing)?
--
Bill Unruh
un...@physics.ubc.ca
*>David Sternlight wrote:
*>>William Unruh wrote:
*>> 2. You are making up that they will disclaim all liability in the case
*>> you cite. Is it excluded by the agreement? (Ex post answers not
*>. ^^ ^^^^ ^^^^^^^ ^^^
*>> acceptable. This is not just a factual issue but one of credibility.)
*>. ^^^^^^^^^^
*> Really, Mr. Unruh, you're slipping. You should have been prepared to
*>answer Mr. Sternlight's question before he asked it.
I know, I know. Must have been an off day.
To answer David, no I was not refereing to any specific "case". That was
I believe clear to almost all who read my post. My post was an
expression of cynicism about the reliability of commercial
organisations,
not a comment about any particular organisation or particular contract.
Unfortunately in cryptography, you rarely know that you have been had,
and if you do it is long after the fact. And launching law suits to
recover damages is far and away the last line of defense against those
damages.
At the present state of affairs, I personally would not trust a
certification authority any more than I would a "Web of trust" for
things which really really mattered to me. (for those which don't I
would trust both about equally). It will only be clear how trustworthy a
CA is when there have been a few court cases which have spelled out what
their liability is, and what kind of arguements the CAs and their
insurance companies advance when faced with a multi million dollar law
suit. I would not expect an easy ride of it by the suing party.
Of course I could be wrong. And of course when I express my cynicism,
you, or any other reader, has the right to discount what I say.
So to clear up any confusion you or anyone else might have had about my
post-- No it did not refer to any particular contract between a CA and a
customer. (For future reference, note that when I use the word
"probably" it always referes to a case where I do not have any detailed
knowledge, and often to cases where I am expressing uninformed doubt or
cynicism. It is not to be taken in the technical sense of probability in
quantum mechanics.)
However, my cynicism stands.
--
Bill Unruh
un...@physics.ubc.ca
This is a virtue of PGP's web of trust: Under the complete control of
EACH user, that user's own web of trust can parallel the centralized
schemes, such as verisign, or be very loose. The beauty of the web of
trust is the very ability of a user to select the level of protection
that he wishes to impose. Centralized schemes are subject to attacks
on a single entity. That is a substantial exposure.
Central certifying authorities have a different class of problem. One
example is Microsoft's authenticode that certifies ActiveX apps. That
certification does not prevent sophomores from exploiting ActiveX
weaknesses to do such mischief as formatting your hard drive.
Many recent postings related to PGP focus on 1) aspersions upon the
character of the author regarding his development and distribution of
the original freeware; 2) third-hand accounts of UK security "experts"
claiming that breaking PGP is trivial; 3) criticism of the security of
the "web of trust". PGP continues to be the leading world-wide program
for private encryption, attacking both IDEA and RSA keys remains, to
the best public knowledge, a matter of brute force attack, Phil
Zimmermann was just appointed to the CPSR Board of Directors, and the
web of trust is as secure as each user determines and configures.
--
-------------------------------
Ed Stone
estone@synernet d o t com
-------------------------------
The US government policy has been more permissive with respect to the
international banking system, so although that example responds to that
specific question stated by William Unruh, it does not do much to support
the implication through this thread that 512 bit RSA has been readily
breakable by the NSA.
Larry Kilgallen
I have a fundemental problem with the viability of a web of trust over
the long run:
How does one revoke certificates in the event that one's private key is
compromised? The flip side being, how can one know that the signed message
was not signed with a compromised private key?
In an institutional CA model, one can check a certificate revocation list
in situations where reliance is related to important matters. Is this
possible when one relies upon a web of trust? If not, is this flaw fatal to
the web of trust model?
--
=========================================================
--------...@shore.net------------
=========================================================
: At the present state of affairs, I personally would not trust a
: certification authority any more than I would a "Web of trust" for
: things which really really mattered to me. (for those which don't I
: would trust both about equally). It will only be clear how trustworthy a
: CA is when there have been a few court cases which have spelled out what
: their liability is, and what kind of arguements the CAs and their
: insurance companies advance when faced with a multi million dollar law
: suit. I would not expect an easy ride of it by the suing party.
The Massachusetts Information Technology Division is planning
to hold a mock trial to explore liability issues. It is likely to
be an interesting exercise. Further information is available at
http://www.state.ma.us/itd/legal.
--
=========================================================
--------...@shore.net------------
=========================================================
MCCHESNEY: Microsoft says it has taken adequate steps to solve the
problem. It's hired a company called Verisign (ph) to
provide authentication code for every site developer using Active-X
controls. This code works with Microsoft's web browser
called Internet Explorer.
Again, Microsoft's Todd Nielsen.
NIELSEN: Verisign, after you fill out the forms and the various
process happens, you're issued a signature. And then Internet
Explorer looks to say, is this control or is this executable signed,
and if so, it then gets the user to say, this was produced by
Microsoft Corporation, would you like to download it? And if there's
no signature, Internet Explorer will say, this is an
unsigned control, we will not run it.
MCCHESNEY: But Simpson Garfinkel feels this is not a strong enough
safeguard.
GARFINKEL: Verisign will basically give a certificate to anybody who
pays for one. They don't care if you're a good person
or a bad person. All they care is that you pay them their money.
MCCHESNEY: And, it should be added, they care that you filled out the
forms so you can be tracked down if your Active-X
controls are used for any malicious purpose.
Recently, one developer put an Active-X control he called Internet
Exploder on his website to demonstrate what could
happen. The control was programmed to shut down a person's computer in
ten seconds. This developer had received a
signature from Verisign, but that signature was disabled as soon as
Verisign discovered what he was doing.
David Sternlight wrote:
>
> But why should a general user trust a signature from Phil Zimmermann,
> given what seems at least to me to be his past willingness to uh, er,
> um, resist in aid of his cause of the day? Would he be willing to
> subvert signature validity in aid of an anti-nuclear cause? a privacy
> cause? some new cause he takes on? (Note that I make no comment on the
> merits of the stated causes.)
The point here is that a person may sacrifice signature integrity in
support of a cause he holds more dear. A certificate authority, being
composed ultimately of human beings, is no less subject to this
human failing than any any other group or person. However, you may
know something about PZ and his causes and credentials, such as his
recent appointment to the CPSR Board and various awards he has
received. How many of us know the personal and professional causes and
credentials of Verisign authenticators? Centralization provides
further vulnerabilities.
Would Verisign, or David Sternlight, or Acme, subvert their
signature/certificate validity in aid of key recovery, pro-crypto-
control propaganda, or some new cause they may take on? (Note that I
make no comment on the merits of the stated causes.)
The foremost characteristic of a secure authentication system is its
compartmentalization. Centralization is a red flag.
>
> This is not an idle question nor a personal slam. For any certification
> system to be widely accepted among arms-length individuals or commercial
> users it must have at least:
>
> 1. A past history of trust. Note that "trust" does not mean
> "trust-in-a-cause" or "trust except for ..." to the general user;
>
> 2. Uniqueness of a certifier's certificates. Once a certificate
> is issued for us...@site.dom, no other can be issued by a certifier
> unless the certificate holder securely revokes his earlier certificate;
>
> 3. Near-iron-clad certification rules/protections. For example PKCS
> systems use "BBN boxes": tamper-proof hardware of "nuclear missile
> launch" quality, both to resist malfeasance and to insure 2 (above).
> Users given Verisign Class 1 (persona) PKCS certificates are the unique
> recipients of a certificate for the claimed signed public key and e-mail
> address combination. Users seeking Verisign level 2 certificates are
> checked against national data bases as well to authenticate their
> persona; For Class 3 they must appear personally and present accepted
> forms of photo ID and confirming ID to a notary and swear under penalty
> of perjury that they are who they say, etc.;
>
> 4. Legal liability for malfeasance and sufficient resources to make a
> malfeasance judgement practical;
>
> 5. Credibility.
>
> David
>
--
Very droll. The intelligent reader will understand that I meant, of
course, that he should have read the agreement before making
representations about what it "probably" contained.
David
No, you should have read the agreement before sounding off about what it
"probably" provided for.
>
> To answer David, no I was not refereing to any specific "case". That was
> I believe clear to almost all who read my post. My post was an
> expression of cynicism about the reliability of commercial
> organisations,
> not a comment about any particular organisation or particular contract.
There's no reasoning with you if you rely on--well, of course nobody
should be trusted to honor their contracts. The only rational conclusion
from that is that one should do nothing. Before you leap to reply with a
meaning I did not type, you might just as well argue that Phil is an NSA
plant, or that the Supreme Court is a conspiracy of Soviet agents.
You can always make up a paranoid story, but to move forward
civilization relies on contracts, agreements, and a background of shared
practices. In this case, it isn't acceptable to claim someone would not
assume liability for forgery without reading the contract or agreement
they provide or citing past instances where they dishonored contracts or
agreements. To do otherwise would drown us in b.s.--that is, an invented
reality that exists only in each person's fantasy world. Moving forward
in a joint setting (such as commerce or communications) would be
impossible under those circumstances.
> Unfortunately in cryptography, you rarely know that you have been had,
> and if you do it is long after the fact
Another vague generality. We're talking about liability for malfeasance
here, and it almost always comes out. Read history.
>. And launching law suits to
> recover damages is far and away the last line of defense against those
> damages.
But the rational man will understand that if someone has a lot to lose
from behavior you don't want, he will be restrained from such behavior.
That's how we prevented a nuclear war and did in the "evil empire",
whether you like it or not. Read Prof. Ithiel de Sola Pool (formerly of
MIT) on influence mechanisms in social process.
>
> At the present state of affairs, I personally would not trust a
> certification authority any more than I would a "Web of trust" for
> things which really really mattered to me. (for those which don't I
> would trust both about equally).
Most of us have to get commerce done and cannot afford to take your
stance. It is fine as YOUR stance, but not as general advice.
>It will only be clear how trustworthy a
> CA is when there have been a few court cases which have spelled out what
> their liability is, and what kind of arguements the CAs and their
> insurance companies advance when faced with a multi million dollar law
> suit.
I agree.
> I would not expect an easy ride of it by the suing party.
I agree.
>
> Of course I could be wrong.
I agree.
>And of course when I express my cynicism,
> you, or any other reader, has the right to discount what I say.
I agree. It's not discounting so much as distinguishing a personal rant
from a factually-based post.
>
> So to clear up any confusion you or anyone else might have had about my
> post-- No it did not refer to any particular contract between a CA and a
> customer.
Funny, I sure though you were discussing heirarchical--and Verisign is
the main exemplar these days.
> (For future reference, note that when I use the word
> "probably" it always referes to a case where I do not have any detailed
> knowledge, and often to cases where I am expressing uninformed doubt or
> cynicism. It is not to be taken in the technical sense of probability in
> quantum mechanics.)
You'd better look the word up. It means "likely" and not "I don't have
information and am winging it".
>
> However, my cynicism stands.
I know, I know. Hang in there. We need some cynics here, but it is
important to distinguish an overt expression of cynicism from an
apparent factual analysis.
David
The PGP doc says:
"A trusted centralized key server or Certifying Authority is
especially appropriate for large impersonal *centrally-controlled*
corporate or governmental institutions. Some institutional
environments use hierarchies of Certifying Authorities.
For more decentralized grassroots "guerrilla style" environments,
allowing all users to act as a trusted introducer for their friends
would probably work better than a centralized key server. PGP tends to
emphasize this organic decentralized non-institutional approach. It
better reflects the natural way humans interact on a personal social
level, and allows people to better choose who they can trust for key
management." [my asterisks]
>>> end of pgp doc
This means that anyone can start their own PGP Certifying Authority,
and you can choose to trust it, or not trust it. You can choose to
trust it on the basis of nuclear-key technology, or on the basis of
you know the certifier's brother-in-law. On the basis of an
affiliation with a governmental agency, or the profound absence
thereof. Some would rely upon, for example, an NSA or Commerce
Department certification, while others would sooner trust the tooth
fairy.
Different users would certainly be expected to trust different types
and classes of certifiers. PGP provides support for those differences,
and leaves control up to the user. If you are sending love letters to
Lucy, using the government as a key certifier is most likely pretty
safe. If you are sending your notes about Chinese political
contributions made to the Clinton Administration to the Washington
Post, I'd feel better about a PZ certification.
Who you trust is a changing and delicate matter. In the news today,
the White House has some element of distrust of the Justice Department
and the FBI. The FBI a few months ago cited an abuse of its trust by
the White House security office. More recently, a copy of a taxpayer
funded White House database seems to found its way to a specific
political party's offices. Trust is a volatile entity...
In article <01bc2e5b$0fc67fe0$2d6b...@crc3.concentric.net>,
sal...@concentric.net says...
> Your point is well taken. As far as I can determine there are significant
> problems with the "web of trust" as well as currently available central
> certification authorities. Certainly Veri-Sign Level 1 and 2 certificates
> are
> essentially meaningless, and even Level 3 sounds only moderately secure.
> The point is that, currently, there is no universaly believed and accepted
> method of electronic verification.
> I am not well versed in the PKCS system and I would appreciate some
> references.
> sal-
> 3. Near-iron-clad certification rules/protections. For example PKCS
> systems use "BBN boxes": tamper-proof hardware of "nuclear missile
> launch" quality, both to resist malfeasance and to insure 2 (above).
A bit of language clarification here -- PKCS is a set of protocols
layered on top of X.509 which support typical electronic commerce
needs (X.509 was aimed at X.500 directory protection).
While hierarchies which use tamper-resistant ("proof" is in the
breaking) BBN boxes, or the somewhat-less-foolproof general PCMCIA
card hardware approaches, invariably use PKCS protocols, it is not
the case that every use of PKCS protocols will use such hardware.
Requirements to use particular hardware are governed by hierarchy
agreements generally drafted by the holder of a root key and then
subscribed to by all CAs which become members of the hierarchy.
There is nothing to keep someone from using PKCS protocols for
a hierarchy which allows CAs to be software based. The market
for hierarchy-based trust tends to also be interested in CAs
using hardware, so the subscriber base might be limited.
Larry Kilgallen
We all know that all the time Mr. Zimmerman spent in court was a government
staged hoax. He's the illegitimate son of a former NSA head <joke>
2) third-hand accounts of UK security "experts" claiming that breaking PGP
: is trivial;
We all know factoring 1024 bit RSA primes (the key), or breaking IDEA (the
encryption) is as simple as pie. Just look at all the posts from AOL
saying so <joke>
: 3) criticism of the security of
: the "web of trust". PGP continues to be the leading world-wide program
: for private encryption, attacking both IDEA and RSA keys remains, to
: the best public knowledge, a matter of brute force attack, Phil
: Zimmermann was just appointed to the CPSR Board of Directors, and the
: web of trust is as secure as each user determines and configures.
Trust no one! Not even yourself, for you may be acting under outside
influences unknown to you. For example, notice the affect of subliminal
messages in TV shows like "Baywatch", which cause people to lapse into
a drooling hypnotic state <joke>
--
Andrew E. Mileski mailto:a...@netcom.ca
Linux Plug-and-Play Kernel Project http://www.redhat.com/linux-info/pnp/
XFree86 Matrox Team http://www.bf.rmit.edu.au/~ajv/xf86-matrox.html
> Central certifying authorities have a different class of problem. One
> example is Microsoft's authenticode that certifies ActiveX apps. That
> certification does not prevent sophomores from exploiting ActiveX
> weaknesses to do such mischief as formatting your hard drive.
From everything I have read, those (many) weaknesses have nothing to do
with the hierarchical trust model. To the extent that the weaknesses
are related to the trust model at all they are related to extensions
to the hierarchical model, such as "what I got from this URL before
was properly signed, so I will presume that what I get from it now
is also properly signed". And of course, the ultimate trust fallacy -
"I trust this code was signed by that entity." (verifiable)
"I trust that entity is honorable." (judgement call)
THEREFORE:
"I trust that entity is competent to avoid security bugs."
Beyond that fallacious extension of trust (for which hierarchy is
only involved in the first step), the ultimate undoing of ActiveX
would seem to be the assumption that average users are going to be
able to make accurate judgements regarding the third point. Many
will "trust" those entities which have spent their money on television
ads rather than on avoiding security bugs.
And those flaws are not any better handled with Web-of-Trust replacing
the hierarchical scheme.
Larry Kilgallen
It doesn't matter. The revoke certificate has to be send out in this case.
If it is originated to the owner of the key or the theft...
> The point here is that a person may sacrifice signature integrity in
> support of a cause he holds more dear. A certificate authority, being
> composed ultimately of human beings, is no less subject to this
> human failing than any any other group or person. However, you may
> know something about PZ and his causes and credentials, such as his
> recent appointment to the CPSR Board and various awards he has
> received. How many of us know the personal and professional causes and
> credentials of Verisign authenticators? Centralization provides
> further vulnerabilities.
>
> Would Verisign, or David Sternlight, or Acme, subvert their
> signature/certificate validity in aid of key recovery, pro-crypto-
> control propaganda, or some new cause they may take on? (Note that I
> make no comment on the merits of the stated causes.)
As a corporation with diverse ownership, it should be much more complex
for VeriSign to subvert their own system for a cause than it would be
in the case of an individual such as Phil Zimmermann or David Sternlight.
The worth of shares in the corporation depends on public trust, and
the shareholders are much larger corporations with large legal staffs
able to sue for damages quite readily.
Whether VeriSign has taken adequate measures to provide internal
controls preventing some individual inside VeriSign from acting
without authority is something which must be evaluated, but the
math to prevent such maverick activities is available in the form
of algorithms for secret-splitting. The physical keys to the BBN
Safekeeper boxes use those algorithms.
Larry Kilgallen
*>No, you should have read the agreement before sounding off about what it
*>"probably" provided for.
Had I been refeeing to some specific agreement, I would have. I was
however making a comment about Certifying Authorities in general, in
which case it is hard to read *the* agreement. But thank you for
pointers to one specific agreement.
*>You can always make up a paranoid story, but to move forward
*>civilization relies on contracts, agreements, and a background of shared
*>practices. In this case, it isn't acceptable to claim someone would not
*>assume liability for forgery without reading the contract or agreement
*>they provide or citing past instances where they dishonored contracts or
*>agreements. To do otherwise would drown us in b.s.--that is, an invented
*>reality that exists only in each person's fantasy world. Moving forward
*>in a joint setting (such as commerce or communications) would be
*>impossible under those circumstances.
This is a totally new area of law. One does not have past practice to go
on. That is part of my point. There is going to be a lot of shaking sown
of the industry in the next few years, and the exact level of liability
of such CAs is going to a major part of that shakedown. I suspect that a
number of such CA companies will go down in flames. They may even want
to honour their contract, but very few small companies could even if
they wanted to, pay a multi million suit. When you wander into uncharted
legal territory, many many areas of potential danger lie ahead. It is
precisely because the uncharted nature of this territory that I feel at
the present time Web of Trust and CAs are probably about equally
trustworthy if you pay attention. For you to advise businesses to regard
CAs as a foolproof, well established set of entities is I think as
foolish as the position of total lack of trust you are advocating.
I certainly feel things should move forward, but the more stake you have
in the reliability of the key the more cautious you should be at the
present time. Committing billions to either Web of Trust or to CAs would
be exceedingly foolish at the present time.
*>> Unfortunately in cryptography, you rarely know that you have been had,
*>> and if you do it is long after the fact
*>Another vague generality. We're talking about liability for malfeasance
*>here, and it almost always comes out. Read history.
*>>. And launching law suits to
*>> recover damages is far and away the last line of defense against those
*>> damages.
*>But the rational man will understand that if someone has a lot to lose
*>from behavior you don't want, he will be restrained from such behavior.
*>That's how we prevented a nuclear war and did in the "evil empire",
*>whether you like it or not. Read Prof. Ithiel de Sola Pool (formerly of
*>MIT) on influence mechanisms in social process.
That is of course why there are so few law suits in the USA, and why
commercial interactions in the USA are so trouble free.
*>>
*>> At the present state of affairs, I personally would not trust a
*>> certification authority any more than I would a "Web of trust" for
*>> things which really really mattered to me. (for those which don't I
*>> would trust both about equally).
*>Most of us have to get commerce done and cannot afford to take your
*>stance. It is fine as YOUR stance, but not as general advice.
*>>It will only be clear how trustworthy a
*>> CA is when there have been a few court cases which have spelled out what
*>> their liability is, and what kind of arguements the CAs and their
*>> insurance companies advance when faced with a multi million dollar law
*>> suit.
*>I agree.
*>> I would not expect an easy ride of it by the suing party.
*>I agree.
*>>
*>> Of course I could be wrong.
*>I agree.
*>>And of course when I express my cynicism,
*>> you, or any other reader, has the right to discount what I say.
*>I agree. It's not discounting so much as distinguishing a personal rant
*>from a factually-based post.
*>>
*>> So to clear up any confusion you or anyone else might have had about my
*>> post-- No it did not refer to any particular contract between a CA and a
*>> customer.
*>Funny, I sure though you were discussing heirarchical--and Verisign is
*>the main exemplar these days.
*>> (For future reference, note that when I use the word
*>> "probably" it always referes to a case where I do not have any detailed
*>> knowledge, and often to cases where I am expressing uninformed doubt or
*>> cynicism. It is not to be taken in the technical sense of probability in
*>> quantum mechanics.)
*>You'd better look the word up. It means "likely" and not "I don't have
*>information and am winging it".
*>>
*>> However, my cynicism stands.
*>I know, I know. Hang in there. We need some cynics here, but it is
*>important to distinguish an overt expression of cynicism from an
*>apparent factual analysis.
Agreed. I hope it has now been cleared up.
--
Bill Unruh
un...@physics.ubc.ca
Some do, some don't. There's nothing in the PKCS that require the use of
SafeKeypers. For some applications they're worth the expense (high value key,
frequent use); for others it's a waste of money.
If you have a high value key, which only needs to be used extremely rarely-
for example, one only used to sign issuers at the next level - it may be
easier simply to use more conventional means of physical security; a state
treasury, or National Guard armoury. The lower the value of the key, the
less extreme the protection you need.
As you know as well as anyone here, trust is a matter of economics and risk;
if it costs $10M to introduce measures that will reduce misuse by $1M, it's
better to let the $1M slide. If it only costs $100,000 just do it.
--
Now available - The Freddy Hayek Kayak | "Pass me another elf
Paddle Your Own Canoe! Be Rowed To Surfdom! | Sergeant- this one's
From The Taco Institute for Dyslexic Libertarians | split"
>On Mon, 10 Mar 1997 18:53:31 GMT, Nicholas Bohm <n7b...@ernest.net>
> posted:
>[snip]
>Statement 1:
>>Signing someone's key is like introducing them to the world and
>>confirming that to your knowledge they are who they claim to be
>
>Statement 2:
>>Giving your secret key away is like giving away blank signed cheques.
>
>Both of those statements can't be true.
>
>One indicates "vague" reliance. The other indicates "financial
>institution" reliance.
>
>[snip]
I agree that the reliance in the two cases is different, but so are
the acts relied on. The acts take their colour from their context.
Signing another's public key gives third parties another point of
reference for checking the genuineness of the keyholder's identity.
If you don't know the keyholder, but know one of the signatories, ask
the signatory about the keyholder. Apart from any specific statement
made to you in response to an enquiry, the signing itself seems to me
to be a representation that the signer has good reason to believe that
the key belongs to the person named in it.
The reason why it amounts to that representation is that that is what
all the PGP documentation and literature says it is, and that is how a
general practice and usage is growing up. At least from an English
lawyer's perspective, it will be continuity in that usage that leads
to its acceptance as a basis for attributing liability in law. In due
course, a careless signer may be held liable for loss caused to those
who have relied on the signature, although there is probably some way
to go yet.
Similarly, the use of the private key to provide a digital signature
is no more than another way for a person to place his mark on a
document to indicate his assent to it (which is all a signature has
ever been, as a matter of law, since the use of a mark for this
purpose dates from times of low literacy). Once use of digital
signatures comes to be understood as another way of making your mark
in this sense, then enabling other people to place your mark on
documents will amount to the same thing as giving them a power of
attorney. I stand by my analogy with blank signed cheques.
--
Nicholas Bohm
[Please disregard numerals in my email address;
they are to frustrate junk email.]
>The PGP doc says:
>"A trusted centralized key server or Certifying Authority is
>especially appropriate for large impersonal *centrally-controlled*
>corporate or governmental institutions. Some institutional
>environments use hierarchies of Certifying Authorities.
>For more decentralized grassroots "guerrilla style" environments,
>allowing all users to act as a trusted introducer for their friends
>would probably work better than a centralized key server. PGP tends to
>emphasize this organic decentralized non-institutional approach. It
>better reflects the natural way humans interact on a personal social
>level, and allows people to better choose who they can trust for key
>management." [my asterisks]
Unfortunately, there are at least two ways in which humans interact.
One way is social interaction among a circle of friends. Another way
is business, legal, or financial interaction between strangers. With
the Internet, we now have social (etc) interaction among strangers
as well. Using a user-centric "web of trust" to authenticate keys just
doesn't work very well for interaction among a large group of people
who don't otherwise know each other, because there may not *be* anyone
to "introduce" you.
>>>> end of pgp doc
>This means that anyone can start their own PGP Certifying Authority,
>and you can choose to trust it, or not trust it. You can choose to
>trust it on the basis of nuclear-key technology, or on the basis of
>you know the certifier's brother-in-law. On the basis of an
>affiliation with a governmental agency, or the profound absence
>thereof. Some would rely upon, for example, an NSA or Commerce
>Department certification, while others would sooner trust the tooth
>fairy.
And when you sign a key presented by "me" when you don't really know
me at all and would have no way of knowing whether or not that person
(who is not me) is who they claim--or when you simply sign "my" key
falsely, knowing it is not true--and someone is then able to impersonate
me using "my" key and presenting *your* certification, how much do you
suppose I could sue you for when some number of people chose to accept
your certification of "my" key and thus believe the impersonator? Of
course, the exact damages would depend on just what the impersonator
does and on how well it can be traced back to them and to your improper
certification, but hypothetically, a lot of damage could be done which
is exactly the sort of thing that digital signatures are supposed to
*protect* us from.
Sure, *you* may know everyone whose key you sign, but how is anyone else
that you "introduce" them to supposed to *know* that you really do know
them and that the key you present is not forged? Relying on the word
of an unaccountable third-party (another user) about someone's public
key defeats the whole point of digital signatures in the first place.
>Different users would certainly be expected to trust different types
>and classes of certifiers. PGP provides support for those differences,
>and leaves control up to the user. If you are sending love letters to
>Lucy, using the government as a key certifier is most likely pretty
>safe. If you are sending your notes about Chinese political
>contributions made to the Clinton Administration to the Washington
>Post, I'd feel better about a PZ certification.
This seems to show a lack of understanding of the concept of public-key
cryptography. With the exception of "key-recovery" systems (which would
contain the private keys, and which I do *not* support for general use),
there is no privacy risk to having some CA/keyserver sign and/or provide
your public key; they can't decrypt messages sent to you with your true
private key. The *risk* in a CA hierarchy is that an agency could present
false keys and thus could forge digital signatures or could read any
intercepted messages encrypted with the false public keys. Or a CA might
be tricked into signing a false key by an impersonator because of an
insufficient check and other safeguards. That is why there need to be
strict regulations on the operation of general CAs/keyservers and great
care in their operation and licensing. One possible model would be to
have a top-level certification authority (or possibly multiple ones,
with the signature of *each* (or some subset of them) being required)
which then registers and signs the public keys of CAs at the next level
below (perhaps a state CA-licensing agency) which then licenses the
CAs under its jurisdiction.
Another important concept in a CA hierarchy is that keys should be
certified by agencies that are directly connected to the use of that
key, who are in the best position to *know* the true key, and who have
a vested interest in keeping the key "honest". For example, a key
associated with the use of a credit card or bank account should be
certified by the card issuer or bank. If you don't trust your bank
to be honest about keeping your account accurate it doesn't do you a
lot of good to add authentication on top of the use of the account,
but on the other hand if the bank is caught in such dishonesty or
violation they would lose their license and be shut down (or pay
massive penalties, depending on the law and the circumstances). A
company would certify the keys of its employees for internal use (and
possibly for external use), an ISP might certify the public keys of
its registered users, an upstream access provider would certify the
keys of its downstream clients, and so on. If you follow the natural
hierarchy of the given application the CA hierarchy can be quite
secure, and the government doesn't have to be involved except in
licensing and monitoring the CAs for compliance with the appropriate
regulations to ensure security. (This assumes we can get the government
to get over its desire to decrypt anything and everything and instead
do its job of protecting the privacy and finnancial security of the
public.)
-Rob Parker
Hi,
as long as I know, there has been a math-proof,
that RSA and IDEA are only to be cracked by a
so called "brute-attack", that means (if I'm
right...) that you have to test *all* posibili-
ties until you found a matching key ...
I never heared of a machine that is able to
do so in relevant time on 128-bit RSA or bigger
... If we suppose that a machine can test 10,000
keys of 128-bit in one second, how much time
would it take to crack one of these 128-bit-keys
then ? It would be 1.079*E+27 years to test all
possible keys. Ok, perhabs the first is correct,
but ... isn't that enough security ?
I think so ... You can see these times proof when
taking a look on the RSA-challenge result:
40-Bit-Key found in 3.5hrs
48-Bit-Key found in 13 days on ~50% of the possible
keys...
Stefan
--
jedwede kommerzielle Nutzung der uebermittelten Daten zu
Werbungszwecken ist hiermit untersagt.
PGP: 26 89 52 B4 7C CD 91 45 0A 06 AD F8 1B A5 67 85
That's silly. Key certification is an introductory mechanism. That
a key is certified, by itself, is meaningless. What matters is who
the certifiers are, and how relevant a particular certificate is
depends on who is asking and why. There are no universally
superior certifiers. Suggesting that the system has to enforce
hierarchical certification is akin to saying you won't do business
with anyone not introduced to you by your bank manager. The only
value in enforcement is the power it gives the certifiers. Most of
us would prefer to deny them that power.
The only thing needed to establish a hierarchical certification
regime with PGP is the decision to do it. Hierarchical systems can
exist not just as isolated subsets, but as threads running through
the web of trust. If my key is "officially" signed by my bank,
what's the harm if it's also signed by my neighbor? That satisfies
both the people who don't care who I am as long as I'm the one who
puts money into the account, and the people who care who I am. My
neighbor's certification doesn't dirty the key.
It's difficult to compare hierarchical systems with network
systems because they serve different constituencies. Hierarchical
systems are totally about limitation of liability (web of
distrust?). The interesting thing about hierarchical systems is
that it's hard to explain what they do - you really have to get
creative to construct a situation where a certificate of this
kind is the best mechanism. It's rare that people do business on a
basis that needs that kind of assurance without first getting
involved enough to exchange reliable keys. The exception is online
credit-card-like payments where one assumes the signature would be
certified by the host institution.
A lot of the disagreement on this topic stems from incoherence
about what the certificate is for. The problem is that there isn't
one answer. Every certificate needs to have the purpose expressed
or implied. A credit card says something about who you are and
your financial resources. It's useless identification if you are
applying for a job as a piano player - that needs a musician's
union card and the club owner could care less about your financial
resources or your absolute identity. Similarly, in cyberspace,
each correspondent will be looking for certification relevant to
the transaction. If we're going to avoid having dozens of keys, we
want something like PGP that supports the variety that the real
world forces on us.
Well, there are several independent questions here. As it happens, no,
I would not believe that PRZ would be willing to "subvert" a signature
in such a case -- but I cheerfully acknowledge that that's a personal
judgement based on my knowledge of his character.
More generally, however, what does it mean for him to "subvert" a
signature? By signing a key, he's stating that he believes that a
key belongs to a particular user. If he makes that statement in error,
that simply means that the key/user mapping that he validated is not
correct. Because trust isn't transferrible, this isn't the sort of thing
that will "infect" the database; I shouldn't be more trustworthy because
I've got PRZ's signature on my key. Any realistic threat model is
going to have to accept the possibility of compromised keys under
any circumstances -- if PRZ missigns a key, it's no greater a threat
than a compromised one.
>This is not an idle question nor a personal slam. For any certification
>system to be widely accepted among arms-length individuals or commercial
>users it must have at least:
>
>1. A past history of trust. Note that "trust" does not mean
>"trust-in-a-cause" or "trust except for ..." to the general user;
Trust in what? You're not defining this.
>2. Uniqueness of a certifier's certificates. Once a certificate
>is issued for us...@site.dom, no other can be issued by a certifier
>unless the certificate holder securely revokes his earlier certificate;
This is stupid. Why can't I have multiple keys, the same as I have
multiple phone numbers, multiple cars, or multiple credit cards?
>3. Near-iron-clad certification rules/protections.
None of which address the superset problem of compromised keys.
>4. Legal liability for malfeasance and sufficient resources to make a
>malfeasance judgement practical;
This is up to the insurance agency.
>5. Credibility.
Isn't this redundant?
Patrick
> Depending on how YOU set up YOUR PGP web of trust, you can have
> This means that anyone can start their own PGP Certifying Authority,
> and you can choose to trust it, or not trust it. You can choose to
> trust it on the basis of nuclear-key technology, or on the basis of
> you know the certifier's brother-in-law. On the basis of an
> affiliation with a governmental agency, or the profound absence
> thereof. Some would rely upon, for example, an NSA or Commerce
> Department certification, while others would sooner trust the tooth
> fairy.
>
This sould vaguely familiar, like the same criteria for secret sharing of
any key. As opposed to the government, the tooth fairy never let me
down. But, it was a cash and carry transaction as I remember.
We wish we could trust anything that said *US Government* on the side, but
their conduct is testamony against doing that. We best keep an open eye
on them.
--
WTShaw, Macintosh Crypto Programmer
wts...@htcomp.net wts...@itexas.net
To Err is Human.
To Not Correct Your Mistakes is Bad Form.
The comp.security.pgp FAQ, Appendix II, which advises:
>This part of the FAQ is (c) by Boudewijn W.Ch. Visser.
has this to say about IDEA:
> In general, it is very difficult to make definite pronouncements about
> the security of encryption algorithms. With the exception of the One
> Time Pad, no algorithm has any formal, mathematical proofs about its
> security. Some necessary (but not sufficient) conditions are known,
> and IDEA meets all those conditions. In addition to that, the longer
> an algorithm survives attacks without being broken, the more it is
> trusted. IDEA has now been a challenging target for cryptographers for
> 6 years now (1996).
and it has this to say about RSA:
> Things that are not yet known : there are a few things that are not
> clear with regard to RSA (and many other public-key cryptography
> systems as well).
>
> It has not been _proven_ that breaking RSA is as hard as factoring n.
> Therefore, it just might be that there is some method for breaking RSA
> that does not require the factorization of a large number. And second,
> factoring itself has not been _proven_ to be a truly hard problem.
----------------------------------------
So while we are justified in having high confidence in the security of PGP,
we nevertheless cannot claim that actual mathematical proof exists.
-
Mike Naylor - "mike-dot-naylor-at-mail-dot-serve-dot-com"
PGP Key 0x49F92C49 fingerprint B32D0E3B3DC3684B 7306AF549BBB5BE7
Play Five by Five Poker at http://www.serve.com/games/
*>This is stupid. Why can't I have multiple keys, the same as I have
*>multiple phone numbers, multiple cars, or multiple credit cards?
The problem David is addressing here is the one where I approach a
signer with a key which I claim is yours and have them sign it as yours,
when there is already a key of yours in circulation. Of course the key I
sign as yours cannot be read by you, but my hope is that others will use
that key for top secret messages to you that I will then be able to
read.
You could get around the problem of you wanting to have multiple keys by
you using your one key signed by the CA, to sign your other keys.
Presumably you would not sign keys which are not your own. (That is of
course not clear if "you " refers to a company, but I guess that is a
problem in any system).
Note of course that David's safeguard works as long as there is just one
signing authority in the world. Once there are 10 or a 100, it becomes
much much more difficult to ensure that this kind of scam does not
occur. Can you really see all these competitors exchanging all of their
databases with each other to ensure no duplicate keys?
--
Bill Unruh
un...@physics.ubc.ca
On 10 Mar 1997 12:58:27 +0000, Paul Leyland
<p...@sable.ox.ac.uk> wrote:
>vis...@ph.tn.tudelft.nl (Boudewijn W. Ch. Visser) writes:
>
>> >1. Obviously Neil Barrett does not understand how public key
>> >encryption really works. To come out with a statement that the
>> >cracking time is in no way dependent on key sizes almost beggars
>> >disbelief.
<smip>
I wonder if this Neil Barrett also suggested his people to
use a version of MS Internet Explorer?
-----BEGIN PGP SIGNATURE-----
Version: 2.6.2
iQCVAgUBMyeMEnDae2odWWzpAQGNxAP+NRDf6zTU5L8FILDolks+2rWMD/XltJ/w
DrJEDERvhXglpXncjH/Clx0X3u+Ss212SVfqtorOXywsXO3AfrS7kyZLPzM1WLpx
Y6xbyF5GDTZZKEslP6SJJGSnl8RVH+/rAgloNt4VfSkd4JEF3rhmJG8tq5tHtU72
wBeNUTakQJg=
=WYVA
-----END PGP SIGNATURE-----
-Opus-
I am an idealist. I don't know where I am going
but I'm on my way.
-Carl Sandburg
--------------------------------------------------------
Home Page at http://www.lava.net/~opusdpen/
PGP Key found at http://www.lava.net/~opusdpen/key.html
Key ID = 0x6FD4259D (4096 bit key)
Key ID = 0x1D596CE9 (1024 bit key)
Given your "nonsense", I reply: what a silly response! The system in
heirarchical enforces the standards. In web of trust the system does no
such thing and each individual user is forced to roll his own, usually
with much manual intervention and no guarantees that others may count on
other than assertion--and even that case-by-case.
> If "web of trust" breaks the heirarchical system in the sense that
> the availability of the former results in insufficient demand for
> the latter, that's how the cookie crumbles.
What?
>
> > We have seen attempts (RIPEM, for example) to extend heirarchical
> > certification to accomodate web of trust. Such systems are hybrid, not
> > super- or sub-sets of heirachical.
>
> The issue is whether web-of-trust can include heirarchical (which is
> trivially easy, as explained above), not whether heirarchical can be
> readily extended to web-of-trust (which is more difficult, except by
> simply implementing heirarchical as a special case of web-of-trust).
Sure, and if my bubbe had baytzim she'd have been my zayda. That fact is
that she didn't and wasn't.
>
> > What is more, PGP contains no mechanisms for enforcing
> . ^^^^^^^^^
>
> What's that duck doing in here, and how did it get a $100 bill
> in its beak?
What ARE you talking about?
>
> > heirarchical
> > certification, so the notion that web of trust is a superset of
> > heirarchical, even if it were true, is non-operational in any current
> > discussion of PGP.
>
> Again, one simply sets the trusted signatures to be those you choose
> to establish as one's signing authorities. This ain't nuclear physics.
No, it's simple logic. Try again.
Putting it another way, that I can calculate velocity and acceleration
with arithmetic doesn't make arithmetic differential calculus. Just so,
simply because I can twist web of trust into quasi-heirarchical by some
subset of users adopting voluntary and software-unenforcable side
conditions doesn't make it heirarchical in the sense that such
real-world systems have been both defined, standardized and are
well-understood.
David
P.S. Don't plan on seeing a response to the next rationalization. The
thing speaks for itself.
PGP's web of trust can be subverted by any user who wishes to, simply by
using lax signing standards. In contrast, PKCS and other defined
heirarchical systems contain many protections, even including
nuclear-missile-launch-grade authentication boxes for key certification
and revocation.
>
> Central certifying authorities have a different class of problem. One
> example is Microsoft's authenticode that certifies ActiveX apps. That
> certification does not prevent sophomores from exploiting ActiveX
> weaknesses to do such mischief as formatting your hard drive.
This discussion is about the robustness of heirarchical systems vs web
of trust per se. Authenticode has nothing to do with weaknesses in
ActiveX, nor do the weaknesses you mention have anything to do with
authentication. If web of trust were used to certify ActiveX applets the
same problems would exist with its ability to do mischief.
>
> Many recent postings related to PGP focus on 1) aspersions upon the
> character of the author regarding his development and distribution of
> the original freeware;
There were no aspersions. There was a recitation of historical facts and
his own words as quoted in interviews. There were certain logical
conclusions that inevitably flowed from that. For instance, a nutty
rendering of patent law is nutty per se, and not an "aspersion on the
character of an author". The act of infringing is the act of
infringing--it's physics or, if you prefer, existential, not a matter of
aspersions with respect to character.
As to conclusions about trustworthiness based on past trust-related
behavior, those are matters of simple logic rather than of "aspersions".
Aspersions (other than sprinkling with holy water) are acts of
calumniation, and calumniation is the spreading of deliberately false
statements about a person. None of my comments were false, much less
deliberately so. May I recommend a good dictionary? Try
Merriam-Webster's Collegiate Dictionary, Tenth Edition. It is readily
available in discount book stores and I have found it to be both
excellent and technically up-to-date. As the late Senator Sam Ervin once
said, "I understand the English Language. It's my mother tongue."
2) third-hand accounts of UK security "experts"
> claiming that breaking PGP is trivial;
I never said any such thing. Here you're talking about someone else.
3) criticism of the security of
> the "web of trust".
Perfectly valid.
> PGP continues to be the leading world-wide program
> for private encryption,
Mostly because of its notoriety due to its giving the finger to
established order (as in ITAR, patents, and copyright). Also false.
Vastly more encrypted messages are sent in Lotus Notes alone each day
than in PGP. That number runs in the millions.
>attacking both IDEA and RSA keys remains, to
> the best public knowledge, a matter of brute force attack,
Although attacks on IDEA are by brute force, attacks on RSA are by
factoring.
> Phil
> Zimmermann was just appointed to the CPSR Board of Directors,
I've had a lot to say about some of the nonsense CPSR has posted in the
past disguised as press releases. Although they often do good work which
I support, they seemingly equally often do shoddy work. I don't think
much of some of the caliber of some of their people, judging from their
work. There's been an organized campaign (described elsewhere by
another) to "protect" Phil against what I think to be perfectly valid
charges by inviting him to lecture at this and that, "honoring" him,
etc. This is about rejectionist left-wing politics thinly camouflaged in
a mantle of civil liberties, in my opinion, not about the facts of the
matters we've been discussing.
>and the
> web of trust is as secure as each user determines and configures.
^^^^
Exactly. That's why it's not a superset of heirarchical.
David
That's odd; my personal threat model indicates just the opposite.
Because most of the personnel involved in a large Verisign-style operation
will not be directly affected by factors such as share price, there's
not much incentive for them not to corrupt the database.
How many of the employees of TRW have a personal stake in the
accuracy of their database?
Patrick
Well, no, it's not. You can do the same thing by merely guessing a
person who doesn't yet have a key and getting a key in their name.
David is demanding a non-solution to a non-problem.
Patrick
You're wrong. RSA is attacked by factoring. Breakthroughs in factoring
have occurred at regular and near-predictable intervals in the past, and
there's no reason to believe that won't continue. And that's what we
know about in the open literature. The NSA has lots of money, big
machines, and very smart mathematicians. Who knows what factoring
breakthroughs they've come up with in secret?
David
Let's do an abstract thought experiment here without reference to
particular persons.
Exactly what kind of "social responsibility" is it to take another's
intellectual property without permission, and then be instrumental in
its wide dissemination to the detriment of the property owner? What kind
of social responsibility is it to unilaterally trample on the rights of
others in aid of one's own hobby-horse "cause"?
And would you find it contemptible if copyright were claimed after
trampling on another's rights in this way?
I say it would be another case of those who utter shrill cries of "Power
to the People!" meaning "Power to me and my friends." and to hell with
everyone else's rights.
David
I'm trying to decide if you're maliciously lying or willfully clueless.
Given that it requires an active decision on the PGP users part to
trust a particular introducer, I fail to see how "any user" can
corrupt the web of trust.
Patrick
> >
> >1. A past history of trust. Note that "trust" does not mean
> >"trust-in-a-cause" or "trust except for ..." to the general user;
>
> Trust in what? You're not defining this.
Trust that someone won't bend the rules for ideological or other
reasons. Someone might not like oil companies, for example, and might
seek to subvert their security via a crypto system with a back door. Not
speaking about any particular person, past willingness to infringe
another's intellectual property and worse, to deny obvious malfeasance
would be a pretty good indicator of untrustworthiness.
>
> >2. Uniqueness of a certifier's certificates. Once a certificate
> >is issued for us...@site.dom, no other can be issued by a certifier
> >unless the certificate holder securely revokes his earlier certificate;
>
> This is stupid. Why can't I have multiple keys, the same as I have
> multiple phone numbers, multiple cars, or multiple credit cards?
You can, but you must name them differently. What would be stupid would
be to have them indistinguishable. How would someone know which key you
were using?
>
> >3. Near-iron-clad certification rules/protections.
> None of which address the superset problem of compromised keys.
You' need to be more specific before I can tell what you're talking
about.
>
> >4. Legal liability for malfeasance and sufficient resources to make a
> >malfeasance judgement practical;
>
> This is up to the insurance agency.
So what? It's still needed. And the cost of the premiums as well as
commonly included negligence clauses in insurance policies will still
compel caution.
>
> >5. Credibility.
> Isn't this redundant?
Perhaps you don't understand the problem. I would find (not speaking of
any particular person) a past infringer who produced a particularly good
and auditable crypto system credible but untrustworthy. I would find a
novice with his first crypto system, who had an impeccable personal
reputation, trustworthy but not credible.
David
> Note of course that David's safeguard works as long as there is just one
> signing authority in the world. Once there are 10 or a 100, it becomes
> much much more difficult to ensure that this kind of scam does not
> occur. Can you really see all these competitors exchanging all of their
> databases with each other to ensure no duplicate keys?
The easy way to deal with this is for the user to use only one
certifying authority, and publicly repudiate any keys which show up
purporting to be his but signed by some other authority.
Of course with the higher classes of signatures (above the simplest
"persona" certificate), identities are checked before a certificate is
issued, so spoofing is much harder.
David
Be careful how you interpret this. It is simply wrong (like some other
things Phil has said about intellectual property law) with respect to
most organized electronic commerce in which sellers must be able to rely
on the keys of buyers unknown to them belonging to particular
individuals. Class 2 heirarchical certification and above is indicated
here, not web of trust.
Phil's ideology--"centrally-controlled corporate or governmental
institutions" vs. "decentralized grassroots" colors the facts and can
easily lead to uh, er, um, misinterpretation. The need for binding of a
key certificate to a particular person exists whether IBM is selling the
goods or Joe Artisan is. Most Joe Artisans are simply not going to take
the trouble to do the personal checking needed to run web of trust and
issue their own certificates, nor does an accepted signer exist for the
general PGP user, who is widely trusted to use the same kind of checking
Verisign does.
David
Hardly. It's definitional. If heirarchical as I've defined it in this
discussion--as an empirical rather than a theoretical concept, with
published standards and existing exemplars--requires the software system
to enforce it, that PGP does not enforce heirarchical if you use web of
trust AS IF it were heirarchical means it isn't heirarchical.
David
As I understand it, Verisign uses SafeKeepers for all but the lowest
level of "persona" certificates. And the RSADSI web site documents on
PKCS, well before Verisign split off, seem to require them for parties
wishing to become certificating agencies for all but persona
certificates.
>
> If you have a high value key, which only needs to be used extremely rarely-
> for example, one only used to sign issuers at the next level - it may be
> easier simply to use more conventional means of physical security; a state
> treasury, or National Guard armoury. The lower the value of the key, the
> less extreme the protection you need.
Since Verisign uses them (as I understand it) for even Class 2
certificates (which cost all of $12 per year and involve only a national
data base check), I don't see that this analysis is borne out in
practice.
>
> As you know as well as anyone here, trust is a matter of economics and risk;
> if it costs $10M to introduce measures that will reduce misuse by $1M, it's
> better to let the $1M slide. If it only costs $100,000 just do it.
This is a poor heuristic if you're trying to build public confidence in
a new certificating institution. Verisign appears not to be following
it, by and large.
David
[snip]
>> >5. Credibility.
>> Isn't this redundant?
>
>Perhaps you don't understand the problem. I would find (not speaking of
>any particular person) a past infringer who produced a particularly good
>and auditable crypto system credible but untrustworthy.
Ah. This sort of repeated malicious insinuations as a thinly
disguised attempt at character assassination answers my earlier
question about whether you were maliciously lying or willfully
clueless.
I don't think there's anything left to say; when the strongest
argument that can be made to the PGP "web of trust" involves this
sort of inuendo and ad hominem attacks in place of actual content
or technical issues, the NSA shills have rather obviously lost.
Consider yourself plonked, Sternlight.
Patrick
: Let's do an abstract thought experiment here without reference to
: particular persons.
: Exactly what kind of "social responsibility" is it to take another's
: intellectual property without permission, and then be instrumental in
: its wide dissemination to the detriment of the property owner?
Oh dear. That again.
This "intellectual property" is the right to obtain a royalty from
those who wish to raise A to the power B mod C in order to gain
security.
We should be extremely wary of monopolies in general, and especially
of those who would seek to have a monopoly on arithmetic. It is not
appropriate for a government to grant anyone a monopoly on performing
modular exponentiation, for any purpose, and by any means.
: What kind
: of social responsibility is it to unilaterally trample on the rights of
: others in aid of one's own hobby-horse "cause"?
I don't think that you're in any position to accuse others of getting
on hobby-horses.
: And would you find it contemptible if copyright were claimed after
: trampling on another's rights in this way?
: I say it would be another case of those who utter shrill cries of "Power
: to the People!" meaning "Power to me and my friends." and to hell with
: everyone else's rights.
This is transparent flame bait.
*plonk*
Andrew.
>Ed Stone mentioned CPSR.
>
>Let's do an abstract thought experiment here without reference to
>particular persons.
>
David,
Every negative has a positive: Consider the scenario where freedom
fighters in different parts of the world are fighting to advance the true
cause of human freedom: Freedom from dictatorship and right to live in
this world as decent, human beings and the right to worship Almighty God
without fear. Powerful encryption could act to protect the innocent and
advance a true legitimate social cause.
Dan Frezza
PGP Key Id: Dan Frezza <d...@frezza.org>
PGP User Id: 0x0B6C9381
PGP Fingerprint: 8C E2 78 50 24 80 D7 0C 64 29 D2 3B FE 4B C5 4E
Please remember that at the time, RSADSI was marketing the services
which VeriSign currently sells. You had to use a SafeKeeper to be
a CA in their higher-security hierarchies, and they had a strong
financial interest in you being in their hierarchies so that would
have been what they emphasized in their publicity.
On a technical level, however, the PKCS protocols can be used
with software-only CAs, and RSADSI currently offers toolkits to
do just that.
It seems to me that anyone who decides that hierarchy-based
CAs provide a higher degree of security for their application
is also likely to make the decision that they need hardware
boxes for the CAs. But that is just a configuration issue
and not a absolute requirement for using PKCS protocols.
Also, I would be wary of comparing BBN SafeKeeper security to
nuclear missile launch security. The BBN SafeKeeper is a small
box, with keys from the shareholders inserted sequentially as
I understand it. Thus it is vulnerable to a key-theft attack
by a single individual. According to Hollywood, on the other
hand, nuclear missile launch requires simultaneous operation
of two keys sufficiently far apart that no single individual
could do it. Yes, at least one screenwriter came up with an
attack on that using a mechanical device, but simple human
laxity is not sufficient.
Larry Kilgallen
My impression is that most VeriSign personnel are in Marketing :-).
The only ones we care about are those with "keys to the kingdom".
In the case of those who are so empowered, one needs positive
incentive for them to corrupt the database. Could some political
cause provide that incentive -- yes. Would sufficient number of
the holders of the crypto-shares subscribe to that same political
cause so strongly as to risk personal freedom -- unlikely. Such
an attack, remember, is only useful if it goes undetected. Once
detected the public loses faith in VeriSign and lawyers for the
stockholders rev up their engines.
> How many of the employees of TRW have a personal stake in the
> accuracy of their database?
In contrast to VeriSign, TRW does not gain business by the accuracy
of their database. Their success depends instead only on avoiding
"false positives" (giving good reports undeservedly). It is quite
possible for them to continue to gain customers with many "false
negatives" since those adversely affected by "false negatives" are
not directly their customers.
The business of VeriSign, GTE Cybertrust, and the others is quite
dependent on total accuracy, with customers unwilling to tolerate
either signatures which work when they should fail or signatures
which fail when they should work.
Larry Kilgallen
[...]
>Perhaps you don't understand the problem. I would find (not speaking of
>any particular person) a past infringer who produced a particularly good
>and auditable crypto system credible but untrustworthy.
This mannerism of "not speaking of any particular person", immediately
followed by a clear indication of who you are speaking of, is rapidly
growing old.
Anno
From the flow of the sentences, it seems to me that you are claiming
that this problem is unique to central certifying authorities. If this
is indeed the case, I would appreciate if you could enlighten us
about how web-of-trust mode will not have this problem.
As I mentioned in another post, signing alone has nothing to do with
you wanting to trust or not to trust the signed object. Signing merely
establishes its digital identity. If this is acceptable, I don't see
any reason why web-of-trust mode could perform better in this aspect.
The rationale behind using digital signature to guarantee digital
security
( such as to deliver secure ActiveX program to the users ) is built
behind two assumptions :-
(1) You can establish the code owner's identity.
- This is obtained from digital signature.
(2) The owner of the signed object wants to be ( or has to be )
accountable,
and failing to do so would jeapodize its long term establishment.
- For example, a bank would want to project the image to the
customers
that it wants to be accountable. You can obtain this from ( but
not
limited to ) your web-of-trust, word-of-mouth, somebody
authoritative,
the CA, the government or you past dealing/interaction with the
key owner.
Somehow the law of human establishment dictates that people who want
to estalish their identity, will normally act in an accountable manner
( Do think a criminal would want to reveal his identity ? ) But that
doesn't necessarily mean that item(1) will always be accompanied by
item (2).
( A very good example of having item(1) but not item(2) is commonly
seen in newsgroup postings. A lot of people who post want to establish
their identity but not accountability ).
In the case of malicious ActiveX program, it lacks item number (2).
So, it doesn't matter if it is central signing or web-of-trust signing.
There is nothing to stop PRZ ( who has all the necessary web-of-trust
signatures ) to one day become totally untrustworthy.
Ming-Ching
To see "a clear indication of who(sic) you are speaking of" when
the author has specifically disavowed any specific example is
unproductive paranoia in this discussion. You can hold Mr. Sternlight
to the wordings he has chosen and displayed to all. Whether you
suspect Mr. Sternlight would trust certain people and not others
is immaterial to the discussion.
To choose an example which might fit Mr. Sternlight's example,
but not what you are reading into his example, consider the US
National Security Agency. Many would categorize them as a past
"infringer" on privacy. Most would consider them as technically
capable of producing a good and auditable cryptosystem. If they
do that, some might judge them "credible but untrustworthy". NSA
is not a "person" but other than that it is a good match.
Larry Kilgallen
Patrick Juola wrote in sci.crypt:
> I'm trying to decide if you're maliciously lying or willfully
> clueless.
Please keep David Sternlight Vs. The World, Part CXXVII, out of
sci.crypt. Thanks.
-Lewis
==>"TRW does not gain business by the accuracy
==>of their database. "
That statement is illogical and totally wrong.
sal-
David Sternlight <da...@sternlight.com> wrote in article
<3325F8...@sternlight.com>...
> [Verisign] Level one means that nobody can spoof the persona. That
> is--you can be sure that there is a continuity of conversation with
> the holder of that certificate.
> . . .
> In contrast, we have no way of knowing that a web of trust signer won't
> sign two different keys claiming the same persona and e-mail address.
If by that you mean that there could exist two keys with the same User Id,
yes, of course. But to 'spoof a persona' would mean that the two keys
could not be told apart, and that is not what happens with PGP.
If I were (hypothetically) holding a conversation with you, and someone
(hypothetically) wanted to be mistaken for you, he could not simply sign
the message with a key labeled "David Sternlight" and thereby fool me into
accepting his identity. The key, or "certificate" ID would not be the same,
and no reasonable amount of monkeying with it would make it so.
You send the revocation certificate to a key server and to whomever you
regularly correspond. Since a revocation certificate cannot be revoked,
this will be enough.
The one thing that you do not have to worry about is that the thief will
revoke the key himself. If I steal your private key, the last thing
that I want is to revoke it. That would be like someone who picks your
pocket calling Visa and reporting your cards stolen.
> In an institutional CA model, one can check a certificate revocation list
> in situations where reliance is related to important matters. Is this
> possible when one relies upon a web of trust? If not, is this flaw fatal to
> the web of trust model?
Since a revocation certificate is signed with the private key of the
public key being revoked, you can always know that the public key is no
good.
--
#!/bin/perl -sp0777i<X+d*lMLa^*lN%0]dsXx++lMlN/dsM0<j]dsj
$/=unpack('H*',$_);$_=`echo 16dio\U$k"SK$/SM$n\EsN0p[lN*1
lK[d2%Sa2/d0$^Ixp"|dc`;s/\W//g;$_=pack('H*',/((..)*)$/)