The mistakes in Shannon's proof have certain relations with the
limitations of probability theory. Moreover, Shannon did not realize
the limitations of probability theory when studying his information
theory.
>From the view of probability theory, he realized the random
uncertainty of event, and expressed this uncertainty with probability,
but ignored the random uncertainty of probability itself. Though it
was not directly stated that the probability was a fixed value [11].
But it can be seen from a lot of formals in probability theory and
information theory that probability is always taken as a fixed value,
otherwise, the formals may be impossible to compute, for example, the
formal of entropy would be impossible to compute if probability is a
random variable. The case that probability is a random variable is
universal as fixed value is only a special case of random variable.
For instance, the probability that comes from the incomplete or
unreliable conditions has more than one possible value, it is not
fixed, so the probability is a random variable and has random
uncertainty correspondingly [10]. The expectation of the probability
as a random variable is a simple static token, with merely local
significance, but its concrete probability distribution and its
concentrative degree are of great significance to the compromise of
probabilities and and the reliability of information. As probability
is always treated as fixed value in probability theory and information
theory, it caused the limitation of the two theories that they can not
solve the problems when the probability is a random variable. It also
caused a few scholars to believe that once the probability was given,
the probability would not change any more for the probability is
fixed. Generally speaking, for fixed probability, neither the analysis
of the reliability of information itself nor the fusion of unreliable
and incomplete information is doable. Most information in reality is
not absolutely reliable or perfect, we should compromise and fuse
different information. As information is expressed by probability, so
information is unchangeable if the corresponding probability is
considered as fixed value. To take probability as a fixed value is one
of the fundamental reasons why information theory can not be used to
research the reliability of information itself and information fusion,
while it can merely be used to research authentic communication.
We usually get information from the imperfect conditions we know and
take information under imperfect conditions as information under
complete conditions as a stopgap. When imperfect information under
imperfect conditions is taken as information under complete
conditions, the information is unreliable for the information is not
the same as the information under complete conditions, so the
imperfect information can be taken as unreliable information. When
utterly knowing nothing about the event, we cannot know how many
possible random values there are, not to mention the corresponding
probabilities of the possible values. Therefore, the prior probability
we obtained is based on the known conditions and and it is also a
conditional probability. Therefore the information and probability is
relative to our known conditions. But the unequal substitution is
never pointed out directly in probability theory and information
theory. In practice, it is prone to ignore the existence of the
unequal substitution of the imperfect one and the perfect one, thereby
confusion and absurdity may appear. When a lot of conditions that
influence the probability of an event are considered and the
conditions are parallel but not in series, there may be more than one
prior probability or posterior probability according to nowadays
probability theory. For example, some conditions can influence birth
gender of baby. Under condition A, the probability of male baby P(M︱A)
is c and under condition B, the probability of male baby P(M︱B) is d.
Now a question occurs: if both of the above conditions exist, how can
we gain the probability of male baby P(M︱A, B)? Other than problems in
nowadays probability theory, in this case the two conditions are
parallel for when A occurs, B is not considered and when B occurs, A
is not considered. and we do not know the conditional probability P(A︱
B) or P(B︱A). It is the same in the example of one-time pad, if the
final posterior probability is considered, it is the compromise of the
two probabilities under two conditions. The problem of compromise is
not settled in nowadays probability theory and is very complex. When
there are some parallel conditions, there may be more than one
probability and the probabilities may be conflicting. The uncertainty
theory and information fusion theory attempts to solve the problem,
but they are not strict and their algorithms are approximate but not
accurate algorithms, what's more, there are absurdities in some
algorithms. It is necessary to develop relevant accurate theory from
the perspective of probability theory.
In addition, probability theory does not realize the complexity of the
conditions, some conditions may be cryptic and interactional, so to
carefully recognize and strictly distinguish each condition is
necessary, and it should be realized that the probabilities under
different conditions are not the same, for example, the probability is
very easy to calculate when only condition x or y exists , but when
both the condition x and y coexist , the calculation of the
corresponding probability is difficult. This causes difficulty in
using the ready-made formulas of probability theory when conditional
probability is unknown. In the information theory, these problems
issues are not under consideration as well, so information theory can
not be used in some fields such as information fusion and artificial
intelligence. Due to the problem, Shannon failed to discover and
distinguish these different conditions in his proof [10].
>From Shannon's result PE(M) =1/n in his example, we can find Shannon
confused the posterior probability under all the conditions with the
imperfect probability when only the ciphertext and OPT were considered
(the prior probability as a condition was not considered, otherwise,
the result is impossible unless the prior probability P(M) =1/n
uncommonly happens). That may owe to the limitation of probability
theory that ignore the discrimination and recognition of different
conditions.
In the case of OTP, the uncertainty of plaintext is increased after
the compromise and fusion, but the conditional entropy was not
increased in information theory. It seems to be contradictory with
information theory. In fact, Shannon's conditional entropy is just one
kind of weighted average of a series of conditional entropies. It was
pointed out that Shannon's conclusion is not absolutely correct [12].
>From the above examples, we can find that the reliability and
completeness of the information are not considered in information
theory. However, the reliability and completeness are very important.
Information is valuable only when it is reliable at a certain extent,
otherwise it would be worthless. However, the relevant theories about
information reliability and completeness (such as uncertainty theory,
information fusion) are far from perfect and systematic, and there are
only approximate algorithms. Some of them even have absurdities.
Probability theory lags behind the study of the relevant theories yet.
4. Conclusion
This paper gives further and profound analyses on the blemish in
Shannon's proof and points out the root. Also, the debates about the
problem with a few experts and scholars are analyzed. This means that
the OPT is not as commonly thought to be perfectly secure and
unbreakable. However, it still has good statistical property and
superior security. At the same time, it analyzes the origin of
Shannon's mistakes in probability theory and information theory for
the probability in the two theories is always taken as fixed value,
but not random variable. This also provides a good direction for the
development of the probability theory and information theory and helps
the two theories to expand so as to solve more problems in reality.
Reference
[1]. Bruce Schneier, Applied Cryptography Second Edition: protocols,
algorithms, and source code in C[M],John Wiley &Sons,Inc,1996
[2]. C.E.Shannon, Communication Theory of Secrecy Systems [J], Bell
System Technical journal, v.28, n.4, 1949, 656-715.
[3]. Yong WANG, Security of One-time System and New Secure System [J],
Netinfo Security, 2004, (7):41-43
[4]. Yong WANG, Fanglai ZHU, Reconsideration of Perfect Secrecy,
Computer Engineering, 2007, 33(19)
[5]. Yong WANG, Perfect Secrecy and Its Implement [J],Network &
Computer Security,2005(05)
[6]. Yong WANG, Fanglai ZHU, Security Analysis of One-time System and
Its Betterment, Journal of Sichuan University (Engineering Science
Edition), 2007, supp. 39(5):222-225
[7]. Yong WANG, Shengyuan Zhou, On Probability Attack, Information
Security and Communications Privacy, 2007,(8):39-40
[8]. Yong WANG, Confirmation of Shannon's Mistake about Perfect
Secrecy of One-time-pad, http://arxiv.org/abs/0709.4420.
[9]. Yong WANG, Mistake Analyses on Proof about Perfect Secrecy of One-
time-pad, http://arxiv.org/abs/0709.3334
[10]. Yong WANG, On Relativity of Probability, www.paper.edu.cn, Aug,
27, 2007.
[11]. Yong WANG, On Relativity of Information, presented at First
National Conference on Social Information Science in 2007, Wuhan,
China, 2007.
[12]. Yong Wang. Question on Conditional Entropy. arXiv:
http://arxiv.org/pdf/0708.3127.
The Project Supported by Guangxi Science Foundation (0640171) and
Modern Communication National Key Laboratory Foundation (No.
9140C1101050706)
Biography:
Yong WANG (1977-) Tianmen city, Hubei province, Male, Master of
cryptography, Research fields: cryptography, information security,
generalized information theory, quantum information technology. GuiLin
University of Electronic Technology, Guilin, Guangxi, 541004 E-mail:
hel...@126.com wang197...@sohu.com
Mobile 13978357217 fax: (86)7735601330(office)
School of Computer and Control, GuiLin University Of Electronic
Technology, Guilin City, Guangxi Province, China, 541004
hel...@126.com wrote:
Translation: various exerts have examined Yong Wang's arguments
and have not been convinced by them. From this, Yong Wang concludes
that he and he alone is right, everybody else is wrong, and that
all the experts are too stupid to understand his arguments.
--
Guy Macon
<http://www.guymacon.com/>
--
===========================
your arguments just prove you are unreasonable and talentless to tell
why I am wrong.
you are too impuissant to give any reason.
i do not want any one just windbaggary.
tell the reason.