这句话怎么理解?

46 views
Skip to first unread message

Jaze Lee

unread,
Nov 19, 2011, 8:29:35 AM11/19/11
to pon...@googlegroups.com
the Shannon entropy is a measure of the average information content one is missing when one does not know the value of the random variable
香农墒是度量平均信息量  当一个人不知道随机变量的值
One is missing 插进来 是什么意思?

Moore

unread,
Nov 19, 2011, 10:31:24 AM11/19/11
to TopLanguage
上下文看全,其实我不懂得
Shannon's entropy represents an absolute limit on the best possible
lossless compression of any communication, under certain constraints:
treating messages to be encoded as a sequence of independent and
identically-distributed random variables, Shannon's source coding
theorem shows that, in the limit, the average length of the shortest
possible representation to encode the messages in a given alphabet is
their entropy divided by the logarithm of the number of symbols in the
target alphabet.


http://en.wikipedia.org/wiki/Shannon_entropy


On Nov 19, 9:29 pm, Jaze Lee <jaze...@gmail.com> wrote:
> the Shannon entropy is a measure of the average information

> content<http://en.wikipedia.org/wiki/Information_content> one

frank chan

unread,
Nov 19, 2011, 10:11:48 PM11/19/11
to pon...@googlegroups.com
个人理解,仅供参考,“the average information content”在这个句子中应该是作miss的宾语。这里content和one之间应该是省略了宾语从句引导词,即原句应该是这样的:
the Shannon entropy is a measure of the average information content which one is missing when one does not know the value of the random variable

2011/11/19 Moore <lyx...@gmail.com>:

maydayzm

unread,
Nov 19, 2011, 11:57:56 PM11/19/11
to pon...@googlegroups.com
one is missing 修饰前面的名词,定于从句

Jaze Lee

unread,
Nov 20, 2011, 5:54:45 AM11/20/11
to pon...@googlegroups.com

我知道是定语从句,我不知道的是这句话是什么意思,one is missing 放到句子中是什么意思,One is missing 直译就是一个人丢失的,
这样放到句子中,怎么理解起来那么别扭啊,一个人当不知道那个随机变量的值的时候丢失的平均信息量? 度量这个有意思吗?


2011/11/20 maydayzm <mayd...@gmail.com>
one is missing 修饰前面的名词,定于从句

Xavier Heruacles

unread,
Nov 20, 2011, 7:45:04 AM11/20/11
to pon...@googlegroups.com
Shannon entropy qualifies the expected value of the information of a message. So there must be some information content that's lost. And Shannon entropy is the measure of it. Hence the word missing for the lost information content. This is what I understand about the sentence.

2011/11/20 Jaze Lee <jaz...@gmail.com>

maydayzm

unread,
Nov 20, 2011, 8:59:28 AM11/20/11
to pon...@googlegroups.com
one is missing 的 one不是指人的吧?one应该是指 the average information content

songyy

unread,
Nov 20, 2011, 8:12:54 PM11/20/11
to TopLanguage
我觉得这是两句话,只是作者忘了加句号了。

On 11月19日, 下午9时29分, Jaze Lee <jaze...@gmail.com> wrote:
> the Shannon entropy is a measure of the average information

> content<http://en.wikipedia.org/wiki/Information_content> one

Jaze Lee

unread,
Nov 20, 2011, 8:27:47 PM11/20/11
to pon...@googlegroups.com
Can u give me a specific example? Or can u use tossing a coin to explain Shannon entropy ? If u toss a coin, which information is lost?

2011/11/20 Xavier Heruacles <xheru...@gmail.com>

Xavier Heruacles

unread,
Nov 20, 2011, 11:32:45 PM11/20/11
to pon...@googlegroups.com
from wiki:
Entropy is a measure of disorder, or more precisely unpredictability. For example, a series of coin tosses with a fair coin has maximum entropy, since there is no way to predict what will come next. A string of coin tosses with a coin with two heads and no tails has zero entropy, since the coin will always come up heads. Most collections of data in the real world lie somewhere in between. It is important to realize the difference between the entropy of a set of possible outcomes, and the entropy of a particular outcome. A single toss of a fair coin has an entropy of one bit, but a particular result (e.g. "heads") has zero entropy, since it is entirely "predictable".

2011/11/21 Jaze Lee <jaz...@gmail.com>

Jaze Lee

unread,
Nov 21, 2011, 12:57:10 AM11/21/11
to pon...@googlegroups.com

I've read it. However, is there some information about alleged "One is missing"?

2011/11/21 Xavier Heruacles <xheru...@gmail.com>

Gentle Yang

unread,
Nov 21, 2011, 4:52:47 AM11/21/11
to pon...@googlegroups.com
除了理解句子本身,根据我学习信息论的经验,以及一些实际数据分析工具经验,要很好滴从不了解到入门理解信息熵的概念,最好能找本教材类的资料来阅读。 虽说教材看似死板,但学习也是需要过程,有些东西熬几个通宵就搞定,但有些东西得逐步理解和推敲,甚至需要其他内容的辅助。

我记得以前面试毕业生的时候,就凭对信息熵的理解这个问题,可以深入了解一个数学系或相关院系毕业学生的理论功底到底如何。同时还有条件熵 互信息 等概念,等到你能自己举出一个非常恰当和典型的例子,来给别人介绍时,就基本理解OK了。

此后,还有应用的部分。如何识别问题建模,使用信息论的理论和方法来解决实践问题 - 这个是更难和更需要实践的。

2011/11/21 Jaze Lee <jaz...@gmail.com>

Xavier Heruacles

unread,
Nov 21, 2011, 11:02:46 PM11/21/11
to pon...@googlegroups.com
It seems in my last reply, I've already made the word missing clear. Entropy is the measure of unpredictability. When you toss a fair coin. The entropy is 1, since you can't tell whether it is head or tail. And if you toss it for twice, then entropy is 2, since for the twice tossing, there are two possible outcomes that's missing...This is what I get about the entropy in information theory...

2011/11/21 Jaze Lee <jaz...@gmail.com>

Jaze Lee

unread,
Nov 22, 2011, 12:29:43 AM11/22/11
to pon...@googlegroups.com
I do no understand the word missing yet. When i toss a fair coin. What did i miss?  
May be i can not say it is head or it is tail before i toss, however, what i miss? is there something u must know, before u toss a coin? Yeah, i think you should toss up not toss down, and u should not cheat when u tossing. Is it sane?
In my opinion, if we say something missed by u, then there are the probability that some one else may not miss. For example, i missed the last bus. Then in this context, someone else can not miss, that is they are on the last bus. But in the context tossing a fair coin, who can know the result before he tosses if he does not cheat on tossing.

2011/11/22 Xavier Heruacles <xheru...@gmail.com>

Xavier Heruacles

unread,
Nov 22, 2011, 1:29:29 AM11/22/11
to pon...@googlegroups.com
In information theoryentropy is a measure of the uncertainty associated with a random variable. In this context, the term usually refers to the Shannon entropy, which quantifies the expected value of the information contained in a message, usually in units such as bits. In this context, a 'message' means a specific realization of the random variable.
Equivalently, the Shannon entropy is a measure of the average information content one is missing when one does not know the value of the random variable. The concept was introduced by Claude E. Shannon in his 1948 paper "A Mathematical Theory of Communication".

Shannon's entropy represents an absolute limit on the best possible lossless compression of any communication, under certain constraints: treating messages to be encoded as a sequence of independent and identically-distributed random variablesShannon's source coding theorem shows that, in the limit, the average length of the shortest possible representation to encode the messages in a given alphabet is their entropy divided by the logarithm of the number of symbols in the target alphabet.

Let's just talk about the meaning from within the context from above. You cannot interpret a word's meaning outside its context.  The word missing comes from the second paragraph, so you know here we talk about information content  that's missing WHEN one does not know the value of the random variable. While the meaning of the value of the random variable, from the ending sentence of the first paragraph, comes from 'message'. So Shannon entropy is the measure of the best possible lossless meaning in terms of meaning that's lost. Hence the measure of unpredictability and it's an expected value.
Since it's information theory, the main interest here focuses on the delivery of meaning(information content) from receiver to its destination through some channel. So as for the example of tossing a fair coin, there are two possible outcomes ( two possible meanings to deliver) and the one can only get known of one. Hence the entropy of 1....

Above is just my personal interpretation of the entropy. I don't know information theory and just got known about it from your post. And I just read just parts of entries from wikipedia. All I said is just for your reference.:)

Hope this helps.

2011/11/22 Jaze Lee <jaz...@gmail.com>

Jaze Lee

unread,
Nov 22, 2011, 3:23:43 AM11/22/11
to pon...@googlegroups.com
I do not understand.
For example, in tossing a coin, before i toss it, i know the tossing may have two results:head or tail.
before tossing:  i guss head, then i miss it is tail
                         I guss tail, then i miss it is head
after  tossing: it is head, i knew it. i miss nothing
                       it is tail, i knew it. I miss nothing.
If that right?
If i want to compute entropy in this tossing-coin example, how should i do?
Let X a random variable represents the results of once tossing.
0:head  1:tail
then  X    0       1
          p    1/2   1/2
how can i got the entropy?
the expection is EX = 0*1/2 + 1*1/2 = 1/2
but how can i get entropy?


2011/11/22 Xavier Heruacles <xheru...@gmail.com>

Gentle Yang

unread,
Nov 22, 2011, 4:31:16 AM11/22/11
to pon...@googlegroups.com

Xpol Wan

unread,
Nov 24, 2011, 7:41:55 AM11/24/11
to pon...@googlegroups.com
既然有人问起翻译,我也问一个:
老外的敏捷培训PPT上有引用一段宫本武藏说的话:

“Do not develop an attachment to any one weapon or any one school of fighting”

求解释!


Best Regards!

Xpol Wan
_G['China']['Human']['Male']['Software']['Programmer']['Embedded']['Gaming']['C/C++/Lua']



2011/11/22 Gentle Yang <nod...@gmail.com>

Zhangming Niu

unread,
Nov 24, 2011, 7:49:40 AM11/24/11
to pon...@googlegroups.com
“Do not develop an attachment to any one weapon or any one school of fighting”
 
kanban and scrum
 
意思是要术业有专攻。
那附图是日本忍者的吧,意思就是,一个好的忍者要会用刀,枪,飞镖,不是把刀当作飞镖或者枪用。
 
用作软件开发里就是,
要会各种工具和语言,比如不能光会用C,遇到问题只能用c 解决。有时需要script, function, c sharp etc.


 
2011/11/24 Xpol Wan <xpo...@gmail.com>



--
--------------------------------------------------------------------
Best Regards,

Zhangming Niu




Xpol Wan

unread,
Nov 24, 2011, 7:54:58 AM11/24/11
to pon...@googlegroups.com
谢谢!
从字面上理解呢?



Best Regards!

Xpol Wan
_G['China']['Human']['Male']['Software']['Programmer']['Embedded']['Gaming']['C/C++/Lua']



2011/11/24 Zhangming Niu <niuzha...@gmail.com>

Zhangming Niu

unread,
Nov 24, 2011, 7:57:02 AM11/24/11
to pon...@googlegroups.com
不要开发一种万能的武器,或者格斗方式。

2011/11/24 Xpol Wan <xpo...@gmail.com>

Xpol Wan

unread,
Nov 24, 2011, 8:01:43 AM11/24/11
to pon...@googlegroups.com
是 “不要对任何武器或武术学校产生依恋”  ?


终于恍然大悟,原来“attachment”是“依恋、情感、爱慕”而不是“附件”,“develop”是“产生”而不是“开发”。

悲剧啊,没有学好英语就去学计算机的后果啊。



Best Regards!

Xpol Wan
_G['China']['Human']['Male']['Software']['Programmer']['Embedded']['Gaming']['C/C++/Lua']



2011/11/24 Xpol Wan <xpo...@gmail.com>

Xavier Heruacles

unread,
Nov 24, 2011, 8:05:57 AM11/24/11
to pon...@googlegroups.com
悲剧就是你试图用中文来学英文。
悲剧就是英中字典本质上都是狗屁。

2011/11/24 Xpol Wan <xpo...@gmail.com>

Zhangming Niu

unread,
Nov 24, 2011, 8:12:14 AM11/24/11
to pon...@googlegroups.com
。。。我是把英语反过来翻译成中文的

2011/11/24 Xavier Heruacles <xheru...@gmail.com>

陈文龙

unread,
Nov 25, 2011, 6:17:19 AM11/25/11
to pon...@googlegroups.com
“不要对任何武器或者武术学派产生依恋” 

2011/11/24 Zhangming Niu <niuzha...@gmail.com>

Liuyizhe

unread,
Nov 26, 2011, 12:51:50 AM11/26/11
to pon...@googlegroups.com
这就是正解,哈哈

Sent from my iPad
Reply all
Reply to author
Forward
0 new messages