Entropy and Information

33 views
Skip to first unread message

Evgenii Rudnyi

unread,
Dec 25, 2010, 7:33:14 AM12/25/10
to embryo...@googlegroups.com
The entropy is often used in biology and I guess that many biologists
associate it with information. Some time ago I had discussion about
these matters on biotaconv and now I have summarized my viewpoint

http://blog.rudnyi.ru/2010/12/entropy-and-artificial-life.html

Merry Christmas and Happy New Year,

Evgenii

Dr. Richard Gordon

unread,
Dec 26, 2010, 4:45:09 AM12/26/10
to Evgenii Rudnyi, embryo...@googlegroups.com, biot...@lists.ccon.org
Sunday, December 26, 2010 4:42 AM, Santee, SC
Dr. Evgenii Rudnyi
CADFEM GmbH, Germany

Dear Evgenii,
I tried to post this comment on your blog, but got an error message:

*******************
The approach of Jaynes does not work for irreversible thermodynamics:

Hill, T.L. (1966). Studies in irreversible thermodynamics, V. Statistical thermodynamics of a simple steady-state membrane model. Proc Natl Acad Sci US A 55(6), 1379-1385.

The concept of information and entropy in a communication channel was introduced by:

Shannon, C.E. (1948). A mathematical theory of communication [corrected version]. Bell System Technical Journal 27, 379–423, 623–656.

Shannon, C.E. & W. W. (1949). The Mathematical Theory of Communication. Urbana, Univ. Illinois Press.

The literature applying this to biology is huge, undoubtedly also with some error. But Shannon would be your starting point to explore this topic further.

Yours, -Dick Gordon gor...@cc.umanitoba.ca
********************
Richard (Dick) Gordon
Visitor, Camera Culture, Media Lab, MIT
Visitor, Department of Mechanical & Aerospace Engineering, Old Dominion University
gor...@cc.umanitoba.ca

Evgenii Rudnyi

unread,
Dec 26, 2010, 7:39:24 AM12/26/10
to embryo...@googlegroups.com
Dear Dick,

Thank you for your references. I am aware of Shannon' works. I believe
that he has just used the term entropy without claiming that his entropy
is the thermodynamic entropy. A citation from his paper

"The form of H will be recognized as that of entropy as defined in
certain formulations of statistical mechanics8 where pi is the
probability of a system being in cell i of its phase space. H is then,
for example, the H in Boltzmann�s famous H theorem."

He just shows that the equation is similar but he does not make a
statement about the meaning of such a similarity. After all, there are
many cases when the same equation describes completely different
phenomena. For example the Poisson equation for electrostatics is
mathematically equivalent to the stationary heat transfer equation. So
what? Well, one creative use is for people who have a thermal FEM solver
and do not have an electrostatic solver. They can solve an electrostatic
problem by using a thermal FEM solver by means of mathematical analogy.

If to talk about the irreversible thermodynamamics, I would recommend
you Prigogine. Note that he has never treated the thermodynamic entropy
as information.

And once more. It would be very instructive for people discussing
"Entropy and Information" just to browse the JANAF Tables. The entropy
is there but I am not sure if it has any relationship with Shannon's
information.

Best wishes,

Evgenii

on 26.12.2010 10:45 Dr. Richard Gordon said the following:


> Sunday, December 26, 2010 4:42 AM, Santee, SC Dr. Evgenii Rudnyi
> CADFEM GmbH, Germany
>
> Dear Evgenii, I tried to post this comment on your blog, but got an
> error message:
>
> ******************* The approach of Jaynes does not work for
> irreversible thermodynamics:
>
> Hill, T.L. (1966). Studies in irreversible thermodynamics, V.
> Statistical thermodynamics of a simple steady-state membrane model.
> Proc Natl Acad Sci US A 55(6), 1379-1385.
>
> The concept of information and entropy in a communication channel was
> introduced by:
>
> Shannon, C.E. (1948). A mathematical theory of communication

> [corrected version]. Bell System Technical Journal 27, 379�423,
> 623�656.
>
> Shannon, C.E.& W. W. (1949). The Mathematical Theory of


> Communication. Urbana, Univ. Illinois Press.
>
> The literature applying this to biology is huge, undoubtedly also
> with some error. But Shannon would be your starting point to explore
> this topic further.
>
> Yours, -Dick Gordon gor...@cc.umanitoba.ca ********************
> Richard (Dick) Gordon Visitor, Camera Culture, Media Lab, MIT

> Visitor, Department of Mechanical& Aerospace Engineering, Old

Dr. Richard Gordon

unread,
Dec 26, 2010, 11:23:12 AM12/26/10
to Evgenii Rudnyi, biot...@lists.ccon.org, embryo...@googlegroups.com
Sunday, December 26, 2010 11:19 AM, Santee, SC
Dear Evgenii,
While I might agree with you on the misuse of “entropy” in biology, many people would disagree with you, as evidenced by the list of books below. If you have the time, maybe you could dig deeper into the question and give us a talk in the Embryo Physics Course? Thanks.
Yours, -Dick

Richard (Dick) Gordon
Visitor, Camera Culture, Media Lab, MIT

Visitor, Department of Mechanical & Aerospace Engineering, Old Dominion University
gor...@cc.umanitoba.ca

On 2010-12-26, at 7:39 AM, Evgenii Rudnyi wrote:

> Dear Dick,
>
> Thank you for your references. I am aware of Shannon' works. I believe that he has just used the term entropy without claiming that his entropy is the thermodynamic entropy. A citation from his paper
>

> "The form of H will be recognized as that of entropy as defined in certain formulations of statistical mechanics8 where pi is the probability of a system being in cell i of its phase space. H is then, for example, the H in Boltzmann’s famous H theorem."


>
> He just shows that the equation is similar but he does not make a statement about the meaning of such a similarity. After all, there are many cases when the same equation describes completely different phenomena. For example the Poisson equation for electrostatics is mathematically equivalent to the stationary heat transfer equation. So what? Well, one creative use is for people who have a thermal FEM solver and do not have an electrostatic solver. They can solve an electrostatic problem by using a thermal FEM solver by means of mathematical analogy.
>
> If to talk about the irreversible thermodynamamics, I would recommend you Prigogine. Note that he has never treated the thermodynamic entropy as information.
>
> And once more. It would be very instructive for people discussing "Entropy and Information" just to browse the JANAF Tables. The entropy is there but I am not sure if it has any relationship with Shannon's information.
>
> Best wishes,
>
> Evgenii
>


Adami, C. & N.J. Cerf (1999). What information theory can tell us about quantum reality. In: Quantum Computing and Quantum Communications. Eds.: C.P. Williams. 1509: 258-268.

Avery, J. (2003). Information Theory and Evolution. Singapore, World Scientific Publishing Company.

Bajic, V.B. & T.T. Wee, Eds. (2005). Information Processing and Living Systems. Singapore, World Scientific Publishing Company.

Barbieri, M. (2003). The definitions of information and meaning: two possible boundaries between physics and biology. In: IPCAT 2003, Fifth International Workshop on Information Processing in Cells and Tissues, September 8-11, 2003, Lausanne, Switzerland. Eds.

Bradley, W.L. (2004). Information, entropy, and the origin of life. In: Debating Design: From Darwin to DNA. Eds.: W.A. Dembski & M. Ruse. Cambridge, Cambridge University Press: 331-351.

Brier, S. (2008). Cybersemiotics: Why Information Is Not Enough. Toronto, University of Toronto Press.

Campbell, J. (1982). Grammatical Man: Information, Entropy, Language, and Life. New York, Simon and Schuster.

da Cunha, P.R. & A.D. de Figueiredo (2001). Information systems development as flowing wholeness. In: Realigning Research and Practice in Information Systems Development - the Social and Organizational Perspective. Eds.: N.L. Russo, B. Fitzgerald & J.I. DeGross. 66: 29-48.

Frieden, B.R. (1999). Physics From Fisher Information; A Unification Cambridge, Cambridge University Press.

Frieden, B.R. (2010). Introduction to EPI (Extreme Physical Information). Second Life®, Embryo Physics Course.

Gatlin, L.L. (1972). Information Theory and the Living System. New York, Columbia University Press.

Goonatilake, S. (1991). Evolution of Information: Lineages in Gene, Culture and Artefact, Pinter Publishers.

Haken, H. & MyiLibrary. (2006). Information and self-organization
a macroscopic approach to complex systems. Berlin ; New York, Springer, 3rd enl.

Haruna, T. & Y.P. Gunji (2009). Wholeness and Information Processing in Biological Networks: An Algebraic Study of Network Motifs. In: Natural Computing, Proceedings. Eds.: Y. Suzuki, M. Hagiya, H. Umeo & A. Adamatzky. 1: 70-80.

Kubat, L. & J. Zeman (1975). Entropy and Information in Science and Philosophy. Amsterdam, Elsevier.

Kuppers, B.O. (1990). Information and the Origin of Life. Cambridge, MIT Press.

Lineweaver, C.H. (2011). The initial low gravitational entropy of the universe as the origin of all information in the universe. In: Origin(s) of Design in Nature. Eds.: R. Gordon, L. Stillwaggon Swan & J. Seckbach. Dordrecht, Springer: in preparation.

Lowenstein, W.R. (1999). The Touchstone of Life: Molecular Information, Cell Communication, and the Foundations of Life. New York, Oxford University Press.

Meyer, S.C. (2004). The Cambrian information explosion: evidence for intelligent design. In: Debating Design: From Darwin to DNA. Eds.: W.A. Dembski & M. Ruse. Cambridge, Cambridge University Press: 371-391.

Oyama, S. (1985). The Ontogeny of Information. Developmental Systems and Evolution. New York, Cambridge University Press.

Oyama, S. (2000). The Ontogeny of Information: Developmental Systems and Evolution. Durham, NC, Duke University Press, 2nd.

Roederer, J.G. (2005). Information and its Role in Nature. Heidelberg, Springer Verlag.

Shallit, J. & W. Elsberry (2004). Playing games with probability: Dembski's complex specified information. In: Why Intelligent Design Fails: A Scientific Critique of the New Creationism. Eds.: M. Young & T. Edis. Piscataway, NJ, Rutgers University Press: 121-138.

Vaneechoutte, M. (2000). The scientific origin of life - Considerations on the evolution of information, leading to an alternative proposal for explaining the origin of the cell, a semantically closed system. In: Closure: Emergent Organizations and Their Dynamics. Eds.: J.L.R. Chandler & G. VandeVijver. New York, New York Acad Sciences. 901: 139-147.

Weber, B.H., D.J. Depew & J.D. Smith (1988). Entropy, Information, and Evolution: New Perspectives on Physical and Biological Evolution. Cambridge, MIT Press.

Wicken, J.S. (1987). Evolution, Thermodynamics, and Information: Extending the Darwinian Program. New York, Oxford University Press.

Yockey, H.P. (1992). Information Theory in Molecular Biology. Cambridge, Cambridge University Press.

Yockey, H.P. (2005). Information Theory, Evolution, and the Origin of Life. Cambridge, Cambridge University Press.

Yockey, H.P., R.L. Platzman & H. Quastler (1958). Symposium on Information Theory in Biology, Gatlinburg, Tennessee, October 29-31, 1956. New York, Pergamon Press.

William R. Buckley

unread,
Dec 26, 2010, 12:58:03 PM12/26/10
to embryo...@googlegroups.com, Evgenii Rudnyi, biot...@lists.ccon.org
Dick and Evgenii:
 
You both should also become familiar with the work of Jeffrey S. Wicken,
such as
 
 
wrb

On Sun, Dec 26, 2010 at 1:45 AM, Dr. Richard Gordon <gor...@cc.umanitoba.ca> wrote:
Sunday, December 26, 2010 4:42 AM, Santee, SC
Dr. Evgenii Rudnyi
CADFEM GmbH, Germany

Dear Evgenii:

--
You received this message because you are subscribed to the Google Groups "EmbryoPhysics" group.
To post to this group, send email to embryo...@googlegroups.com.
To unsubscribe from this group, send email to embryophysic...@googlegroups.com.
For more options, visit this group at http://groups.google.com/group/embryophysics?hl=en.


Evgenii Rudnyi

unread,
Dec 26, 2010, 3:29:30 PM12/26/10
to embryo...@googlegroups.com
Dear Dick,

I know that there are many books about information and entropy. When
people from biotoconv have mentioned works of Edwin T. Jaynes, I have
checked at Google Scholar his citations:

http://scholar.google.de/scholar?q=Edwin+T.+Jaynes

His paper Information theory and statistical mechanics is cited there
more than 4000 times. It is just incredible.

However I am not sure if I will read the books that you have mentioned.
Let me explain my point as I believe it is not ignorance from my side. I
am a chemist and chemistry is basically a stamp collecting science. From
this viewpoint, the entropy has been developed to explain heat engines
and then chemists has used it to explain chemical equilibrium. As such,
there are experiments where entropy plays a good role. I believe that
people stating "Entropy is information" must go back to these original
experiments and explain where they find information there. From what I
have seen so far, they have not done it.

I will describe below a basic exercise from chemical thermodynamics
related to synthesis of ammonia. In my view, the concept "Entropy is
information" should be tried first with such an exercise. If someone is
ready to do it, I will be glad to discuss this.

Best wishes,

Evgenii

-----------------------------------------------
Problem. Given temperature, pressure, and initial number of moles of
NH3, N2 and H2, compute the equilibrium composition.

To solve the problem one should find thermodynamic properties of NH3, N2
and H2 for example in the JANAF Tables and then compute the equilibrium
constant.

From thermodynamics tables (all values are for the standard pressure 1
bar, I have omitted the symbol o for simplicity but it is very important
not to forget it):

Del_f_H_298(NH3), S_298(NH3), Cp(NH3), Del_f_H_298(N2), S_298(N2),
Cp(N2), Del_f_H_298(N2), S_298(N2), Cp(N2)

2NH3 = N2 + 3H2

Del_H_r_298 = Del_f_H_298(N2) + 3 Del_f_H_298(N2) - 2 Del_f_H_298(NH3)

Del_S_r_298 = S_298(N2) + 3 S_298(N2) - 2 S_298(NH3)

Del_Cp_r = Cp(N2) + 3 Cp(N2) - 2 Cp(NH3)

To make life simple, I will assume below that Del_Cp_r = 0, but it is
not a big deal to extend the equations to include heat capacities as well.

Del_G_r_T = Del_H_r_298 - T Del_S_r_298

Del_G_r_T = - R T ln Kp

When Kp, total pressure and the initial number of moles are given, it is
rather straightforward to compute equilibrium composition. If you need
help, please just let me know.

So, the entropy is there. Where is the related information?

-----------------------------------------------


on 26.12.2010 17:23 Dr. Richard Gordon said the following:


> Sunday, December 26, 2010 11:19 AM, Santee, SC Dear Evgenii, While I

> might agree with you on the misuse of �entropy� in biology, many


> people would disagree with you, as evidenced by the list of books
> below. If you have the time, maybe you could dig deeper into the
> question and give us a talk in the Embryo Physics Course? Thanks.
> Yours, -Dick
>
> Richard (Dick) Gordon Visitor, Camera Culture, Media Lab, MIT

> Visitor, Department of Mechanical& Aerospace Engineering, Old


> Dominion University gor...@cc.umanitoba.ca
>
> On 2010-12-26, at 7:39 AM, Evgenii Rudnyi wrote:
>
>> Dear Dick,
>>
>> Thank you for your references. I am aware of Shannon' works. I
>> believe that he has just used the term entropy without claiming
>> that his entropy is the thermodynamic entropy. A citation from his
>> paper
>>
>> "The form of H will be recognized as that of entropy as defined in
>> certain formulations of statistical mechanics8 where pi is the
>> probability of a system being in cell i of its phase space. H is

>> then, for example, the H in Boltzmann�s famous H theorem."


>>
>> He just shows that the equation is similar but he does not make a
>> statement about the meaning of such a similarity. After all, there
>> are many cases when the same equation describes completely
>> different phenomena. For example the Poisson equation for
>> electrostatics is mathematically equivalent to the stationary heat
>> transfer equation. So what? Well, one creative use is for people
>> who have a thermal FEM solver and do not have an electrostatic
>> solver. They can solve an electrostatic problem by using a thermal
>> FEM solver by means of mathematical analogy.
>>
>> If to talk about the irreversible thermodynamamics, I would
>> recommend you Prigogine. Note that he has never treated the
>> thermodynamic entropy as information.
>>
>> And once more. It would be very instructive for people discussing
>> "Entropy and Information" just to browse the JANAF Tables. The
>> entropy is there but I am not sure if it has any relationship with
>> Shannon's information.
>>
>> Best wishes,
>>
>> Evgenii
>>
>
>

> Adami, C.& N.J. Cerf (1999). What information theory can tell us


> about quantum reality. In: Quantum Computing and Quantum
> Communications. Eds.: C.P. Williams. 1509: 258-268.
>
> Avery, J. (2003). Information Theory and Evolution. Singapore, World
> Scientific Publishing Company.
>

> Bajic, V.B.& T.T. Wee, Eds. (2005). Information Processing and


> Living Systems. Singapore, World Scientific Publishing Company.
>
> Barbieri, M. (2003). The definitions of information and meaning: two
> possible boundaries between physics and biology. In: IPCAT 2003,
> Fifth International Workshop on Information Processing in Cells and
> Tissues, September 8-11, 2003, Lausanne, Switzerland. Eds.
>
> Bradley, W.L. (2004). Information, entropy, and the origin of life.

> In: Debating Design: From Darwin to DNA. Eds.: W.A. Dembski& M.


> Ruse. Cambridge, Cambridge University Press: 331-351.
>
> Brier, S. (2008). Cybersemiotics: Why Information Is Not Enough.
> Toronto, University of Toronto Press.
>
> Campbell, J. (1982). Grammatical Man: Information, Entropy, Language,
> and Life. New York, Simon and Schuster.
>

> da Cunha, P.R.& A.D. de Figueiredo (2001). Information systems


> development as flowing wholeness. In: Realigning Research and
> Practice in Information Systems Development - the Social and

> Organizational Perspective. Eds.: N.L. Russo, B. Fitzgerald& J.I.


> DeGross. 66: 29-48.
>
> Frieden, B.R. (1999). Physics From Fisher Information; A Unification
> Cambridge, Cambridge University Press.
>
> Frieden, B.R. (2010). Introduction to EPI (Extreme Physical

> Information). Second Life�, Embryo Physics Course.


>
> Gatlin, L.L. (1972). Information Theory and the Living System. New
> York, Columbia University Press.
>
> Goonatilake, S. (1991). Evolution of Information: Lineages in Gene,
> Culture and Artefact, Pinter Publishers.
>

> Haken, H.& MyiLibrary. (2006). Information and self-organization a


> macroscopic approach to complex systems. Berlin ; New York, Springer,
> 3rd enl.
>

> Haruna, T.& Y.P. Gunji (2009). Wholeness and Information Processing


> in Biological Networks: An Algebraic Study of Network Motifs. In:
> Natural Computing, Proceedings. Eds.: Y. Suzuki, M. Hagiya, H. Umeo&
> A. Adamatzky. 1: 70-80.
>

> Kubat, L.& J. Zeman (1975). Entropy and Information in Science and


> Philosophy. Amsterdam, Elsevier.
>
> Kuppers, B.O. (1990). Information and the Origin of Life. Cambridge,
> MIT Press.
>
> Lineweaver, C.H. (2011). The initial low gravitational entropy of the
> universe as the origin of all information in the universe. In:
> Origin(s) of Design in Nature. Eds.: R. Gordon, L. Stillwaggon Swan&
> J. Seckbach. Dordrecht, Springer: in preparation.
>
> Lowenstein, W.R. (1999). The Touchstone of Life: Molecular
> Information, Cell Communication, and the Foundations of Life. New
> York, Oxford University Press.
>
> Meyer, S.C. (2004). The Cambrian information explosion: evidence for
> intelligent design. In: Debating Design: From Darwin to DNA. Eds.:

> W.A. Dembski& M. Ruse. Cambridge, Cambridge University Press:


> 371-391.
>
> Oyama, S. (1985). The Ontogeny of Information. Developmental Systems
> and Evolution. New York, Cambridge University Press.
>
> Oyama, S. (2000). The Ontogeny of Information: Developmental Systems
> and Evolution. Durham, NC, Duke University Press, 2nd.
>
> Roederer, J.G. (2005). Information and its Role in Nature.
> Heidelberg, Springer Verlag.
>

> Shallit, J.& W. Elsberry (2004). Playing games with probability:


> Dembski's complex specified information. In: Why Intelligent Design
> Fails: A Scientific Critique of the New Creationism. Eds.: M. Young&
> T. Edis. Piscataway, NJ, Rutgers University Press: 121-138.
>
> Vaneechoutte, M. (2000). The scientific origin of life -
> Considerations on the evolution of information, leading to an
> alternative proposal for explaining the origin of the cell, a
> semantically closed system. In: Closure: Emergent Organizations and

> Their Dynamics. Eds.: J.L.R. Chandler& G. VandeVijver. New York, New


> York Acad Sciences. 901: 139-147.
>

> Weber, B.H., D.J. Depew& J.D. Smith (1988). Entropy, Information,


> and Evolution: New Perspectives on Physical and Biological Evolution.
> Cambridge, MIT Press.
>
> Wicken, J.S. (1987). Evolution, Thermodynamics, and Information:
> Extending the Darwinian Program. New York, Oxford University Press.
>
> Yockey, H.P. (1992). Information Theory in Molecular Biology.
> Cambridge, Cambridge University Press.
>
> Yockey, H.P. (2005). Information Theory, Evolution, and the Origin of
> Life. Cambridge, Cambridge University Press.
>

> Yockey, H.P., R.L. Platzman& H. Quastler (1958). Symposium on

Evgenii Rudnyi

unread,
Dec 26, 2010, 3:44:44 PM12/26/10
to embryo...@googlegroups.com
I am very sorry, there was a misprint in equations that I have written
in my previous message(N2 was partly instead of H2, a typical copy and
paste error). Please find the corrected version of the exercise.

-----------------------------------------------
Problem. Given temperature, pressure, and initial number of moles of
NH3, N2 and H2, compute the equilibrium composition.

To solve the problem one should find thermodynamic properties of NH3, N2
and H2 for example in the JANAF Tables and then compute the equilibrium
constant.

From thermodynamics tables (all values are molar values for the

standard pressure 1 bar, I have omitted the symbol o for simplicity but
it is very important not to forget it):

Del_f_H_298(NH3), S_298(NH3), Cp(NH3), Del_f_H_298(N2), S_298(N2),

Cp(N2), Del_f_H_298(H2), S_298(H2), Cp(H2)

2NH3 = N2 + 3H2

Del_H_r_298 = Del_f_H_298(N2) + 3 Del_f_H_298(H2) - 2 Del_f_H_298(NH3)

Del_S_r_298 = S_298(N2) + 3 S_298(H2) - 2 S_298(NH3)

Del_Cp_r = Cp(N2) + 3 Cp(H2) - 2 Cp(NH3)

To make life simple, I will assume below that Del_Cp_r = 0, but it is
not a big deal to extend the equations to include heat capacities as well.

Del_G_r_T = Del_H_r_298 - T Del_S_r_298

Del_G_r_T = - R T ln Kp

When Kp, total pressure and the initial number of moles are given, it is
rather straightforward to compute equilibrium composition. If you need
help, please just let me know.

So, the entropy is there. Where is the related information?

-----------------------------------------------


on 26.12.2010 21:29 Evgenii Rudnyi said the following:

William R. Buckley

unread,
Dec 27, 2010, 3:38:55 PM12/27/10
to embryo...@googlegroups.com
Gentlemen:

I do not agree that Wicken or Weber et al. stated that entropy is information; i.e. that entropy and
information form an identity.  Nor would I make that claim.

Do understand Evgenii that the increase in representation of information in DNA which has been
replicated comes upon the heels of an increase in entropy, such as is concomitant with the
chemical processes of biological metabolism.  Further, this increase in information (basically, a
doubling of the representation) is a measurable quantity by exactly the method employed by
Shannon: the distinction is between expectation and observation.  Shannon measures the
transfer of information on the basis of the receiver getting the bit value that the receiver expects.
If the bit received is exactly the bit expected, then no transfer of information occurs.  It is only
on the basis of uncertainty of the value of the next bit to be received that Shannon confers a
transfer of information.

Similarly, the transfer of information from DNA to protein comes upon the lack of expectation
of which nucleotide shall next be observed in the sequence of the chromosome.

This is to say, predictability is the antithesis of information.  This is true even as the addition
of information makes predictable that which is otherwise unpredictable.  If you can predict, you
don't need information.  In corollary, if you can't predict, then you need information.

wrb

might agree with you on the misuse of “entropy” in biology, many

people would disagree with you, as evidenced by the list of books
below. If you have the time, maybe you could dig deeper into the
question and give us a talk in the Embryo Physics Course? Thanks.
Yours, -Dick

Richard (Dick) Gordon Visitor, Camera Culture, Media Lab, MIT
Visitor, Department of Mechanical&  Aerospace Engineering, Old
Dominion University gor...@cc.umanitoba.ca

On 2010-12-26, at 7:39 AM, Evgenii Rudnyi wrote:

Dear Dick,

Thank you for your references. I am aware of Shannon' works. I
believe that he has just used the term entropy without claiming
that his entropy is the thermodynamic entropy. A citation from his
paper

"The form of H will be recognized as that of entropy as defined in
certain formulations of statistical mechanics8 where pi is the
probability of a system being in cell i of its phase space. H is
then, for example, the H in Boltzmann’s famous H theorem."
Information). Second Life®, Embryo Physics Course.

Steve McGrew

unread,
Dec 27, 2010, 7:54:16 PM12/27/10
to embryo...@googlegroups.com
There are a lot of different possible definitions for information, some of which might bring entropy and information closer to being an identity.
Steve

On 12/27/2010 12:38 PM, William R. Buckley wrote:
Gentlemen:

I do not agree that Wicken or Weber et al. stated that entropy is information; i.e. that entropy and
information form an identity.� Nor would I make that claim.


Do understand Evgenii that the increase in representation of information in DNA which has been
replicated comes upon the heels of an increase in entropy, such as is concomitant with the
chemical processes of biological metabolism.� Further, this increase in information (basically, a

doubling of the representation) is a measurable quantity by exactly the method employed by
Shannon: the distinction is between expectation and observation.� Shannon measures the
transfer of information on the basis of the receiver getting the bit value that the receiver expects.
If the bit received is exactly the bit expected, then no transfer of information occurs.� It is only
on the basis of uncertainty of the value of the next bit to be received that Shannon confers a
transfer of information.

Similarly, the transfer of information from DNA to protein comes upon the lack of expectation
of which nucleotide shall next be observed in the sequence of the chromosome.

This is to say, predictability is the antithesis of information.� This is true even as the addition
of information makes predictable that which is otherwise unpredictable.� If you can predict, you
don't need information.� In corollary, if you can't predict, then you need information.

wrb

might agree with you on the misuse of �entropy� in biology, many

people would disagree with you, as evidenced by the list of books
below. If you have the time, maybe you could dig deeper into the
question and give us a talk in the Embryo Physics Course? Thanks.
Yours, -Dick

Richard (Dick) Gordon Visitor, Camera Culture, Media Lab, MIT
Visitor, Department of Mechanical& �Aerospace Engineering, Old

Dominion University gor...@cc.umanitoba.ca

On 2010-12-26, at 7:39 AM, Evgenii Rudnyi wrote:

Dear Dick,

Thank you for your references. I am aware of Shannon' works. I
believe that he has just used the term entropy without claiming
that his entropy is the thermodynamic entropy. A citation from his
paper

"The form of H will be recognized as that of entropy as defined in
certain formulations of statistical mechanics8 where pi is the
probability of a system being in cell i of its phase space. H is
then, for example, the H in Boltzmann�s famous H theorem."


He just shows that the equation is similar but he does not make a
statement about the meaning of such a similarity. After all, there
are many cases when the same equation describes completely
different phenomena. For example the Poisson equation for
electrostatics is mathematically equivalent to the stationary heat
transfer equation. So what? Well, one creative use is for people
who have a thermal FEM solver and do not have an electrostatic
solver. They can solve an electrostatic problem by using a thermal
FEM solver by means of mathematical analogy.

If to talk about the irreversible thermodynamamics, I would
recommend you Prigogine. Note that he has never treated the
thermodynamic entropy as information.

And once more. It would be very instructive for people discussing
"Entropy and Information" just to browse the JANAF Tables. The
entropy is there but I am not sure if it has any relationship with
Shannon's information.

Best wishes,

Evgenii



Adami, C.& �N.J. Cerf (1999). What information theory can tell us
about quantum reality. In: �Quantum Computing and Quantum

Communications. Eds.: C.P. Williams. 1509: 258-268.

Avery, J. (2003). Information Theory and Evolution. Singapore, World
Scientific Publishing Company.

Bajic, V.B.& �T.T. Wee, Eds. (2005). Information Processing and

Living Systems. Singapore, World Scientific Publishing Company.

Barbieri, M. (2003). The definitions of information and meaning: two
possible boundaries between physics and biology. In: �IPCAT 2003,

Fifth International Workshop on Information Processing in Cells and
Tissues, September 8-11, 2003, Lausanne, Switzerland. Eds.

Bradley, W.L. (2004). Information, entropy, and the origin of life.
In: �Debating Design: From Darwin to DNA. Eds.: W.A. Dembski& �M.

Ruse. Cambridge, Cambridge University Press: 331-351.

Brier, S. (2008). Cybersemiotics: Why Information Is Not Enough.
Toronto, University of Toronto Press.

Campbell, J. (1982). Grammatical Man: Information, Entropy, Language,
and Life. New York, Simon and Schuster.

da Cunha, P.R.& �A.D. de Figueiredo (2001). Information systems
development as flowing wholeness. In: �Realigning Research and

Practice in Information Systems Development - the Social and
Organizational Perspective. Eds.: N.L. Russo, B. Fitzgerald& �J.I.

DeGross. 66: 29-48.

Frieden, B.R. (1999). Physics From Fisher Information; A Unification
Cambridge, Cambridge University Press.

Frieden, B.R. (2010). Introduction to EPI (Extreme Physical
Information). Second Life�, Embryo Physics Course.


Gatlin, L.L. (1972). Information Theory and the Living System. New
York, Columbia University Press.

Goonatilake, S. (1991). Evolution of Information: Lineages in Gene,
Culture and Artefact, Pinter Publishers.

Haken, H.& �MyiLibrary. (2006). Information and self-organization a

macroscopic approach to complex systems. Berlin ; New York, Springer,
3rd enl.

Haruna, T.& �Y.P. Gunji (2009). Wholeness and Information Processing

in Biological Networks: An Algebraic Study of Network Motifs. In:
Natural Computing, Proceedings. Eds.: Y. Suzuki, M. Hagiya, H. Umeo&
A. Adamatzky. 1: 70-80.

Kubat, L.& �J. Zeman (1975). Entropy and Information in Science and

Philosophy. Amsterdam, Elsevier.

Kuppers, B.O. (1990). Information and the Origin of Life. Cambridge,
MIT Press.

Lineweaver, C.H. (2011). The initial low gravitational entropy of the
universe as the origin of all information in the universe. In:
Origin(s) of Design in Nature. Eds.: R. Gordon, L. Stillwaggon Swan&
J. Seckbach. Dordrecht, Springer: in preparation.

Lowenstein, W.R. (1999). The Touchstone of Life: Molecular
Information, Cell Communication, and the Foundations of Life. New
York, Oxford University Press.

Meyer, S.C. (2004). The Cambrian information explosion: evidence for
intelligent design. In: �Debating Design: From Darwin to DNA. Eds.:
W.A. Dembski& �M. Ruse. Cambridge, Cambridge University Press:

371-391.

Oyama, S. (1985). The Ontogeny of Information. Developmental Systems
and Evolution. New York, Cambridge University Press.

Oyama, S. (2000). The Ontogeny of Information: Developmental Systems
and Evolution. Durham, NC, Duke University Press, 2nd.

Roederer, J.G. (2005). Information and its Role in Nature.
Heidelberg, Springer Verlag.

Shallit, J.& �W. Elsberry (2004). Playing games with probability:
Dembski's complex specified information. In: �Why Intelligent Design

Fails: A Scientific Critique of the New Creationism. Eds.: M. Young&
T. Edis. Piscataway, NJ, Rutgers University Press: 121-138.

Vaneechoutte, M. (2000). The scientific origin of life -
Considerations on the evolution of information, leading to an
alternative proposal for explaining the origin of the cell, a
semantically closed system. In: �Closure: Emergent Organizations and
Their Dynamics. Eds.: J.L.R. Chandler& �G. VandeVijver. New York, New

York Acad Sciences. 901: 139-147.

Weber, B.H., D.J. Depew& �J.D. Smith (1988). Entropy, Information,

and Evolution: New Perspectives on Physical and Biological Evolution.
Cambridge, MIT Press.

Wicken, J.S. (1987). Evolution, Thermodynamics, and Information:
Extending the Darwinian Program. New York, Oxford University Press.

Yockey, H.P. (1992). Information Theory in Molecular Biology.
Cambridge, Cambridge University Press.

Yockey, H.P. (2005). Information Theory, Evolution, and the Origin of
Life. Cambridge, Cambridge University Press.

Yockey, H.P., R.L. Platzman& �H. Quastler (1958). Symposium on

Information Theory in Biology, Gatlinburg, Tennessee, October 29-31,
1956. New York, Pergamon Press.



--
You received this message because you are subscribed to the Google Groups "EmbryoPhysics" group.
To post to this group, send email to embryo...@googlegroups.com.
To unsubscribe from this group, send email to embryophysic...@googlegroups.com.
For more options, visit this group at http://groups.google.com/group/embryophysics?hl=en.

Eluemuno Blyden

unread,
Dec 28, 2010, 1:14:49 AM12/28/10
to embryo...@googlegroups.com

"Similarly, the transfer of information from DNA to protein comes upon the lack of expectation 
of which nucleotide shall next be observed in the sequence of the chromosome.

This is to say, predictability is the antithesis of information.  This is true even as the addition 
of information makes predictable that which is otherwise unpredictable.  If you can predict, you 
don't need information.  In corollary, if you can't predict, then you need information."
 

The problem I think we face in bioinformatics when we try to apply these information theory concepts is exemplified by the first sentence above in which the information transfer is implicitly one dimensional.  While it may be safe to say that the majority of information transferred from the Chromosome ( which I assume includes RNA transcripts, messengers etc)  to the protein is  1 dimensional, it is not necessarily all so.  Structure is a key mediator of information in biological systems which introduces additional complexity or even dimensionality to the transfer. 

Consider the challenge faced by influenza viruses which generate vast numbers of  "erroneous chromosomes" (eC) but must still communicate all the information required for successful life cycle. They would achieve evolutionary advantage by being able to "predict" the protein structure encoded by a given eC during the assembly and packaging of virus. This can be done by associating protein information with structural information in the eC. This structural information effectively gives information about which nucleotide to expect though not in a linear way.  imagine a stem loop structure in which nucleotide i is base paired with nucleotide J somewhere else in the sequence ( with j > i+3). The presence of a stable stem communicates information about what is coming down the sequence when we are only at i. In effect, at every i we get holographic information about the entire sequence in this way. 

The set of stem-loop structures comprising the RNA (eC) can therefore communicate information about the encoded protein even without being translated at all. 



Dr. Eluemuno R. Blyden

>>> "Replace questions with answers. Exchange anxieties for hope. Turn isolation into intimacy. Gain a mastery of love." <<< Unknown author...



From: William R. Buckley <bill.b...@gmail.com>
To: embryo...@googlegroups.com
Sent: Mon, December 27, 2010 12:38:55 PM
Subject: Re: EmbryoPhysics110:: Entropy and Information

Evgenii Rudnyi

unread,
Dec 28, 2010, 1:49:52 PM12/28/10
to embryo...@googlegroups.com
on 28.12.2010 01:54 Steve McGrew said the following:

> There are a lot of different possible definitions for information,
> some of which might bring entropy and information closer to being an
> identity. Steve

When you bring the thermodynamic entropy and information closer to each
other, it would be good to open the JANAF Tables and ask yourself what
is the meaning of information for entropy there in the thermodynamic tables.

Evgenii

William R. Buckley

unread,
Dec 28, 2010, 1:56:08 PM12/28/10
to embryo...@googlegroups.com
Eleumuno:

In your description of eC and stem-loops, you are discussing what is known as context sensitivity.

wrb

William R. Buckley

unread,
Dec 28, 2010, 2:00:12 PM12/28/10
to embryo...@googlegroups.com
It is clearly the case that an ordered crystal of NaCl is describable by a certain amount of "information" and it is also clearly the case that upon dissolving this crystal of NaCl in water there is a certain amount more of information necessary to describe the new state.  Note that with this increase of information has come an increase of entropy.

Why would I need to refer to tables in order to understand so simple a model?

wrb

Evgenii Rudnyi

unread,
Dec 28, 2010, 2:23:39 PM12/28/10
to embryo...@googlegroups.com
Let us look at the thermodynamic tables of MgCl2 from Wikipedia

http://en.wikipedia.org/wiki/File:Thermodynamic_Properties_Table_for_MgCl2(c,l,g).PNG

You see a column with numerical values of the entropy. The question is
whether the numerical values of the entropy in the forth column has
something to do with the information?

That is, the entropy of MgCl2 at 298.15 K is 89.538 J/(mol K). Could you
please convert this value to some information content?

Alternatively you could look at the CODATA tables

http://www.codata.org/resources/databases/key1.html

You will see that different substances have different numerical values
of the entropy. Could we say that they have different amount of information?

Evgenii

on 28.12.2010 20:00 William R. Buckley said the following:

>> embryophysic...@googlegroups.com<embryophysics%2Bunsu...@googlegroups.com>

William R. Buckley

unread,
Dec 28, 2010, 8:28:24 PM12/28/10
to embryo...@googlegroups.com
Ah, now you are asking for a measurable/computable relationship between information and entropy.
That has not ever been determined.  There is nevertheless a clear correlation, that an increase in
entropy necessarily implies an increase of information.

Correct me if I am mistaken but, I know of no indication that one can ever compute an absolute
entropy; entropy derivations are always taken upon a change of state of some system.  So, there
is no entropy of MgCl2 at any temperature.  There might be a change in system entropy of say,
89.538 J/mol K, under some circumstance, such as solvation in water at standard temperature
and pressure.

Again, to my knowledge no person has ever claimed to define a function which yields a measure
of the net change to systemic information that corresponds to a change in systemic entropy.

And, you have not denied my earlier claim, regarding the obviousness of a requirement of increased
information to describe a dissolved (and soluble) salt, versus the information required to describe the
crystal form of that salt.

wrb

To unsubscribe from this group, send email to embryophysic...@googlegroups.com.

Evgenii Rudnyi

unread,
Dec 29, 2010, 2:04:32 PM12/29/10
to embryo...@googlegroups.com
on 29.12.2010 02:28 William R. Buckley said the following:

> Ah, now you are asking for a measurable/computable relationship
> between information and entropy. That has not ever been determined.
> There is nevertheless a clear correlation, that an increase in
> entropy necessarily implies an increase of information.

Does it mean that Del S > 0 means that Del Information > 0? If yes, then

http://www.codata.org/resources/databases/key1.html

Ag 42.55 � 0.20
B 5.90 � 0.08

does it mean Ag has more information than B?

> Correct me if I am mistaken but, I know of no indication that one can
> ever compute an absolute entropy; entropy derivations are always
> taken upon a change of state of some system. So, there is no entropy
> of MgCl2 at any temperature. There might be a change in system
> entropy of say, 89.538 J/mol K, under some circumstance, such as
> solvation in water at standard temperature and pressure.

According to the Third Law, the entropy at O K is 0. Then the integral

S_T - S_0 = Integral_from_0_to_T Cp/T dT

gives the absolute entropy.

> Again, to my knowledge no person has ever claimed to define a
> function which yields a measure of the net change to systemic
> information that corresponds to a change in systemic entropy.
>
> And, you have not denied my earlier claim, regarding the obviousness
> of a requirement of increased information to describe a dissolved
> (and soluble) salt, versus the information required to describe the
> crystal form of that salt.

I am not an expert on information, so for me frankly speaking is not
evident, that the information content increases. As for the entropy,
what entropy do you mean, the entropy of a solution or partial entropies
of components?

>
> wrb
>
> On Tue, Dec 28, 2010 at 1:23 PM, Evgenii Rudnyi<use...@rudnyi.ru>
> wrote:
>
>> Let us look at the thermodynamic tables of MgCl2 from Wikipedia
>>
>>

>> http://en.wikipedia.org/wiki/File:Thermodynamic_Properties_Table_for_MgCl2(c,l,g).PNG<http://en.wikipedia.org/wiki/File:Thermodynamic_Properties_Table_for_MgCl2%28c,l,g%29.PNG>

<embryophysics%2Bunsu...@googlegroups.com<embryophysics%252Buns...@googlegroups.com>

Reply all
Reply to author
Forward
0 new messages