http://blog.rudnyi.ru/2012/02/entropy-and-information.html
No doubt, this is my personal viewpoint. If you see that I have missed
something, please let me know.
Evgenii
In doing their calculations, thermodynamicists and chemists work
with an infinitesimally small fraction of the complete description
of a system. Physicists tend to do the same, of course.
By the way, a "chaotic" system would be one in which the behavior
cannot be accurately described (for practical purposes) without
taking a large fraction of the total amount of information into
account.
Steve McGrew
I would agree that we are free to give a meaning to a term (assuming
that we do have free will). Yet, in my view in the discussion about the
entropy and information, clear definitions are just missing.
Evgenii
On 26.02.2012 16:38 Steve McGrew said the following:
> Evgenii, I am neither a thermodynamicist nor an information theorist.
> However, I would like to offer the following points in response to
> your summary.
>
> * Every system has structure that can in principle be described from
> the macroscopic level all the way down to sub-atomic. * A complete*
> description of that structure could reasonably be called
> "information", and its amount can be measured in suitable units
> (e.g., "bits"). * Thus, every system contains a certain amount of
> information. * The macroscopic behavior of a system can be described
> accurately enough for almost all purposes, by using a much smaller
> amount of information than is actually contained in the system. * In
> an entirely deterministic, closed system, the /amount/ of
> information never changes, and the change in the /information
> contents/ may change, but only with a single degree of freedom which
> corresponds to the passage of time. * In a closed probabilistic
> system (e.g. closed quantum-mechanical system), again, the amount of
> information does not change but the information contents may change
> with a very large number of degrees of freedom corresponding to the
> "bits" in the description. * Calculation of the evolution of a
> probabilistic system over time, based on an initial incomplete
> description, becomes less and less accurate as it is extrapolated
> further and further into the future, because, in effect, the "known"
> fraction of the complete description available for the calculation
> becomes smaller, and the "unknown" fraction becomes larger,
>
> In doing their calculations, thermodynamicists and chemists work with
> an infinitesimally small fraction of the complete description of a
> system. Physicists tend to do the same, of course.
>
> By the way, a "chaotic" system would be one in which the behavior
> cannot be accurately described (for practical purposes) without
> taking a large fraction of the total amount of information into
> account.
>
> Steve McGrew
>
>
> On 2/26/2012 5:54 AM, Evgenii Rudnyi wrote:
>> I have written a summary for the discussion in the subject:
>>
>> http://blog.rudnyi.ru/2012/02/entropy-and-information.html
>>
>> No doubt, this is my personal viewpoint. If you see that I have
>> missed something, please let me know.
>>
>> Evgenii
>>
> -- You received this message because you are subscribed to the Google
> Groups "EmbryoPhysics" group. To post to this group, send email to
> embryo...@googlegroups.com. To unsubscribe from this group, send
> email to embryophysic...@googlegroups.com. For more
> options, visit this group at
> http://groups.google.com/group/embryophysics?hl=en.
I have thought more about your suggestion to define information in such
a way in order to completely describe a physical system. This could make
sense but then the thermodynamic entropy, in my view, is not related.
Some problem is to define what it means to describe a physical system.
Say, I could do it at the macroscopic level, at atomic level, at the
level of elementary particles, then now at the level of superstings.
Evgenii
On 26.02.2012 16:38 Steve McGrew said the following:
> Evgenii,
> I am neither a thermodynamicist nor an information theorist. However, I would
> like to offer the following points in response to your summary.
>
> * Every system has structure that can in principle be described from the
> macroscopic level all the way down to sub-atomic.
> * A complete* description of that structure could reasonably be called
> "information", and its amount can be measured in suitable units (e.g., "bits").
> * Thus, every system contains a certain amount of information.
> * The macroscopic behavior of a system can be described accurately enough for
> almost all purposes, by using a much smaller amount of information than is
> actually contained in the system.
> * In an entirely deterministic, closed system, the /amount/ of information
> never changes, and the change in the /information contents/ may change, but
> only with a single degree of freedom which corresponds to the passage of time.
> * In a closed probabilistic system (e.g. closed quantum-mechanical system),
> again, the amount of information does not change but the information
> contents may change with a very large number of degrees of freedom
> corresponding to the "bits" in the description.
> * Calculation of the evolution of a probabilistic system over time, based on
> an initial incomplete description, becomes less and less accurate as it is
> extrapolated further and further into the future, because, in effect, the
> "known" fraction of the complete description available for the calculation
> becomes smaller, and the "unknown" fraction becomes larger,
>
> In doing their calculations, thermodynamicists and chemists work with an
> infinitesimally small fraction of the complete description of a system.
> Physicists tend to do the same, of course.
>
> By the way, a "chaotic" system would be one in which the behavior cannot be
> accurately described (for practical purposes) without taking a large fraction of
> the total amount of information into account.
>
> Steve McGrew
>
>
> On 2/26/2012 5:54 AM, Evgenii Rudnyi wrote:
>> I have written a summary for the discussion in the subject:
>>
>> http://blog.rudnyi.ru/2012/02/entropy-and-information.html
>>
>> No doubt, this is my personal viewpoint. If you see that I have missed
>> something, please let me know.
>>
>> Evgenii
>>
Stuart
________________________________
From: embryo...@googlegroups.com [embryo...@googlegroups.com] On Behalf Of Steve McGrew [ste...@nli-ltd.com]
Sent: Monday, February 27, 2012 6:40 PM
To: embryo...@googlegroups.com
Subject: Re: EmbryoPhysics158:: Entropy and information
Evgenii,
Steve
Evgenii
Steve McGrew
http://blog.rudnyi.ru/2012/02/entropy-and-information.html
Evgenii
To post to this group, send email to embryo...@googlegroups.com<mailto:embryo...@googlegroups.com>.
To unsubscribe from this group, send email to
embryophysic...@googlegroups.com<mailto:embryophysic...@googlegroups.com>.
Could please try your notation to the IT devices that surround us (a
hard disk, a flesh memory, a DVD, etc.)? Will it work to describe for
example their information capacity?
Evgenii
On 28.02.2012 00:40 Steve McGrew said the following:
> Evgenii,
>
> A crystal in its lowest energy state) has low algorithmic complexity, and
> therefore low information content, because one sentence or a short formula can
> describe the arrangement of its atoms in complete detail. A bottle of CO2 has
> high algorithmic complexity, and therefore high information content, because the
> position and momentum, orientation, electronic state and spin vector of every
> molecule would need to be listed in order to describe it in complete detail.
>
> However, we are usually satisfied to describe the bottle of CO2 with a few
> numbers: volume, pressure, and temperature, because that is enough for us to
> calculate its behavior in situations that usually matter to us. If the crystal,
> on the other hand, has an appropriately structured arrangement of electronic
> states, it might be able to perform complex calculations or even accurately
> model a bottle of CO2 down to the molecular level. The bottle of CO2 simply is
> itself; the crystal simulating the bottle of CO2 needs its electronic states
> highly structured.
>
> According to the widely accepted notion, a system with the absolute maximum
> algorithmic complexity necessarily contains the maximum possible information.
> And, it must contain a minimum of internal correlations because every
> correlation reduces the algorithmic complexity. So, the arrangement of its parts
> need to be maximally random according to all possible measures of randomness.
> This equivalence of maximum randomness and maximum complexity suggests to me
> that /neither/ randomness nor algorithmic complexity refer to things of
The entropy is a function of a number of moles, temperature and pressure
S(n, T, p). If you take the same gas, I am pretty sure that you will
find many states
S(n1, T1, p1) = S(n2, T2, p2) = S(n3, T3, p3) = ...
So it is not clear what you mean. The situation is even more complex as
there are many different gases, say O2, N2, Ar, and so on. Again we can
find many states when the entropy has the same numerical values for
different gases.
Evgenii
On 28.02.2012 17:46 Steve McGrew said the following:
>>> A crystal in its lowest energy state) has low algorithmic
complexity, and
>>> therefore low information content, because one sentence or a short
formula can
>>> describe the arrangement of its atoms in complete detail. A bottle
of CO2 has
>>> high algorithmic complexity, and therefore high information
content, because the
You cannot use gas to store information for IT, you need a solid, and
higher temperatures are not good for IT either.
Evgenii
On 28.02.2012 20:46 Steve McGrew said the following:
Could you please say what is the relationship between information in IT
and in the CO2?
"A bottle of CO2 has high algorithmic complexity, and therefore high
information content"
I completely agree that information has several meaning and this is
probably the main problem when people discuss about the entropy and
information. But then it would be good to make definitions for different
meaning of information. It might help to understand the problem better.
Evgenii
On 28.02.2012 22:18 Steve McGrew said the following:
> Evgenii,
> I guess I didn't understand your question the first time.
> Of course it is difficult to store information (for doing computations) in
> a gas specifically because the states are not easily accessible.
> Accessibility requires a sizable degree of "structure", but of course a gas
> has nearly negligible structure. To the extent that the relevant
> structural features are degraded by increased temperatures, heat can reduce
> the accessibility of states and thus reduce the effective *information
> storage capacity*. However, if there is no structure then the amount of
I see a big difference between a flash memory and a gas. We write
information on a flesh memory in order to use it. The same concerns
books. A book is written to read it.
A gas, on the other hand, is just a gas. When I model it, I need to
define some variables, this is true. But this concern the gas model and
not the gas as such. When you speak about information to define a gas
model, I could understand. When you speak about information in the gas
as such, I cannot follow you.
Evgenii
P.S. By the way, I am not sure if I understand why you say that the
content of the Library of Congress is random. I would say not. Either
the content is compressed or not, in my view, this does not change the
fact that the information in the books is not random.
Even if the compressed form looks random, it is actually not, as one can
decompress the archive and restore normal books.
On 29.02.2012 22:34 Steve McGrew said the following:
> Evgenii,
>
> bits to specify the /equations/ needed to *predict* the behavior, given the
...
> Lots of experiments have been done in which seemingly irreversible processes can
> be reversed. In those cases, it is clear that the *information* needed to
> specify the original state is transformed, but not lost. Reversing the
> transformation restores the original state. Although it *looks* like entropy has
> increased when the transformation is first done, it turns out that it really hasn't.
Could you please give an example? But please not a thought experiment,
rather a real one.
Evgenii
Stuart
________________________________
From: embryo...@googlegroups.com [embryo...@googlegroups.com] On Behalf Of Steve McGrew [ste...@nli-ltd.com]
Sent: Thursday, March 01, 2012 6:36 PM
To: embryo...@googlegroups.com
Subject: Re: EmbryoPhysics176:: Entropy and information
Regards,
Steve
Evgenii
Regards,
Steve
Evgenii
Dear Stuart & Steve,
I tend to agree with Steve. What I call his LOGO program is “the genetic program”. But the physics is the component of the genetic program that is usually taken for granted. In the case of the LOGO program, this physics has to do with the operation of the compiler and the structure and operation of the computer. In the case of the embryo the physics is what the Embryo Physics Course is about: forces generated and responded to by cells, cytoskeletal physics, differentiation waves, etc. That there is a stochastic component is part of the physics, whether it be due to a random number generator or Brownian motion.
Yours, -Dick
On 2012-03-01, at 10:18 PM, Steve McGrew wrote:
> Stuart,
> Physical laws are the background principles that drive the processes.
> I probably haven't communicated what I mean by "description of an organism", if you think that to believe DNA is a "description of an organism" invalidates the notion of embryo physics.
>
> Once I wrote a simple LOGO program that generated a forest of trees on a computer screen.. It could easily generate a forest containing vastly more pixels than the number of bits in the program. Every time I ran the program, it generated a different forest with different trees, but the forest and trees had the same general character each time. They always looked like spruce trees or oak trees, depending on the small handful of control parameters I gave to the program.
>
> Maybe you wouldn't say that the LOGO program was a "description" of the trees or forest. What word would you use? It wasn't a blueprint, because the trees and forest were different each time. It was a set of rules, and it made the forest grow but did not fully control the growth of the forest (because it included random functions). It was a "prescription", maybe.
>
> Steve
Dr. Richard (Dick) Gordon
Theoretical Biologist, Embryogenesis Center
Gulf Specimen Marine Laboratory (http://www.gulfspecimen.org)
Visiting Professor, Micro & Nanotechnology Institute, Old Dominion University
1-(850) 745-5011 or Skype: DickGordonCan
DickGo...@gmail.com
Steve,
I understand that you can write a program that generates tree morphologies. But you designed the program. An organism’s DNA does not contain such a program. The program, if you want to call it that, resides in the entire material composition of the organism’s zygote, and only part of that is inscribed in DNA sequence.
The forms that we see unfolding in a present-day organism are not the execution of information in the DNA, but outcomes of a complex set of physical processes, only some of which are predictable based on the physics acting on the contemporary materials (including the DNA). Some of the forms arose much earlier in evolutionary history based on the cellular materials present at that time and the physical effects relevant to those materials.
Those original forms (if they were consistent with survival) acted as structural templates for subsequent canalizing evolution, so that the present-day unfolding process can neither be attributed to present-day DNA, or present-day DNA plus present-day physics. The explanation of the forms and the means of their generation must also take the historical dimension into account. The DNA sequence reflect this history, but only partially, and not in the form of a program.
Stuart