# Entropy and information

16 views

### Evgenii Rudnyi

Feb 26, 2012, 8:54:44 AM2/26/12
I have written a summary for the discussion in the subject:

http://blog.rudnyi.ru/2012/02/entropy-and-information.html

No doubt, this is my personal viewpoint. If you see that I have missed

Evgenii

### Steve McGrew

Feb 26, 2012, 10:38:31 AM2/26/12
Evgenii,
I am neither a thermodynamicist nor an information theorist.  However, I would like to offer the following points in response to your summary.

• Every system has structure that can in principle be described from the macroscopic level all the way down to sub-atomic.
• A complete* description of that structure  could reasonably be called "information", and its amount can be measured in suitable units (e.g., "bits").
• Thus, every system contains a certain amount of information.
• The macroscopic behavior of a system can be described accurately enough for almost all purposes, by using a much smaller amount of information than is actually contained in the system.
• In an entirely deterministic, closed system, the amount of information never changes, and the change in the information contents may change, but only with a single degree of freedom which corresponds to the passage of time.
• In a closed probabilistic system (e.g. closed quantum-mechanical system), again, the amount of information does not change but the information contents may change with a very large number of degrees of freedom corresponding to the "bits" in the description.
• Calculation of the evolution of a probabilistic system over time, based on an initial incomplete description, becomes less and less accurate as it is extrapolated further and further into the future, because, in effect, the "known" fraction of the complete description available for the calculation becomes smaller, and the "unknown" fraction becomes larger,

In doing their calculations, thermodynamicists and chemists work with an infinitesimally small fraction of the complete description of a system.  Physicists tend to do the same, of course.

By the way, a "chaotic" system would be one in which the behavior cannot be accurately described (for practical purposes) without taking a large fraction of the total amount of information into account.

Steve McGrew

### Evgenii Rudnyi

Feb 26, 2012, 2:57:54 PM2/26/12
Steve,

I would agree that we are free to give a meaning to a term (assuming
that we do have free will). Yet, in my view in the discussion about the
entropy and information, clear definitions are just missing.

Evgenii

On 26.02.2012 16:38 Steve McGrew said the following:

> Evgenii, I am neither a thermodynamicist nor an information theorist.
> However, I would like to offer the following points in response to
>

> * Every system has structure that can in principle be described from
> the macroscopic level all the way down to sub-atomic. * A complete*

> description of that structure could reasonably be called
> "information", and its amount can be measured in suitable units

> (e.g., "bits"). * Thus, every system contains a certain amount of
> information. * The macroscopic behavior of a system can be described

> accurately enough for almost all purposes, by using a much smaller

> amount of information than is actually contained in the system. * In
> an entirely deterministic, closed system, the /amount/ of
> information never changes, and the change in the /information
> contents/ may change, but only with a single degree of freedom which
> corresponds to the passage of time. * In a closed probabilistic

> system (e.g. closed quantum-mechanical system), again, the amount of
> information does not change but the information contents may change
> with a very large number of degrees of freedom corresponding to the

> "bits" in the description. * Calculation of the evolution of a

> probabilistic system over time, based on an initial incomplete
> description, becomes less and less accurate as it is extrapolated
> further and further into the future, because, in effect, the "known"
> fraction of the complete description available for the calculation
> becomes smaller, and the "unknown" fraction becomes larger,
>
> In doing their calculations, thermodynamicists and chemists work with
> an infinitesimally small fraction of the complete description of a
> system. Physicists tend to do the same, of course.
>
> By the way, a "chaotic" system would be one in which the behavior
> cannot be accurately described (for practical purposes) without
> taking a large fraction of the total amount of information into
> account.
>
> Steve McGrew
>
>
> On 2/26/2012 5:54 AM, Evgenii Rudnyi wrote:
>> I have written a summary for the discussion in the subject:
>>
>> http://blog.rudnyi.ru/2012/02/entropy-and-information.html
>>
>> No doubt, this is my personal viewpoint. If you see that I have
>> missed something, please let me know.
>>
>> Evgenii
>>

> -- You received this message because you are subscribed to the Google
> Groups "EmbryoPhysics" group. To post to this group, send email to
> embryo...@googlegroups.com. To unsubscribe from this group, send
> email to embryophysic...@googlegroups.com. For more
> options, visit this group at

### Evgenii Rudnyi

Feb 27, 2012, 3:14:50 PM2/27/12
Steve,

I have thought more about your suggestion to define information in such
a way in order to completely describe a physical system. This could make
sense but then the thermodynamic entropy, in my view, is not related.

Some problem is to define what it means to describe a physical system.
Say, I could do it at the macroscopic level, at atomic level, at the
level of elementary particles, then now at the level of superstings.

Evgenii

On 26.02.2012 16:38 Steve McGrew said the following:

> Evgenii,
> I am neither a thermodynamicist nor an information theorist. However, I would
> like to offer the following points in response to your summary.
>

> * Every system has structure that can in principle be described from the

> macroscopic level all the way down to sub-atomic.

> * A complete* description of that structure could reasonably be called

> "information", and its amount can be measured in suitable units (e.g., "bits").

> * Thus, every system contains a certain amount of information.
> * The macroscopic behavior of a system can be described accurately enough for

> almost all purposes, by using a much smaller amount of information than is
> actually contained in the system.

> * In an entirely deterministic, closed system, the /amount/ of information
> never changes, and the change in the /information contents/ may change, but

> only with a single degree of freedom which corresponds to the passage of time.

> * In a closed probabilistic system (e.g. closed quantum-mechanical system),

> again, the amount of information does not change but the information
> contents may change with a very large number of degrees of freedom
> corresponding to the "bits" in the description.

> * Calculation of the evolution of a probabilistic system over time, based on

> an initial incomplete description, becomes less and less accurate as it is
> extrapolated further and further into the future, because, in effect, the
> "known" fraction of the complete description available for the calculation
> becomes smaller, and the "unknown" fraction becomes larger,
>
> In doing their calculations, thermodynamicists and chemists work with an
> infinitesimally small fraction of the complete description of a system.
> Physicists tend to do the same, of course.
>
> By the way, a "chaotic" system would be one in which the behavior cannot be
> accurately described (for practical purposes) without taking a large fraction of
> the total amount of information into account.
>
> Steve McGrew
>
>
> On 2/26/2012 5:54 AM, Evgenii Rudnyi wrote:
>> I have written a summary for the discussion in the subject:
>>
>> http://blog.rudnyi.ru/2012/02/entropy-and-information.html
>>
>> No doubt, this is my personal viewpoint. If you see that I have missed
>> something, please let me know.
>>
>> Evgenii
>>

### Steve McGrew

Feb 27, 2012, 6:40:34 PM2/27/12
Evgenii,

A crystal in its lowest energy state) has low algorithmic complexity, and therefore low information content, because one sentence or a short formula can describe the arrangement of its atoms in complete detail.  A bottle of CO2 has high algorithmic complexity, and therefore high information content, because the position and momentum, orientation, electronic state and spin vector of every molecule would need to be listed in order to describe it in complete detail.

However, we are usually satisfied to describe the bottle of CO2 with a few numbers: volume, pressure, and temperature, because that is enough for us to calculate its behavior in situations that usually matter to us.  If the crystal, on the other hand, has an appropriately  structured arrangement of electronic states, it might be able to perform complex calculations or even accurately model a bottle of CO2 down to the molecular level.  The bottle of CO2 simply is itself; the crystal simulating the bottle of CO2 needs its electronic states highly structured.

According to the widely accepted notion, a system with the absolute maximum algorithmic complexity necessarily contains the maximum possible information.  And, it must contain a minimum of internal correlations because every correlation reduces the algorithmic complexity.  So, the arrangement of its parts need to be maximally random according to all possible measures of randomness.  This equivalence of maximum randomness and maximum complexity suggests to me that neither randomness nor algorithmic complexity refer to things of importance to biology.  Rather, we need a definition and measure of what I think of as "structure".  Both a crystal (low algorithmic complexity for a complete description) and a gas (high algorithmic complexity for a complete description) would be low on the scale of "structure", while living organisms would rank high on the scale of "structure"

Steve

### Newman, Stuart

Feb 27, 2012, 7:46:09 PM2/27/12
Attached is a brief note I wrote in these questions many years ago.

Stuart

________________________________
Sent: Monday, February 27, 2012 6:40 PM
Subject: Re: EmbryoPhysics158:: Entropy and information

Evgenii,

Steve

Evgenii

Steve McGrew

Evgenii

To unsubscribe from this group, send email to

Note on complex systems_JTB.pdf

### Newman, Stuart

Feb 27, 2012, 7:50:38 PM2/27/12
...ON these questions...

### William R. Buckley

Feb 28, 2012, 10:39:26 AM2/28/12
All:

Consider this model.  Assume we have two adiabatic vessels.  One has volume, the other does not.

Inside the volumetric adiabatic vessel we have an ideal gas at STP.  Inside the non-volumetric adiabatic vessel, there is only
heat energy.  Now, the STP gas has some entropy, and that entropy describes fully the state of the gas - it is *this much* disordered.

Now, we connect the two adiabatic vessels, and open the valve between them.  Because there is no volumetric change, the V is
constant.  Yet, the added energy to the gas increases the entropy state of the gas, and in order to properly describe this new
entropy state corresponds to an equal increase in the information needed to describe this changed state.

Is this argument based upon some fallacy?

wrb

### Steve McGrew

Feb 28, 2012, 11:46:11 AM2/28/12
What is the physical form of the heat energy?  And what is meant by a vessel without volume?

### Evgenii Rudnyi

Feb 28, 2012, 2:21:09 PM2/28/12
Steve,

Could please try your notation to the IT devices that surround us (a
hard disk, a flesh memory, a DVD, etc.)? Will it work to describe for
example their information capacity?

Evgenii

On 28.02.2012 00:40 Steve McGrew said the following:
> Evgenii,
>

> A crystal in its lowest energy state) has low algorithmic complexity, and
> therefore low information content, because one sentence or a short formula can
> describe the arrangement of its atoms in complete detail. A bottle of CO2 has
> high algorithmic complexity, and therefore high information content, because the
> position and momentum, orientation, electronic state and spin vector of every
> molecule would need to be listed in order to describe it in complete detail.
>
> However, we are usually satisfied to describe the bottle of CO2 with a few
> numbers: volume, pressure, and temperature, because that is enough for us to
> calculate its behavior in situations that usually matter to us. If the crystal,
> on the other hand, has an appropriately structured arrangement of electronic
> states, it might be able to perform complex calculations or even accurately
> model a bottle of CO2 down to the molecular level. The bottle of CO2 simply is
> itself; the crystal simulating the bottle of CO2 needs its electronic states
> highly structured.
>
> According to the widely accepted notion, a system with the absolute maximum
> algorithmic complexity necessarily contains the maximum possible information.
> And, it must contain a minimum of internal correlations because every
> correlation reduces the algorithmic complexity. So, the arrangement of its parts
> need to be maximally random according to all possible measures of randomness.
> This equivalence of maximum randomness and maximum complexity suggests to me

> that /neither/ randomness nor algorithmic complexity refer to things of

### Evgenii Rudnyi

Feb 28, 2012, 2:29:11 PM2/28/12
I have another question. What is the meaning of "entropy describes fully
the state of the gas"?

The entropy is a function of a number of moles, temperature and pressure
S(n, T, p). If you take the same gas, I am pretty sure that you will
find many states

S(n1, T1, p1) = S(n2, T2, p2) = S(n3, T3, p3) = ...

So it is not clear what you mean. The situation is even more complex as
there are many different gases, say O2, N2, Ar, and so on. Again we can
find many states when the entropy has the same numerical values for
different gases.

Evgenii

On 28.02.2012 17:46 Steve McGrew said the following:

### Steve McGrew

Feb 28, 2012, 2:46:03 PM2/28/12
Evgenii,
I think that it is reasonable to equate the information capacity of a system to the number of distinct accessible states, or perhaps to log(2) of that number if we want to talk about bits.

In a flash memory, there are specific locations with binary states.  Each location, of course, is composed of a large number of atoms so the state of a location is a collective state of the atoms in that location.  In a hard disk, the locations are not necessarily pre-defined, but again the states of the locations are collective states.  The number of locations is the number of bits, and the number of accessible states is 2^(number of bits)

### Evgenii Rudnyi

Feb 28, 2012, 3:12:30 PM2/28/12
I agree but then it does not relate for example with

>>> A crystal in its lowest energy state) has low algorithmic
complexity, and
>>> therefore low information content, because one sentence or a short
formula can
>>> describe the arrangement of its atoms in complete detail. A bottle
of CO2 has
>>> high algorithmic complexity, and therefore high information
content, because the

You cannot use gas to store information for IT, you need a solid, and
higher temperatures are not good for IT either.

Evgenii

On 28.02.2012 20:46 Steve McGrew said the following:

### Steve McGrew

Feb 28, 2012, 4:18:49 PM2/28/12
Evgenii,
I guess I didn't understand your question the first time.
Of course it is difficult to store information (for doing computations) in a gas specifically because the states are not easily accessible.  Accessibility requires a sizable degree of "structure", but of course a gas has nearly negligible structure.  To the extent that the relevant structural features are degraded by increased temperatures, heat can reduce the accessibility of states and thus reduce the effective information storage capacity.  However, if there is no structure then the amount of information required to completely describe the state of the system is maximized even if the useful information storage capacity is consequently minimized.

So, the bottom line is that the term "information" has several meanings that are related but not equivalent.  In any debate about such things, the participants need to make sure they are using the same meanings.

I guess "useful information storage capacity" is loosely comparable to what I've termed "structure".

Steve

### William R. Buckley

Feb 29, 2012, 1:15:27 AM2/29/12
The form of energy in the adiabatic vessel with no volume is infrared radiation.  And, how is this detail of importance to the question that I asked?

### Steve McGrew

Feb 29, 2012, 9:36:27 AM2/29/12
Hi William,
It is important because electromagnetic radiation has entropy.  Coherent laser light has extremely low entropy.  Thermal radiation that is absorbed and re-emitted by the interior walls of a closed vessel has very high entropy.

Steve

### Evgenii Rudnyi

Feb 29, 2012, 3:14:22 PM2/29/12
Steve,

Could you please say what is the relationship between information in IT
and in the CO2?

"A bottle of CO2 has high algorithmic complexity, and therefore high
information content"

I completely agree that information has several meaning and this is
probably the main problem when people discuss about the entropy and
information. But then it would be good to make definitions for different
meaning of information. It might help to understand the problem better.

Evgenii

On 28.02.2012 22:18 Steve McGrew said the following:

> Evgenii,
> I guess I didn't understand your question the first time.
> Of course it is difficult to store information (for doing computations) in
> a gas specifically because the states are not easily accessible.
> Accessibility requires a sizable degree of "structure", but of course a gas
> has nearly negligible structure. To the extent that the relevant
> structural features are degraded by increased temperatures, heat can reduce

> the accessibility of states and thus reduce the effective *information
> storage capacity*. However, if there is no structure then the amount of

### William R. Buckley

Feb 29, 2012, 3:19:22 PM2/29/12
Well then, choose your poison.  I'll go to the extreme and rely upon the uncertainty that is part and parcel of quantum mechanics!

While it is generally understood that such a case is nigh on impossible, it is also accepted to be a possibility, that a coffee cup
which is now at a temperature of 10 degrees C might suddenly obtain a large input of energy, and so be found at a temperature
of 20 degrees C.  Highly unlikely but not impossible under the strictures of QM.

I don't care the form of the energy, so you pick one with no inherent entropy.  Heck, idealised problems are the life-blood of physics.
This is how Einstein was able to understand the limit of travel, and the consequences to observation; such as by travel at the speed
of light.

So, the argument is that suddenly there is the appearance of a unit of energy that has no inherent entropy and which alters the
state of the adiabatically contained gas.

wrb

### Steve McGrew

Feb 29, 2012, 4:34:33 PM2/29/12
Evgenii,

In CO2, "high algorithmic complexity" and the corresponding high information content, equates to the number of bits needed to specify the position and momentum (within the uncertainties of QM) of each molecule.  Of course, a complete description would also need to include the atomic-level details of the vessel containing the CO2.  Because there are very few correlations in a gas, the number of bits is on the order of the number of molecules in the gas-- on the order of 10^24.  The number of different states is horrendously large: on the order of 2^(10^24)

In, say, a flash memory, we do not care about the atomic-level details of the device.  Rather, we care only about the accessible states of the device: those states we can control and detect via the input and output ports of the device.  The biggest currently available flash memories have a number of accessible bits on the order of 10^12.

The number of accessible bits if we wanted to use a liter of gas at room temperature and normal atmospheric pressure as a computer data storage device is probably limited to about ten bits ((2^10) accessible states: whatever is necessary to specify the temperature to a practical degree of precision).  This is because we cannot control the motions of individual molecules; all we can do is heat or cool the gas and measure its temperature.

The number of accessible bits in a terabyte flash drive is about 8 trillion bits ( 2^(8 trillion) states).

The algorithmic complexity of  the contents of a flash drive depends on the contents.  If the whole drive is filled with logical "0's", then the algorithmic complexity is very low because it would take only a very small number of bits to say, "filled completely with zeroes".  This would be analogous to the algorithmic complexity of a CO2 crystal at absolute zero.  If the whole drive is filled with a maximally compressed portion of the Library of Congress, it would contain a collection of 0's and 1's with no detectable internal correlations.  That is, it would be random according to all statistical measures.  In order to describe the contents of the drive in that case, the state of each bit would need to be specified individually.  It would require at least 8 trillion bits to describe it.

In a living organism, the position and momentum of each molecule makes very little difference to the behavior and survival of the organism.  In a cell membrane, there is some degree of organization that is important.  The arrangement of the various organelles and cytoskeletal components is important but does not need to be controlled to the last detail.  Similarly, the arrangement of the various types of cells and extracellular components of a multicellular creature does not need to be controlled to the last detail.  If we fully understood a simple organism, we would be able to specify its state adequately in a relatively small number of bits, I'd guess something on the order of a thousand bits. (I could be off by a couple of orders of magnitude, but don't think so).  "Adequately" here means well enough to predict its behavior accurately.  On the other hand, it might take a substantially larger number of bits to specify the equations needed to *predict* the behavior, given the state.  The accessible information capacity of a bacterium would correspond roughly to the amount of information we could write into it and get back out without killing it.  Maybe that is on the order of ten thousand bits, maybe a couple of orders of magnitude higher.  But to specify the bacterium's structure in full detail -- to transmit it in a Star Trek type transporter -- would take something on the order of 10^18 bits.

When we describe the information contained in a flash memory, we ignore the structure.  But if we are comparing different kinds of data storage media, the atomic-level structure is very important.  In the first case, "information" is what we store in the device.  In the second case, "information" might be the full set of process steps and blueprints needed to make the device.

Regards,
Steve

### Steve McGrew

Feb 29, 2012, 4:57:45 PM2/29/12
William,
If an unconstrained, collisionless blob of gas is illuminated by a fully coherent (zero entropy) laser beam,  I think its entropy does not change.  The distribution of states in the gas changes, but in a highly constrained way.

Lots of experiments have been done in which seemingly irreversible processes can be reversed. In those cases, it is clear that the *information* needed to specify the original state is transformed, but not lost.  Reversing the transformation restores the original state.  Although it *looks* like entropy has increased when the transformation is first done, it turns out that it really hasn't.

If I understand your thought experiment correctly, the second vessel has zero volume, so the fact that it's a vessel is irrelevant.  You just want to abruptly introduce zero-entropy energy into the gas in the first vessel.  This of course is very different from illuminating an unconstrained blob of gas with a pulse of laser light.  In a very short time after the pulse, every molecule in the gas has hit other molecules or has hit the walls of the vessel, rebounding in unpredictable ways.

You're right, of course, that quantum mechanics complicates things.  If molecules were billiard balls, then complete knowledge of the initial state of gas and vessel and the details of the laser pulse would be sufficient to predict any later state, and therefore sufficient to describe the later state after a given time.  So, the entropy would not change.

Steve

### Evgenii Rudnyi

Mar 1, 2012, 2:46:48 PM3/1/12
Steve,

I see a big difference between a flash memory and a gas. We write
information on a flesh memory in order to use it. The same concerns
books. A book is written to read it.

A gas, on the other hand, is just a gas. When I model it, I need to
define some variables, this is true. But this concern the gas model and
not the gas as such. When you speak about information to define a gas
model, I could understand. When you speak about information in the gas
as such, I cannot follow you.

Evgenii

P.S. By the way, I am not sure if I understand why you say that the
content of the Library of Congress is random. I would say not. Either
the content is compressed or not, in my view, this does not change the
fact that the information in the books is not random.

Even if the compressed form looks random, it is actually not, as one can
decompress the archive and restore normal books.

On 29.02.2012 22:34 Steve McGrew said the following:
> Evgenii,
>

> bits to specify the /equations/ needed to *predict* the behavior, given the

### Evgenii Rudnyi

Mar 1, 2012, 2:49:57 PM3/1/12
On 29.02.2012 22:57 Steve McGrew said the following:

...

> Lots of experiments have been done in which seemingly irreversible processes can
> be reversed. In those cases, it is clear that the *information* needed to
> specify the original state is transformed, but not lost. Reversing the
> transformation restores the original state. Although it *looks* like entropy has
> increased when the transformation is first done, it turns out that it really hasn't.

Could you please give an example? But please not a thought experiment,
rather a real one.

Evgenii

### Steve McGrew

Mar 1, 2012, 6:36:11 PM3/1/12
Evgenii,
Of course there are enormous differences. I just wanted to note that a vessel filled with gas could, in fact, serve as a memory element if its temperature were controlled and detected by a computer.  For example, if the gas temperature is over 100 degrees C it might represent a "1", and if below 100 C it might represent a "0". Or, if the temperature could be controlled and detected to within one degree C in the range from 100 K to 400K, it could be in any of 300 states -- on the order of six or seven bits.  It certainly wouldn't be a very useful memory element!

A maximally compressed digital version of any information (i.e., the Library of Congress) is devoid of internal correlations -- because those internal correlations are what is reduced by data compression.  "Maximally compressed" would mean that there are no further internal correlations to exploit for compression.

What distinguishes a random sequence of bits from a nonrandom sequence is precisely the internal correlations.  The "degree of randomness" of a sequence corresponds to the length of the shortest possible algorithm that can generate the same sequence.  Any algorithm to generate the sequence efficiently will take advantage of internal correlations in the sequence. That is, it corresponds to the algorithmic complexity of the sequence.

A maximally compressed version of a sequence *IS* effectively the shortest algorithm that can generate the original sequence.  In practice, no attempt is made to maximally compress data.  Instead, a standard algorithm is used to detect and exploit a very limited subset of the possible correlations to produce a new sequence which, when run through an inverse of the algorithm, will regenerate the original sequence.  Consequently, some correlations always remain after ordinary data compression.

When you say, "
Even if the compressed form looks random, it is actually not, as one can decompress the archive and restore normal books", I think you're stepping into a very messy topic.  It is not possible to decompress the archive without using the decompression algorithm, so the decompression algorithm effectively contains part of the information that's to be decompressed.  A bigger compression/decompression algorithm can allow a higher degree of compression.

Embryo development provides a very good analogy.  DNA is a highly compressed description of an organism (highly compressed, not maximally compressed).  The decompression algorithm resides in both the DNA and in the cellular machinery.  The genetic code and all the cellular machinery are themselves encoded in the DNA, but cannot do their job unless there is already some tRNA along with other key machinery waiting in the fertilized ovum to "boot up" the system.  And, it's worth noting that the principles of physics and chemistry are crucial parts of the decompression algorithm.  If we sent all the DNA in an e. Coli bacterium to alien scientists on Alpha Centauri, they would not be able to "decompress" it.  If we included a set of tRNAs, they just might have a chance of success.

We can assign a meaning to a randomly generated string of bits, and as long as we don't forget what meaning we assigned to the string, the string has that meaning and is thus no longer "random" in the sense that you're saying that the compressed archive is non-random..  If we forget, the string has not changed but it has lost its meaning.  If we lose the decompression algorithm, the compressed string that "contains" the Library of Congress can lose its meaning.

You asked for an example of a real experiment in which seemingly irreversible processes can be reversed.  Spin echo and photon echo experiments are real examples.  Here is a good one involving viscous liquid flow: http://www.youtube.com/watch?v=p08_KlTKP50.  In the photon & spin echo experiments, the 180 degree phase reversal pulse serves essentially the same purpose as reversing the direction of rotation in the viscous liquid experiment.

Regards,
Steve

### Newman, Stuart

Mar 1, 2012, 9:58:42 PM3/1/12
In my opinion there is no sense at all in which DNA is a "description of an organism," compressed or not. To believe it is completely invalidates the notion of "embryo physics." Is DNA supposed to encode the laws of physics? Or are physical laws just the constant background to everything, applying equally and everywhere to every parcel of matter?

Stuart

________________________________

Sent: Thursday, March 01, 2012 6:36 PM
Subject: Re: EmbryoPhysics176:: Entropy and information

Regards,
Steve

Evgenii

Regards,
Steve

Evgenii

### Steve McGrew

Mar 1, 2012, 10:18:17 PM3/1/12
Stuart,
Physical laws are the background principles that drive the processes.
I probably haven't communicated what I mean by "description of an organism", if you think that to believe DNA is a "description of an organism" invalidates the notion of embryo physics.

Once I wrote a simple LOGO program that generated a forest of trees on a computer screen..  It could easily generate a forest containing vastly more pixels than the number of bits in the program.  Every time I ran the program, it generated a different forest with different trees, but the forest and trees had the same general character each time.  They always looked like spruce trees or oak trees, depending on the small handful of control parameters I gave to the program.

Maybe you wouldn't say that the LOGO program was a "description" of the trees or forest.  What word would you use?  It wasn't a blueprint, because the trees and forest were different each time.  It was a set of rules, and it made the forest grow but did not fully control the growth of the forest (because it included random functions).  It was a "prescription", maybe.

Steve

### Dr. Richard Gordon Ph.D.

Mar 1, 2012, 10:43:07 PM3/1/12
to embryo...@googlegroups.com, Stuart A. Newman, Stephen P. McGrew
Thursday, March 1, 2012 10:35 PM, Panacea, FL, USA

Dear Stuart & Steve,
I tend to agree with Steve. What I call his LOGO program is “the genetic program”. But the physics is the component of the genetic program that is usually taken for granted. In the case of the LOGO program, this physics has to do with the operation of the compiler and the structure and operation of the computer. In the case of the embryo the physics is what the Embryo Physics Course is about: forces generated and responded to by cells, cytoskeletal physics, differentiation waves, etc. That there is a stochastic component is part of the physics, whether it be due to a random number generator or Brownian motion.
Yours, -Dick

On 2012-03-01, at 10:18 PM, Steve McGrew wrote:

> Stuart,
> Physical laws are the background principles that drive the processes.
> I probably haven't communicated what I mean by "description of an organism", if you think that to believe DNA is a "description of an organism" invalidates the notion of embryo physics.
>
> Once I wrote a simple LOGO program that generated a forest of trees on a computer screen.. It could easily generate a forest containing vastly more pixels than the number of bits in the program. Every time I ran the program, it generated a different forest with different trees, but the forest and trees had the same general character each time. They always looked like spruce trees or oak trees, depending on the small handful of control parameters I gave to the program.
>
> Maybe you wouldn't say that the LOGO program was a "description" of the trees or forest. What word would you use? It wasn't a blueprint, because the trees and forest were different each time. It was a set of rules, and it made the forest grow but did not fully control the growth of the forest (because it included random functions). It was a "prescription", maybe.
>
> Steve

Dr. Richard (Dick) Gordon
Theoretical Biologist, Embryogenesis Center
Gulf Specimen Marine Laboratory (http://www.gulfspecimen.org)
Visiting Professor, Micro & Nanotechnology Institute, Old Dominion University
1-(850) 745-5011 or Skype: DickGordonCan
DickGo...@gmail.com

### Newman, Stuart

Mar 2, 2012, 10:48:24 AM3/2/12

Steve,

I understand that you can write a program that generates tree morphologies. But you designed the program. An organism’s DNA does not contain such a program. The program, if you want to call it that, resides in the entire material composition of the organism’s zygote, and only part of that is inscribed in DNA sequence.

The forms that we see unfolding in a present-day organism are not the execution of information in the DNA, but outcomes of a complex set of physical processes, only some of which are predictable based on the physics acting on the  contemporary materials (including the DNA). Some of the forms arose much earlier in evolutionary history based on the cellular materials present at that time and the physical effects relevant to those materials.

Those original forms (if they were consistent with survival) acted as structural templates for subsequent canalizing evolution, so that the present-day unfolding process can neither be attributed to present-day DNA, or present-day DNA plus present-day physics. The explanation of the forms and the means of their generation must also take the historical dimension into account. The DNA sequence reflect this history, but only partially, and not in the form of a program.

Stuart

### William R. Buckley

Mar 2, 2012, 11:42:03 AM3/2/12
On the notion of genes as program, I tend to disagree with the trend in biology, which is to deny the correlation between the software of a computer and the genes of organisms.

I rather think the analogy is quite accurate.  The key is abandoning the notion that all software is procedural.  The genes of any organism, as a collective, act in the fashion of a highly
parallel program which is functional, not procedural.

wrb

### William R. Buckley

Mar 2, 2012, 11:49:13 AM3/2/12