Information: a basic physical quantity or rather emergence/supervenience phenomenon

48 views
Skip to first unread message

Evgenii Rudnyi

unread,
Jan 15, 2012, 3:54:00 PM1/15/12
to everyth...@googlegroups.com
On 14.01.2012 08:21 John Clark said the following:
> On Thu, Jan 12, 2012 Craig Weinberg<whats...@gmail.com> wrote:

> For heavens sake, I went into quite a lot of detail about how the
> code is executed so that protein gets made, and it could not be more
> clear that the cell factory contains digital machines.
>
>> They are not information.
>>
>
> According to you nothing is information and that is one reason it is
> becoming increasingly difficult to take anything you say seriously.

I should say that I also have difficulty with the term information. A
question would for example if information belongs to physics or not.
Some physicists say that information is related to the entropy and as
such it is a basic physical quantity. I personally do not buy it, as
thermodynamics, as it has been designed, had nothing to do with
information and information as such brings nothing to help to solve
thermodynamics problem (more to this end in [1]).

Let us consider for example a conventional thermodynamic problem:
improving efficiency of a motor. Is the information concept is helpful
to solve this problem? If we look at modern motors, then we see that
nowadays they are working together with controllers that allows us to
drive the efficiency to the thermodynamic limit. The term information is
helpful indeed to develop a controller but what about the thermodynamic
limit of a motor? Does information helps here? In my view, not.

In the Gray's book on consciousness (Consciousness: Creeping up on the
Hard Problem.) there is an interesting statement on if physics is enough
to explain biology. Gray's answer is yes provided we add cybernetics
laws and evolution. Let me leave evolution aside and discuss the
cybernetics laws only as this is exactly where, I think, information
comes into play. A good short video from the Artificial Intelligence
Class that I have recently attended would be a good introduction (an
intelligent agent sensing external information and then acting):

http://www.youtube.com/watch?v=cx3lV07w-XE

Thus, the question would be about the relationship between physics and
cybernetics laws. When we consider the Equation of Everything, are the
cybernetics laws already there or we still need to introduce them
separately? One of possible answers would be that the cybernetics laws
emerge or supervene on the physics laws. I however does not understand
what this means. It probably has something to do with a transition
between quantity and quality, but I do not understand how it happens
either. For myself, it remains a magic.

Let me repeat a series from physical objects discussed already recently
(see also [2][3]):

1) A rock;
2) A ballcock in the toilet;
3) A self-driving car;
4) A living cell.

Where do we have the cybernetics laws (information) and where not? Can
physics describe these objects without the cybernetics laws? What
emergence and superveniece mean along this series? Any idea?

Evgenii

[1] http://blog.rudnyi.ru/2010/12/entropy-and-artificial-life.html
[2] http://blog.rudnyi.ru/2011/01/perception-feedback-and-qualia.html
[3] http://blog.rudnyi.ru/2011/02/rock-and-information.html

Craig Weinberg

unread,
Jan 15, 2012, 8:02:19 PM1/15/12
to Everything List
On Jan 15, 3:54 pm, Evgenii Rudnyi <use...@rudnyi.ru> wrote:
> On 14.01.2012 08:21 John Clark said the following:
>  > On Thu, Jan 12, 2012  Craig Weinberg<whatsons...@gmail.com>  wrote:
>
> …
>
>  > For heavens sake, I went into quite a lot of detail about how the
>  > code is executed so that protein gets made, and it could not be more
>  > clear that the cell factory contains digital machines.
>  >
>  >> They are not information.
>  >>
>  >
>  > According to you nothing is information and that is one reason it is
>  > becoming increasingly difficult to take anything you say seriously.
>
> I should say that I also have difficulty with the term information. A
> question would for example if information belongs to physics or not.
> Some physicists say that information is related to the entropy and as
> such it is a basic physical quantity. I personally do not buy it, as
> thermodynamics, as it has been designed, had nothing to do with
> information and information as such brings nothing to help to solve
> thermodynamics problem (more to this end in [1]).

Yes! The word information in my opinion only applies to an agent which
can be informed. Without such an agent, information cannot exist, even
as a potential. Once you have an agent that has experience, the
character of that experience can be modified by the agent being
'informed' by an experience which changes how that agent perceives,
responds, and acts in the future. Information is a process which
begins and ends with an agent's experience.

>
> Let us consider for example a conventional thermodynamic problem:
> improving efficiency of a motor. Is the information concept is helpful
> to solve this problem? If we look at modern motors, then we see that
> nowadays they are working together with controllers that allows us to
> drive the efficiency to the thermodynamic limit. The term information is
> helpful indeed to develop a controller but what about the thermodynamic
> limit of a motor? Does information helps here? In my view, not.
>
> In the Gray's book on consciousness (Consciousness: Creeping up on the
> Hard Problem.) there is an interesting statement on if physics is enough
> to explain biology. Gray's answer is yes provided we add cybernetics
> laws and evolution. Let me leave evolution aside and discuss the
> cybernetics laws only as this is exactly where, I think, information
> comes into play. A good short video from the Artificial Intelligence
> Class that I have recently attended would be a good introduction (an
> intelligent agent sensing external information and then acting):
>
> http://www.youtube.com/watch?v=cx3lV07w-XE

Nice. His concept of 'perception action cycle' is what I call
sensorimotivation. The problem is that he relies on third person
structure called a 'control policy'. While this control policy concept
that maps sensors to actuators is entirely appropriate for programming
mechanisms (since they can't program themselves unless they are made
of self-programming materials like living cells). I think that I
understand that the problem with this is that in neurological agents,
the sensor and the actuator is the same thing so that no control
policy is needed. In the case of a human nervous system, it functions
as a whole system with afferent and efferent nerves being organically
specialized divisions which make up the sensorimotive capacity of one
human being.

With a rock, you have a very low sensorimotive development. It knows
how to respond to it's environment in terms of heat and pressure,
velocity, fracture, etc. With a ballcock in a toilet you have separate
parts, each with a very low, rock-like sensorimotive capacity. If the
parts were literally parts of the same thing like our afferent and
efferent nerves are part of a nervous system, then you would have
something with slightly less primitive characteristics. Maybe on par
with a bubble of oil in water. The problem is that we overlook the
fact that the assembly inside the toilet is only our human reading of
these attached separate parts. The parts don't know that they are
attached. They handle doesn't know the reason that the ballcock is
moving it up and down. Not the case with a nervous system. It knows
what the body it is a part of is doing. It all came from one single
cell, not 12 different factories. It's a completely different thing in
reality, but our perception is hard to question so fundamentally. It's
like trying to not read these words as English.

>
> Thus, the question would be about the relationship between physics and
> cybernetics laws. When we consider the Equation of Everything, are the
> cybernetics laws already there or we still need to introduce them
> separately? One of possible answers would be that the cybernetics laws
> emerge or supervene on the physics laws. I however does not understand
> what this means. It probably has something to do with a transition
> between quantity and quality, but I do not understand how it happens
> either. For myself, it remains a magic.
>
> Let me repeat a series from physical objects discussed already recently
> (see also [2][3]):
>
> 1) A rock;
> 2) A ballcock in the toilet;
> 3) A self-driving car;
> 4) A living cell.

A self-driving car is the same as the ballcock. An assembly of dumb
parts which we program to simulate what seems like (trivial)
intelligence to us. In reality it has no more intelligence than
watching a cartoon of living cars. A living cell is nothing like any
of the other examples. A dead cell would be comparable to a rock, but
a living cell has a sensorimotive capacity which cannot be reduced
beneath the cellular level. It is a biological atom.

>
> Where do we have the cybernetics laws (information) and where not? Can
> physics describe these objects without the cybernetics laws? What
> emergence and superveniece mean along this series? Any idea?

We have cybernetics laws only where we can control the behavior of
matter. They aren't very useful for understanding our experience.
Physics can only describe the components of the objects, but the
function of the assemblies of objects are subject to interpretation
rather than physical law. A self-driving car is just electronic parts
that happen to be in a car which we understand to be driving 'itself',
but is actually just executing a program based on an abstract model of
driving.

Emergence is useful only in the context of self-evident models. A
triangle can emerge from three points, a square from four, but the
smell of cabbage cannot emerge from any quantity of points. Cybernetic
laws supervene on sensorimotive pattern recognition, not the other way
around. A toilet ballcock can only perceive the mechanical forces
being applied to it's various physical parts. It has no capacity to
recognize it's extended context and thus can't ever know if it's
broken or try to fix itself like a living cell can.

Craig

John Clark

unread,
Jan 18, 2012, 12:47:12 PM1/18/12
to everyth...@googlegroups.com
On Sun, Jan 15, 2012 at 3:54 PM, Evgenii Rudnyi <use...@rudnyi.ru> wrote:

 " Some physicists say that information is related to the entropy"

That is incorrect, ALL physicists say that information is related to entropy. There are quite a number of definitions of entropy, one I like, although not as rigorous as some it does convey the basic idea:  entropy is a measure of the number of ways the microscopic structure of something can be changed without changing the macroscopic properties. Thus, the living human body has very low entropy because there are relatively few changes that could be made in it without a drastic change in macroscopic properties, like being dead; a bucket of water has a much higher entropy because there are lots of ways you could change the microscopic position of all those water molecules and it would still look like a bucket of water; cool the water and form ice and you have less entropy because the molecules line up into a orderly lattice so there are fewer changes you could make. The ultimate in high entropy objects is a Black Hole because whatever is inside one on the outside any Black Hole can be completely described with just 3 numbers, its mass, spin and electrical charge.

  John K Clark 


Evgenii Rudnyi

unread,
Jan 18, 2012, 2:13:07 PM1/18/12
to everyth...@googlegroups.com
On 18.01.2012 18:47 John Clark said the following:

If you look around you may still find species of scientists who still
are working with classical thermodynamics (search for example for
CALPHAD). Well, if you refer to them as physicists or not, it is your
choice. Anyway in experimental thermodynamics people determine
entropies, for example from CODATA tables

http://www.codata.org/resources/databases/key1.html

S � (298.15 K)
J K-1 mol-1

Ag cr 42.55 � 0.20
Al cr 28.30 � 0.10

Do you mean that 1 mole of Ag has more information than 1 mole of Al at
298.15 K?

Also remember that at constant volume dS = (Cv/T) dT and dU = CvdT. If
the entropy is information then its derivative must be related to
information as well. Hence Cv must be related to information. This
however means that the energy also somehow related to information.

Finally, the entropy is defined by the Second Law and the best would be
to stick to this definition. Only in this case, it is possible to
understand what we are talking about.

Evgenii
--
http://blog.rudnyi.ru

Russell Standish

unread,
Jan 18, 2012, 5:42:43 PM1/18/12
to everyth...@googlegroups.com
> Ag cr 42.55 ą 0.20
> Al cr 28.30 ą 0.10

>
> Do you mean that 1 mole of Ag has more information than 1 mole of Al
> at 298.15 K?
>
> Also remember that at constant volume dS = (Cv/T) dT and dU = CvdT.
> If the entropy is information then its derivative must be related to
> information as well. Hence Cv must be related to information. This
> however means that the energy also somehow related to information.
>
> Finally, the entropy is defined by the Second Law and the best would
> be to stick to this definition. Only in this case, it is possible to
> understand what we are talking about.
>
> Evgenii
> --
> http://blog.rudnyi.ru
>

Evgenii, while you may be right that some physicists (mostly
experimentalists) work in thermodynamics without recourse to the
notion of information, and chemists even more so, it is also true that
the modern theoretical understanding of entropy (and indeed
thermodynamics) is information-based.

This trend really became mainstream with Landauer's work demonstrating
thermodynamic limits of information processing in the 1960s, which
turned earlier speculations by the likes of Schroedinger and Brillouin
into something that couldn't be ignored, even by experimentalists.

This trend of an information basis to physics has only accelerated
in my professional lifetime - I've seen people like Hawking discuss
information processing of black holes, and we've see concepts like the
Beckenstein bound linking geometry of space to information capacity.

David Deutsch is surely backing a winning horse to point out that
algorithmic information theory must be a foundational strand of the
"fabric of reality".

Cheers

--

----------------------------------------------------------------------------
Prof Russell Standish Phone 0425 253119 (mobile)
Principal, High Performance Coders
Visiting Professor of Mathematics hpc...@hpcoders.com.au
University of New South Wales http://www.hpcoders.com.au
----------------------------------------------------------------------------

meekerdb

unread,
Jan 19, 2012, 12:37:13 AM1/19/12
to everyth...@googlegroups.com
> Ag cr 42.55 ą 0.20
> Al cr 28.30 ą 0.10

>
> Do you mean that 1 mole of Ag has more information than 1 mole of Al at 298.15 K?

Yes, it has more internal degrees of freedom so that it takes addition of more energy in
order to increase those we measure as temperature.

Brent

Craig Weinberg

unread,
Jan 19, 2012, 10:21:25 AM1/19/12
to Everything List
This suggests to me that a molecule of DNA belonging to a kangaroo
could have no more information than the same molecule with the primary
sequence scrambled into randomness or 'blanked out' with a single
repeating A-T base pair. That would seem to make this definition of
information the exact opposite of the colloquial meaning of the term.
A blank hard drive could have more information as one full of billions
of documents if the platters were at a different temperatures?

Craig

meekerdb

unread,
Jan 19, 2012, 12:36:48 PM1/19/12
to everyth...@googlegroups.com

That's because the colloquial meaning of the terms takes into account the environment and
which form of information can be causally effective.

Brent

Evgenii Rudnyi

unread,
Jan 19, 2012, 2:03:41 PM1/19/12
to everyth...@googlegroups.com
Russell,

I know that many physicists identify the entropy with information.
Recently I had a nice discussion on biotaconv and people pointed out
that presumably Edwin T. Jaynes was the first to make such a connection
(Information theory and statistical mechanics, 1957). Google Scholar
shows that his paper has been cited more than 5000 times, that is
impressive and it shows indeed that this is in a way mainstream.

I have studied Jaynes papers but I have been stacked with for example

“With such an interpretation the expression “irreversible process”
represents a semantic confusion; it is not the physical process that is
irreversible, but rather our ability to follow it. The second law of
thermodynamics then becomes merely the statement that although our
information as to the state of a system may be lost in a variety of
ways, the only way in which it can be gained is by carrying out further
measurements.”

“It is important to realize that the tendency of entropy to increase is
not a consequence of the laws of physics as such, … . An entropy
increase may occur unavoidably, due to our incomplete knowledge of the
forces acting on a system, or it may be entirely voluntary act on our part.”

This is above of my understanding. As I have mentioned, I do not buy it,
I still consider the entropy as it has been defined by for example Gibbs.

Basically I do not understand what the term information then brings. One
can certainly state that information is the same as the entropy (we are
free with definitions after all). Yet I miss the meaning of that. Let me
put it this way, we have the thermodynamic entropy and then the
informational entropy as defined by Shannon. The first used to designe a
motor and the second to design a controller. Now let us suppose that
these two entropies are the same. What this changes in a design of a
motor and a controller? In my view nothing.

By the way, have you seen the answer to my question:

>> Also remember that at constant volume dS = (Cv/T) dT and dU =
>> CvdT. If the entropy is information then its derivative must be
>> related to information as well. Hence Cv must be related to
>> information. This however means that the energy also somehow
>> related to information.

If the entropy is the same as information, than through the derivatives
all thermodynamic properties are related to information as well. I am
not sure if this makes sense in respect for example to design a
self-driving car.

I am aware of works that estimated the thermodynamic limit (kT) to
process information. I do not see however, how this proves the
equivalence of information and entropy.

Evgenii

P.S. For a long time, people have identified the entropy with chaos. I
have recently read a nice book to this end, Entropy and Art by Arnheim,
1971, it is really nice. One quote:

"The absurd consequences of neglecting structure but using the concept
of order just the same are evident if one examines the present
terminology of information theory. Here order is described as the
carrier of information, because information is defined as the opposite
of entropy, and entropy is a measure of disorder. To transmit
information means to induce order. This sounds reasonable enough. Next,
since entropy grows with the probability of a state of affairs,
information does the opposite: it increases with its improbability. The
less likely an event is to happen, the more information does its
occurrence represent. This again seems reasonable. Now what sort of
sequence of events will be least predictable and therefore carry a
maximum of information? Obviously a totally disordered one, since when
we are confronted with chaos we can never predict what will happen next.
The conclusion is that total disorder provides a maximum of information;
and since information is measured by order, a maximum of order is
conveyed by a maximum of disorder. Obviously, this is a Babylonian
muddle. Somebody or something has confounded our language."

--
http://blog.rudnyi.ru


On 18.01.2012 23:42 Russell Standish said the following:

Evgenii Rudnyi

unread,
Jan 19, 2012, 2:06:38 PM1/19/12
to everyth...@googlegroups.com
On 19.01.2012 06:37 meekerdb said the following:

> On 1/18/2012 11:13 AM, Evgenii Rudnyi wrote:

...

>> If you look around you may still find species of scientists who
>> still are working with classical thermodynamics (search for example
>> for CALPHAD). Well, if you refer to them as physicists or not, it
>> is your choice. Anyway in experimental thermodynamics people
>> determine entropies, for example from CODATA tables
>>
>> http://www.codata.org/resources/databases/key1.html
>>
>> S ° (298.15 K) J K-1 mol-1
>>
>> Ag cr 42.55 ą 0.20 Al cr 28.30 ą 0.10
>>
>> Do you mean that 1 mole of Ag has more information than 1 mole of
>> Al at 298.15 K?
>
> Yes, it has more internal degrees of freedom so that it takes
> addition of more energy in order to increase those we measure as
> temperature.

Could you please explain then why engineers do not use the CODATA/JANAF
Tables to find the best material to keep information?

Evgenii

meekerdb

unread,
Jan 19, 2012, 2:41:01 PM1/19/12
to everyth...@googlegroups.com
On 1/19/2012 11:06 AM, Evgenii Rudnyi wrote:
> On 19.01.2012 06:37 meekerdb said the following:
>> On 1/18/2012 11:13 AM, Evgenii Rudnyi wrote:
>
> ...
>
>>> If you look around you may still find species of scientists who
>>> still are working with classical thermodynamics (search for example
>>> for CALPHAD). Well, if you refer to them as physicists or not, it
>>> is your choice. Anyway in experimental thermodynamics people
>>> determine entropies, for example from CODATA tables
>>>
>>> http://www.codata.org/resources/databases/key1.html
>>>
>>> S ° (298.15 K) J K-1 mol-1
>>>
>>> Ag cr 42.55 ą 0.20 Al cr 28.30 ą 0.10
>>>
>>> Do you mean that 1 mole of Ag has more information than 1 mole of
>>> Al at 298.15 K?
>>
>> Yes, it has more internal degrees of freedom so that it takes
>> addition of more energy in order to increase those we measure as
>> temperature.
>
> Could you please explain then why engineers do not use the CODATA/JANAF Tables to find
> the best material to keep information?

Because they are interested in information that they can insert and retrieve. I once
invented write-only-memory, but it didn't sell. :-)

Brent

Evgenii Rudnyi

unread,
Jan 19, 2012, 3:34:32 PM1/19/12
to everyth...@googlegroups.com
On 19.01.2012 20:41 meekerdb said the following:

> On 1/19/2012 11:06 AM, Evgenii Rudnyi wrote:
>> On 19.01.2012 06:37 meekerdb said the following:
>>> On 1/18/2012 11:13 AM, Evgenii Rudnyi wrote:
>>
>> ...
>>
>>>> If you look around you may still find species of scientists
>>>> who still are working with classical thermodynamics (search for
>>>> example for CALPHAD). Well, if you refer to them as physicists
>>>> or not, it is your choice. Anyway in experimental
>>>> thermodynamics people determine entropies, for example from
>>>> CODATA tables
>>>>
>>>> http://www.codata.org/resources/databases/key1.html
>>>>
>>>> S ° (298.15 K) J K-1 mol-1
>>>>
>>>> Ag cr 42.55 ą 0.20 Al cr 28.30 ą 0.10
>>>>
>>>> Do you mean that 1 mole of Ag has more information than 1 mole
>>>> of Al at 298.15 K?
>>>
>>> Yes, it has more internal degrees of freedom so that it takes
>>> addition of more energy in order to increase those we measure as
>>> temperature.
>>
>> Could you please explain then why engineers do not use the
>> CODATA/JANAF Tables to find the best material to keep information?
>
> Because they are interested in information that they can insert and
> retrieve. I once invented write-only-memory, but it didn't sell. :-)

Well, but this shows that by information physicists and engineers mean
different things. It would good then to distinguish them.

Evgenii

> Brent
>

John Clark

unread,
Jan 19, 2012, 4:29:58 PM1/19/12
to everyth...@googlegroups.com
On Thu, Jan 19, 2012 at 10:21 AM, Craig Weinberg <whats...@gmail.com> wrote:

"This suggests to me that a molecule of DNA belonging to a kangaroo could have no more information than the same molecule with the primary sequence scrambled into randomness

That is correct, it would have the same quantity of information, but most would be of the opinion that the quality has changed.
 
or 'blanked out' with a single repeating A-T base pair.

No, if its repeating then it would have less information, that is to say it would take less information to describe the result.
 
"That would seem to make this definition of information the exact opposite of the colloquial meaning of the term."

That can sometimes happen because mathematics can only deal in the quantity of information not it's quality. Quality is a value judgement and changes from person to person and mathematics does not make value judgements, but the quantity of something is objective and universal so mathematics can talk about that. So yes, there is much more information in a bucket of water than in our DNA , but most human beings are more interested in our genes than the astronomical number of micro-states in a bucket of water. That is my opinion too but a bucket of water may look at it differently and there is no disputing matters of taste. But both the bucket and I would agree on the amount of information in the DNA and in the bucket even if we disagree on which is more important.

  John K Clark

 


Craig Weinberg

unread,
Jan 19, 2012, 5:05:10 PM1/19/12
to Everything List
On Jan 19, 12:36 pm, meekerdb <meeke...@verizon.net> wrote:

> That's because the colloquial meaning of the terms takes into account the environment and
> which form of information can be causally effective.

How is one any form of information more or less likely to be causally
effective than any other form? This is degenerating into pure fantasy
where information is a magical wildcard. If information cannot inform
- ie a disk has been thoroughly and permanently erased what has been
lost if not information?

Craig

Craig Weinberg

unread,
Jan 19, 2012, 5:28:25 PM1/19/12
to Everything List
On Jan 19, 4:29 pm, John Clark <johnkcl...@gmail.com> wrote:
> On Thu, Jan 19, 2012 at 10:21 AM, Craig Weinberg <whatsons...@gmail.com>wrote:
>
> "This suggests to me that a molecule of DNA belonging to a kangaroo could
>
> > have no more information than the same molecule with the primary sequence
> > scrambled into randomness
>
> That is correct, it would have the same quantity of information, but most
> would be of the opinion that the quality has changed.

Ah, so by information you mean 'not information at all'? I thought
that the whole point of information theory is to move beyond quality
into pure quantification. The reason that most would be of the opinion
that the quality has changed is the same reason that most would be of
the opinion that a person wearing no clothes is naked. Not to pick on
anyone, but tbh, the suggestion that information can be defined as not
having anything to do with the difference between order and the
absence of order is laughably preposterous in a way that would impress
both Orwell and Kafka at the same time. I think it is actually one of
the biggest falsehoods that I have ever heard.

>
> > or 'blanked out' with a single repeating A-T base pair.
>
> No, if its repeating then it would have less information, that is to say it
> would take less information to describe the result.

Of course, but how does that jibe with the notion that information is
molecular entropy? How does A-T A-T A-T or G-T G-T G-T guarantee less
internal degrees of freedom within a DNA molecule then A-T G-C A-T?
What if you warm it up or cool it down. It doesn't make any sense that
there would be a physical difference which corresponds to the degree
to which a genetic sequence was non-random or non-monotonous. If I
have red legos and white legos, and I build two opposite monochrome
houses and one of mixed blocks, how in the world does that effect the
entropy of the plastic bricks in any way?

>
> > "That would seem to make this definition of information the exact opposite
> > of the colloquial meaning of the term."
>
> That can sometimes happen because mathematics can only deal in the quantity
> of information not it's quality. Quality is a value judgement and changes
> from person to person and mathematics does not make value judgements, but
> the quantity of something is objective and universal so mathematics can
> talk about that. So yes, there is much more information in a bucket of
> water than in our DNA , but most human beings are more interested in our
> genes than the astronomical number of micro-states in a bucket of water.
> That is my opinion too but a bucket of water may look at it differently and
> there is no disputing matters of taste. But both the bucket and I would
> agree on the amount of information in the DNA and in the bucket even if we
> disagree on which is more important.

I see no reason to use the word information at all for this. It sounds
like you are just talking about entropy to me. The idea that a bucket
of water has more 'information' than DNA is meaningless. I'm not
drinking that Kool-Aid, sorry. If you know of any physicists who are
willing to by my new mega information storage water buckets for only
twice the price of conventional RAID arrays though, I will gladly pay
you a commission.

Craig

Craig

meekerdb

unread,
Jan 19, 2012, 5:40:34 PM1/19/12
to everyth...@googlegroups.com
On 1/19/2012 2:05 PM, Craig Weinberg wrote:
> On Jan 19, 12:36 pm, meekerdb<meeke...@verizon.net> wrote:
>
>> That's because the colloquial meaning of the terms takes into account the environment and
>> which form of information can be causally effective.
> How is one any form of information more or less likely to be causally
> effective than any other form?

Would you rather have an instruction manual in English or Urdu?

Brent

Bruno Marchal

unread,
Jan 19, 2012, 6:12:16 PM1/19/12
to everyth...@googlegroups.com
On 19 Jan 2012, at 20:41, meekerdb wrote:

On 1/19/2012 11:06 AM, Evgenii Rudnyi wrote:
On 19.01.2012 06:37 meekerdb said the following:
On 1/18/2012 11:13 AM, Evgenii Rudnyi wrote:
snip

Could you please explain then why engineers do not use the CODATA/JANAF Tables to find the best material to keep information?

Because they are interested in information that they can insert and retrieve.  I once invented write-only-memory, but it didn't sell. :-)

Hmm... It might have interested the psycho-analysts, and perhaps the revisionists too.  Improve you marketing strategy!

I could buy you some for my list of boring urgent tasks!

(like sending my obsolete  list of of boring urgent tasks to a black hole. Find a way to prevent evaporation! If that is possible).

I think that physically write-only-memory might not exist. I think the core of physics might be very symmetrical, reversible. A group probably. There is no place where you can "really" hide information for long. 

Bruno




Brent


Evgenii


Brent


Also remember that at constant volume dS = (Cv/T) dT and dU = CvdT.

If the entropy is information then its derivative must be related
to information as well. Hence Cv must be related to information.
This however means that the energy also somehow related to
information.

Finally, the entropy is defined by the Second Law and the best
would be to stick to this definition. Only in this case, it is
possible to understand what we are talking about.

Evgenii



--
You received this message because you are subscribed to the Google Groups "Everything List" group.
To post to this group, send email to everyth...@googlegroups.com.
To unsubscribe from this group, send email to everything-li...@googlegroups.com.
For more options, visit this group at http://groups.google.com/group/everything-list?hl=en.



Russell Standish

unread,
Jan 19, 2012, 11:59:42 PM1/19/12
to everyth...@googlegroups.com
On Thu, Jan 19, 2012 at 08:03:41PM +0100, Evgenii Rudnyi wrote:
> Russell,
>
> I know that many physicists identify the entropy with information.
> Recently I had a nice discussion on biotaconv and people pointed out
> that presumably Edwin T. Jaynes was the first to make such a
> connection (Information theory and statistical mechanics, 1957).
> Google Scholar shows that his paper has been cited more than 5000
> times, that is impressive and it shows indeed that this is in a way
> mainstream.

Because I tend to think of "negentropy", which is really another term
for information, I tend to give priority to Schroedinger who wrote
about the topic in the early 40s. But Jaynes was certainly
instrumental in establishing the information based foundations to
statistical physics, even before information was properly defined (it
wasn't really until the likes of Kolmogorov, Chaitin and Solomonoff in
the 60s that information was really understood.

But Landauer in the late 60s was probably the first to make physicists
really wake up to the concept of physical information.

But then, I'm not a science historian, so what would I know :).

>
> I have studied Jaynes papers but I have been stacked with for example
>

... snip ...

>
> Basically I do not understand what the term information then brings.
> One can certainly state that information is the same as the entropy
> (we are free with definitions after all). Yet I miss the meaning of
> that. Let me put it this way, we have the thermodynamic entropy and
> then the informational entropy as defined by Shannon. The first used
> to designe a motor and the second to design a controller. Now let us
> suppose that these two entropies are the same. What this changes in
> a design of a motor and a controller? In my view nothing.
>

I can well recommend Denbigh & Denbigh's book from the 80s - its a bit
more of a modern understanding of the topic than Jaynes :)

@book{Denbigh-Denbigh87,
author = {Denbigh, K. G. and Denbigh, J.},
publisher = { Cambridge UP},
title = { Entropy in Relation to Incomplete Knowledge},
year = { 1987},
}


> By the way, have you seen the answer to my question:
>
> >> Also remember that at constant volume dS = (Cv/T) dT and dU =
> >> CvdT. If the entropy is information then its derivative must be
> >> related to information as well. Hence Cv must be related to
> >> information. This however means that the energy also somehow
> >> related to information.
>
> If the entropy is the same as information, than through the
> derivatives all thermodynamic properties are related to information
> as well. I am not sure if this makes sense in respect for example to
> design a self-driving car.
>

The information embodied in the thermodynamic state is presumably not
relevant to the design of a self-driving car. By the same token,
thermodynamic treatment (typically) discards a lot of information
useful for engineering.

> I am aware of works that estimated the thermodynamic limit (kT) to
> process information. I do not see however, how this proves the
> equivalence of information and entropy.
>
> Evgenii
>
> P.S. For a long time, people have identified the entropy with chaos.
> I have recently read a nice book to this end, Entropy and Art by
> Arnheim, 1971, it is really nice. One quote:
>

I guess this is the original meaning of chaos, not the more modern
meaning referring to "low dimension dynamical systems having strange
attractors".

> "The absurd consequences of neglecting structure but using the
> concept of order just the same are evident if one examines the
> present terminology of information theory. Here order is described
> as the carrier of information, because information is defined as the
> opposite of entropy, and entropy is a measure of disorder. To
> transmit information means to induce order. This sounds reasonable
> enough. Next, since entropy grows with the probability of a state of
> affairs, information does the opposite: it increases with its
> improbability. The less likely an event is to happen, the more
> information does its occurrence represent. This again seems
> reasonable. Now what sort of sequence of events will be least
> predictable and therefore carry a maximum of information? Obviously
> a totally disordered one, since when we are confronted with chaos we
> can never predict what will happen next.

This rather depends on whether the disorder is informationally
significant. This is context dependent. I have a discussion on this
(it relates to the Kolmogorov idea that random sequences have maximum
complexity) in my paper "On Complexity and Emergence". I also touch on
the theme in my book "Theory of Nothing", which I know you've read!

> The conclusion is that
> total disorder provides a maximum of information;

Total disorder corresponds to a maximum of entropy. Maximum entropy
minimises the amount of information.

> and since
> information is measured by order, a maximum of order is conveyed by
> a maximum of disorder. Obviously, this is a Babylonian muddle.
> Somebody or something has confounded our language."
>

I would say it is many people, rather than just one. I wrote "On
Complexity and Emergence" in response to the amount of unmitigated
tripe I've seen written about these topics.

Craig Weinberg

unread,
Jan 20, 2012, 7:42:18 AM1/20/12
to Everything List
On Jan 19, 5:40 pm, meekerdb <meeke...@verizon.net> wrote:
> On 1/19/2012 2:05 PM, Craig Weinberg wrote:

> > How is one any form of information more or less likely to be causally
> > effective than any other form?
>
> Would you rather have an instruction manual in English or Urdu?

Since I tend to put instruction manuals in a drawer and never look at
them, I would rather have the Urdu one as a novelty.

What difference does it make what I would rather have though? Both the
English and Urdu manuals are equally informative or non-informative
objectively (assuming they are equivalent translations), and neither
of them are causally effective without a subjective interpreter who is
causally effective.

Craig

Evgenii Rudnyi

unread,
Jan 21, 2012, 7:25:44 AM1/21/12
to everyth...@googlegroups.com
On 20.01.2012 05:59 Russell Standish said the following:

> On Thu, Jan 19, 2012 at 08:03:41PM +0100, Evgenii Rudnyi wrote:

...

>>
>> Basically I do not understand what the term information then
>> brings. One can certainly state that information is the same as the
>> entropy (we are free with definitions after all). Yet I miss the
>> meaning of that. Let me put it this way, we have the thermodynamic
>> entropy and then the informational entropy as defined by Shannon.
>> The first used to designe a motor and the second to design a
>> controller. Now let us suppose that these two entropies are the
>> same. What this changes in a design of a motor and a controller? In
>> my view nothing.
>>
>

> I can well recommend Denbigh& Denbigh's book from the 80s - its a


> bit more of a modern understanding of the topic than Jaynes :)
>
> @book{Denbigh-Denbigh87, author = {Denbigh, K. G. and Denbigh, J.},
> publisher = { Cambridge UP}, title = { Entropy in Relation to
> Incomplete Knowledge}, year = { 1987}, }

Thanks. On biotaconv they have recommended John Avery's "Information
Theory and Evolution" but I think I have already satisfied my curiosity
with Jaynes's two papers. My personal feeling is as follows:

1) The concept of information is useless in conventional thermodynamic
problems. Let us take for example the Fe-C phase diagram

http://www.calphad.com/graphs/Metastable%20Fe-C%20Phase%20Diagram.gif

What information has to do with the entropies of the phases in this
phase diagram? Do you mean that I find an answer in Denbigh's book?

2) If physicists say that information is the entropy, they must take it
literally and then apply experimental thermodynamics to measure
information. This however seems not to happen.

3) I am working with engineers developing mechatronics products.
Thermodynamics (hence the entropy) is there as well as information.
However, I have not met a practitioner yet who makes a connection
between the entropy and information.

>
>> By the way, have you seen the answer to my question:
>>
>>>> Also remember that at constant volume dS = (Cv/T) dT and dU =
>>>> CvdT. If the entropy is information then its derivative must
>>>> be related to information as well. Hence Cv must be related to
>>>> information. This however means that the energy also somehow
>>>> related to information.
>>
>> If the entropy is the same as information, than through the
>> derivatives all thermodynamic properties are related to
>> information as well. I am not sure if this makes sense in respect
>> for example to design a self-driving car.
>>
>
> The information embodied in the thermodynamic state is presumably
> not relevant to the design of a self-driving car. By the same token,
> thermodynamic treatment (typically) discards a lot of information
> useful for engineering.

Sorry, I do not understand what this means.

>> I am aware of works that estimated the thermodynamic limit (kT) to
>> process information. I do not see however, how this proves the
>> equivalence of information and entropy.
>>
>> Evgenii

...

>> and since information is measured by order, a maximum of order is
>> conveyed by a maximum of disorder. Obviously, this is a Babylonian
>> muddle. Somebody or something has confounded our language."
>>
>
> I would say it is many people, rather than just one. I wrote "On
> Complexity and Emergence" in response to the amount of unmitigated
> tripe I've seen written about these topics.
>
>

I have found your work on archiv.org and I will look at it. Thank you
for mentioning it.

Evgenii

meekerdb

unread,
Jan 21, 2012, 2:00:47 PM1/21/12
to everyth...@googlegroups.com

It does happen. The number of states, i.e. the information, available from a black hole
is calculated from it's thermodynamic properties as calculated by Hawking. At a more
conventional level, counting the states available to molecules in a gas can be used to
determine the specific heat of the gas and vice-verse. The reason the thermodynamic
measures and the information measures are treated separately in engineering problems is
that the information that is important to engineering is infinitesimal compared to the
information stored in the microscopic states. So the latter is considered only in terms
of a few macroscopic averages, like temperature and pressure.

Brent

Evgenii Rudnyi

unread,
Jan 21, 2012, 2:23:18 PM1/21/12
to everyth...@googlegroups.com
On 21.01.2012 20:00 meekerdb said the following:

> On 1/21/2012 4:25 AM, Evgenii Rudnyi wrote:
>>

...

>> 2) If physicists say that information is the entropy, they must
>> take it literally and then apply experimental thermodynamics to
>> measure information. This however seems not to happen.
>
> It does happen. The number of states, i.e. the information, available
> from a black hole is calculated from it's thermodynamic properties
> as calculated by Hawking. At a more conventional level, counting the
> states available to molecules in a gas can be used to determine the
> specific heat of the gas and vice-verse. The reason the thermodynamic
> measures and the information measures are treated separately in
> engineering problems is that the information that is important to
> engineering is infinitesimal compared to the information stored in
> the microscopic states. So the latter is considered only in terms of
> a few macroscopic averages, like temperature and pressure.
>
> Brent

Doesn't this mean that by information engineers means something
different as physicists?

Evgenii

meekerdb

unread,
Jan 21, 2012, 3:01:48 PM1/21/12
to everyth...@googlegroups.com

I don't think so. A lot of the work on information theory was done by communication
engineers who were concerned with the effect of thermal noise on bandwidth. Of course
engineers specialize more narrowly than physics, so within different fields of engineering
there are different terminologies and different measurement methods for things that are
unified in basic physics, e.g. there are engineers who specialize in magnetism and who
seldom need to reflect that it is part of EM, there are others who specialize in RF and
don't worry about "static" fields.

Brent

>
> Evgenii
>

Evgenii Rudnyi

unread,
Jan 21, 2012, 4:03:02 PM1/21/12
to everyth...@googlegroups.com
On 21.01.2012 21:01 meekerdb said the following:

Do you mean that engineers use experimental thermodynamics to determine
information?

Evgenii

> Brent
>
>>
>> Evgenii
>>
>

Evgenii Rudnyi

unread,
Jan 22, 2012, 4:04:45 AM1/22/12
to everyth...@googlegroups.com
On 21.01.2012 22:03 Evgenii Rudnyi said the following:

To be concrete. This is for example a paper from control

J.C. Willems and H.L. Trentelman
H_inf control in a behavioral context: The full information case
IEEE Transactions on Automatic Control
Volume 44, pages 521-536, 1999
http://homes.esat.kuleuven.be/~jwillems/Articles/JournalArticles/1999.4.pdf

The term information is there but the entropy not. Could you please
explain why? Or alternatively could you please point out to papers where
engineers use the concept of the equivalence between the entropy and
information?

Evgenii

>
>> Brent
>>
>>>
>>> Evgenii
>>>
>>
>

Evgenii Rudnyi

unread,
Jan 22, 2012, 1:16:23 PM1/22/12
to everyth...@googlegroups.com
On 20.01.2012 05:59 Russell Standish said the following:

> On Thu, Jan 19, 2012 at 08:03:41PM +0100, Evgenii Rudnyi wrote:

...

>> and since information is measured by order, a maximum of order is
>> conveyed by a maximum of disorder. Obviously, this is a Babylonian
>> muddle. Somebody or something has confounded our language."
>>
>
> I would say it is many people, rather than just one. I wrote "On
> Complexity and Emergence" in response to the amount of unmitigated
> tripe I've seen written about these topics.
>

Russel,

I have read your paper

http://arxiv.org/abs/nlin/0101006

It is well written. Could you please apply the principles from your
paper to a problem on how to determine information in a book (for
example let us take your book Theory of Nothing)?

Also do you believe earnestly that this information is equal to the
thermodynamic entropy of the book? If yes, can one determine the
information in the book just by means of experimental thermodynamics?

Evgenii

P.S. Why it is impossible to state that a random string is generated by
some random generator?


Russell Standish

unread,
Jan 22, 2012, 7:26:26 PM1/22/12
to everyth...@googlegroups.com
On Sun, Jan 22, 2012 at 07:16:23PM +0100, Evgenii Rudnyi wrote:
> On 20.01.2012 05:59 Russell Standish said the following:
> >On Thu, Jan 19, 2012 at 08:03:41PM +0100, Evgenii Rudnyi wrote:
>
> ...
>
> >>and since information is measured by order, a maximum of order is
> >>conveyed by a maximum of disorder. Obviously, this is a Babylonian
> >>muddle. Somebody or something has confounded our language."
> >>
> >
> >I would say it is many people, rather than just one. I wrote "On
> >Complexity and Emergence" in response to the amount of unmitigated
> >tripe I've seen written about these topics.
> >
>
> Russel,
>
> I have read your paper
>
> http://arxiv.org/abs/nlin/0101006
>
> It is well written. Could you please apply the principles from your
> paper to a problem on how to determine information in a book (for
> example let us take your book Theory of Nothing)?
>
> Also do you believe earnestly that this information is equal to the
> thermodynamic entropy of the book?

These are two quite different questions. To someone who reads my book,
the physical form of the book is unimportant - it could just as easily
be a PDF file or a Kindle e-book as a physical paper copy. The PDF is
a little over 30,000 bytes long. Computing the information content
would be a matter of counting the number 30,000 long byte strings that
generate a recognisable variant of ToN when fed into Acrobat
reader. Then subtract the logarithm (to base 256) of this figure from
30,000 to get the information content in bytes.

This is quite impractical, of course, not to speak of expense in
paying for an army of people to go through 256^30,000 variants to
decide which ones are the true ToN's. An upper bound can be
found by compressing the file - PDFs are already compressed, so we
could estimate the information content as being between 25KB and 30KB (say).

To a physicist, it is the physical form that is important - the fact
that it is made of paper, with a bit of glue to hold it together. The
arrangement of ink on the pages is probably quite unimportant - a book
of the same size and shape, but with blank pages would do just as
well. Even if the arrangement of ink is important, then does
typesetting the book in a different font lead to the same book or a
different book?

To compute the thermodynamic information, one could imagine performing
a massive molecular dynamics simulation, and then count the number of
states that correspond to the physical book, take the logarithm, then
subtract that from the logarithm of the total possible number of
states the molecules could take on (if completely disassociated).

This is, of course, completely impractical. Computing the complexity
of something is generally NP-hard. But in principle doable.

Now, how does this relate to the thermodynamic entropy of the book? It
turns out that the information computed by the in-principle process
above is equal to the difference between the maximum entropy of the
molecules making up the book (if completely disassociated) and the
thermodynamic entropy, which could be measured in a calorimeter.


> If yes, can one determine the
> information in the book just by means of experimental
> thermodynamics?
>

One can certainly determine the information of the physical book
(defined however you might like) - but that is not the same as the
information of the abstract book.

> Evgenii
>
> P.S. Why it is impossible to state that a random string is generated
> by some random generator?
>

Not sure what you mean, unless you're really asking "Why it is


impossible to state that a random string is generated by some

pseudorandom generator?"

In which case the answer is that a pseudorandom generator is an
algorithm, so by definition doesn't produce random numbers. There is a
lot of knowledge about how to decide if a particular PRNG is
sufficiently random for a particular purpose. No PRNG is sufficiently
random for all purposes - in particular they are very poor for
security purposes, as they're inherently predictable.

Cheers

Craig Weinberg

unread,
Jan 23, 2012, 8:20:28 AM1/23/12
to Everything List
On Jan 22, 7:26 pm, Russell Standish <li...@hpcoders.com.au> wrote:

>
> Now, how does this relate to the thermodynamic entropy of the book? It
> turns out that the information computed by the in-principle process
> above is equal to the difference between the maximum entropy of the
> molecules making up the book (if completely disassociated) and the
> thermodynamic entropy, which could be measured in a calorimeter.
>
> > If yes, can one determine the
> > information in the book just by means of experimental
> > thermodynamics?
>
> One can certainly determine the information of the physical book
> (defined however you might like) - but that is not the same as the
> information of the abstract book.

This would only work of the information were meaningless and a-
signifying. I can write a whole book with just the words "The movie
Goodfellas". Anyone who has seen that movie has a rich text of
memories from which to inform themselves through that association.
That is what being informed actually is, associating and integrating
presented texts with a body of accumulated texts and contexts. If you
conflate information with the data that happens to be associated with
a particular text in a particular language-media context, you are
literally weighing stories by the pound (or gram).

Besides, any such quantitative measure does not take sequence into
account. A book or file which is completely scrambled down to the
level of characters or pixels has the same quantity of entropy
displacement as the in tact text. To reduce information to quantity
alone means that a 240k text file can be rearranged to be 40kb of
nothing but 1s and then 200kb of nothing but 0s and have the same
amount of information and entropy. It's a gross misunderstanding of
how information works.

Craig

Russell Standish

unread,
Jan 23, 2012, 11:25:38 PM1/23/12
to everyth...@googlegroups.com
On Mon, Jan 23, 2012 at 05:20:28AM -0800, Craig Weinberg wrote:
>
> Besides, any such quantitative measure does not take sequence into
> account. A book or file which is completely scrambled down to the
> level of characters or pixels has the same quantity of entropy
> displacement as the in tact text. To reduce information to quantity
> alone means that a 240k text file can be rearranged to be 40kb of
> nothing but 1s and then 200kb of nothing but 0s and have the same
> amount of information and entropy. It's a gross misunderstanding of
> how information works.
>
> Craig
>

Rearranging the text file to have 40KB of 1s and 200KB of 0s
dramatically reduces the information and increases the entropy by the
same amount, although not nearly as much as completely scrambling the
file. I'd say you have a gross misunderstanding of how these measures
work if you think otherwise.

Craig Weinberg

unread,
Jan 24, 2012, 7:49:41 AM1/24/12
to Everything List
On Jan 23, 11:25 pm, Russell Standish <li...@hpcoders.com.au> wrote:
> On Mon, Jan 23, 2012 at 05:20:28AM -0800, Craig Weinberg wrote:
>
> > Besides, any such quantitative measure does not take sequence into
> > account. A book or file which is completely scrambled down to the
> > level of characters or pixels has the same quantity of entropy
> > displacement as the in tact text. To reduce information to quantity
> > alone means that a 240k text file can be rearranged to be 40kb of
> > nothing but 1s and then 200kb of nothing but 0s and have the same
> > amount of information and entropy. It's a gross misunderstanding of
> > how information works.
>
> > Craig
>
> Rearranging the text file to have 40KB of 1s and 200KB of 0s
> dramatically reduces the information and increases the entropy by the
> same amount, although not nearly as much as completely scrambling the
> file. I'd say you have a gross misunderstanding of how these measures
> work if you think otherwise.

All this time I thought that you have been saying that entropy and
information are the same thing:

>>"This suggests to me that a molecule of DNA belonging to a
kangaroo could
>> have no more information than the same molecule with the primary
sequence
>> scrambled into randomness

>That is correct, it would have the same quantity of
information, but most
>would be of the opinion that the quality has changed.

If you are instead saying that they are inversely proportional then I
would agree in general - information can be considered negentropy.
Sorry, I thought you were saying that they are directly proportional
measures (Brent and Evgenii seem to be talking about it that way). I
think that we can go further in understanding information though.
Negentropy is a good beginning but it does not address significance.
The degree to which information has the capacity to inform is even
more important than the energy cost to generate. Significance of
information is a subjective quality which is independent of entropy
but essential to the purpose of information. In fact, information
itself could be considered the quantitative shadow of the quality of
significance. Information that does not inform something is not
information.

Craig

meekerdb

unread,
Jan 24, 2012, 4:56:55 PM1/24/12
to everyth...@googlegroups.com


In thinking about how to answer this I came across an excellent paper by Roman Frigg and
Charlotte Werndl http://www.romanfrigg.org/writings/EntropyGuide.pdf which explicates the
relation more comprehensively than I could and which also gives some historical background
and extensions: specifically look at section 4.

Brent

Evgenii Rudnyi

unread,
Jan 25, 2012, 2:47:03 PM1/25/12
to everyth...@googlegroups.com
On 23.01.2012 01:26 Russell Standish said the following:

Yet, this is already information. Hence if take the equivalence between
the informational and thermodynamic entropies literally, then even in
this case the thermodynamic entropy (that should be possible to measure
by experimental thermodynamics) must exist. What it is in this case?

> To a physicist, it is the physical form that is important - the fact
> that it is made of paper, with a bit of glue to hold it together.
> The arrangement of ink on the pages is probably quite unimportant - a
> book of the same size and shape, but with blank pages would do just
> as well. Even if the arrangement of ink is important, then does
> typesetting the book in a different font lead to the same book or a
> different book?

It is a good question and in my view it again shows that thermodynamic
entropy and information are some different things, as for the same
object we can define the information differently (see also below).

> To compute the thermodynamic information, one could imagine
> performing a massive molecular dynamics simulation, and then count
> the number of states that correspond to the physical book, take the
> logarithm, then subtract that from the logarithm of the total
> possible number of states the molecules could take on (if completely
> disassociated).

Do not forget that molecular dynamics simulation is based on the Newton
laws (even quantum mechanics molecular dynamics). Hence you probably
mean here the Monte-Carlo method. Yet, it is much simpler to employ
experimental thermodynamics (see below).

> This is, of course, completely impractical. Computing the complexity
> of something is generally NP-hard. But in principle doable.
>
> Now, how does this relate to the thermodynamic entropy of the book?
> It turns out that the information computed by the in-principle
> process above is equal to the difference between the maximum entropy
> of the molecules making up the book (if completely disassociated) and
> the thermodynamic entropy, which could be measured in a calorimeter.
>
>
>> If yes, can one determine the information in the book just by means
>> of experimental thermodynamics?
>>
>
> One can certainly determine the information of the physical book
> (defined however you might like) - but that is not the same as the
> information of the abstract book.

Let me suggest a very simple case to understand better what you are
saying. Let us consider a string "10" for simplicity. Let us consider
the next cases. I will cite first the thermodynamic properties of Ag and
Al from CODATA tables (we will need them)

S � (298.15 K)
J K-1 mol-1

Ag cr 42.55 � 0.20
Al cr 28.30 � 0.10

In J K-1 cm-3 it will be

Ag cr 42.55/107.87*10.49 = 4.14
Al cr 28.30/26.98*2.7 = 2.83

1) An abstract string "10" as the abstract book above.

2) Let us make now an aluminum plate (a page) with "10" hammered on it
(as on a coin) of the total volume 10 cm^3. The thermodynamic entropy is
then 28.3 J/K.

3) Let us make now a silver plate (a page) with "10" hammered on it (as
on a coin) of the total volume 10 cm^3. The thermodynamic entropy is
then 41.4 J/K.

4) We can easily make another aluminum plate (scaling all dimensions
from 2) to the total volume of 100 cm^3. Then the thermodynamic entropy
is 283 J/K.

Now we have four different combinations to represent a string "10" and
the thermodynamic entropy is different. If we take the statement
literally then the information must be different in all four cases and
defined uniquely as the thermodynamic entropy is already there. Yet in
my view this makes little sense.

Could you please comment on this four cases?

>> Evgenii
>>
>> P.S. Why it is impossible to state that a random string is
>> generated by some random generator?
>>
>
> Not sure what you mean, unless you're really asking "Why it is
> impossible to state that a random string is generated by some
> pseudorandom generator?"
>
> In which case the answer is that a pseudorandom generator is an
> algorithm, so by definition doesn't produce random numbers. There is
> a lot of knowledge about how to decide if a particular PRNG is
> sufficiently random for a particular purpose. No PRNG is
> sufficiently random for all purposes - in particular they are very
> poor for security purposes, as they're inherently predictable.

I understand. Yet if we take a finite random string, then presumably
there should be some random generate with some seed that produces it.
What would be wrong with this?

Evgenii


> Cheers
>

Evgenii Rudnyi

unread,
Jan 25, 2012, 2:52:00 PM1/25/12
to everyth...@googlegroups.com
On 24.01.2012 13:49 Craig Weinberg said the following:

> If you are instead saying that they are inversely proportional then
> I would agree in general - information can be considered negentropy.
> Sorry, I thought you were saying that they are directly proportional
> measures (Brent and Evgenii seem to be talking about it that way). I

I am not an expert in the informational entropy. For me it does not
matter how they define it in the information theory, whether as entropy
or negentropy. My point is that this has nothing to do with the
thermodynamic entropy (see my previous message with four cases for the
string "10").

Evgenii

Evgenii Rudnyi

unread,
Jan 25, 2012, 2:56:34 PM1/25/12
to everyth...@googlegroups.com
On 24.01.2012 22:56 meekerdb said the following:

> In thinking about how to answer this I came across an excellent paper
> by Roman Frigg and Charlotte Werndl
> http://www.romanfrigg.org/writings/EntropyGuide.pdf which explicates
> the relation more comprehensively than I could and which also gives
> some historical background and extensions: specifically look at
> section 4.
>
> Brent
>

Thanks for the link. I will try to work it out to see if they have an
answer to the four cases with the string "10" that I have described in
my reply to Russell.

Evgenii

meekerdb

unread,
Jan 25, 2012, 3:25:09 PM1/25/12
to everyth...@googlegroups.com
> Ag cr 42.55 ą 0.20
> Al cr 28.30 ą 0.10

>
> In J K-1 cm-3 it will be
>
> Ag cr 42.55/107.87*10.49 = 4.14
> Al cr 28.30/26.98*2.7 = 2.83
>
> 1) An abstract string "10" as the abstract book above.
>
> 2) Let us make now an aluminum plate (a page) with "10" hammered on it (as on a coin) of
> the total volume 10 cm^3. The thermodynamic entropy is then 28.3 J/K.
>
> 3) Let us make now a silver plate (a page) with "10" hammered on it (as on a coin) of
> the total volume 10 cm^3. The thermodynamic entropy is then 41.4 J/K.
>
> 4) We can easily make another aluminum plate (scaling all dimensions from 2) to the
> total volume of 100 cm^3. Then the thermodynamic entropy
> is 283 J/K.
>
> Now we have four different combinations to represent a string "10" and the thermodynamic
> entropy is different. If we take the statement literally then the information must be
> different in all four cases and defined uniquely as the thermodynamic entropy is already
> there. Yet in my view this makes little sense.
>
> Could you please comment on this four cases?

The thermodynamic entropy is a measure of the information required to locate the possible
states of the plates in the phase space of atomic configurations constituting them. Note
that the thermodynamic entropy you quote is really the *change* in entropy per degree at
the given temperature. It's a measure of how much more phase space becomes available to
the atomic states when the internal energy is increased. More available phase space means
more uncertainty of the exact actual state and hence more information entropy. This
information is enormous compared to the "01" stamped on the plate, the shape of the plate
or any other aspects that we would normally use to convey information. It would only be
in case we cooled the plate to near absolute zero and then tried to encode information in
its microscopic vibrational states that the thermodynamic and the encoded information
entropy would become similar.


>
>>> Evgenii
>>>
>>> P.S. Why it is impossible to state that a random string is
>>> generated by some random generator?
>>>
>>
>> Not sure what you mean, unless you're really asking "Why it is
>> impossible to state that a random string is generated by some
>> pseudorandom generator?"
>>
>> In which case the answer is that a pseudorandom generator is an
>> algorithm, so by definition doesn't produce random numbers. There is
>> a lot of knowledge about how to decide if a particular PRNG is
>> sufficiently random for a particular purpose. No PRNG is
>> sufficiently random for all purposes - in particular they are very
>> poor for security purposes, as they're inherently predictable.
>
> I understand. Yet if we take a finite random string, then presumably there should be
> some random generate with some seed that produces it. What would be wrong with this?

Yes, that points out that any finite string cannot be known to be random.

Brent

>
> Evgenii
>
>
>> Cheers
>>
>

Russell Standish

unread,
Jan 26, 2012, 6:00:04 AM1/26/12
to everyth...@googlegroups.com
On Wed, Jan 25, 2012 at 08:47:03PM +0100, Evgenii Rudnyi wrote:
>
> Let me suggest a very simple case to understand better what you are
> saying. Let us consider a string "10" for simplicity. Let us
> consider the next cases. I will cite first the thermodynamic
> properties of Ag and Al from CODATA tables (we will need them)
>
> S ° (298.15 K)
> J K-1 mol-1
>
> Ag cr 42.55 ą 0.20
> Al cr 28.30 ą 0.10

>
> In J K-1 cm-3 it will be
>
> Ag cr 42.55/107.87*10.49 = 4.14
> Al cr 28.30/26.98*2.7 = 2.83
>
> 1) An abstract string "10" as the abstract book above.
>
> 2) Let us make now an aluminum plate (a page) with "10" hammered on
> it (as on a coin) of the total volume 10 cm^3. The thermodynamic
> entropy is then 28.3 J/K.
>
> 3) Let us make now a silver plate (a page) with "10" hammered on it
> (as on a coin) of the total volume 10 cm^3. The thermodynamic
> entropy is then 41.4 J/K.
>
> 4) We can easily make another aluminum plate (scaling all dimensions
> from 2) to the total volume of 100 cm^3. Then the thermodynamic
> entropy is 283 J/K.
>
> Now we have four different combinations to represent a string "10"
> and the thermodynamic entropy is different. If we take the statement
> literally then the information must be different in all four cases
> and defined uniquely as the thermodynamic entropy is already there.
> Yet in my view this makes little sense.
>
> Could you please comment on this four cases?
>

Brent commented quite aptly on these cases in another post. The fact
that you calculate the thermodynamic entropy the way you do implies
you are disregarding the information contained in the symbols embossed
on the coin.

If you included these two bits, the thermodynamic entropy is two bits
less, = 4.15 x 10^{-24} J/K less

This is so many orders of magnitude less than the entropy due to the
material, its probably not worth including, but it is there.

John Clark

unread,
Jan 26, 2012, 1:01:32 PM1/26/12
to everyth...@googlegroups.com
On Thu, Jan 19, 2012 at 5:28 PM, Craig Weinberg <whats...@gmail.com> wrote:

> I thought that the whole point of information theory is to move beyond quality into pure quantification.

 Yes.
 
> the suggestion that information can be defined as not having anything to do with the difference between order and the absence of order is laughably preposterous

Yes.

> The idea that a bucket of water has more 'information' than DNA is meaningless.

What word didn't you understand?


>>  No, if its repeating then it would have less information, that is to say it would take less information to describe the result.


> Of course, but how does that jibe with the notion that information ismolecular entropy? How does A-T A-T A-T or G-T G-T G-T guarantee less internal degrees of freedom within a DNA molecule then A-T G-C A-T?

It would take little information to describe a repeating sequence like A-T-A-T-A-T.... and few ways to change it's micro-state without altering its macro orderly appearance, so it has a very low entropy,  but it would take a lot of information to describe a random sequence A-T G-C A-T... and lots of ways to alter it's micro-state with it still looking random, so it has a high entropy.

> I see no reason to use the word information at all for this. It sounds like you are just talking about entropy to me.

As I said, think about entropy as a measure of the number of ways you can change the micro-structure of something without changing its large scale macro appearance.
 
> If I have red legos and white legos, and I build two opposite monochrome houses and one of mixed blocks, how in the world does that effect the entropy of the plastic bricks in any way?

It does not effect the entropy of the plastic bricks but it does change the entropy of the structures built with those plastic bricks. For a single part in isolation entropy is not defined, a single water molecule has no entropy but a trillion trillion of them in a drop of water does.

  John K Clark


Craig Weinberg

unread,
Jan 26, 2012, 6:32:20 PM1/26/12
to Everything List
On Jan 26, 1:01 pm, John Clark <johnkcl...@gmail.com> wrote:
> On Thu, Jan 19, 2012 at 5:28 PM, Craig Weinberg <whatsons...@gmail.com>wrote:
>
> > I thought that the whole point of information theory is to move beyond
> > quality into pure quantification.
>
>  Yes.
>
> > > the suggestion that information can be defined as not having anything to
> > do with the difference between order and the absence of order is laughably
> > preposterous
>
> Yes.
>
> > The idea that a bucket of water has more 'information' than DNA is
> > meaningless.
>
> What word didn't you understand?

Information. If a bucket of water has more of it than DNA, then the
word information is meaningless.

>
> >>  No, if its repeating then it would have less information, that is to
> >> say it would take less information to describe the result.
>
> > > Of course, but how does that jibe with the notion that information
> > ismolecular entropy? How does A-T A-T A-T or G-T G-T G-T guarantee less
> > internal degrees of freedom within a DNA molecule then A-T G-C A-T?
>
> It would take little information to describe a repeating sequence like
> A-T-A-T-A-T.... and few ways to change it's micro-state without altering
> its macro orderly appearance,

Describe it to who? Macro appearance to what? If you live alone on a
planet that is only liquid, how does one 'describe' a repeating
sequence? Besides your own mind, what would tell you that A-T-A-T-A-
T... can be expressed in any other way other than what it literally
is?

> so it has a very low entropy,  but it would
> take a lot of information to describe a random sequence A-T G-C A-T... and
> lots of ways to alter it's micro-state with it still looking random, so it
> has a high entropy.

So you are saying water has more information than DNA, but DNA that is
completely random has the same amount (or less) information than the
DNA that belonged to Beethoven. A symphony then would have less
information and more entropy than random noise. If the word
information is to have any meaning, quantity and compressibility of
data must be distinguished from quality of it's interpretation. Which
of course parallels the AI treatment of intelligence (trivial or
quantitative processing capacity) and cognitive awareness
(consciousness).

>
> > I see no reason to use the word information at all for this. It sounds
> > like you are just talking about entropy to me.
>
> As I said, think about entropy as a measure of the number of ways you can
> change the micro-structure of something without changing its large scale
> macro appearance.

I don't think it's a good definition because micro and macro are
relative to an observer, not to the universe, but I understand what
you mean. There really is no definition related to order or pattern
that isn't subjective. The degree to which something's 'large scale
macro appearance' changes is contingent entirely on our ability to
perceive and recognize the changes.

Let's say your definition were true though. What does it have to do
with information being directly proportionate to entropy? If entropy
were equal or proportionate to information, then are saying that the
more information something contains, the less it matters. The more
information you have on the micro level, the less you can tell at the
macro. It seems obvious that they are inversely proportional. To
inform something is to reduce it's entropy (which necessarily means
increasing entropy somewhere else...entropy is all about space). I
build a sand castle and it has lower entropy than the rest of the
beach. Over time, the sand will return to the beach and we say the
entropy has returned to the higher beach level. If I encase the
sandcastle in lucite, it will slow down that process tremendously
because the form has no space to fall away from the castle.

>
> > > If I have red legos and white legos, and I build two opposite monochrome
> > houses and one of mixed blocks, how in the world does that effect the
> > entropy of the plastic bricks in any way?
>
> It does not effect the entropy of the plastic bricks but it does change the
> entropy of the structures built with those plastic bricks. For a single
> part in isolation entropy is not defined, a single water molecule has no
> entropy but a trillion trillion of them in a drop of water does.
>

Ok, so how does it effect the entropy of the structures? The red
house, the white house, and the mixed house (even if an interesting
pattern is made in the bricks), all behave in a physically identical
way, do they not? That would seem to preclude information itself from
having any objective material presence.

Craig

meekerdb

unread,
Jan 26, 2012, 6:54:41 PM1/26/12
to everyth...@googlegroups.com
On 1/26/2012 3:32 PM, Craig Weinberg wrote:
Ok, so how does it effect the entropy of the structures? The red
house, the white house, and the mixed house (even if an interesting
pattern is made in the bricks), all behave in a physically identical
way, do they not?

No they don't.  They reflect photons differently; which is why you could use the pattern to send a message.

There seems to be a lot of confusion about information as defined by Shannon.  Shannon's information is relative to the uncertainty in a message.  So it depends on how you define the possible messages.  If different patterns of red and white legos constitute the possible messages, then you can measure the information capacity of this message system by Shannon's formula.  It's *not* the measure of some particular message - it's the measure of the *capacity* of the message system.

Brent

Craig Weinberg

unread,
Jan 26, 2012, 8:03:31 PM1/26/12
to Everything List
On Jan 26, 6:54 pm, meekerdb <meeke...@verizon.net> wrote:
> On 1/26/2012 3:32 PM, Craig Weinberg wrote:
>
> > Ok, so how does it effect the entropy of the structures? The red
> > house, the white house, and the mixed house (even if an interesting
> > pattern is made in the bricks), all behave in a physically identical
> > way, do they not?
>
> No they don't.  They reflect photons differently; which is why you could use the pattern
> to send a message.

True, although it's only relevant if you have photons to reflect. If I
turn out the lights (completely) does that change the entropy of the
red house? What if I turn the lights back on, has entropy been
suddenly reduced? Would a brighter light put more information or less
entropy onto the white house than the red house, ie, does the pattern
cost something in photons?

I'm just curious, not trying to argue with you about it. On a similar
note, I was wondering about heat loss in a vacuum today. With the
second law of thermodynamics, it seems like heat could only dissipate
by heating something else up. If there was nothing in the universe
except a blob of molten nickel, would it cool off over time in an
infinite vacuum? It seems like it wouldn't. It seems like you would
need some other matter at a different temperature to seek a common
equilibrium with. Or is the heat just lost over time no matter what?

>
> There seems to be a lot of confusion about information as defined by Shannon.  Shannon's
> information is relative to the uncertainty in a message.  So it depends on how you define
> the possible messages.  If different patterns of red and white legos constitute the
> possible messages, then you can measure the information capacity of this message system by
> Shannon's formula.  It's *not* the measure of some particular message - it's the measure
> of the *capacity* of the message system.

That makes more sense. As long as the possibility of messages is
subjective, I don't have a problem with it. It's when information is
treated as an objective entity that I vote no,

Craig

meekerdb

unread,
Jan 26, 2012, 11:11:59 PM1/26/12
to everyth...@googlegroups.com
On 1/26/2012 5:03 PM, Craig Weinberg wrote:
> On Jan 26, 6:54 pm, meekerdb<meeke...@verizon.net> wrote:
>> On 1/26/2012 3:32 PM, Craig Weinberg wrote:
>>
>>> Ok, so how does it effect the entropy of the structures? The red
>>> house, the white house, and the mixed house (even if an interesting
>>> pattern is made in the bricks), all behave in a physically identical
>>> way, do they not?
>> No they don't. They reflect photons differently; which is why you could use the pattern
>> to send a message.
> True, although it's only relevant if you have photons to reflect. If I
> turn out the lights (completely) does that change the entropy of the
> red house? What if I turn the lights back on, has entropy been
> suddenly reduced? Would a brighter light put more information or less
> entropy onto the white house than the red house, ie, does the pattern
> cost something in photons?

Yes.

>
> I'm just curious, not trying to argue with you about it. On a similar
> note, I was wondering about heat loss in a vacuum today. With the
> second law of thermodynamics, it seems like heat could only dissipate
> by heating something else up. If there was nothing in the universe
> except a blob of molten nickel, would it cool off over time in an
> infinite vacuum? It seems like it wouldn't. It seems like you would
> need some other matter at a different temperature to seek a common
> equilibrium with. Or is the heat just lost over time no matter what?

The heat would be lost by infrared radiation.

Brent

Craig Weinberg

unread,
Jan 27, 2012, 6:56:52 AM1/27/12
to Everything List
On Jan 26, 11:11 pm, meekerdb <meeke...@verizon.net> wrote:
> On 1/26/2012 5:03 PM, Craig Weinberg wrote:
>
> >>> Ok, so how does it effect the entropy of the structures? The red
> >>> house, the white house, and the mixed house (even if an interesting
> >>> pattern is made in the bricks), all behave in a physically identical
> >>> way, do they not?
> >> No they don't.  They reflect photons differently; which is why you could use the pattern
> >> to send a message.
> > True, although it's only relevant if you have photons to reflect. If I
> > turn out the lights (completely) does that change the entropy of the
> > red house? What if I turn the lights back on, has entropy been
> > suddenly reduced? Would a brighter light put more information or less
> > entropy onto the white house than the red house, ie, does the pattern
> > cost something in photons?
>
> Yes.

That doesn't make sense to me. I think if two houses had two different
patterns with the same numbers of each brick, neither one could
possibly have a different cost in photons than the other. In a house
of four bricks, Red Red White White cannot have a different photon
absorption than Red White White Red.

>
>
>
> > I'm just curious, not trying to argue with you about it. On a similar
> > note, I was wondering about heat loss in a vacuum today. With the
> > second law of thermodynamics, it seems like heat could only dissipate
> > by heating something else up. If there was nothing in the universe
> > except a blob of molten nickel, would it cool off over time in an
> > infinite vacuum? It seems like it wouldn't. It seems like you would
> > need some other matter at a different temperature to seek a common
> > equilibrium with. Or is the heat just lost over time no matter what?
>
> The heat would be lost by infrared radiation.

Lost to where? Energy is neither created nor...lost.

Craig

John Clark

unread,
Jan 27, 2012, 11:42:39 AM1/27/12
to everyth...@googlegroups.com
On Thu, Jan 26, 2012  Craig Weinberg <whats...@gmail.com> wrote:

> If a bucket of water has more of it than DNA, then the word information is meaningless.

You would need to send more, far far more, dots and dashes down a wire to inform a intelligent entity what the position and velocity of every molecule in bucket of water is than to inform it exactly what the human genome is. Now what word didn't you understand.
 
> A symphony then would have less information and more entropy than random noise.

No, a symphony would have less information but LESS entropy than random white noise. That's why lossless computer image and sound compression programs don't work with white noise, there is no redundancy to remove because white noise has no redundancy.  It would take many more dots and dashes sent down a wire to describe every pop and click in a piece of white noise than to describe a symphony of equal length. 

> If the word information is to have any meaning, quantity and compressibility of data must be distinguished from quality of it's interpretation.


If you want to clearly distinguish these things, and I agree that is a very good idea, then you need separate words for the separate ideas. Quality is subjective so mathematics can not deal with it, mathematics can work with quantity however, so if quality comes into play you can not use the word  "information" because mathematics already owns that word; but there are plenty of other words that you can use, words like "knowledge" or "wisdom".   
 

> Let's say your definition were true though. What does it have to do with information being directly proportionate to entropy?

The larger the entropy something has the more information it has. 

> If entropy were equal or proportionate to information, then are saying that the more information something contains, the less it matters.

Whether it matters or not is subjective so you should not use the word "information" in the above. A bucket of water contains far more information than the human genome but the human genome has far more knowledge, at least I think so, although a bucket of water might disagree with me.

 John K Clark

 

John Clark

unread,
Jan 27, 2012, 1:31:05 PM1/27/12
to everyth...@googlegroups.com
On Thu, Jan 26, 2012 at 8:03 PM, Craig Weinberg <whats...@gmail.com> wrote:

> With the second law of thermodynamics, it seems like heat could only dissipate by heating something else up.

The second law says that energy will tend to get diluted in space over time, and heat conducting to other matter is one way for this to happen but it is not the only way. Photons radiating outward in all directions from a hot object is another way energy can get diluted. But among many other things, you don't think photons, or logic, exist so I doubt this answer will satisfy you. 

 John K Clark



Evgenii Rudnyi

unread,
Jan 27, 2012, 2:21:15 PM1/27/12
to everyth...@googlegroups.com
On 25.01.2012 21:25 meekerdb said the following:

> On 1/25/2012 11:47 AM, Evgenii Rudnyi wrote:
...

I would say that from your answer it follows that engineering
information has nothing to do with the thermodynamic entropy. Don't you
agree?

It would certainly interesting to consider what happens when we decrease
the temperature (in the limit to zero Kelvin). According to the Third
Law the entropy will be zero then. What do you think, can we save less
information on a copper plate at low temperatures as compared with
higher temperatures? Or more?

Evgenii

meekerdb

unread,
Jan 27, 2012, 2:22:09 PM1/27/12
to everyth...@googlegroups.com

The reason I seldom respond to your posts is that you seem unwilling to put any effort
into understanding what is written to you.

Lost to the photons.

Brent

Evgenii Rudnyi

unread,
Jan 27, 2012, 2:27:31 PM1/27/12
to everyth...@googlegroups.com
On 26.01.2012 12:00 Russell Standish said the following:

Well, I do disregard the surface effects. However, the statement was
that the informational entropy is the same as thermodynamic entropy, so
we must consider the total entropy.

> If you included these two bits, the thermodynamic entropy is two
> bits less, = 4.15 x 10^{-24} J/K less
>
> This is so many orders of magnitude less than the entropy due to the
> material, its probably not worth including, but it is there.

I do not believe that effects below the experimental noise are important
for empirical science. You probably mean then some other science, it
would be good if you define what science you mean.

Evgenii


Evgenii Rudnyi

unread,
Jan 27, 2012, 2:33:35 PM1/27/12
to everyth...@googlegroups.com
On 26.01.2012 19:01 John Clark said the following:

> On Thu, Jan 19, 2012 at 5:28 PM, Craig
> Weinberg<whats...@gmail.com>wrote:

...

>>> If I have red legos and white legos, and I build two opposite
>>> monochrome
>> houses and one of mixed blocks, how in the world does that effect
>> the entropy of the plastic bricks in any way?
>>
>
> It does not effect the entropy of the plastic bricks but it does
> change the entropy of the structures built with those plastic bricks.

This change in the entropy is below of experimental noise. Just estimate
what difference it makes and the difference in what digit in the total
entropy you will have. Hence the talk about the thermodynamic entropy as
the information source in this case is just meaningless, as you cannot
experimentally measure what you are talking about.

Evgenii

Evgenii Rudnyi

unread,
Jan 27, 2012, 2:41:31 PM1/27/12
to everyth...@googlegroups.com
On 27.01.2012 05:11 meekerdb said the following:

> On 1/26/2012 5:03 PM, Craig Weinberg wrote:

...

>>
>> I'm just curious, not trying to argue with you about it. On a
>> similar note, I was wondering about heat loss in a vacuum today.
>> With the second law of thermodynamics, it seems like heat could
>> only dissipate by heating something else up. If there was nothing
>> in the universe except a blob of molten nickel, would it cool off
>> over time in an infinite vacuum? It seems like it wouldn't. It
>> seems like you would need some other matter at a different
>> temperature to seek a common equilibrium with. Or is the heat just
>> lost over time no matter what?
>
> The heat would be lost by infrared radiation.
>

Brent,

if we consider a heated block in an infinite universe, then does its
temperature go then to zero Kelvin?

Evgenii

meekerdb

unread,
Jan 27, 2012, 3:22:22 PM1/27/12
to everyth...@googlegroups.com

Obviously not since I wrote above that the thermodynamic entropy is a measure of how much
information it would take to locate the exact state within the phase space allowed by the
thermodynamic parameters.

>
> It would certainly interesting to consider what happens when we decrease the temperature
> (in the limit to zero Kelvin). According to the Third Law the entropy will be zero then.
> What do you think, can we save less information on a copper plate at low temperatures as
> compared with higher temperatures? Or more?

Are you being deliberately obtuse? Information encoded in the shape of the plate is not
accounted for in the thermodynamic tables - they are just based on ideal bulk material
(ignoring boundaries).

Brent

Evgenii Rudnyi

unread,
Jan 27, 2012, 3:43:22 PM1/27/12
to everyth...@googlegroups.com
On 27.01.2012 21:22 meekerdb said the following:

Does this what engineers use when they develop communication devices?

>
>>
>> It would certainly interesting to consider what happens when we
>> decrease the temperature (in the limit to zero Kelvin). According
>> to the Third Law the entropy will be zero then. What do you think,
>> can we save less information on a copper plate at low temperatures
>> as compared with higher temperatures? Or more?
>
> Are you being deliberately obtuse? Information encoded in the shape
> of the plate is not accounted for in the thermodynamic tables - they
> are just based on ideal bulk material (ignoring boundaries).

I am just trying to understand the meaning of the term information that
you use. I would say that there is the thermodynamic entropy and then
the Shannon information entropy. The Shannon has developed a theory to
help engineers to deal with communication (I believe that you have also
recently a similar statement). Yet, in my view when we talk about
communication devices and mechatronics, the information that engineers
are interested in has nothing to do with the thermodynamic entropy. Do
you agree or disagree with that? If you disagree, could you please give
an example from engineering where engineers do employ the thermodynamic
entropy as the estimate of information. My example would be Millipede

http://en.wikipedia.org/wiki/Millipede_memory

I am pretty sure that when IBM engineers develop it, they do not employ
the thermodynamic entropy to estimate its information capabilities.
Also, the increase of temperature would be destroy saved information there.

Well, I might be deliberately obtuse indeed. Yet with the only goal to
reach a clear definition of what the information is. Right now I would
say that there is information in engineering and in physics and they are
different. The first I roughly understand and the second not.

Evgenii

> Brent
>

meekerdb

unread,
Jan 27, 2012, 5:03:41 PM1/27/12
to everyth...@googlegroups.com

I already said I disagreed. You are confusing two different things. Because structural
engineers don't employ the theory of interatomic forces it doesn't follow that
interactomic forces have nothing to do with sturctural properties.

Brent

Russell Standish

unread,
Jan 27, 2012, 5:46:14 PM1/27/12
to everyth...@googlegroups.com
On Fri, Jan 27, 2012 at 08:27:31PM +0100, Evgenii Rudnyi wrote:
> On 26.01.2012 12:00 Russell Standish said the following:
> >If you included these two bits, the thermodynamic entropy is two
> >bits less, = 4.15 x 10^{-24} J/K less
> >
> >This is so many orders of magnitude less than the entropy due to the
> >material, its probably not worth including, but it is there.
>
> I do not believe that effects below the experimental noise are
> important for empirical science. You probably mean then some other
> science, it would be good if you define what science you mean.
>
> Evgenii

For one thing, it indicates to storing just two bits of information on
these physical substrates is grossly inefficient!

Cheers

Craig Weinberg

unread,
Jan 27, 2012, 5:51:20 PM1/27/12
to Everything List
On Jan 27, 11:42 am, John Clark <johnkcl...@gmail.com> wrote:
> On Thu, Jan 26, 2012 Craig Weinberg <whatsons...@gmail.com> wrote:
>
> > If a bucket of water has more of it than DNA, then the word information
> > is meaningless.
>
> You would need to send more, far far more, dots and dashes down a wire to
> inform a intelligent entity what the position and velocity of every
> molecule in bucket of water is than to inform it exactly what the human
> genome is.

It depends what kind of compression you are using. You could much more
easily write a probabilistic equation to simulate any given volume of
water than the same volume of DNA, especially when you get into
secondary and tertiary structure.

> Now what word didn't you understand.

Condescension doesn't impress me. I understand your words perfectly,
it's just that what they are saying seems to be incorrect.

>
> > > A symphony then would have less information and more entropy than random
> > noise.
>
> No, a symphony would have less information but LESS entropy than random
> white noise.

Ok, I think I see what the confusion is. We are operating not only
different definitions of entropy but different assumptions about the
universe which directly relate to information.

This Q&A: http://stackoverflow.com/questions/651135/shannons-entropy-formula-help-my-confusion
was the only page I could find that was written simply enough to make
sense to me. Your definition assumes that the universe is a platform
for encoding and decoding information and mine does not. You are
talking about entropy in terms of resistance to compression of
redundancy. Ok, but the relationship of Shannon entropy and
thermodynamic entropy is not what you are implying it is. The Wiki was
helpful: http://en.wikipedia.org/wiki/Entropy_(information_theory)

"At an everyday practical level the links between information entropy
and thermodynamic entropy are not evident. Physicists and chemists are
apt to be more interested in changes in entropy as a system
spontaneously evolves away from its initial conditions, in accordance
with the second law of thermodynamics, rather than an unchanging
probability distribution. And, as the minuteness of Boltzmann's
constant kB indicates, the changes in S / kB for even tiny amounts of
substances in chemical and physical processes represent amounts of
entropy which are so large as to be off the scale compared to anything
seen in data compression or signal processing. Furthermore, in
classical thermodynamics the entropy is defined in terms of
macroscopic measurements and makes no reference to any probability
distribution, which is central to the definition of information
entropy.

But, at a multidisciplinary level, connections can be made between
thermodynamic and informational entropy, although it took many years
in the development of the theories of statistical mechanics and
information theory to make the relationship fully apparent. In fact,
in the view of Jaynes (1957), thermodynamic entropy, as explained by
statistical mechanics, should be seen as an application of Shannon's
information theory: the thermodynamic entropy is interpreted as being
proportional to the amount of further Shannon information needed to
define the detailed microscopic state of the system, that remains
uncommunicated by a description solely in terms of the macroscopic
variables of classical thermodynamics, with the constant of
proportionality being just the Boltzmann constant."

The key phrase for me here is "the thermodynamic entropy is
interpreted as being proportional to the amount of further Shannon
information needed to define the detailed microscopic state of the
system". This confirms what I have been saying and is the opposite of
what you are saying. Thermodynamic entropy is proportional to the
amount of Shannon information *needed* to (encode/compress/extract
redundancy) from a given description to arrive at a maximally
compressed description. The more entropy or patternlessness you have,
ie the more equilibrium of probability and lack of redundancy you
have, the less information you have and the more Shannon information
you need to avoid lossy compression.

This means that DNA, having low entropy compared with pure water, has
high pattern content, high information, and less Shannon information
is required to describe it. Easier to compress does *not* mean less
information, it means more information is present already because in
essence the job is already partially done for you. Shannon entropy
then, is a measure of drag on compression, a figurative use of the
term entropy for the specific purpose of encoding and decoding. I am
using the literal thermodynamic sense of entropy, as well as the
figurative vernacular sense of entropy as degradation of order or
coherence, both of these are loosely inversely proportional to Shannon
entropy. The compressibility of a novel or picture does not relate to
the quality of information, not to mention qualities of significance.
Weighing art by the pound is not a serious way to approach a theory
about consciousness or qualia.


>That's why lossless computer image and sound compression
> programs don't work with white noise, there is no redundancy to remove
> because white noise has no redundancy. It would take many more dots and
> dashes sent down a wire to describe every pop and click in a piece of white
> noise than to describe a symphony of equal length.

Yes, I see what you mean. I had not heard of Shannon information. It's
an excellent tool for working with statistical data, but tells us
nothing about what information actually is or does. It is an analysis
of how to engineer data quantitatively, and as such, it
(appropriately) takes data for granted. I don't do that.

>
> > If the word information is to have any meaning, quantity and
> > compressibility of data must be distinguished from quality of it's
> > interpretation.
>
> If you want to clearly distinguish these things, and I agree that is a very
> good idea, then you need separate words for the separate ideas. Quality is
> subjective so mathematics can not deal with it, mathematics can work with
> quantity however, so if quality comes into play you can not use the word
> "information" because mathematics already owns that word; but there are
> plenty of other words that you can use, words like "knowledge" or
> "wisdom".

Yes, I agree, that's why I make such a big deal about not reaching for
that term to talk about perception. Perception is all about quality.
Knowledge and wisdom are already owned by philosophy and religion,
that's why I use sensorimotive awareness, perception, cognition,
feeling, sensing, etc.

>
> > Let's say your definition were true though. What does it have to do with
> > information being directly proportionate to entropy?
>
> The larger the entropy something has the more information it has.

Yes, this was a semantic confusion. I don't understand why you would
use the general term information to describe Shannon information, but
I at least understand what you mean now. Shannon information may be
the 'only reasonable way to measure information' but it is not
information and it does not map to information. It's like measuring a
volume of water in terms of kWH of power it can generate. Yes, it is a
way of measuring an effect of a quantity of water but it is not a
direct measurement of any quantity or quality of water itself. It
should be considered also that the Shannon approach is only valid
because it's the best we have come up with so far. The human mind does
not work like a computer - it does not compress and decode memories
that way at all. The psyche stores experiences as iconic associations;
semantic triggers for sensorimotive cascades which are concrete analog
presentations that re-present, *not* representations and not digital
data.

>
> > If entropy were equal or proportionate to information, then are saying
> > that the more information something contains, the less it matters.
>
> Whether it matters or not is subjective so you should not use the word
> "information" in the above.

No. You should not use the word information when you are talking about
a special case statistical definition of the word. I don't think it's
an exaggeration to say that 99% of people who use the word information
use it the way that I've been using it, not in the way you have been
using it.

> A bucket of water contains far more information
> than the human genome but the human genome has far more knowledge, at least
> I think so, although a bucket of water might disagree with me.

No. The bucket of water has higher thermodynamic entropy which
requires more Shannon information to describe. The encoded description
of the water has more information if we were to simulate it exactly,
but that doesn't mean the original has more information, it just means
it's got more noise (less signal).

Craig

meekerdb

unread,
Jan 27, 2012, 6:11:35 PM1/27/12
to everyth...@googlegroups.com

You're switching meanings of "information". Something highly compressible, like,
"AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA" doesn't convey much information in either the
colloquial or Shannon sense. I think it's important to keep in mind that these measures
of information are relative to some context. Removed from it's cellular environment, the
code for a strand of DNA would not convey much information in the colloquial sense, but
its Shannon information would be the same.


> it means more information is present already because in
> essence the job is already partially done for you. Shannon entropy
> then, is a measure of drag on compression, a figurative use of the
> term entropy for the specific purpose of encoding and decoding. I am
> using the literal thermodynamic sense of entropy,

You mean the integrating variable S in TdS=dQ?

> as well as the
> figurative vernacular sense of entropy as degradation of order or
> coherence, both of these are loosely inversely proportional to Shannon
> entropy.

No; more varied strings, with less internal correlation, more random looking, convey more
information.

> The compressibility of a novel or picture does not relate to
> the quality of information, not to mention qualities of significance.
> Weighing art by the pound is not a serious way to approach a theory
> about consciousness or qualia.
>
>
>> That's why lossless computer image and sound compression
>> programs don't work with white noise, there is no redundancy to remove
>> because white noise has no redundancy. It would take many more dots and
>> dashes sent down a wire to describe every pop and click in a piece of white
>> noise than to describe a symphony of equal length.
> Yes, I see what you mean. I had not heard of Shannon information. It's
> an excellent tool for working with statistical data, but tells us
> nothing about what information actually is or does.

It does so long as you keep the context in mind.

Brent

Craig Weinberg

unread,
Jan 27, 2012, 6:24:15 PM1/27/12
to Everything List
On Jan 27, 1:31 pm, John Clark <johnkcl...@gmail.com> wrote:
> On Thu, Jan 26, 2012 at 8:03 PM, Craig Weinberg <whatsons...@gmail.com>wrote:
>
> > With the second law of thermodynamics, it seems like heat could only
> > dissipate by heating something else up.
>
> The second law says that energy will tend to get diluted in space over
> time, and heat conducting to other matter is one way for this to happen but
> it is not the only way. Photons radiating outward in all directions from a
> hot object is another way energy can get diluted. But among many other
> things, you don't think photons, or logic, exist so I doubt this answer
> will satisfy you.

It would satisfy me if I you had some examples, but I don't think that
you know the answer for sure. If a vacuum is a good insulator (like a
vacuum thermos) and a perfect vacuum, as far as I have been able to
read online, is a perfect insulator. Electricity and heat pass from
object to object, not from space to space. Please point out any source
you can find to the contrary. What little I find agrees with vacuums
being insulators of heat and electricity.

Craig

Craig Weinberg

unread,
Jan 27, 2012, 6:31:55 PM1/27/12
to Everything List
I understand completely, and I apologize, but I am not here to
understand second hand summaries of authoritative knowledge form the
past. I am only interested in first hand, common sense realities
because my hypothesis presents a radical challenge of the post-
Enlightenment and pre-Enlightenment worldviews. EVERYTHING must be
questioned anew.

It's hard to find any first hand information on experiments on the
basics of modern physics as the accounts all take the interpretation
as a foregone solution. You never see any documentation of double slit
tests which don't presume photons to begin with. If I had access to a
lab there are a lot of experiments I would want to run that might be
revealing in a completely new way.

Craig

Craig Weinberg

unread,
Jan 27, 2012, 6:33:46 PM1/27/12
to Everything List
On Jan 27, 2:33 pm, Evgenii Rudnyi <use...@rudnyi.ru> wrote:
> On 26.01.2012 19:01 John Clark said the following:
>
> > On Thu, Jan 19, 2012 at 5:28 PM, Craig
> > Weinberg<whatsons...@gmail.com>wrote:
>
> ...
>
> >>> If I have red legos and white legos, and I build two opposite
> >>> monochrome
> >> houses and one of mixed blocks, how in the world does that effect
> >> the entropy of the plastic bricks in any way?
>
> > It does not effect the entropy of the plastic bricks but it does
> > change the entropy of the structures built with those plastic bricks.
>
> This change in the entropy is below of experimental noise. Just estimate
> what difference it makes and the difference in what digit in the total
> entropy you will have. Hence the talk about the thermodynamic entropy as
> the information source in this case is just meaningless, as you cannot
> experimentally measure what you are talking about.
>
> Evgenii

Thanks, that's what I thought.

Evgenii Rudnyi

unread,
Jan 28, 2012, 2:47:52 AM1/28/12
to everyth...@googlegroups.com
On 27.01.2012 23:03 meekerdb said the following:

You disagree that engineers do not use thermodynamic entropy but you
have not shown yet how information in engineering is related with the
thermodynamic entropy. Form the Millipede example

>> http://en.wikipedia.org/wiki/Millipede_memory

"The earliest generation millipede devices used probes 10 nanometers in
diameter and 70 nanometers in length, producing pits about 40 nm in
diameter on fields 92 µm x 92 µm. Arranged in a 32 x 32 grid, the
resulting 3 mm x 3 mm chip stores 500 megabits of data or 62.5 MB,
resulting in an areal density, the number of bits per square inch, on
the order of 200 Gbit/in²."

If would be much easier to understand you if you say to what
thermodynamic entropy corresponds the value of 62.5 MB in Millipede.

The only example on Thermodynamic Entropy == Information so far from you
was the work on a black hole. However, as far as I know, there is no
theory yet to describe a black hole, as from one side you need
gravitation, from the other side quantum effects. The theory that unites
them seems not to exist.

Evgenii

Evgenii Rudnyi

unread,
Jan 28, 2012, 2:58:54 AM1/28/12
to everyth...@googlegroups.com
On 27.01.2012 23:46 Russell Standish said the following:

> On Fri, Jan 27, 2012 at 08:27:31PM +0100, Evgenii Rudnyi wrote:
>> On 26.01.2012 12:00 Russell Standish said the following:
>>> If you included these two bits, the thermodynamic entropy is two
>>> bits less, = 4.15 x 10^{-24} J/K less
>>>
>>> This is so many orders of magnitude less than the entropy due to
>>> the material, its probably not worth including, but it is there.
>>
>> I do not believe that effects below the experimental noise are
>> important for empirical science. You probably mean then some other
>> science, it would be good if you define what science you mean.
>>
>> Evgenii
>
> For one thing, it indicates to storing just two bits of information
> on these physical substrates is grossly inefficient!

Well, you could contact governments then and try to convince them that
coins in use are highly inefficient. It would be a good chance to have
funding.

By the way, at what temperature there will be possible to save more
information, at higher or at lower one. Brent and John are talking about
the entropy and the higher temperature the higher the entropy. From an
engineering viewpoint it looks a bit strange.

Evgenii

> Cheers
>

Evgenii Rudnyi

unread,
Jan 28, 2012, 3:17:26 AM1/28/12
to everyth...@googlegroups.com
On 28.01.2012 00:24 Craig Weinberg said the following:

Graig,

Radiation does happen. If you need more detail, there is a nice free
book from MIT

A Heat Transfer Textbook, 4th edition
John H. Lienhard IV, Professor, University of Houston
John H. Lienhard V, Professor, Massachusetts Institute of Technology

http://web.mit.edu/lienhard/www/ahtt.html

Some disadvantage is that it is thick but you go directly to Part IV
Thermal Radiation Heat Transfer. Vacuum is a good insulator but thermal
radiation gets through.

It is pretty important for example to include radiation in the case of
free convection as it may account up to 40% of heat transfer in this case.

Evgenii

Russell Standish

unread,
Jan 28, 2012, 5:20:31 AM1/28/12
to everyth...@googlegroups.com
On Sat, Jan 28, 2012 at 08:58:54AM +0100, Evgenii Rudnyi wrote:
> On 27.01.2012 23:46 Russell Standish said the following:
> >For one thing, it indicates to storing just two bits of information
> >on these physical substrates is grossly inefficient!
>
> Well, you could contact governments then and try to convince them
> that coins in use are highly inefficient. It would be a good chance
> to have funding.

Chuckle. Maybe we can persuade them to get behind bitcoin :).

>
> By the way, at what temperature there will be possible to save more
> information, at higher or at lower one.

What does this mean?

> Brent and John are talking
> about the entropy and the higher temperature the higher the entropy.

True. But information has no such relationship with temperature, other
than that the maximum possible value for information increases with temperature.

Remember the equation S+I = S_max. S_max obviously increases with
temperature. So usually does S, but S can be decreased by organisation
of the matter - by the input of information.

> From an engineering viewpoint it looks a bit strange.

How so?

>
> Evgenii
>
> >Cheers
> >
>
> --
> You received this message because you are subscribed to the Google Groups "Everything List" group.
> To post to this group, send email to everyth...@googlegroups.com.
> To unsubscribe from this group, send email to everything-li...@googlegroups.com.
> For more options, visit this group at http://groups.google.com/group/everything-list?hl=en.

Evgenii Rudnyi

unread,
Jan 28, 2012, 6:05:57 AM1/28/12
to everyth...@googlegroups.com
On 28.01.2012 11:20 Russell Standish said the following:

> On Sat, Jan 28, 2012 at 08:58:54AM +0100, Evgenii Rudnyi wrote:
>> On 27.01.2012 23:46 Russell Standish said the following:
>>> For one thing, it indicates to storing just two bits of
>>> information on these physical substrates is grossly inefficient!
>>
>> Well, you could contact governments then and try to convince them
>> that coins in use are highly inefficient. It would be a good
>> chance to have funding.
>
> Chuckle. Maybe we can persuade them to get behind bitcoin :).
>
>>
>> By the way, at what temperature there will be possible to save
>> more information, at higher or at lower one.
>
> What does this mean?

Let us take a hard disk. Can I save more information on it at higher or
lower temperatures?

>
>> Brent and John are talking about the entropy and the higher
>> temperature the higher the entropy.
>
> True. But information has no such relationship with temperature,
> other than that the maximum possible value for information increases
> with temperature.
>
> Remember the equation S+I = S_max. S_max obviously increases with
> temperature. So usually does S, but S can be decreased by
> organisation of the matter - by the input of information.
>
>> From an engineering viewpoint it looks a bit strange.

> How so?
>

If engineers would take the statement "the maximum possible value for
information increases with temperature" literally, they should operate a
hard disk at higher temperatures (the higher the better according to
such a statement). Yet this does not happens. Do you know why?

In general we are surrounded devices that store information (hard discs,
memory sticks, DVD, etc.). The information that these devices can store,
I believe, is known with accuracy to one bit. Can you suggest a
thermodynamic state which entropy gives us exactly that amount of
information?

Here would be again a question about temperature. If I operate my memory
stick in some reasonable range of temperatures, the information it
contains does not change. Yet, the entropy in my view changes.

So these are my doubts for which I do not see an answer.

Evgenii

John Clark

unread,
Jan 28, 2012, 1:48:28 PM1/28/12
to everyth...@googlegroups.com
On Fri, Jan 27, 2012 at 5:51 PM, Craig Weinberg <whats...@gmail.com> wrote:

> You could much more easily write a probabilistic equation to simulate any given volume of water than the same volume of DNA, especially

The motion of both can be well described by Napier-Stokes  equations which describe fluid flow using Newton's laws, and DNA being more viscous than water the resulting equations would be simpler than the ones for water.
 
> when you get into secondary and tertiary structure.

You've got to play fair, it you talk about micro states for DNA I get to talk about micro states for water.


> I had not heard of Shannon information.

Somehow I'm not surprised, and it's Shannon Information Theory.
 
The key phrase for me here is "the thermodynamic entropy is interpreted as being proportional to the amount of further Shannon information needed to define the detailed microscopic state of the
system".

OK, although I don't see what purpose the word "further" serves in the above, and although I know all about Claude Shannon the term "Shannon information" is nonstandard. What would Non-Shannon information be?
 
> This confirms what I have been saying and is the opposite of what you are saying.

What on Earth are you talking about?? The more entropy a system has the more information needed to describe it.
 
> This means that DNA, having low entropy compared with pure water, has high pattern content, high information, and less Shannon information"

I see, it has high information and less information. No I take that back, I don't see, although it is consistent with your usual logical standards.

> Easier to compress does *not* mean less information

It means a message has been inflated with useless gas and a compression program can remove that gas and recover the small kernel of information undamaged.  White noise has no gas in it for a compression program to deflate, that's why if you don't know the specific compression program used the resulting file ( like a zip or gif file) would look like random white noise, and yet its full of useful information if you know how to get it. The same thing is true of encrypted files, if the encription is good then the file will look completely random, just white noise, to anyone who does not have the secret key. 

> The compressibility of a novel or picture does not relate to the quality of information

How do you expect mathematics to deal with anything as subjective as quality? A novel that's high quality to you may be junk to me.

> Knowledge and wisdom are already owned by philosophy and religion,

I've never heard of religion saying anything wise, philosophy does contain wisdom but none of it came from professional philosophers, at least not in the last 300 years. 

> The human mind does not work like a computer

As you've said before, but saying it does not make it so.

> it does not compress and decode memories

Then the human mind works very inefficiently and needs improvement.

> are concrete analog presentations that re-present, *not* representations and not digital data.

Even asterisks do not make it so.

> I don't think it's an exaggeration to say that 99% of people who use the word information use it the way that I've been using it

I don't think it's an exaggeration to say that if you wish to understand how mind works the verbiage generated by 99% of the people on this planet will be of no help to you whatsoever; better to listen to what the experts have to say about the subject. 

> The bucket of water has higher thermodynamic entropy which requires more Shannon information to describe.

 Yes
 
> The encoded description of the water has more information if we were to simulate it exactly

Yes.

> but that doesn't mean the original has more information,

I see, it has more information but that doesn't mean it has more information.  No I take that back, I don't see, although it is consistent with your usual logical standards.

 John K Clark


meekerdb

unread,
Jan 28, 2012, 5:26:16 PM1/28/12
to everyth...@googlegroups.com


Yes. I disagreed that information "has nothing to do with thermodynamic entropy", as you
wrote above. You keep switching formulations. You write X and ask if I agree. I
disagree. Then you claim I've disagreed with Y. Please pay attention to your own
writing. There's a difference between "X is used in place of Y" and "X has nothing to do
with Y".

> but you have not shown yet how information in engineering is related with the
> thermodynamic entropy. Form the Millipede example
>
> >> http://en.wikipedia.org/wiki/Millipede_memory
>
> "The earliest generation millipede devices used probes 10 nanometers in diameter and 70
> nanometers in length, producing pits about 40 nm in diameter on fields 92 µm x 92 µm.
> Arranged in a 32 x 32 grid, the resulting 3 mm x 3 mm chip stores 500 megabits of data
> or 62.5 MB, resulting in an areal density, the number of bits per square inch, on the
> order of 200 Gbit/in²."
>
> If would be much easier to understand you if you say to what thermodynamic entropy
> corresponds the value of 62.5 MB in Millipede.


The Shannon information capacity is 5e8 bits. The thermodynamic entropy depends on the
energy used to switch a memory element. I'd guess it must correspond to at least few tens
of thousands of electrons at 9v, so

S ~ [5e8 * 9e4 eV]/[8.6e-5 eV/degK * 300degK]~17e15

So the total entropy is about 17e15+5e8, and the information portion is numerically (but
not functionally) negligible compared to the thermodynamic.

Brent

meekerdb

unread,
Jan 28, 2012, 5:30:30 PM1/28/12
to everyth...@googlegroups.com

At a higher temperature there are more microstates accessible and hence more uncertainty
as to which state is actually realized. But if you're storing information, which you want
to retrieve, this uncertainty is noise and you have to use larger increments of energy to
reliably switch states. So for storage it is more efficient (takes less energy per bit)
to be colder.

Brent

>
> Evgenii
>
>> Cheers
>>
>

Russell Standish

unread,
Jan 28, 2012, 6:42:13 PM1/28/12
to everyth...@googlegroups.com
On Sat, Jan 28, 2012 at 12:05:57PM +0100, Evgenii Rudnyi wrote:
>
> Let us take a hard disk. Can I save more information on it at higher
> or lower temperatures?

This is a strictly ambiguous question. If we take the usual meaning of
hard disk as including a particular apparatus (heads, controller
logic, SATA interface and so on) to read and write the data, then
there will be a limited range of temperatures over which that
apparatus will operate. Outside of that range, (both higher and lower)
the information storage will fall to zero. That is a purely
engineering question.

On the other hand, if you just gave me the metallic platter from the
hard disk, and did not restrict in any way the technology used to read
and write the data, then in principle, the higher the temperature, the
more information is capable of being encoded on the disk.

In practice, various phase transitions will make this more difficult
to achieve as temperature is increased. Passing the curie point, for
instance, will mean we can no longer rely on magnetism, although
presumably even below the curie point we can increase the information
storage in some other way (eg moving atoms around by an STM) and
ignoring the ferromagnetic behaviour. By the same token, passing the
freezing and boiling points will make it even harder - but still
doable with sufficiently advanced technology.

> >
> >>From an engineering viewpoint it looks a bit strange.
>
> > How so?
> >
>
> If engineers would take the statement "the maximum possible value
> for information increases with temperature" literally, they should
> operate a hard disk at higher temperatures (the higher the better
> according to such a statement). Yet this does not happens. Do you
> know why?
>
> In general we are surrounded devices that store information (hard
> discs, memory sticks, DVD, etc.). The information that these devices
> can store, I believe, is known with accuracy to one bit.

Because they're engineered that way. It would be rather inconvenient if
one's information storage varied with temperature.

> Can you
> suggest a thermodynamic state which entropy gives us exactly that
> amount of information?
>
> Here would be again a question about temperature. If I operate my
> memory stick in some reasonable range of temperatures, the
> information it contains does not change. Yet, the entropy in my view
> changes.

Sure - because they're engineered that way, and they operate a long
way from the theoretical maximum storage capability of that
matter. What's the problem with that?

>
> So these are my doubts for which I do not see an answer.
>
> Evgenii
>

--

meekerdb

unread,
Jan 29, 2012, 12:41:27 AM1/29/12
to everyth...@googlegroups.com
On 1/28/2012 3:42 PM, Russell Standish wrote:
> On Sat, Jan 28, 2012 at 12:05:57PM +0100, Evgenii Rudnyi wrote:
>> Let us take a hard disk. Can I save more information on it at higher
>> or lower temperatures?
> This is a strictly ambiguous question. If we take the usual meaning of
> hard disk as including a particular apparatus (heads, controller
> logic, SATA interface and so on) to read and write the data, then
> there will be a limited range of temperatures over which that
> apparatus will operate. Outside of that range, (both higher and lower)
> the information storage will fall to zero. That is a purely
> engineering question.
>
> On the other hand, if you just gave me the metallic platter from the
> hard disk, and did not restrict in any way the technology used to read
> and write the data, then in principle, the higher the temperature, the
> more information is capable of being encoded on the disk.

I don't think this is quite right. A higher temperature means that there are more energy
states available. But the concept of 'temperature' implies that these are occupied in a
random way (according to the micro-canonical ensemble). For us to read and write data
requires that the act of reading or writing a bit moves the distribution of states in
phase space enough that it is distinguishable from the random fluctuations due to
temperature. So if the medium is hotter, you need to use more energy to read and write a
bit. This of course runs into the problems you note below. So in practice it is often
colder systems that allow us to store more data because then we can use small energy
differences to encode bits.

Brent

Craig Weinberg

unread,
Jan 29, 2012, 1:54:52 AM1/29/12
to Everything List
On Jan 28, 1:48 pm, John Clark <johnkcl...@gmail.com> wrote:
> On Fri, Jan 27, 2012 at 5:51 PM, Craig Weinberg <whatsons...@gmail.com>wrote:
>
> > You could much more easily write a probabilistic equation to simulate any
> > given volume of water than the same volume of DNA, especially
>
> The motion of both can be well described by Napier-Stokes  equations which
> describe fluid flow using Newton's laws, and DNA being more viscous than
> water the resulting equations would be simpler than the ones for water.

I'm not talking about fluid flow, I'm talking about simulating
everything - potential and actual chemical reactions, etc. Water can
be described by multiplying the known interactions of H2O, DNA would
need many more variables.

>
> > > when you get into secondary and tertiary structure.
>
> You've got to play fair, it you talk about micro states for DNA I get to
> talk about micro states for water.
>
> > I had not heard of Shannon information.
>
> Somehow I'm not surprised, and it's Shannon Information Theory.

No, I've heard of Shannon Information Theory. I didn't realize that it
was such an instrumental special case theory though.

>
> > The key phrase for me here is "the thermodynamic entropy is interpreted as
> > being proportional to the amount of further Shannon information needed to
> > define the detailed microscopic state of the
> > system".
>
> OK, although I don't see what purpose the word "further" serves in the
> above, and although I know all about Claude Shannon the term "Shannon
> information" is nonstandard. What would Non-Shannon information be?

Non-Shannon information would be anything that is not directly
involved in the compression of a digitally sampled description into
another digital description. Further means that if you add x calories
of heat, you need x more units of Shannon information to define the
effect of the added heat/motion.

>
> > > This confirms what I have been saying and is the opposite of what you
> > are saying.
>
> What on Earth are you talking about?? The more entropy a system has the
> more information needed to describe it.

Yes. It is information that lets you describe patterns more easily.
The more pattern there is, the more you can say 'yes, I get it, add
500 0s and then another 1'. When there is less information, less
pattern, more energy, it takes more information to describe it. There
are no patterns to give your compression a shortcut.

>
> > > This means that DNA, having low entropy compared with pure water, has
> > high pattern content, high information, and less Shannon information"
>
> I see, it has high information and less information. No I take that back, I
> don't see, although it is consistent with your usual logical standards.

Shannon information is not information in general, it is a specific
kind of information about information which is really inversely
proportional to information in any other sense. It's uninformability
is what it is. Drag. Entropy. Resistance to the process (not
thermodynamic resistance).

>
> > Easier to compress does *not* mean less information
>
> It means a message has been inflated with useless gas and a compression
> program can remove that gas and recover the small kernel of information
> undamaged.

Hahaha. The useless gas is what separates coherence and sanity from
garbage. It's useless to a computer, sure, but without the gas it's
useless to us. Next time you want to look at a picture, try viewing it
in it's compressed form in a hex editor. Get rid of all that useless
gas.

> White noise has no gas in it for a compression program to
> deflate, that's why if you don't know the specific compression program used
> the resulting file ( like a zip or gif file) would look like random white
> noise, and yet its full of useful information if you know how to get it.
> The same thing is true of encrypted files, if the encription is good then
> the file will look completely random, just white noise, to anyone who does
> not have the secret key.

I understand what you mean completely, and that is indeed how
computers treat data, but it is the opposite of what it means to
inform in general terms. Compression and encryption are deformations.
Decryption is how we get any information out of it. White noise is
called noise for a reason. The opposite of noise is signal. Signals
are signifying and informing, thus information.

>
> > The compressibility of a novel or picture does not relate to the quality
> > of information
>
> How do you expect mathematics to deal with anything as subjective as
> quality? A novel that's high quality to you may be junk to me.

I don't expect mathematics to deal with it. I expect a theory of
everything to deal with it.

>
> > Knowledge and wisdom are already owned by philosophy and religion,
>
> I've never heard of religion saying anything wise, philosophy does contain
> wisdom but none of it came from professional philosophers, at least not in
> the last 300 years.

I'm not a big philosophy or religion fan myself but Wittgenstein,
Heidegger, Sarte, Foucault, Kierkegaard were recent and had some
impressive things to say. But my point was that those terms are
associated too much with those traditions to be used meaningfully in a
new universal synthesis.

>
> > The human mind does not work like a computer
>
> As you've said before, but saying it does not make it so.

No but thinking about what it means does make it so. Here's some
sample articles on the subject:


http://scienceblogs.com/developingintelligence/2007/03/why_the_brain_is_not_like_a_co.php

http://neuroanthropology.net/2008/03/17/how-is-your-brain-not-like-a-computer/

http://dangerousintersection.org/2006/05/18/the-brain-is-not-a-computer/

http://www.forbes.com/forbes/2009/0511/046-artificial-intelligence-neuroscience-digital-tools.html

>
> > it does not compress and decode memories
>
> Then the human mind works very inefficiently and needs improvement.

Why so we can fill it up with a petabytes of tooth brushing
highlights?

>
> > are concrete analog presentations that re-present, *not* representations
> > and not digital data.
>
> Even asterisks do not make it so.

It already is made so, I'm just pointing it out. It's not a matter of
opinion. Blue is a presentation, not some generic quantitative
representation of non-blueness.

>
> > I don't think it's an exaggeration to say that 99% of people who use the
> > word information use it the way that I've been using it
>
> I don't think it's an exaggeration to say that if you wish to understand
> how mind works the verbiage generated by 99% of the people on this planet
> will be of no help to you whatsoever; better to listen to what the experts
> have to say about the subject.

"Science begins when you distrust experts.” - Richard Feynman.

You're right, I'll trust Feynman.

>
> > The bucket of water has higher thermodynamic entropy which requires more
> > Shannon information to describe.
>
>  Yes
>
> > > The encoded description of the water has more information if we were to
> > simulate it exactly
>
> Yes.
>
> > but that doesn't mean the original has more information,
>
> I see, it has more information but that doesn't mean it has more
> information.  No I take that back, I don't see, although it is consistent
> with your usual logical standards.

Repeating yourself there. I hope you realize now that information is
not at all the same thing as Shannon information, and that there are
many uses of the word entropy. It's a fundamental concept like
'degeneration' so it gets used in a lot of different ways. If you want
to talk about how a physical object degenerates you might talk about
thermodynamic entropy. If you want to talk about how a crowd disperses
after a football game you might talk about entropy in a geographic
statistical sense. The two are not related. You can't make the crowd
disperse earlier by warming the stadium up.

Craig

Evgenii Rudnyi

unread,
Jan 29, 2012, 10:23:12 AM1/29/12
to everyth...@googlegroups.com
On 28.01.2012 23:26 meekerdb said the following:

> On 1/27/2012 11:47 PM, Evgenii Rudnyi wrote:

...

>> You disagree that engineers do not use thermodynamic entropy
>
>
> Yes. I disagreed that information "has nothing to do with
> thermodynamic entropy", as you wrote above. You keep switching
> formulations. You write X and ask if I agree. I disagree. Then you
> claim I've disagreed with Y. Please pay attention to your own
> writing. There's a difference between "X is used in place of Y" and
> "X has nothing to do with Y".

A good suggestion. It well might be that I express my thoughts unclear,
sorry for that. Yet, I think that my examples show that

1) There is information that engineers employ.

2) There is the thermodynamic entropy.

3) Numerical values in 1) and 2) are not related to each other.

Otherwise I would appreciate if you express the relationship between
information that engineers use and the thermodynamic entropy in your own
words, as this is the question that I would like to understand.

I understand you when you say about the number of microstates. I do not
understand though how they are related to the information employed by
engineers. I would be glad to hear your comment on that.

Evgenii

Evgenii Rudnyi

unread,
Jan 29, 2012, 10:30:38 AM1/29/12
to everyth...@googlegroups.com
On 29.01.2012 00:42 Russell Standish said the following:

> On Sat, Jan 28, 2012 at 12:05:57PM +0100, Evgenii Rudnyi wrote:

...

>> In general we are surrounded devices that store information (hard
>> discs, memory sticks, DVD, etc.). The information that these
>> devices can store, I believe, is known with accuracy to one bit.
>
> Because they're engineered that way. It would be rather inconvenient
> if one's information storage varied with temperature.
>
>> Can you suggest a thermodynamic state which entropy gives us
>> exactly that amount of information?
>>
>> Here would be again a question about temperature. If I operate my
>> memory stick in some reasonable range of temperatures, the
>> information it contains does not change. Yet, the entropy in my
>> view changes.
>
> Sure - because they're engineered that way, and they operate a long
> way from the theoretical maximum storage capability of that matter.
> What's the problem with that?

The problem that I see is that the entropy changes when the temperature
changes. Or do you claim that the entropy of the memory stick/DVD/hard
disc remains the same when its temperature changes for example from 15
to 25 degrees?

Anyway, I do not see how one can obtain the information capacity of the
storage devices from the thermodynamic entropy and this is my point.

Do you claim, that the information capacity for which we pay money of a
memory stick/DVD/hard disk is equivalent to the thermodynamic entropy of
the device?

Evgenii

Russell Standish

unread,
Jan 29, 2012, 4:49:59 PM1/29/12
to everyth...@googlegroups.com
On Sun, Jan 29, 2012 at 04:23:12PM +0100, Evgenii Rudnyi wrote:
> On 28.01.2012 23:26 meekerdb said the following:
> >On 1/27/2012 11:47 PM, Evgenii Rudnyi wrote:
>
> A good suggestion. It well might be that I express my thoughts
> unclear, sorry for that. Yet, I think that my examples show that
>
> 1) There is information
and entropy

> that engineers employ.
>
> 2) There is the thermodynamic entropy.

+ thermodynamic information

>
> 3) Numerical values in 1) and 2) are not related to each other.
>

Fixed that for you. Why should you expect the different types of
information that come from different contexts to have the same
numerical value? The whole point of "On complexity and emergence" is
that notions of information and entropy are complete context sensitive
(that is not to say their subjective as such - people agreeing on the
context will agree on the numerical values).

Russell Standish

unread,
Jan 29, 2012, 5:00:13 PM1/29/12
to everyth...@googlegroups.com
On Sun, Jan 29, 2012 at 04:30:38PM +0100, Evgenii Rudnyi wrote:
>
> The problem that I see is that the entropy changes when the
> temperature changes. Or do you claim that the entropy of the memory
> stick/DVD/hard disc remains the same when its temperature changes
> for example from 15 to 25 degrees?

The entropy changes.

>
> Anyway, I do not see how one can obtain the information capacity of
> the storage devices from the thermodynamic entropy and this is my
> point.
>

Who was ever claiming that? The theoretically maximum possible
information storage is related, though.

> Do you claim, that the information capacity for which we pay money
> of a memory stick/DVD/hard disk is equivalent to the thermodynamic
> entropy of the device?
>

Never. The best you have is I=S_max-S, where I is the theoretical
maximum possible information storage. The value C (capacity of the
storage device) must satisfy

C <= I.

Usually C << I, for technological reasons. Also, it is undesirable to
have C vary with temperature, whereas I does vary in general
(particularly across phase transitions).

The information content of a drive is another number D <= C, usually
much less, and very dependent on the user of that drive. If the drive
is encrypted, and the user has lost the key, the information content
is close to zero.

The quantities I, C and D are all numerical quantities having the name
information.

Cheers

Russell Standish

unread,
Jan 29, 2012, 5:06:02 PM1/29/12
to everyth...@googlegroups.com
On Sat, Jan 28, 2012 at 09:41:27PM -0800, meekerdb wrote:
> On 1/28/2012 3:42 PM, Russell Standish wrote:
> >On the other hand, if you just gave me the metallic platter from the
> >hard disk, and did not restrict in any way the technology used to read
> >and write the data, then in principle, the higher the temperature, the
> >more information is capable of being encoded on the disk.
>
> I don't think this is quite right. A higher temperature means that
> there are more energy states available. But the concept of
> 'temperature' implies that these are occupied in a random way
> (according to the micro-canonical ensemble). For us to read and
> write data requires that the act of reading or writing a bit moves
> the distribution of states in phase space enough that it is
> distinguishable from the random fluctuations due to temperature.
> So
> if the medium is hotter, you need to use more energy to read and
> write a bit. This of course runs into the problems you note below.

Hence the requirement that technology not be fixed. It is a
theoretician's answer :).

> So in practice it is often colder systems that allow us to store
> more data because then we can use small energy differences to encode
> bits.

Absolutely! But at zero kelvin, the information storage capacity of the
device is precisely zero, so cooling only works to a certain point.

> --
> You received this message because you are subscribed to the Google Groups "Everything List" group.
> To post to this group, send email to everyth...@googlegroups.com.
> To unsubscribe from this group, send email to everything-li...@googlegroups.com.
> For more options, visit this group at http://groups.google.com/group/everything-list?hl=en.

--

John Clark

unread,
Jan 30, 2012, 12:03:29 AM1/30/12
to everyth...@googlegroups.com
On Sun, Jan 29, 2012 Craig Weinberg <whats...@gmail.com> wrote:

> I'm not talking about fluid flow,

OK


> I'm talking about simulating everything - potential and actual chemical reactions, etc.

OK


> Water can be described by multiplying the known interactions of H2O,

But many, probably most, of water's interactions are unknown to this
day. Virtually all of organic chemistry (including DNA reactions!)
involves water somewhere in the chain of reaction, but organic chemistry
is very far from a closed subject, there is still much to learn. Another
example, up to now nobody has derived the temperature that water freezes
at from first principles because the resulting quantum mechanical
equations are so mathematically complicated that nobody has yet figured
out how to solve them.


> DNA would need many more variables.

BULLSHIT!


> Non-Shannon information would be anything that is not directly involved in the compression of a digitally sampled description into another digital description.

In other words non-Shannon information is gaseous philosophical flatulence.

       > Shannon information is not information in general, it is [...]

Shannon published his work in 1948 but you never even heard about it
until 3 days ago, and now you're a great world authority on the subject
telling us all exactly what it does and does not mean. I don't mind
ignorance, I'm ignorant about a lot of stuff myself, but there is a
certain kind of arrogant aggressive ignorance that I find very distasteful.

In contrast Richard Feynman displayed humble ignorance, he did as much
as anyone to develop Quantum Mechanics but he said "I think it's safe to
say that nobody understands Quantum Mechanics", in describing the work
that won him the Nobel Prize he said he found a way to "sweep
mathematical difficulties under the rug". He also said "I know how hard
it is to really know something; how careful you have to be about
checking the experiments; how easy it is to make mistakes and fool
yourself. I know what it means to know something."


       > Compression and encryption are deformations.

If you can get the exact same file out after compression or encryption
then obviously nothing has been lost and all deformations or shrinkage
are reversible.


       > I understand what you mean completely

Apparently not


       > White noise is called noise for a reason.

And its called white for a reason, a evil occidental mindset
conspiracy created by round eyed white devils.



>> How do you expect mathematics to deal with anything as subjective as
quality? A novel that's high quality to you may be junk to me.

> I don't expect mathematics to deal with it. I expect a theory of
everything to deal with it.

And your way of dealing with it is to say it (bits electrons information
logic etc) does not exist. I would never have guessed that coming up
with a theory of everything could be so easy.


> I'm not a big philosophy or religion fan myself but Wittgenstein,
Heidegger, Sarte, Foucault, Kierkegaard were recent and had some
impressive things to say.

As I've said before nearly everything they and all other recent
philosophers say can be put into one of four categories:

1) False.
2) True but obvious, a truism disguised in pretentious language.
3) True and deep but discovered first and explained better by a
mathematician or scientist or someone else who didn't write
"philosopher" in the box labeled "occupation" on his tax form.
4) So bad its not even wrong.


> Here's some sample articles on the subject:

I know how to look up things on Google too, and I wonder how many of the
authors of those articles graduated from high school.


> "Science begins when you distrust experts." - Richard Feynman. You're
right, I'll trust Feynman.

If you think Feynman would treat your ideas with anything other than
contempt you're nuts. And you should look at the short one minute video
by Feynman called "You don't like it? Go somewhere else!":

http://www.youtube.com/watch?v=iMDTcMD6pOw


 John K Clark
YouTube - Videos from this email



Craig Weinberg

unread,
Jan 30, 2012, 6:13:04 PM1/30/12
to Everything List
On Jan 30, 12:03 am, John Clark <johnkcl...@gmail.com> wrote:
> On Sun, Jan 29, 2012 Craig Weinberg <whatsons...@gmail.com> wrote:
>
> > I'm not talking about fluid flow,
>
> OK
>
> > I'm talking about simulating everything - potential and actual chemical
> > reactions, etc.
>
> OK
>
> > Water can be described by multiplying the known interactions of H2O,
>
> But many, probably most, of water's interactions are unknown to this
> day. Virtually all of organic chemistry (including DNA reactions!)
> involves water somewhere in the chain of reaction, but organic chemistry
> is very far from a closed subject, there is still much to learn.

cool. I didn't know that. What about DNA though? Why would it be any
less mysterious?

> Another
> example, up to now nobody has derived the temperature that water freezes
> at from first principles because the resulting quantum mechanical
> equations are so mathematically complicated that nobody has yet figured
> out how to solve them.

Water is strange stuff. It's blue color comes from inside of it too.
Intramolecular collisions rather than reflection.

>
> > DNA would need many more variables.
>
> BULLSHIT!
>

Why?

> > Non-Shannon information would be anything that is not directly involved
> > in the compression of a digitally sampled description into another digital
> > description.
>
> In other words non-Shannon information is gaseous philosophical flatulence.

Uhh, what? I just explained that Shannon information has nothing to do
with anything except data compression. It's like I just explained what
a catalytic converter is and you said 'in other words non-catalytic
converters are gaseous philosophical flatulence.'

>
>        > Shannon information is not information in general, it is [...]
>
>
>
> Shannon published his work in 1948 but you never even heard about it
> until 3 days ago, and now you're a great world authority on the subject
> telling us all exactly what it does and does not mean.

I'm only the expert compared to you, since your explanation which you
argued with all the authority of a seasoned expert was dead wrong.
Precisely wrong.

> I don't mind
> ignorance, I'm ignorant about a lot of stuff myself, but there is a
> certain kind of arrogant aggressive ignorance that I find very distasteful.

That sentence embodies it perfectly.

>
> In contrast Richard Feynman displayed humble ignorance, he did as much
> as anyone to develop Quantum Mechanics but he said "I think it's safe to
> say that nobody understands Quantum Mechanics", in describing the work
> that won him the Nobel Prize he said he found a way to "sweep
> mathematical difficulties under the rug". He also said "I know how hard
> it is to really know something; how careful you have to be about
> checking the experiments; how easy it is to make mistakes and fool
> yourself. I know what it means to know something."

Yes, I'm familiar with Feynman.

>
>        > Compression and encryption are deformations.
>
>
>
> If you can get the exact same file out after compression or encryption
> then obviously nothing has been lost and all deformations or shrinkage
> are reversible.

Nothing can become a 'file' without irreversible loss. Once it's a
file, sure you can do all kinds of transformations to it, but you'll
never get the original live band playing a song off of an mp3.

>
>        > I understand what you mean completely
>
>
>
> Apparently not

No, I have understood you from the start. I knew you were wrong about
information and entropy and you were. You don't understand my position
though, so you assume it's senseless and throw things in my general
direction.

>
>        > White noise is called noise for a reason.
>
>
>
> And its called white for a reason, a evil occidental mindset
> conspiracy created by round eyed white devils.

I would imagine it's called white because it is additive interference.
My point still stands. The terms signal and noise refer to information
(signal) and entropy (noise). Get it straight. Or don't.

>
> >> How do you expect mathematics to deal with anything as subjective as
> >> quality? A novel that's high quality to you may be junk to me.
>
> > I don't expect mathematics to deal with it. I expect a theory of
> > everything to deal with it.
>
> And your way of dealing with it is to say it (bits electrons information
> logic etc) does not exist. I would never have guessed that coming up
> with a theory of everything could be so easy.

If you understand my hypothesis then you will see there is no reason
to think they exist. Just as you think free will has no reason to
exist.

>
> > I'm not a big philosophy or religion fan myself but Wittgenstein,
> > Heidegger, Sarte, Foucault, Kierkegaard were recent and had some
> > impressive things to say.
>
> As I've said before nearly everything they and all other recent
> philosophers say can be put into one of four categories:
>
> 1) False.
> 2) True but obvious, a truism disguised in pretentious language.
> 3) True and deep but discovered first and explained better by a
> mathematician or scientist or someone else who didn't write
> "philosopher" in the box labeled "occupation" on his tax form.
> 4) So bad its not even wrong.

Have you read anything of theirs? I thought Foucault's Discipline and
Punish was one of the most interesting books I've ever read.

>
> > Here's some sample articles on the subject:
>
> I know how to look up things on Google too, and I wonder how many of the
> authors of those articles graduated from high school.
>
> > "Science begins when you distrust experts." - Richard Feynman. You're
> > right, I'll trust Feynman.
>
> If you think Feynman would treat your ideas with anything other than
> contempt you're nuts.

Feynman's ideas in his time were as crazy as mine are in our time.
Feynman I think would have been intrigued by my ideas, especially if
they came from someone that he know personally.

> And you should look at the short one minute video
> by Feynman called "You don't like it? Go somewhere else!":
>
> http://www.youtube.com/watch?v=iMDTcMD6pOw

QED was wild stuff then, now it's the orthodoxy. That happens.

Craig

John Clark

unread,
Jan 31, 2012, 12:56:23 PM1/31/12
to everyth...@googlegroups.com
On Mon, Jan 30, 2012  Craig Weinberg <whats...@gmail.com> wrote:

> I just explained

3 days after learning that the subject even existed here we sit at your feet while you explain all about it to us.
 
> that Shannon information has nothing to do with anything except data compression.

Except for data compression? Except for identifying the core, must have, part of any message. Except for telling us exactly what's important and what is not. Except for showing how to build things like the internet.

Except for that Mrs. Lincoln how did you like the play?

Shannon can tell you how many books can be sent over a noisy wire in a given amout of time without error, and if you're willing to tolerate a few errors Shannon can tell you how to send even more. If the contents of books is not information what do you call the contents of books?

> Nothing can become a 'file' without irreversible loss.

Ah, well, that explains why I can't make heads or tails out of your ideas, all I've seen is your mail files, now if I'd seen your original glorious Email just as it was as you typed it on your computer screen with no irreversible loss I would have long ago become convinced you were right and were in fact the second coming of Issac Newton. So when you respond to this post please don't send me a file full of irreversible loss, send me your ORIGINAL, send me the real deal.
 
> The terms signal and noise refer to information (signal) and entropy (noise). Get it straight.

One man's signal is another man's noise, to a fan of hisses and clicks and pops the music is the noise.  First you decide what you want to call the signal and then Shannon can tell you what the signal to noise ratio is and he can show you ways to improve it.


>> And your way of dealing with it is to say it (bits electrons information logic etc) does not exist. I would never have guessed that coming up with a theory of everything could be so easy.

> If you understand my hypothesis then you will see there is no reason to think they exist.

Then I dearly hope my mind never goes so soft that I understand your hypothesis.

> Just as you think free will has no reason to exist.

No no a thousand times no! Free will would have to improve dramatically before it could have the lofty property of "nonexistence"; free will is a idea so bad its not even wrong. 
 
> I thought Foucault's Discipline and Punish was one of the most interesting books I've ever read.

I don't consider social criticism a part of philosophy even if I agree with it because it always includes matters of taste. Professional philosophers might write interesting books about history or about what society should or should not do, but none of them have contributed to our understanding of the nature of reality in centuries. That's not to say philosophy hasn't made progress, it just wasn't made by philosophers.


> Feynman I think would have been intrigued by my ideas

Delusions of grandeur.

 John K Clark 




Evgenii Rudnyi

unread,
Feb 1, 2012, 3:10:21 PM2/1/12
to everyth...@googlegroups.com
On 29.01.2012 22:49 Russell Standish said the following:

> On Sun, Jan 29, 2012 at 04:23:12PM +0100, Evgenii Rudnyi wrote:
>> On 28.01.2012 23:26 meekerdb said the following:
>>> On 1/27/2012 11:47 PM, Evgenii Rudnyi wrote:
>>
>> A good suggestion. It well might be that I express my thoughts
>> unclear, sorry for that. Yet, I think that my examples show that
>>
>> 1) There is information
> and entropy
>
>> that engineers employ.

Some engineers employ information, some the thermodynamic entropy. I
have not seen though an engineering paper where both information and the
thermodynamic entropy have been used as synonyms.

>> 2) There is the thermodynamic entropy.
>
> + thermodynamic information
>
>>
>> 3) Numerical values in 1) and 2) are not related to each other.
>>
>
> Fixed that for you. Why should you expect the different types of
> information that come from different contexts to have the same
> numerical value? The whole point of "On complexity and emergence" is
> that notions of information and entropy are complete context
> sensitive (that is not to say their subjective as such - people
> agreeing on the context will agree on the numerical values).


First the thermodynamic entropy is not context depended. This must mean
that if it is the same as information, then the latter must not be
context dependent as well. Could you please give me an example of a
physical property that is context dependent?

Second, when I have different numerical values, this could mean that the
units are different. Yet, if this is not the case, then in my view we
are talking about two different entities.

Could you please explain then what is common between 1) and 2)?

Evgenii

>

Evgenii Rudnyi

unread,
Feb 1, 2012, 3:13:34 PM2/1/12
to everyth...@googlegroups.com
On 29.01.2012 23:00 Russell Standish said the following:

> On Sun, Jan 29, 2012 at 04:30:38PM +0100, Evgenii Rudnyi wrote:
>>
>> The problem that I see is that the entropy changes when the
>> temperature changes. Or do you claim that the entropy of the
>> memory stick/DVD/hard disc remains the same when its temperature
>> changes for example from 15 to 25 degrees?
>
> The entropy changes.
>
>>
>> Anyway, I do not see how one can obtain the information capacity
>> of the storage devices from the thermodynamic entropy and this is
>> my point.
>>
>
> Who was ever claiming that? The theoretically maximum possible
> information storage is related, though.
>
>> Do you claim, that the information capacity for which we pay money
>> of a memory stick/DVD/hard disk is equivalent to the thermodynamic
>> entropy of the device?
>>
>
> Never. The best you have is I=S_max-S, where I is the theoretical

What are S_max and S in this equation?

Evgenii

Evgenii Rudnyi

unread,
Feb 1, 2012, 3:17:41 PM2/1/12
to everyth...@googlegroups.com
On 29.01.2012 23:06 Russell Standish said the following:

I believe that you have mentioned once that information is negentropy.
If yes, could you please comment on that? What negentropy would mean?

In general, I do not understand what does it mean that information at
zero Kelvin is zero. Let us take a coin and cool it down. Do you mean
that the text on the coin will disappear? Or you mean that no one device
can read this text at zero Kelvin?

Evgenii

Stephen P. King

unread,
Feb 1, 2012, 3:51:37 PM2/1/12
to everyth...@googlegroups.com

Temperature is context dependent. If we consider physics at the
level of atoms there is no such a quantity as temperature. Additionally,
thermodynamic entropy does require Boltzmann's constant to be defined
with is a form of context dependency as it specifies the level at which
we are to take micro-states as macroscopically indistinguishable.

Onward!

Stephen

John Mikes

unread,
Feb 1, 2012, 4:51:51 PM2/1/12
to everyth...@googlegroups.com
Evgenii, I am not sure if it is your text, or Russell's":
 
   "In general, I do not understand what does it mean that information at zero Kelvin is zero. Let us take a coin and cool it down. Do you mean that the text on the coin will disappear? Or you mean that no one device can read this text at zero Kelvin?"
 
I doubt that the "text" embossed on a coin is "its" information. It is part of the "physical" structure as e.g. the roundness. size, or material(?) characteristics - all, what nobody can imagine how to change for  the condition of 0-Kelvin. The abs. zero temp. conditions are extrapolated the best way we could muster. A matter of (sci.) faith. Maybe the so called 'interstitial' spaces also collapse? I am not for a 'physicalistic' worldview - rather an agnostic about 'explanations' of diverse epochs based on then recent 'findings' (mostly mathematically justified??? -
realizing that we may be up to lots of novelties we have no idea about today, not even of the directions they may shove our views into. I say that in comparison to our 'conventional scientific' - even everyday's - views of the world in the past, before and after fundamental knowledge-domains were added to our inventory.
I do not condone evidences "that must be, because THERE IS NO OTHER WAY" - in our existing ignorance of course. Atoms? well, if there is 'matter'? (MASS??) even my (macro)molecules I invented are suspect.
So 'entropy' is a nice term in (classical?) thermodynamics what I coined in 1942 as "the science that tells us how things would proceed wouldn't they proceed as they do indeed" thinking of Carnot and the isotherm/reversible equilibria, etc. - way before the irreversible kind was taught in college courses. Information is another rather difficult term, I like to use 'relation' and leave it open what so far unknown relations may affect our processes we assign to 'causes' known within the model of the world we think we are in. The rest (including our misunderstood model - domain) is what I may call an 'infinite complexity' of which we are part - mostly ignorant about the 'beyond model' everything.
 
We 'fabricate' our context, try to explain by the portion we know of - as if it was the totality - and live in our happy conventional scientific terms.
Human ingenuity constructed a miraculous science and technology that is ALMOST good (some mistakes notwithstanding occurring), then comes M. Curie, Watson-Crick, Fleming, Copernicus, Volta, etc. and we re-write the schoolbooks.
 
John M
 

 
--
You received this message because you are subscribed to the Google Groups "Everything List" group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to everything-list+unsubscribe@googlegroups.com.

John Clark

unread,
Feb 2, 2012, 1:12:40 PM2/2/12
to everyth...@googlegroups.com
On Wed, Feb 1, 2012 at 3:10 PM, Evgenii Rudnyi <use...@rudnyi.ru> wrote:

> Could you please give me an example of a physical property that is context dependent?

Off the top of my head, mass, velocity, duration and length.

 John K Clark


Evgenii Rudnyi

unread,
Feb 2, 2012, 1:32:41 PM2/2/12
to everyth...@googlegroups.com
On 02.02.2012 19:12 John Clark said the following:

Could you please expand and show what do you mean by context dependent.
Often people employ the same words, but the meaning is completely
different (as it happens, in my view, with the entropy in thermodynamics
and in the information theory).

When Russell says that information is context dependent, we talk about
for example a DVD. Then information capacity as defined by the company
and the number of physical states are completely different. Hence the
notation from Russell that information is context dependent.

Do you mean that mass in context dependent in the same sense as above?
If yes, could you please explain it a bit more?

Evgenii

Evgenii Rudnyi

unread,
Feb 2, 2012, 1:35:41 PM2/2/12
to everyth...@googlegroups.com
On 01.02.2012 22:51 John Mikes said the following:

> Evgenii, I am not sure if it is your text, or Russell's":
>
> *"**In general, I do not understand what does it mean that

> information at zero Kelvin is zero. Let us take a coin and cool it
> down. Do you mean that the text on the coin will disappear? Or you
> mean that no one device can read this text at zero Kelvin?"*

This was my question to Russell.

> ** I
> doubt that the "text" embossed on a coin is "its" *information*. It


> is part of the "physical" structure as e.g. the roundness. size, or
> material(?) characteristics - all, what nobody can imagine how to
> change for the condition of 0-Kelvin. The abs. zero temp. conditions

Yes, but when we speak about information carrier (book, a hard drive,
DVD, flash memory) it is exactly the same. And it has nothing to do with
the total number of physical states in the device, as this example with
zero temperature nicely shows.

Evgenii

> are extrapolated the best way we could muster. A matter of (sci.)
> faith. Maybe the so called 'interstitial' spaces also collapse? I am
> not for a 'physicalistic' worldview - rather an agnostic about
> 'explanations' of diverse epochs based on then recent 'findings'
> (mostly mathematically justified??? - realizing that we may be up to
> lots of novelties we have no idea about today, not even of the
> directions they may shove our views into. I say that in comparison to
> our 'conventional scientific' - even everyday's - views of the world
> in the past, before and after fundamental knowledge-domains were
> added to our inventory. I do not condone evidences "that must be,
> because THERE IS NO OTHER WAY" - in our existing ignorance of course.

> Atoms? well, if there *is* 'matter'? (MASS??) even my


> (macro)molecules I invented are suspect. So 'entropy' is a nice term

> in (classical?) thermodynamics what I coined in 1942 as *"the science


> that tells us how things would proceed wouldn't they proceed as they

> do indeed"* thinking of Carnot and the isotherm/reversible


> equilibria, etc. - way before the irreversible kind was taught in
> college courses. Information is another rather difficult term, I like
> to use 'relation' and leave it open what so far unknown relations may
> affect our processes we assign to 'causes' known within the model of
> the world we think we are in. The rest (including our misunderstood
> model - domain) is what I may call an 'infinite complexity' of which
> we are part - mostly ignorant about the 'beyond model' everything.
>
> We 'fabricate' our context, try to explain by the portion we know of
> - as if it was the totality - and live in our happy conventional
> scientific terms. Human ingenuity constructed a miraculous science
> and technology that is ALMOST good (some mistakes notwithstanding
> occurring), then comes M. Curie, Watson-Crick, Fleming, Copernicus,
> Volta, etc. and we re-write the schoolbooks.
>
> John M
>

> **

>> everything-list@googlegroups.**com<everyth...@googlegroups.com>


>>
>>
.
>> To unsubscribe from this group, send email to
>> everything-list+unsubscribe@

>> **googlegroups.com<everything-list%2Bunsu...@googlegroups.com>.
>>
>>
For more options, visit this group at http://groups.google.com/**
>> group/everything-list?hl=en<http://groups.google.com/group/everything-list?hl=en>
>>
>>
.
>>
>>
>

Evgenii Rudnyi

unread,
Feb 2, 2012, 1:45:53 PM2/2/12
to everyth...@googlegroups.com
On 01.02.2012 21:51 Stephen P. King said the following:

The Boltzmann's constant, as far as I understand, is defined uniquely.
If you talk about some other universe (or Platonia) where one could
imagine something else, then it could be. Yet, in the world that we know
according to empirical scientific studies, the Boltmann's constant is a
fundamental constant. Hence I do not understand you in this respect.

Indeed, temperature is not available directly at the level of particles
obeying classical or quantum laws. However for example it could be not a
problem with the temperature but rather with the description at the
particle level.

Anyway, I would suggest to stick to empirical scientific knowledge that
we have. Then I do not understand what do you mean that temperature is
context dependent either.

We can imagine very different worlds indeed. Yet, right now we discuss
the question (I will repeat from the email to John) as follows:

When Russell says that information is context dependent, we talk about
for example a DVD. Then information capacity as defined by the company
and the number of physical states are completely different. Hence the
notation from Russell that information is context dependent.

If you mean that the temperature and the Boltzmann constant are context
depended in the same way, could you please give practical examples?

Evgenii

meekerdb

unread,
Feb 2, 2012, 2:00:03 PM2/2/12
to everyth...@googlegroups.com
On 2/2/2012 10:35 AM, Evgenii Rudnyi wrote:
> Yes, but when we speak about information carrier (book, a hard drive, DVD, flash memory)
> it is exactly the same. And it has nothing to do with the total number of physical
> states in the device, as this example with zero temperature nicely shows.

That's not true. The arrangement of ink on the page, the embossed face of the coin, do
contribute to the physical states. It's just that the information encoded by them are
infinitesimal compared to the information required to define the microscopic states, e.g.
the vibrational mode of every atom. So when we're concerned with heat energy that changes
the vibrational modes we neglect the pattern of ink and the emobossing. And when we're
reading we are only interested in the information conveyed by a well defined channel, and
we ignored what information might be encoded in the mircroscopic states. But the two are
both present.

Brent.

Evgenii Rudnyi

unread,
Feb 2, 2012, 2:23:20 PM2/2/12
to everyth...@googlegroups.com
On 02.02.2012 20:00 meekerdb said the following:

Yes, I agree with this, but I think it changes nothing with the term
information. We have a number of physical states in a carrier (that is
influenced indeed by for example the arrangement of ink on the page) and
we have the information capability as defined by the company that sells
the carrier.

By the way, the example with the zero temperature (or strictly speaking
with temperature going to zero Kelvin) seems to show that the
information capability could be even more than the number of physical
states.

Evgenii

Russell Standish

unread,
Feb 2, 2012, 4:18:01 PM2/2/12
to everyth...@googlegroups.com
On Thu, Feb 02, 2012 at 07:45:53PM +0100, Evgenii Rudnyi wrote:
> On 01.02.2012 21:51 Stephen P. King said the following:
> >On 2/1/2012 3:10 PM, Evgenii Rudnyi wrote:
> >>First the thermodynamic entropy is not context depended. This must
> >> mean that if it is the same as information, then the latter must
> >>not be context dependent as well. Could you please give me an
> >>example of a physical property that is context dependent?
> >>
> >
> >Temperature is context dependent. If we consider physics at the level
> >of atoms there is no such a quantity as temperature. Additionally,
> >thermodynamic entropy does require Boltzmann's constant to be defined
> > with is a form of context dependency as it specifies the level at
> >which we are to take micro-states as macroscopically
> >indistinguishable.
>
> The Boltzmann's constant, as far as I understand, is defined
> uniquely. If you talk about some other universe (or Platonia) where
> one could imagine something else, then it could be. Yet, in the
> world that we know according to empirical scientific studies, the
> Boltmann's constant is a fundamental constant. Hence I do not
> understand you in this respect.

Boltzmann's constant is a unit conversion constant like c an Plank's
constant, nothing more. It has no fundamental significance.

>
> Indeed, temperature is not available directly at the level of
> particles obeying classical or quantum laws. However for example it
> could be not a problem with the temperature but rather with the
> description at the particle level.
>
> Anyway, I would suggest to stick to empirical scientific knowledge
> that we have. Then I do not understand what do you mean that
> temperature is context dependent either.
>

Temperature is an averaged quantity, so whilst technically an example
of emergence, it is the weakest form of emergence.

Evgenii is stating an oft-repeated meme that entropy is not
context-dependent.

It is context dependent because it (possibly implicitly) depends on
what we mean by a thermodynamic state. In thermodynamics, we usually
mean a state defined by temperature, pressure, volume, number of
particles, and so on. The "and so on" is the context dependent
part. There are actually an enormous number of possible independent
thermodyamic variables that may be relevant in different
situations. In an electrical device, the arrangement of charges might
be another such thermodynamic variable.

Also, even in classic "schoolbook" thermodynamics, not all of
temperature, pressue, volume and particle number are
relevant. Dropping various of these terms leads to different ensembles
(microcanonical, canonical and grand canonical).

Of course, context dependence does not mean subjective. If two
observers agree on the context, the entropy is quite objective. But it
is a little more complex than something like mass or length.

This is explained very well in Denbigh & Denbigh.

Russell Standish

unread,
Feb 2, 2012, 4:35:06 PM2/2/12
to everyth...@googlegroups.com
On Wed, Feb 01, 2012 at 09:17:41PM +0100, Evgenii Rudnyi wrote:
> On 29.01.2012 23:06 Russell Standish said the following:
> >
> >Absolutely! But at zero kelvin, the information storage capacity of
> >the device is precisely zero, so cooling only works to a certain
> >point.
> >
>
> I believe that you have mentioned once that information is
> negentropy. If yes, could you please comment on that? What
> negentropy would mean?

Scheodinger first pointed out that living systems must export entropy,
and coined the term "negative entropy" to refer to this. Brillouin
shortened this to negentropy.

The basic formula is S_max = S + I.

S_max is the maximum possible value for entropy to take - the value of
entropy at thermodynamic equilibrium for a microcanonical ensemble. S
is the usual entropy, which for non-equilibrium systems will be
typically lower than S_max, and even for equilibrium systems can be
held lower by physical constraints. I is the difference, and this is what
Brillouin called negentropy. It is an information - the information
encoded in that state.

Try looking up http://en.wikipedia.org/wiki/Negentropy

>
> In general, I do not understand what does it mean that information
> at zero Kelvin is zero. Let us take a coin and cool it down. Do you
> mean that the text on the coin will disappear? Or you mean that no
> one device can read this text at zero Kelvin?
>

I vaguely remembered that S_max=0 at absolute zero. If it were, then
both S and I must be zero, because these are all nonnegative
quantities. But http://en.wikipedia.org/wiki/Absolute_zero states only
that entropy is at a minimum, not stricly zero. In which case, I
withdraw that comment.

Cheers

Jason Resch

unread,
Feb 2, 2012, 6:14:06 PM2/2/12
to everyth...@googlegroups.com


On Sun, Jan 22, 2012 at 3:04 AM, Evgenii Rudnyi <use...@rudnyi.ru> wrote:
On 21.01.2012 22:03 Evgenii Rudnyi said the following:

On 21.01.2012 21:01 meekerdb said the following:
On 1/21/2012 11:23 AM, Evgenii Rudnyi wrote:
On 21.01.2012 20:00 meekerdb said the following:
On 1/21/2012 4:25 AM, Evgenii Rudnyi wrote:


...

2) If physicists say that information is the entropy, they
must take it literally and then apply experimental
thermodynamics to measure information. This however seems
not to happen.

It does happen. The number of states, i.e. the information,
available from a black hole is calculated from it's
thermodynamic properties as calculated by Hawking. At a more
conventional level, counting the states available to molecules
in a gas can be used to determine the specific heat of the gas
and vice-verse. The reason the thermodynamic measures and the
information measures are treated separately in engineering
problems is that the information that is important to
engineering is infinitesimal compared to the information stored
in the microscopic states. So the latter is considered only in
terms of a few macroscopic averages, like temperature and
pressure.

Brent

Doesn't this mean that by information engineers means something
different as physicists?

I don't think so. A lot of the work on information theory was done
by communication engineers who were concerned with the effect of
thermal noise on bandwidth. Of course engineers specialize more
narrowly than physics, so within different fields of engineering
there are different terminologies and different measurement
methods for things that are unified in basic physics, e.g. there
are engineers who specialize in magnetism and who seldom need to
reflect that it is part of EM, there are others who specialize in
RF and don't worry about "static" fields.

Do you mean that engineers use experimental thermodynamics to
determine information?
>
> Evgenii

To be concrete. This is for example a paper from control

J.C. Willems and H.L. Trentelman
H_inf control in a behavioral context: The full information case
IEEE Transactions on Automatic Control
Volume 44, pages 521-536, 1999
http://homes.esat.kuleuven.be/~jwillems/Articles/JournalArticles/1999.4.pdf

The term information is there but the entropy not. Could you please explain why? Or alternatively could you please point out to papers where engineers use the concept of the equivalence between the entropy and information?



Evgenii,

Sure, I could give a few examples as this somewhat intersects with my line of work.

 The NIST 800-90 recommendation ( http://csrc.nist.gov/publications/nistpubs/800-90A/SP800-90A.pdf ) for random number generators is a document for engineers implementing secure pseudo-random number generators.  An example of where it is important is when considering entropy sources for seeding a random number generator.  If you use something completely random, like a fair coin toss, each toss provides 1 bit of entropy.  The formula is -log2(predictability).  With a coin flip, you have at best a .5 chance of correctly guessing it, and -log2(.5) = 1.  If you used a die roll, then each die roll would provide -log2(1/6) = 2.58 bits of entropy.  The ability to measure unpredictability is necessary to ensure, for example, that a cryptographic key is at least as difficult to predict the random inputs that went into generating it as it would be to brute force the key.

In addition to security, entropy is also an important concept in the field of data compression.  The amount of entropy in a given bit string represents the theoretical minimum number of bits it takes to represent the information.  If 100 bits contain 100 bits of entropy, then there is no compression algorithm that can represent those 100 bits with fewer than 100 bits.  However, if a 100 bit string contains only 50 bits of entropy, you could compress it to 50 bits.  For example, let's say you had 100 coin flips from an unfair coin.  This unfair coin comes up heads 90% of the time.  Each flip represents -log2(.9) = 0.152 bits of entropy.  Thus, a sequence of 100 coin flips with this biased coin could be represent with 16 bits.  There is only 15.2 bits of information / entropy contained in that 100 bit long sequence.

Jason


Evgenii Rudnyi

unread,
Feb 3, 2012, 2:26:48 PM2/3/12
to everyth...@googlegroups.com
On 03.02.2012 00:14 Jason Resch said the following:
>> http://homes.esat.kuleuven.be/**~jwillems/Articles/**
>> JournalArticles/1999.4.pdf<http://homes.esat.kuleuven.be/%7Ejwillems/Articles/JournalArticles/1999.4.pdf>

Jason,

Sorry, for being unclear. In my statement I have meant the thermodynamic
entropy. No doubt, in the information theory engineers, starting from
Shannon, use the information entropy. Yet, I wanted to point out that I
have not seen engineering works where engineers employ the equivalence
between the thermodynamic entropy and the informational entropy.

Evgenii

Evgenii Rudnyi

unread,
Feb 3, 2012, 2:50:40 PM2/3/12
to everyth...@googlegroups.com
On 02.02.2012 22:18 Russell Standish said the following:

I guess that you have never done a lab in experimental thermodynamics.
There are classical experiment where people measure heat of combustion,
heat capacity, equilibrium pressure, equilibrium constants and then
determine the entropy. If you do it, you see that you can measure the
entropy the same way as other properties, there is no difference. A good
example to this end is JANAF Thermochemical Tables (Joint Army-Naval-Air
Force Thermochemical Tables). You will find a pdf here

http://www.nist.gov/data/PDFfiles/jpcrdM9.pdf

It is about 230 Mb but I guess it is doable to download it. Please open
it and explain what is the difference between the tabulated entropy and
other properties there. How your personal viewpoint on a thermodynamic
system will influence numerical values of the entropy tabulated in
JANAF? What is the difference with the mass or length? I do not see it.

You see, the JANAF Tables has started by military. They needed it to
compute for example the combustion process in rockets and they have been
successful. What part then in a rocket is context dependent?

This is the main problem with the books on entropy and information. They
do not consider thermodynamic tables, they do not work out simple
thermodynamic examples. For example let us consider the next problem:

-----------------------------------------------
Problem. Given temperature, pressure, and initial number of moles of
NH3, N2 and H2, compute the equilibrium composition.

To solve the problem one should find thermodynamic properties of NH3, N2
and H2 for example in the JANAF Tables and then compute the equilibrium
constant.

From thermodynamics tables (all values are molar values for the
standard pressure 1 bar, I have omitted the symbol o for simplicity but
it is very important not to forget it):

Del_f_H_298(NH3), S_298(NH3), Cp(NH3), Del_f_H_298(N2), S_298(N2),
Cp(N2), Del_f_H_298(H2), S_298(H2), Cp(H2)

2NH3 = N2 + 3H2

Del_H_r_298 = Del_f_H_298(N2) + 3 Del_f_H_298(H2) - 2 Del_f_H_298(NH3)

Del_S_r_298 = S_298(N2) + 3 S_298(H2) - 2 S_298(NH3)

Del_Cp_r = Cp(N2) + 3 Cp(H2) - 2 Cp(NH3)

To make life simple, I will assume below that Del_Cp_r = 0, but it is
not a big deal to extend the equations to include heat capacities as well.

Del_G_r_T = Del_H_r_298 - T Del_S_r_298

Del_G_r_T = - R T ln Kp

When Kp, total pressure and the initial number of moles are given, it is
rather straightforward to compute equilibrium composition. If you need
help, please just let me know.
-----------------------------------------------

So, the entropy is there. What is context dependent here? Where is the
difference with mass and length?

Evgenii

Evgenii Rudnyi

unread,
Feb 3, 2012, 2:56:10 PM2/3/12
to everyth...@googlegroups.com
On 02.02.2012 22:35 Russell Standish said the following:

> On Wed, Feb 01, 2012 at 09:17:41PM +0100, Evgenii Rudnyi wrote:
>> On 29.01.2012 23:06 Russell Standish said the following:
>>>
>>> Absolutely! But at zero kelvin, the information storage capacity
>>> of the device is precisely zero, so cooling only works to a
>>> certain point.
>>>
>>
>> I believe that you have mentioned once that information is
>> negentropy. If yes, could you please comment on that? What
>> negentropy would mean?
>
> Scheodinger first pointed out that living systems must export
> entropy, and coined the term "negative entropy" to refer to this.
> Brillouin shortened this to negentropy.
>
> The basic formula is S_max = S + I.
>
> S_max is the maximum possible value for entropy to take - the value
> of entropy at thermodynamic equilibrium for a microcanonical
> ensemble. S is the usual entropy, which for non-equilibrium systems
> will be typically lower than S_max, and even for equilibrium systems
> can be held lower by physical constraints. I is the difference, and
> this is what Brillouin called negentropy. It is an information - the
> information encoded in that state.
>
> Try looking up http://en.wikipedia.org/wiki/Negentropy

Could you please explain how the negentropy is related to experimental
thermodynamics? You will find in the previous message the link to the
JANAF tables and a basic thermodynamic problem. Could you please
demonstrate how the negentropy will help there?

>
>>
>> In general, I do not understand what does it mean that information
>> at zero Kelvin is zero. Let us take a coin and cool it down. Do
>> you mean that the text on the coin will disappear? Or you mean that
>> no one device can read this text at zero Kelvin?
>>
>
> I vaguely remembered that S_max=0 at absolute zero. If it were, then
> both S and I must be zero, because these are all nonnegative
> quantities. But http://en.wikipedia.org/wiki/Absolute_zero states
> only that entropy is at a minimum, not stricly zero. In which case,
> I withdraw that comment.
>
> Cheers

First, we have not to forget the Third Law that states that the change
in entropy in any reaction, as well its derivatives, goes to zero as the
temperatures goes to zero Kelvin.

In this respect your question is actually nice, as now, I believe, we
see that it is possible to have a case when the information capacity
will be more than the number of physical states.

Evgenii

Russell Standish

unread,
Feb 5, 2012, 4:46:30 PM2/5/12
to everyth...@googlegroups.com
On Fri, Feb 03, 2012 at 08:56:10PM +0100, Evgenii Rudnyi wrote:
>
> First, we have not to forget the Third Law that states that the
> change in entropy in any reaction, as well its derivatives, goes to
> zero as the temperatures goes to zero Kelvin.
>
> In this respect your question is actually nice, as now, I believe,
> we see that it is possible to have a case when the information
> capacity will be more than the number of physical states.
>
> Evgenii

How so?

Russell Standish

unread,
Feb 5, 2012, 5:05:55 PM2/5/12
to everyth...@googlegroups.com

The context is there - you will just have to look for it. I rather
suspect that use of these tables refers to homogenous bulk samples of
the material, in thermal equilibrium with a heat bath at some given
temperature.

If we were to take you at face value, we would have to conclude that
entropy is ill-defined in nonequlibrium systems.

More to the point - consider milling whatever material you have chosen
into small particles. Then consider what happens to a container of the
stuff in the Earth's gravity well, compared with the microgravity
situation on the ISS. In the former, the stuff forms a pile on the
bottom of the container - in the latter, the stuff will be more or
less uniformly distributed throughout the containers volume. In the
former case, shaking the container will flatten the pile - but at all
stages the material is in thermal equilibrium.

In your "thermodynamic context", the entropy is the same
throughout. It only depends on bulk material properties, and
temperature. But most physicists would say that the milled material is
in a higher entropy state in microgravity, and that shaking the pile
in Earth's gravity raises the entropy.

Furthermore, lets assume that the particles are milled in the form of
tiny "Penrose replicators" (named after Lionel Penrose, Roger's
dad). When shaken, these particles stick together, forming quite
specific structures that replicate, entraining all the replicators in
the material. (http://docs.huihoo.com/reprap/Revolutionary.pdf).

Most physicists would say that shaking a container of Penrose
replicators actually reduces the system's entropy. Yet, the
thermodynamic entropy of the JNAF context does not change, as that
only depends on bulk material properties.

We can follow your line of thinking, and have a word entropy that is
only useful in certain contexts, then we'll need to make up a
different word for other contexts. Alternatively, we can have a word
that applies over all macroscopic contexts, and explicitly qualify
what that context is. The underlying concept is the same in all cases
though. It appears to me, that standard scientific usage has become to
use the same word for that concept, rather than coin different words
to describe the same concept in all the possible different contexts
that there are.

Jason Resch

unread,
Feb 6, 2012, 11:55:21 AM2/6/12
to everyth...@googlegroups.com
Informational laws and physical laws are, in my mind, closely
related. Laws related to information seem to supercede physical law.
For example, the impossibility of encoding information in fewer
symbols or trying to send more over a channel in a given time period,
than allowed. There is also a "conservation" of information. It is
apparently industrictable. There is a minimum physical energy
expenditure associate with irreversible computation. E.g. Setting a
memory register from 1 to 0. Other "informational laws", prevent any
compression algorithm from having any net decrease in size when
considered over the set of all possible inputs. You can also do
really cool things with information, such as forward error correction:
a file of size 1 mb can be encoded to 1.5 mb. Then this encoded file
can be split into 15 equally sized pieces. The cool part is that any
10 of these pieces (corresponding to 1 mb of information) may be used
to recover the entire original file. Any less than 1 mb worth of
pieces is insufficient.

Jason

On Feb 5, 2012, at 3:46 PM, Russell Standish <li...@hpcoders.com.au>
wrote:

1Z

unread,
Feb 6, 2012, 12:03:13 PM2/6/12
to Everything List


On Feb 6, 4:55 pm, Jason Resch <jasonre...@gmail.com> wrote:
> Informational laws and physical laws are, in my mind, closely
> related.  Laws related to information seem to supercede physical law.
> For example,  the impossibility of encoding information in fewer
> symbols or trying to send more over a channel in a given time period,
> than allowed.

Those transcend physics inasmuch as they are mathematical .

> There is also a "conservation" of information.  It is
> apparently industrictable.

Is there? if there is , it is a phsycial law, and AFAIK it is hotly
debated.

> There is a minimum physical energy
> expenditure associate with irreversible computation.  E.g. Setting a
> memory register from 1 to 0.  Other "informational laws", prevent any
> compression algorithm from having any net decrease in size when
> considered over the set of all possible inputs.  You can also do
> really cool things with information, such as forward error correction:
> a file of size 1 mb can be encoded to 1.5 mb.  Then this encoded file
> can be split into 15 equally sized pieces.  The cool part is that any
> 10 of these pieces (corresponding to 1 mb of information) may be used
> to recover the entire original file.  Any less than 1 mb worth of
> pieces is insufficient.
>
> Jason

You "information laws" seem to have mixed origins.
>
> On Feb 5, 2012, at 3:46 PM, Russell Standish <li...@hpcoders.com.au>
> wrote:
>
>
>
>
>
>
>
> > On Fri, Feb 03, 2012 at 08:56:10PM +0100, Evgenii Rudnyi wrote:
>
> >> First, we have not to forget the Third Law that states that the
> >> change in entropy in any reaction, as well its derivatives, goes to
> >> zero as the temperatures goes to zero Kelvin.
>
> >> In this respect your question is actually nice, as now, I believe,
> >> we see that it is possible to have a case when the information
> >> capacity will be more than the number of physical states.
>
> >> Evgenii
>
> > How so?
>
> > --
>
> > ---
> > ---
> > ----------------------------------------------------------------------
> > Prof Russell Standish                  Phone 0425 253119 (mobile)
> > Principal, High Performance Coders
> > Visiting Professor of Mathematics      hpco...@hpcoders.com.au

Evgenii Rudnyi

unread,
Feb 6, 2012, 2:20:53 PM2/6/12
to everyth...@googlegroups.com
On 05.02.2012 22:46 Russell Standish said the following:

> On Fri, Feb 03, 2012 at 08:56:10PM +0100, Evgenii Rudnyi wrote:
>>
>> First, we have not to forget the Third Law that states that the
>> change in entropy in any reaction, as well its derivatives, goes
>> to zero as the temperatures goes to zero Kelvin.
>>
>> In this respect your question is actually nice, as now, I believe,
>> we see that it is possible to have a case when the information
>> capacity will be more than the number of physical states.
>>
>> Evgenii
>
> How so?
>

Take a coin and cool it to zero Kelvin. Here it was my question that you
have not answered yet. Do you assume that the text on the coin will be
destroyed during cooling?

Evgenii

Evgenii Rudnyi

unread,
Feb 6, 2012, 2:36:44 PM2/6/12
to everyth...@googlegroups.com
On 05.02.2012 23:05 Russell Standish said the following:

I do not get your point. Do you mean that sometimes the surface effects
could be important? Every thermodynamicist know this. However I do not
understand your problem. The thermodynamics of surface phenomena is well
established and to work with it you need to extend the JANAF Tables with
other tables. What is the problem?

It would be good if you define better what do you mean by context
dependent. As far as I remember, you have used this term in respect to
informational capacity of some modern information carrier and its number
of physical states. I would suggest to stay with this example as the
definition of context dependent. Otherwise, it does not make much sense.

> If we were to take you at face value, we would have to conclude that
> entropy is ill-defined in nonequlibrium systems.

The entropy is well-defined for a nonequilibrium system as soon as one
can use local temperature. There are some rare occasions where local
temperature is ambiguous, for example in plasma where one defines
different temperatures for electrons and molecules. Yet, the two
temperatures being defined, the entropy becomes again well-defined.

> More to the point - consider milling whatever material you have
> chosen into small particles. Then consider what happens to a
> container of the stuff in the Earth's gravity well, compared with the
> microgravity situation on the ISS. In the former, the stuff forms a
> pile on the bottom of the container - in the latter, the stuff will
> be more or less uniformly distributed throughout the containers
> volume. In the former case, shaking the container will flatten the
> pile - but at all stages the material is in thermal equilibrium.
>
> In your "thermodynamic context", the entropy is the same throughout.

No it is not. As I have mentioned in this case one just must consider
surface effects.

> It only depends on bulk material properties, and temperature. But
> most physicists would say that the milled material is in a higher
> entropy state in microgravity, and that shaking the pile in Earth's
> gravity raises the entropy.

> Furthermore, lets assume that the particles are milled in the form
> of tiny "Penrose replicators" (named after Lionel Penrose, Roger's
> dad). When shaken, these particles stick together, forming quite
> specific structures that replicate, entraining all the replicators
> in the material. (http://docs.huihoo.com/reprap/Revolutionary.pdf).
>
> Most physicists would say that shaking a container of Penrose
> replicators actually reduces the system's entropy. Yet, the
> thermodynamic entropy of the JNAF context does not change, as that
> only depends on bulk material properties.

We are again at the definition of context dependent. What are saying now
is that when you have new physical effects, it is necessary to take them
into account. What it has to do with your example when information on an
information carrier was context dependent?

Evgenii

It is loading more messages.
0 new messages