Groups keyboard shortcuts have been updated
Dismiss
See shortcuts

Information: a basic physical quantity or rather emergence/supervenience phenomenon

52 views
Skip to first unread message

Evgenii Rudnyi

unread,
Jan 15, 2012, 3:54:00 PM1/15/12
to everyth...@googlegroups.com
On 14.01.2012 08:21 John Clark said the following:
> On Thu, Jan 12, 2012 Craig Weinberg<whats...@gmail.com> wrote:

> For heavens sake, I went into quite a lot of detail about how the
> code is executed so that protein gets made, and it could not be more
> clear that the cell factory contains digital machines.
>
>> They are not information.
>>
>
> According to you nothing is information and that is one reason it is
> becoming increasingly difficult to take anything you say seriously.

I should say that I also have difficulty with the term information. A
question would for example if information belongs to physics or not.
Some physicists say that information is related to the entropy and as
such it is a basic physical quantity. I personally do not buy it, as
thermodynamics, as it has been designed, had nothing to do with
information and information as such brings nothing to help to solve
thermodynamics problem (more to this end in [1]).

Let us consider for example a conventional thermodynamic problem:
improving efficiency of a motor. Is the information concept is helpful
to solve this problem? If we look at modern motors, then we see that
nowadays they are working together with controllers that allows us to
drive the efficiency to the thermodynamic limit. The term information is
helpful indeed to develop a controller but what about the thermodynamic
limit of a motor? Does information helps here? In my view, not.

In the Gray's book on consciousness (Consciousness: Creeping up on the
Hard Problem.) there is an interesting statement on if physics is enough
to explain biology. Gray's answer is yes provided we add cybernetics
laws and evolution. Let me leave evolution aside and discuss the
cybernetics laws only as this is exactly where, I think, information
comes into play. A good short video from the Artificial Intelligence
Class that I have recently attended would be a good introduction (an
intelligent agent sensing external information and then acting):

http://www.youtube.com/watch?v=cx3lV07w-XE

Thus, the question would be about the relationship between physics and
cybernetics laws. When we consider the Equation of Everything, are the
cybernetics laws already there or we still need to introduce them
separately? One of possible answers would be that the cybernetics laws
emerge or supervene on the physics laws. I however does not understand
what this means. It probably has something to do with a transition
between quantity and quality, but I do not understand how it happens
either. For myself, it remains a magic.

Let me repeat a series from physical objects discussed already recently
(see also [2][3]):

1) A rock;
2) A ballcock in the toilet;
3) A self-driving car;
4) A living cell.

Where do we have the cybernetics laws (information) and where not? Can
physics describe these objects without the cybernetics laws? What
emergence and superveniece mean along this series? Any idea?

Evgenii

[1] http://blog.rudnyi.ru/2010/12/entropy-and-artificial-life.html
[2] http://blog.rudnyi.ru/2011/01/perception-feedback-and-qualia.html
[3] http://blog.rudnyi.ru/2011/02/rock-and-information.html

Craig Weinberg

unread,
Jan 15, 2012, 8:02:19 PM1/15/12
to Everything List
On Jan 15, 3:54 pm, Evgenii Rudnyi <use...@rudnyi.ru> wrote:
> On 14.01.2012 08:21 John Clark said the following:
>  > On Thu, Jan 12, 2012  Craig Weinberg<whatsons...@gmail.com>  wrote:
>
> …
>
>  > For heavens sake, I went into quite a lot of detail about how the
>  > code is executed so that protein gets made, and it could not be more
>  > clear that the cell factory contains digital machines.
>  >
>  >> They are not information.
>  >>
>  >
>  > According to you nothing is information and that is one reason it is
>  > becoming increasingly difficult to take anything you say seriously.
>
> I should say that I also have difficulty with the term information. A
> question would for example if information belongs to physics or not.
> Some physicists say that information is related to the entropy and as
> such it is a basic physical quantity. I personally do not buy it, as
> thermodynamics, as it has been designed, had nothing to do with
> information and information as such brings nothing to help to solve
> thermodynamics problem (more to this end in [1]).

Yes! The word information in my opinion only applies to an agent which
can be informed. Without such an agent, information cannot exist, even
as a potential. Once you have an agent that has experience, the
character of that experience can be modified by the agent being
'informed' by an experience which changes how that agent perceives,
responds, and acts in the future. Information is a process which
begins and ends with an agent's experience.

>
> Let us consider for example a conventional thermodynamic problem:
> improving efficiency of a motor. Is the information concept is helpful
> to solve this problem? If we look at modern motors, then we see that
> nowadays they are working together with controllers that allows us to
> drive the efficiency to the thermodynamic limit. The term information is
> helpful indeed to develop a controller but what about the thermodynamic
> limit of a motor? Does information helps here? In my view, not.
>
> In the Gray's book on consciousness (Consciousness: Creeping up on the
> Hard Problem.) there is an interesting statement on if physics is enough
> to explain biology. Gray's answer is yes provided we add cybernetics
> laws and evolution. Let me leave evolution aside and discuss the
> cybernetics laws only as this is exactly where, I think, information
> comes into play. A good short video from the Artificial Intelligence
> Class that I have recently attended would be a good introduction (an
> intelligent agent sensing external information and then acting):
>
> http://www.youtube.com/watch?v=cx3lV07w-XE

Nice. His concept of 'perception action cycle' is what I call
sensorimotivation. The problem is that he relies on third person
structure called a 'control policy'. While this control policy concept
that maps sensors to actuators is entirely appropriate for programming
mechanisms (since they can't program themselves unless they are made
of self-programming materials like living cells). I think that I
understand that the problem with this is that in neurological agents,
the sensor and the actuator is the same thing so that no control
policy is needed. In the case of a human nervous system, it functions
as a whole system with afferent and efferent nerves being organically
specialized divisions which make up the sensorimotive capacity of one
human being.

With a rock, you have a very low sensorimotive development. It knows
how to respond to it's environment in terms of heat and pressure,
velocity, fracture, etc. With a ballcock in a toilet you have separate
parts, each with a very low, rock-like sensorimotive capacity. If the
parts were literally parts of the same thing like our afferent and
efferent nerves are part of a nervous system, then you would have
something with slightly less primitive characteristics. Maybe on par
with a bubble of oil in water. The problem is that we overlook the
fact that the assembly inside the toilet is only our human reading of
these attached separate parts. The parts don't know that they are
attached. They handle doesn't know the reason that the ballcock is
moving it up and down. Not the case with a nervous system. It knows
what the body it is a part of is doing. It all came from one single
cell, not 12 different factories. It's a completely different thing in
reality, but our perception is hard to question so fundamentally. It's
like trying to not read these words as English.

>
> Thus, the question would be about the relationship between physics and
> cybernetics laws. When we consider the Equation of Everything, are the
> cybernetics laws already there or we still need to introduce them
> separately? One of possible answers would be that the cybernetics laws
> emerge or supervene on the physics laws. I however does not understand
> what this means. It probably has something to do with a transition
> between quantity and quality, but I do not understand how it happens
> either. For myself, it remains a magic.
>
> Let me repeat a series from physical objects discussed already recently
> (see also [2][3]):
>
> 1) A rock;
> 2) A ballcock in the toilet;
> 3) A self-driving car;
> 4) A living cell.

A self-driving car is the same as the ballcock. An assembly of dumb
parts which we program to simulate what seems like (trivial)
intelligence to us. In reality it has no more intelligence than
watching a cartoon of living cars. A living cell is nothing like any
of the other examples. A dead cell would be comparable to a rock, but
a living cell has a sensorimotive capacity which cannot be reduced
beneath the cellular level. It is a biological atom.

>
> Where do we have the cybernetics laws (information) and where not? Can
> physics describe these objects without the cybernetics laws? What
> emergence and superveniece mean along this series? Any idea?

We have cybernetics laws only where we can control the behavior of
matter. They aren't very useful for understanding our experience.
Physics can only describe the components of the objects, but the
function of the assemblies of objects are subject to interpretation
rather than physical law. A self-driving car is just electronic parts
that happen to be in a car which we understand to be driving 'itself',
but is actually just executing a program based on an abstract model of
driving.

Emergence is useful only in the context of self-evident models. A
triangle can emerge from three points, a square from four, but the
smell of cabbage cannot emerge from any quantity of points. Cybernetic
laws supervene on sensorimotive pattern recognition, not the other way
around. A toilet ballcock can only perceive the mechanical forces
being applied to it's various physical parts. It has no capacity to
recognize it's extended context and thus can't ever know if it's
broken or try to fix itself like a living cell can.

Craig

John Clark

unread,
Jan 18, 2012, 12:47:12 PM1/18/12
to everyth...@googlegroups.com
On Sun, Jan 15, 2012 at 3:54 PM, Evgenii Rudnyi <use...@rudnyi.ru> wrote:

 " Some physicists say that information is related to the entropy"

That is incorrect, ALL physicists say that information is related to entropy. There are quite a number of definitions of entropy, one I like, although not as rigorous as some it does convey the basic idea:  entropy is a measure of the number of ways the microscopic structure of something can be changed without changing the macroscopic properties. Thus, the living human body has very low entropy because there are relatively few changes that could be made in it without a drastic change in macroscopic properties, like being dead; a bucket of water has a much higher entropy because there are lots of ways you could change the microscopic position of all those water molecules and it would still look like a bucket of water; cool the water and form ice and you have less entropy because the molecules line up into a orderly lattice so there are fewer changes you could make. The ultimate in high entropy objects is a Black Hole because whatever is inside one on the outside any Black Hole can be completely described with just 3 numbers, its mass, spin and electrical charge.

  John K Clark 


Evgenii Rudnyi

unread,
Jan 18, 2012, 2:13:07 PM1/18/12
to everyth...@googlegroups.com
On 18.01.2012 18:47 John Clark said the following:

If you look around you may still find species of scientists who still
are working with classical thermodynamics (search for example for
CALPHAD). Well, if you refer to them as physicists or not, it is your
choice. Anyway in experimental thermodynamics people determine
entropies, for example from CODATA tables

http://www.codata.org/resources/databases/key1.html

S � (298.15 K)
J K-1 mol-1

Ag cr 42.55 � 0.20
Al cr 28.30 � 0.10

Do you mean that 1 mole of Ag has more information than 1 mole of Al at
298.15 K?

Also remember that at constant volume dS = (Cv/T) dT and dU = CvdT. If
the entropy is information then its derivative must be related to
information as well. Hence Cv must be related to information. This
however means that the energy also somehow related to information.

Finally, the entropy is defined by the Second Law and the best would be
to stick to this definition. Only in this case, it is possible to
understand what we are talking about.

Evgenii
--
http://blog.rudnyi.ru

Russell Standish

unread,
Jan 18, 2012, 5:42:43 PM1/18/12
to everyth...@googlegroups.com
> Ag cr 42.55 ą 0.20
> Al cr 28.30 ą 0.10

>
> Do you mean that 1 mole of Ag has more information than 1 mole of Al
> at 298.15 K?
>
> Also remember that at constant volume dS = (Cv/T) dT and dU = CvdT.
> If the entropy is information then its derivative must be related to
> information as well. Hence Cv must be related to information. This
> however means that the energy also somehow related to information.
>
> Finally, the entropy is defined by the Second Law and the best would
> be to stick to this definition. Only in this case, it is possible to
> understand what we are talking about.
>
> Evgenii
> --
> http://blog.rudnyi.ru
>

Evgenii, while you may be right that some physicists (mostly
experimentalists) work in thermodynamics without recourse to the
notion of information, and chemists even more so, it is also true that
the modern theoretical understanding of entropy (and indeed
thermodynamics) is information-based.

This trend really became mainstream with Landauer's work demonstrating
thermodynamic limits of information processing in the 1960s, which
turned earlier speculations by the likes of Schroedinger and Brillouin
into something that couldn't be ignored, even by experimentalists.

This trend of an information basis to physics has only accelerated
in my professional lifetime - I've seen people like Hawking discuss
information processing of black holes, and we've see concepts like the
Beckenstein bound linking geometry of space to information capacity.

David Deutsch is surely backing a winning horse to point out that
algorithmic information theory must be a foundational strand of the
"fabric of reality".

Cheers

--

----------------------------------------------------------------------------
Prof Russell Standish Phone 0425 253119 (mobile)
Principal, High Performance Coders
Visiting Professor of Mathematics hpc...@hpcoders.com.au
University of New South Wales http://www.hpcoders.com.au
----------------------------------------------------------------------------

meekerdb

unread,
Jan 19, 2012, 12:37:13 AM1/19/12
to everyth...@googlegroups.com
> Ag cr 42.55 ą 0.20
> Al cr 28.30 ą 0.10

>
> Do you mean that 1 mole of Ag has more information than 1 mole of Al at 298.15 K?

Yes, it has more internal degrees of freedom so that it takes addition of more energy in
order to increase those we measure as temperature.

Brent

Craig Weinberg

unread,
Jan 19, 2012, 10:21:25 AM1/19/12
to Everything List
This suggests to me that a molecule of DNA belonging to a kangaroo
could have no more information than the same molecule with the primary
sequence scrambled into randomness or 'blanked out' with a single
repeating A-T base pair. That would seem to make this definition of
information the exact opposite of the colloquial meaning of the term.
A blank hard drive could have more information as one full of billions
of documents if the platters were at a different temperatures?

Craig

meekerdb

unread,
Jan 19, 2012, 12:36:48 PM1/19/12
to everyth...@googlegroups.com

That's because the colloquial meaning of the terms takes into account the environment and
which form of information can be causally effective.

Brent

Evgenii Rudnyi

unread,
Jan 19, 2012, 2:03:41 PM1/19/12
to everyth...@googlegroups.com
Russell,

I know that many physicists identify the entropy with information.
Recently I had a nice discussion on biotaconv and people pointed out
that presumably Edwin T. Jaynes was the first to make such a connection
(Information theory and statistical mechanics, 1957). Google Scholar
shows that his paper has been cited more than 5000 times, that is
impressive and it shows indeed that this is in a way mainstream.

I have studied Jaynes papers but I have been stacked with for example

“With such an interpretation the expression “irreversible process”
represents a semantic confusion; it is not the physical process that is
irreversible, but rather our ability to follow it. The second law of
thermodynamics then becomes merely the statement that although our
information as to the state of a system may be lost in a variety of
ways, the only way in which it can be gained is by carrying out further
measurements.”

“It is important to realize that the tendency of entropy to increase is
not a consequence of the laws of physics as such, … . An entropy
increase may occur unavoidably, due to our incomplete knowledge of the
forces acting on a system, or it may be entirely voluntary act on our part.”

This is above of my understanding. As I have mentioned, I do not buy it,
I still consider the entropy as it has been defined by for example Gibbs.

Basically I do not understand what the term information then brings. One
can certainly state that information is the same as the entropy (we are
free with definitions after all). Yet I miss the meaning of that. Let me
put it this way, we have the thermodynamic entropy and then the
informational entropy as defined by Shannon. The first used to designe a
motor and the second to design a controller. Now let us suppose that
these two entropies are the same. What this changes in a design of a
motor and a controller? In my view nothing.

By the way, have you seen the answer to my question:

>> Also remember that at constant volume dS = (Cv/T) dT and dU =
>> CvdT. If the entropy is information then its derivative must be
>> related to information as well. Hence Cv must be related to
>> information. This however means that the energy also somehow
>> related to information.

If the entropy is the same as information, than through the derivatives
all thermodynamic properties are related to information as well. I am
not sure if this makes sense in respect for example to design a
self-driving car.

I am aware of works that estimated the thermodynamic limit (kT) to
process information. I do not see however, how this proves the
equivalence of information and entropy.

Evgenii

P.S. For a long time, people have identified the entropy with chaos. I
have recently read a nice book to this end, Entropy and Art by Arnheim,
1971, it is really nice. One quote:

"The absurd consequences of neglecting structure but using the concept
of order just the same are evident if one examines the present
terminology of information theory. Here order is described as the
carrier of information, because information is defined as the opposite
of entropy, and entropy is a measure of disorder. To transmit
information means to induce order. This sounds reasonable enough. Next,
since entropy grows with the probability of a state of affairs,
information does the opposite: it increases with its improbability. The
less likely an event is to happen, the more information does its
occurrence represent. This again seems reasonable. Now what sort of
sequence of events will be least predictable and therefore carry a
maximum of information? Obviously a totally disordered one, since when
we are confronted with chaos we can never predict what will happen next.
The conclusion is that total disorder provides a maximum of information;
and since information is measured by order, a maximum of order is
conveyed by a maximum of disorder. Obviously, this is a Babylonian
muddle. Somebody or something has confounded our language."

--
http://blog.rudnyi.ru


On 18.01.2012 23:42 Russell Standish said the following:

Evgenii Rudnyi

unread,
Jan 19, 2012, 2:06:38 PM1/19/12
to everyth...@googlegroups.com
On 19.01.2012 06:37 meekerdb said the following:

> On 1/18/2012 11:13 AM, Evgenii Rudnyi wrote:

...

>> If you look around you may still find species of scientists who
>> still are working with classical thermodynamics (search for example
>> for CALPHAD). Well, if you refer to them as physicists or not, it
>> is your choice. Anyway in experimental thermodynamics people
>> determine entropies, for example from CODATA tables
>>
>> http://www.codata.org/resources/databases/key1.html
>>
>> S ° (298.15 K) J K-1 mol-1
>>
>> Ag cr 42.55 ą 0.20 Al cr 28.30 ą 0.10
>>
>> Do you mean that 1 mole of Ag has more information than 1 mole of
>> Al at 298.15 K?
>
> Yes, it has more internal degrees of freedom so that it takes
> addition of more energy in order to increase those we measure as
> temperature.

Could you please explain then why engineers do not use the CODATA/JANAF
Tables to find the best material to keep information?

Evgenii

meekerdb

unread,
Jan 19, 2012, 2:41:01 PM1/19/12
to everyth...@googlegroups.com
On 1/19/2012 11:06 AM, Evgenii Rudnyi wrote:
> On 19.01.2012 06:37 meekerdb said the following:
>> On 1/18/2012 11:13 AM, Evgenii Rudnyi wrote:
>
> ...
>
>>> If you look around you may still find species of scientists who
>>> still are working with classical thermodynamics (search for example
>>> for CALPHAD). Well, if you refer to them as physicists or not, it
>>> is your choice. Anyway in experimental thermodynamics people
>>> determine entropies, for example from CODATA tables
>>>
>>> http://www.codata.org/resources/databases/key1.html
>>>
>>> S ° (298.15 K) J K-1 mol-1
>>>
>>> Ag cr 42.55 ą 0.20 Al cr 28.30 ą 0.10
>>>
>>> Do you mean that 1 mole of Ag has more information than 1 mole of
>>> Al at 298.15 K?
>>
>> Yes, it has more internal degrees of freedom so that it takes
>> addition of more energy in order to increase those we measure as
>> temperature.
>
> Could you please explain then why engineers do not use the CODATA/JANAF Tables to find
> the best material to keep information?

Because they are interested in information that they can insert and retrieve. I once
invented write-only-memory, but it didn't sell. :-)

Brent

Evgenii Rudnyi

unread,
Jan 19, 2012, 3:34:32 PM1/19/12
to everyth...@googlegroups.com
On 19.01.2012 20:41 meekerdb said the following:

> On 1/19/2012 11:06 AM, Evgenii Rudnyi wrote:
>> On 19.01.2012 06:37 meekerdb said the following:
>>> On 1/18/2012 11:13 AM, Evgenii Rudnyi wrote:
>>
>> ...
>>
>>>> If you look around you may still find species of scientists
>>>> who still are working with classical thermodynamics (search for
>>>> example for CALPHAD). Well, if you refer to them as physicists
>>>> or not, it is your choice. Anyway in experimental
>>>> thermodynamics people determine entropies, for example from
>>>> CODATA tables
>>>>
>>>> http://www.codata.org/resources/databases/key1.html
>>>>
>>>> S ° (298.15 K) J K-1 mol-1
>>>>
>>>> Ag cr 42.55 ą 0.20 Al cr 28.30 ą 0.10
>>>>
>>>> Do you mean that 1 mole of Ag has more information than 1 mole
>>>> of Al at 298.15 K?
>>>
>>> Yes, it has more internal degrees of freedom so that it takes
>>> addition of more energy in order to increase those we measure as
>>> temperature.
>>
>> Could you please explain then why engineers do not use the
>> CODATA/JANAF Tables to find the best material to keep information?
>
> Because they are interested in information that they can insert and
> retrieve. I once invented write-only-memory, but it didn't sell. :-)

Well, but this shows that by information physicists and engineers mean
different things. It would good then to distinguish them.

Evgenii

> Brent
>

John Clark

unread,
Jan 19, 2012, 4:29:58 PM1/19/12
to everyth...@googlegroups.com
On Thu, Jan 19, 2012 at 10:21 AM, Craig Weinberg <whats...@gmail.com> wrote:

"This suggests to me that a molecule of DNA belonging to a kangaroo could have no more information than the same molecule with the primary sequence scrambled into randomness

That is correct, it would have the same quantity of information, but most would be of the opinion that the quality has changed.
 
or 'blanked out' with a single repeating A-T base pair.

No, if its repeating then it would have less information, that is to say it would take less information to describe the result.
 
"That would seem to make this definition of information the exact opposite of the colloquial meaning of the term."

That can sometimes happen because mathematics can only deal in the quantity of information not it's quality. Quality is a value judgement and changes from person to person and mathematics does not make value judgements, but the quantity of something is objective and universal so mathematics can talk about that. So yes, there is much more information in a bucket of water than in our DNA , but most human beings are more interested in our genes than the astronomical number of micro-states in a bucket of water. That is my opinion too but a bucket of water may look at it differently and there is no disputing matters of taste. But both the bucket and I would agree on the amount of information in the DNA and in the bucket even if we disagree on which is more important.

  John K Clark

 


Craig Weinberg

unread,
Jan 19, 2012, 5:05:10 PM1/19/12
to Everything List
On Jan 19, 12:36 pm, meekerdb <meeke...@verizon.net> wrote:

> That's because the colloquial meaning of the terms takes into account the environment and
> which form of information can be causally effective.

How is one any form of information more or less likely to be causally
effective than any other form? This is degenerating into pure fantasy
where information is a magical wildcard. If information cannot inform
- ie a disk has been thoroughly and permanently erased what has been
lost if not information?

Craig

Craig Weinberg

unread,
Jan 19, 2012, 5:28:25 PM1/19/12
to Everything List
On Jan 19, 4:29 pm, John Clark <johnkcl...@gmail.com> wrote:
> On Thu, Jan 19, 2012 at 10:21 AM, Craig Weinberg <whatsons...@gmail.com>wrote:
>
> "This suggests to me that a molecule of DNA belonging to a kangaroo could
>
> > have no more information than the same molecule with the primary sequence
> > scrambled into randomness
>
> That is correct, it would have the same quantity of information, but most
> would be of the opinion that the quality has changed.

Ah, so by information you mean 'not information at all'? I thought
that the whole point of information theory is to move beyond quality
into pure quantification. The reason that most would be of the opinion
that the quality has changed is the same reason that most would be of
the opinion that a person wearing no clothes is naked. Not to pick on
anyone, but tbh, the suggestion that information can be defined as not
having anything to do with the difference between order and the
absence of order is laughably preposterous in a way that would impress
both Orwell and Kafka at the same time. I think it is actually one of
the biggest falsehoods that I have ever heard.

>
> > or 'blanked out' with a single repeating A-T base pair.
>
> No, if its repeating then it would have less information, that is to say it
> would take less information to describe the result.

Of course, but how does that jibe with the notion that information is
molecular entropy? How does A-T A-T A-T or G-T G-T G-T guarantee less
internal degrees of freedom within a DNA molecule then A-T G-C A-T?
What if you warm it up or cool it down. It doesn't make any sense that
there would be a physical difference which corresponds to the degree
to which a genetic sequence was non-random or non-monotonous. If I
have red legos and white legos, and I build two opposite monochrome
houses and one of mixed blocks, how in the world does that effect the
entropy of the plastic bricks in any way?

>
> > "That would seem to make this definition of information the exact opposite
> > of the colloquial meaning of the term."
>
> That can sometimes happen because mathematics can only deal in the quantity
> of information not it's quality. Quality is a value judgement and changes
> from person to person and mathematics does not make value judgements, but
> the quantity of something is objective and universal so mathematics can
> talk about that. So yes, there is much more information in a bucket of
> water than in our DNA , but most human beings are more interested in our
> genes than the astronomical number of micro-states in a bucket of water.
> That is my opinion too but a bucket of water may look at it differently and
> there is no disputing matters of taste. But both the bucket and I would
> agree on the amount of information in the DNA and in the bucket even if we
> disagree on which is more important.

I see no reason to use the word information at all for this. It sounds
like you are just talking about entropy to me. The idea that a bucket
of water has more 'information' than DNA is meaningless. I'm not
drinking that Kool-Aid, sorry. If you know of any physicists who are
willing to by my new mega information storage water buckets for only
twice the price of conventional RAID arrays though, I will gladly pay
you a commission.

Craig

Craig

meekerdb

unread,
Jan 19, 2012, 5:40:34 PM1/19/12
to everyth...@googlegroups.com
On 1/19/2012 2:05 PM, Craig Weinberg wrote:
> On Jan 19, 12:36 pm, meekerdb<meeke...@verizon.net> wrote:
>
>> That's because the colloquial meaning of the terms takes into account the environment and
>> which form of information can be causally effective.
> How is one any form of information more or less likely to be causally
> effective than any other form?

Would you rather have an instruction manual in English or Urdu?

Brent

Bruno Marchal

unread,
Jan 19, 2012, 6:12:16 PM1/19/12
to everyth...@googlegroups.com
On 19 Jan 2012, at 20:41, meekerdb wrote:

On 1/19/2012 11:06 AM, Evgenii Rudnyi wrote:
On 19.01.2012 06:37 meekerdb said the following:
On 1/18/2012 11:13 AM, Evgenii Rudnyi wrote:
snip

Could you please explain then why engineers do not use the CODATA/JANAF Tables to find the best material to keep information?

Because they are interested in information that they can insert and retrieve.  I once invented write-only-memory, but it didn't sell. :-)

Hmm... It might have interested the psycho-analysts, and perhaps the revisionists too.  Improve you marketing strategy!

I could buy you some for my list of boring urgent tasks!

(like sending my obsolete  list of of boring urgent tasks to a black hole. Find a way to prevent evaporation! If that is possible).

I think that physically write-only-memory might not exist. I think the core of physics might be very symmetrical, reversible. A group probably. There is no place where you can "really" hide information for long. 

Bruno




Brent


Evgenii


Brent


Also remember that at constant volume dS = (Cv/T) dT and dU = CvdT.

If the entropy is information then its derivative must be related
to information as well. Hence Cv must be related to information.
This however means that the energy also somehow related to
information.

Finally, the entropy is defined by the Second Law and the best
would be to stick to this definition. Only in this case, it is
possible to understand what we are talking about.

Evgenii



--
You received this message because you are subscribed to the Google Groups "Everything List" group.
To post to this group, send email to everyth...@googlegroups.com.
To unsubscribe from this group, send email to everything-li...@googlegroups.com.
For more options, visit this group at http://groups.google.com/group/everything-list?hl=en.



Russell Standish

unread,
Jan 19, 2012, 11:59:42 PM1/19/12
to everyth...@googlegroups.com
On Thu, Jan 19, 2012 at 08:03:41PM +0100, Evgenii Rudnyi wrote:
> Russell,
>
> I know that many physicists identify the entropy with information.
> Recently I had a nice discussion on biotaconv and people pointed out
> that presumably Edwin T. Jaynes was the first to make such a
> connection (Information theory and statistical mechanics, 1957).
> Google Scholar shows that his paper has been cited more than 5000
> times, that is impressive and it shows indeed that this is in a way
> mainstream.

Because I tend to think of "negentropy", which is really another term
for information, I tend to give priority to Schroedinger who wrote
about the topic in the early 40s. But Jaynes was certainly
instrumental in establishing the information based foundations to
statistical physics, even before information was properly defined (it
wasn't really until the likes of Kolmogorov, Chaitin and Solomonoff in
the 60s that information was really understood.

But Landauer in the late 60s was probably the first to make physicists
really wake up to the concept of physical information.

But then, I'm not a science historian, so what would I know :).

>
> I have studied Jaynes papers but I have been stacked with for example
>

... snip ...

>
> Basically I do not understand what the term information then brings.
> One can certainly state that information is the same as the entropy
> (we are free with definitions after all). Yet I miss the meaning of
> that. Let me put it this way, we have the thermodynamic entropy and
> then the informational entropy as defined by Shannon. The first used
> to designe a motor and the second to design a controller. Now let us
> suppose that these two entropies are the same. What this changes in
> a design of a motor and a controller? In my view nothing.
>

I can well recommend Denbigh & Denbigh's book from the 80s - its a bit
more of a modern understanding of the topic than Jaynes :)

@book{Denbigh-Denbigh87,
author = {Denbigh, K. G. and Denbigh, J.},
publisher = { Cambridge UP},
title = { Entropy in Relation to Incomplete Knowledge},
year = { 1987},
}


> By the way, have you seen the answer to my question:
>
> >> Also remember that at constant volume dS = (Cv/T) dT and dU =
> >> CvdT. If the entropy is information then its derivative must be
> >> related to information as well. Hence Cv must be related to
> >> information. This however means that the energy also somehow
> >> related to information.
>
> If the entropy is the same as information, than through the
> derivatives all thermodynamic properties are related to information
> as well. I am not sure if this makes sense in respect for example to
> design a self-driving car.
>

The information embodied in the thermodynamic state is presumably not
relevant to the design of a self-driving car. By the same token,
thermodynamic treatment (typically) discards a lot of information
useful for engineering.

> I am aware of works that estimated the thermodynamic limit (kT) to
> process information. I do not see however, how this proves the
> equivalence of information and entropy.
>
> Evgenii
>
> P.S. For a long time, people have identified the entropy with chaos.
> I have recently read a nice book to this end, Entropy and Art by
> Arnheim, 1971, it is really nice. One quote:
>

I guess this is the original meaning of chaos, not the more modern
meaning referring to "low dimension dynamical systems having strange
attractors".

> "The absurd consequences of neglecting structure but using the
> concept of order just the same are evident if one examines the
> present terminology of information theory. Here order is described
> as the carrier of information, because information is defined as the
> opposite of entropy, and entropy is a measure of disorder. To
> transmit information means to induce order. This sounds reasonable
> enough. Next, since entropy grows with the probability of a state of
> affairs, information does the opposite: it increases with its
> improbability. The less likely an event is to happen, the more
> information does its occurrence represent. This again seems
> reasonable. Now what sort of sequence of events will be least
> predictable and therefore carry a maximum of information? Obviously
> a totally disordered one, since when we are confronted with chaos we
> can never predict what will happen next.

This rather depends on whether the disorder is informationally
significant. This is context dependent. I have a discussion on this
(it relates to the Kolmogorov idea that random sequences have maximum
complexity) in my paper "On Complexity and Emergence". I also touch on
the theme in my book "Theory of Nothing", which I know you've read!

> The conclusion is that
> total disorder provides a maximum of information;

Total disorder corresponds to a maximum of entropy. Maximum entropy
minimises the amount of information.

> and since
> information is measured by order, a maximum of order is conveyed by
> a maximum of disorder. Obviously, this is a Babylonian muddle.
> Somebody or something has confounded our language."
>

I would say it is many people, rather than just one. I wrote "On
Complexity and Emergence" in response to the amount of unmitigated
tripe I've seen written about these topics.

Craig Weinberg

unread,
Jan 20, 2012, 7:42:18 AM1/20/12
to Everything List
On Jan 19, 5:40 pm, meekerdb <meeke...@verizon.net> wrote:
> On 1/19/2012 2:05 PM, Craig Weinberg wrote:

> > How is one any form of information more or less likely to be causally
> > effective than any other form?
>
> Would you rather have an instruction manual in English or Urdu?

Since I tend to put instruction manuals in a drawer and never look at
them, I would rather have the Urdu one as a novelty.

What difference does it make what I would rather have though? Both the
English and Urdu manuals are equally informative or non-informative
objectively (assuming they are equivalent translations), and neither
of them are causally effective without a subjective interpreter who is
causally effective.

Craig

Evgenii Rudnyi

unread,
Jan 21, 2012, 7:25:44 AM1/21/12
to everyth...@googlegroups.com
On 20.01.2012 05:59 Russell Standish said the following:

> On Thu, Jan 19, 2012 at 08:03:41PM +0100, Evgenii Rudnyi wrote:

...

>>
>> Basically I do not understand what the term information then
>> brings. One can certainly state that information is the same as the
>> entropy (we are free with definitions after all). Yet I miss the
>> meaning of that. Let me put it this way, we have the thermodynamic
>> entropy and then the informational entropy as defined by Shannon.
>> The first used to designe a motor and the second to design a
>> controller. Now let us suppose that these two entropies are the
>> same. What this changes in a design of a motor and a controller? In
>> my view nothing.
>>
>

> I can well recommend Denbigh& Denbigh's book from the 80s - its a


> bit more of a modern understanding of the topic than Jaynes :)
>
> @book{Denbigh-Denbigh87, author = {Denbigh, K. G. and Denbigh, J.},
> publisher = { Cambridge UP}, title = { Entropy in Relation to
> Incomplete Knowledge}, year = { 1987}, }

Thanks. On biotaconv they have recommended John Avery's "Information
Theory and Evolution" but I think I have already satisfied my curiosity
with Jaynes's two papers. My personal feeling is as follows:

1) The concept of information is useless in conventional thermodynamic
problems. Let us take for example the Fe-C phase diagram

http://www.calphad.com/graphs/Metastable%20Fe-C%20Phase%20Diagram.gif

What information has to do with the entropies of the phases in this
phase diagram? Do you mean that I find an answer in Denbigh's book?

2) If physicists say that information is the entropy, they must take it
literally and then apply experimental thermodynamics to measure
information. This however seems not to happen.

3) I am working with engineers developing mechatronics products.
Thermodynamics (hence the entropy) is there as well as information.
However, I have not met a practitioner yet who makes a connection
between the entropy and information.

>
>> By the way, have you seen the answer to my question:
>>
>>>> Also remember that at constant volume dS = (Cv/T) dT and dU =
>>>> CvdT. If the entropy is information then its derivative must
>>>> be related to information as well. Hence Cv must be related to
>>>> information. This however means that the energy also somehow
>>>> related to information.
>>
>> If the entropy is the same as information, than through the
>> derivatives all thermodynamic properties are related to
>> information as well. I am not sure if this makes sense in respect
>> for example to design a self-driving car.
>>
>
> The information embodied in the thermodynamic state is presumably
> not relevant to the design of a self-driving car. By the same token,
> thermodynamic treatment (typically) discards a lot of information
> useful for engineering.

Sorry, I do not understand what this means.

>> I am aware of works that estimated the thermodynamic limit (kT) to
>> process information. I do not see however, how this proves the
>> equivalence of information and entropy.
>>
>> Evgenii

...

>> and since information is measured by order, a maximum of order is
>> conveyed by a maximum of disorder. Obviously, this is a Babylonian
>> muddle. Somebody or something has confounded our language."
>>
>
> I would say it is many people, rather than just one. I wrote "On
> Complexity and Emergence" in response to the amount of unmitigated
> tripe I've seen written about these topics.
>
>

I have found your work on archiv.org and I will look at it. Thank you
for mentioning it.

Evgenii

meekerdb

unread,
Jan 21, 2012, 2:00:47 PM1/21/12
to everyth...@googlegroups.com

It does happen. The number of states, i.e. the information, available from a black hole
is calculated from it's thermodynamic properties as calculated by Hawking. At a more
conventional level, counting the states available to molecules in a gas can be used to
determine the specific heat of the gas and vice-verse. The reason the thermodynamic
measures and the information measures are treated separately in engineering problems is
that the information that is important to engineering is infinitesimal compared to the
information stored in the microscopic states. So the latter is considered only in terms
of a few macroscopic averages, like temperature and pressure.

Brent

Evgenii Rudnyi

unread,
Jan 21, 2012, 2:23:18 PM1/21/12
to everyth...@googlegroups.com
On 21.01.2012 20:00 meekerdb said the following:

> On 1/21/2012 4:25 AM, Evgenii Rudnyi wrote:
>>

...

>> 2) If physicists say that information is the entropy, they must
>> take it literally and then apply experimental thermodynamics to
>> measure information. This however seems not to happen.
>
> It does happen. The number of states, i.e. the information, available
> from a black hole is calculated from it's thermodynamic properties
> as calculated by Hawking. At a more conventional level, counting the
> states available to molecules in a gas can be used to determine the
> specific heat of the gas and vice-verse. The reason the thermodynamic
> measures and the information measures are treated separately in
> engineering problems is that the information that is important to
> engineering is infinitesimal compared to the information stored in
> the microscopic states. So the latter is considered only in terms of
> a few macroscopic averages, like temperature and pressure.
>
> Brent

Doesn't this mean that by information engineers means something
different as physicists?

Evgenii

meekerdb

unread,
Jan 21, 2012, 3:01:48 PM1/21/12
to everyth...@googlegroups.com

I don't think so. A lot of the work on information theory was done by communication
engineers who were concerned with the effect of thermal noise on bandwidth. Of course
engineers specialize more narrowly than physics, so within different fields of engineering
there are different terminologies and different measurement methods for things that are
unified in basic physics, e.g. there are engineers who specialize in magnetism and who
seldom need to reflect that it is part of EM, there are others who specialize in RF and
don't worry about "static" fields.

Brent

>
> Evgenii
>

Evgenii Rudnyi

unread,
Jan 21, 2012, 4:03:02 PM1/21/12
to everyth...@googlegroups.com
On 21.01.2012 21:01 meekerdb said the following:

Do you mean that engineers use experimental thermodynamics to determine
information?

Evgenii

> Brent
>
>>
>> Evgenii
>>
>

Evgenii Rudnyi

unread,
Jan 22, 2012, 4:04:45 AM1/22/12
to everyth...@googlegroups.com
On 21.01.2012 22:03 Evgenii Rudnyi said the following:

To be concrete. This is for example a paper from control

J.C. Willems and H.L. Trentelman
H_inf control in a behavioral context: The full information case
IEEE Transactions on Automatic Control
Volume 44, pages 521-536, 1999
http://homes.esat.kuleuven.be/~jwillems/Articles/JournalArticles/1999.4.pdf

The term information is there but the entropy not. Could you please
explain why? Or alternatively could you please point out to papers where
engineers use the concept of the equivalence between the entropy and
information?

Evgenii

>
>> Brent
>>
>>>
>>> Evgenii
>>>
>>
>

Evgenii Rudnyi

unread,
Jan 22, 2012, 1:16:23 PM1/22/12
to everyth...@googlegroups.com
On 20.01.2012 05:59 Russell Standish said the following:

> On Thu, Jan 19, 2012 at 08:03:41PM +0100, Evgenii Rudnyi wrote:

...

>> and since information is measured by order, a maximum of order is
>> conveyed by a maximum of disorder. Obviously, this is a Babylonian
>> muddle. Somebody or something has confounded our language."
>>
>
> I would say it is many people, rather than just one. I wrote "On
> Complexity and Emergence" in response to the amount of unmitigated
> tripe I've seen written about these topics.
>

Russel,

I have read your paper

http://arxiv.org/abs/nlin/0101006

It is well written. Could you please apply the principles from your
paper to a problem on how to determine information in a book (for
example let us take your book Theory of Nothing)?

Also do you believe earnestly that this information is equal to the
thermodynamic entropy of the book? If yes, can one determine the
information in the book just by means of experimental thermodynamics?

Evgenii

P.S. Why it is impossible to state that a random string is generated by
some random generator?


Russell Standish

unread,
Jan 22, 2012, 7:26:26 PM1/22/12
to everyth...@googlegroups.com
On Sun, Jan 22, 2012 at 07:16:23PM +0100, Evgenii Rudnyi wrote:
> On 20.01.2012 05:59 Russell Standish said the following:
> >On Thu, Jan 19, 2012 at 08:03:41PM +0100, Evgenii Rudnyi wrote:
>
> ...
>
> >>and since information is measured by order, a maximum of order is
> >>conveyed by a maximum of disorder. Obviously, this is a Babylonian
> >>muddle. Somebody or something has confounded our language."
> >>
> >
> >I would say it is many people, rather than just one. I wrote "On
> >Complexity and Emergence" in response to the amount of unmitigated
> >tripe I've seen written about these topics.
> >
>
> Russel,
>
> I have read your paper
>
> http://arxiv.org/abs/nlin/0101006
>
> It is well written. Could you please apply the principles from your
> paper to a problem on how to determine information in a book (for
> example let us take your book Theory of Nothing)?
>
> Also do you believe earnestly that this information is equal to the
> thermodynamic entropy of the book?

These are two quite different questions. To someone who reads my book,
the physical form of the book is unimportant - it could just as easily
be a PDF file or a Kindle e-book as a physical paper copy. The PDF is
a little over 30,000 bytes long. Computing the information content
would be a matter of counting the number 30,000 long byte strings that
generate a recognisable variant of ToN when fed into Acrobat
reader. Then subtract the logarithm (to base 256) of this figure from
30,000 to get the information content in bytes.

This is quite impractical, of course, not to speak of expense in
paying for an army of people to go through 256^30,000 variants to
decide which ones are the true ToN's. An upper bound can be
found by compressing the file - PDFs are already compressed, so we
could estimate the information content as being between 25KB and 30KB (say).

To a physicist, it is the physical form that is important - the fact
that it is made of paper, with a bit of glue to hold it together. The
arrangement of ink on the pages is probably quite unimportant - a book
of the same size and shape, but with blank pages would do just as
well. Even if the arrangement of ink is important, then does
typesetting the book in a different font lead to the same book or a
different book?

To compute the thermodynamic information, one could imagine performing
a massive molecular dynamics simulation, and then count the number of
states that correspond to the physical book, take the logarithm, then
subtract that from the logarithm of the total possible number of
states the molecules could take on (if completely disassociated).

This is, of course, completely impractical. Computing the complexity
of something is generally NP-hard. But in principle doable.

Now, how does this relate to the thermodynamic entropy of the book? It
turns out that the information computed by the in-principle process
above is equal to the difference between the maximum entropy of the
molecules making up the book (if completely disassociated) and the
thermodynamic entropy, which could be measured in a calorimeter.


> If yes, can one determine the
> information in the book just by means of experimental
> thermodynamics?
>

One can certainly determine the information of the physical book
(defined however you might like) - but that is not the same as the
information of the abstract book.

> Evgenii
>
> P.S. Why it is impossible to state that a random string is generated
> by some random generator?
>

Not sure what you mean, unless you're really asking "Why it is


impossible to state that a random string is generated by some

pseudorandom generator?"

In which case the answer is that a pseudorandom generator is an
algorithm, so by definition doesn't produce random numbers. There is a
lot of knowledge about how to decide if a particular PRNG is
sufficiently random for a particular purpose. No PRNG is sufficiently
random for all purposes - in particular they are very poor for
security purposes, as they're inherently predictable.

Cheers

Craig Weinberg

unread,
Jan 23, 2012, 8:20:28 AM1/23/12
to Everything List
On Jan 22, 7:26 pm, Russell Standish <li...@hpcoders.com.au> wrote:

>
> Now, how does this relate to the thermodynamic entropy of the book? It
> turns out that the information computed by the in-principle process
> above is equal to the difference between the maximum entropy of the
> molecules making up the book (if completely disassociated) and the
> thermodynamic entropy, which could be measured in a calorimeter.
>
> > If yes, can one determine the
> > information in the book just by means of experimental
> > thermodynamics?
>
> One can certainly determine the information of the physical book
> (defined however you might like) - but that is not the same as the
> information of the abstract book.

This would only work of the information were meaningless and a-
signifying. I can write a whole book with just the words "The movie
Goodfellas". Anyone who has seen that movie has a rich text of
memories from which to inform themselves through that association.
That is what being informed actually is, associating and integrating
presented texts with a body of accumulated texts and contexts. If you
conflate information with the data that happens to be associated with
a particular text in a particular language-media context, you are
literally weighing stories by the pound (or gram).

Besides, any such quantitative measure does not take sequence into
account. A book or file which is completely scrambled down to the
level of characters or pixels has the same quantity of entropy
displacement as the in tact text. To reduce information to quantity
alone means that a 240k text file can be rearranged to be 40kb of
nothing but 1s and then 200kb of nothing but 0s and have the same
amount of information and entropy. It's a gross misunderstanding of
how information works.

Craig

Russell Standish

unread,
Jan 23, 2012, 11:25:38 PM1/23/12
to everyth...@googlegroups.com
On Mon, Jan 23, 2012 at 05:20:28AM -0800, Craig Weinberg wrote:
>
> Besides, any such quantitative measure does not take sequence into
> account. A book or file which is completely scrambled down to the
> level of characters or pixels has the same quantity of entropy
> displacement as the in tact text. To reduce information to quantity
> alone means that a 240k text file can be rearranged to be 40kb of
> nothing but 1s and then 200kb of nothing but 0s and have the same
> amount of information and entropy. It's a gross misunderstanding of
> how information works.
>
> Craig
>

Rearranging the text file to have 40KB of 1s and 200KB of 0s
dramatically reduces the information and increases the entropy by the
same amount, although not nearly as much as completely scrambling the
file. I'd say you have a gross misunderstanding of how these measures
work if you think otherwise.

Craig Weinberg

unread,
Jan 24, 2012, 7:49:41 AM1/24/12
to Everything List
On Jan 23, 11:25 pm, Russell Standish <li...@hpcoders.com.au> wrote:
> On Mon, Jan 23, 2012 at 05:20:28AM -0800, Craig Weinberg wrote:
>
> > Besides, any such quantitative measure does not take sequence into
> > account. A book or file which is completely scrambled down to the
> > level of characters or pixels has the same quantity of entropy
> > displacement as the in tact text. To reduce information to quantity
> > alone means that a 240k text file can be rearranged to be 40kb of
> > nothing but 1s and then 200kb of nothing but 0s and have the same
> > amount of information and entropy. It's a gross misunderstanding of
> > how information works.
>
> > Craig
>
> Rearranging the text file to have 40KB of 1s and 200KB of 0s
> dramatically reduces the information and increases the entropy by the
> same amount, although not nearly as much as completely scrambling the
> file. I'd say you have a gross misunderstanding of how these measures
> work if you think otherwise.

All this time I thought that you have been saying that entropy and
information are the same thing:

>>"This suggests to me that a molecule of DNA belonging to a
kangaroo could
>> have no more information than the same molecule with the primary
sequence
>> scrambled into randomness

>That is correct, it would have the same quantity of
information, but most
>would be of the opinion that the quality has changed.

If you are instead saying that they are inversely proportional then I
would agree in general - information can be considered negentropy.
Sorry, I thought you were saying that they are directly proportional
measures (Brent and Evgenii seem to be talking about it that way). I
think that we can go further in understanding information though.
Negentropy is a good beginning but it does not address significance.
The degree to which information has the capacity to inform is even
more important than the energy cost to generate. Significance of
information is a subjective quality which is independent of entropy
but essential to the purpose of information. In fact, information
itself could be considered the quantitative shadow of the quality of
significance. Information that does not inform something is not
information.

Craig

meekerdb

unread,
Jan 24, 2012, 4:56:55 PM1/24/12
to everyth...@googlegroups.com


In thinking about how to answer this I came across an excellent paper by Roman Frigg and
Charlotte Werndl http://www.romanfrigg.org/writings/EntropyGuide.pdf which explicates the
relation more comprehensively than I could and which also gives some historical background
and extensions: specifically look at section 4.

Brent

Evgenii Rudnyi

unread,
Jan 25, 2012, 2:47:03 PM1/25/12
to everyth...@googlegroups.com
On 23.01.2012 01:26 Russell Standish said the following:

Yet, this is already information. Hence if take the equivalence between
the informational and thermodynamic entropies literally, then even in
this case the thermodynamic entropy (that should be possible to measure
by experimental thermodynamics) must exist. What it is in this case?

> To a physicist, it is the physical form that is important - the fact
> that it is made of paper, with a bit of glue to hold it together.
> The arrangement of ink on the pages is probably quite unimportant - a
> book of the same size and shape, but with blank pages would do just
> as well. Even if the arrangement of ink is important, then does
> typesetting the book in a different font lead to the same book or a
> different book?

It is a good question and in my view it again shows that thermodynamic
entropy and information are some different things, as for the same
object we can define the information differently (see also below).

> To compute the thermodynamic information, one could imagine
> performing a massive molecular dynamics simulation, and then count
> the number of states that correspond to the physical book, take the
> logarithm, then subtract that from the logarithm of the total
> possible number of states the molecules could take on (if completely
> disassociated).

Do not forget that molecular dynamics simulation is based on the Newton
laws (even quantum mechanics molecular dynamics). Hence you probably
mean here the Monte-Carlo method. Yet, it is much simpler to employ
experimental thermodynamics (see below).

> This is, of course, completely impractical. Computing the complexity
> of something is generally NP-hard. But in principle doable.
>
> Now, how does this relate to the thermodynamic entropy of the book?
> It turns out that the information computed by the in-principle
> process above is equal to the difference between the maximum entropy
> of the molecules making up the book (if completely disassociated) and
> the thermodynamic entropy, which could be measured in a calorimeter.
>
>
>> If yes, can one determine the information in the book just by means
>> of experimental thermodynamics?
>>
>
> One can certainly determine the information of the physical book
> (defined however you might like) - but that is not the same as the
> information of the abstract book.

Let me suggest a very simple case to understand better what you are
saying. Let us consider a string "10" for simplicity. Let us consider
the next cases. I will cite first the thermodynamic properties of Ag and
Al from CODATA tables (we will need them)

S � (298.15 K)
J K-1 mol-1

Ag cr 42.55 � 0.20
Al cr 28.30 � 0.10

In J K-1 cm-3 it will be

Ag cr 42.55/107.87*10.49 = 4.14
Al cr 28.30/26.98*2.7 = 2.83

1) An abstract string "10" as the abstract book above.

2) Let us make now an aluminum plate (a page) with "10" hammered on it
(as on a coin) of the total volume 10 cm^3. The thermodynamic entropy is
then 28.3 J/K.

3) Let us make now a silver plate (a page) with "10" hammered on it (as
on a coin) of the total volume 10 cm^3. The thermodynamic entropy is
then 41.4 J/K.

4) We can easily make another aluminum plate (scaling all dimensions
from 2) to the total volume of 100 cm^3. Then the thermodynamic entropy
is 283 J/K.

Now we have four different combinations to represent a string "10" and
the thermodynamic entropy is different. If we take the statement
literally then the information must be different in all four cases and
defined uniquely as the thermodynamic entropy is already there. Yet in
my view this makes little sense.

Could you please comment on this four cases?

>> Evgenii
>>
>> P.S. Why it is impossible to state that a random string is
>> generated by some random generator?
>>
>
> Not sure what you mean, unless you're really asking "Why it is
> impossible to state that a random string is generated by some
> pseudorandom generator?"
>
> In which case the answer is that a pseudorandom generator is an
> algorithm, so by definition doesn't produce random numbers. There is
> a lot of knowledge about how to decide if a particular PRNG is
> sufficiently random for a particular purpose. No PRNG is
> sufficiently random for all purposes - in particular they are very
> poor for security purposes, as they're inherently predictable.

I understand. Yet if we take a finite random string, then presumably
there should be some random generate with some seed that produces it.
What would be wrong with this?

Evgenii


> Cheers
>

Evgenii Rudnyi

unread,
Jan 25, 2012, 2:52:00 PM1/25/12
to everyth...@googlegroups.com
On 24.01.2012 13:49 Craig Weinberg said the following:

> If you are instead saying that they are inversely proportional then
> I would agree in general - information can be considered negentropy.
> Sorry, I thought you were saying that they are directly proportional
> measures (Brent and Evgenii seem to be talking about it that way). I

I am not an expert in the informational entropy. For me it does not
matter how they define it in the information theory, whether as entropy
or negentropy. My point is that this has nothing to do with the
thermodynamic entropy (see my previous message with four cases for the
string "10").

Evgenii

Evgenii Rudnyi

unread,
Jan 25, 2012, 2:56:34 PM1/25/12
to everyth...@googlegroups.com
On 24.01.2012 22:56 meekerdb said the following:

> In thinking about how to answer this I came across an excellent paper
> by Roman Frigg and Charlotte Werndl
> http://www.romanfrigg.org/writings/EntropyGuide.pdf which explicates
> the relation more comprehensively than I could and which also gives
> some historical background and extensions: specifically look at
> section 4.
>
> Brent
>

Thanks for the link. I will try to work it out to see if they have an
answer to the four cases with the string "10" that I have described in
my reply to Russe