…
> For heavens sake, I went into quite a lot of detail about how the
> code is executed so that protein gets made, and it could not be more
> clear that the cell factory contains digital machines.
>
>> They are not information.
>>
>
> According to you nothing is information and that is one reason it is
> becoming increasingly difficult to take anything you say seriously.
I should say that I also have difficulty with the term information. A
question would for example if information belongs to physics or not.
Some physicists say that information is related to the entropy and as
such it is a basic physical quantity. I personally do not buy it, as
thermodynamics, as it has been designed, had nothing to do with
information and information as such brings nothing to help to solve
thermodynamics problem (more to this end in [1]).
Let us consider for example a conventional thermodynamic problem:
improving efficiency of a motor. Is the information concept is helpful
to solve this problem? If we look at modern motors, then we see that
nowadays they are working together with controllers that allows us to
drive the efficiency to the thermodynamic limit. The term information is
helpful indeed to develop a controller but what about the thermodynamic
limit of a motor? Does information helps here? In my view, not.
In the Gray's book on consciousness (Consciousness: Creeping up on the
Hard Problem.) there is an interesting statement on if physics is enough
to explain biology. Gray's answer is yes provided we add cybernetics
laws and evolution. Let me leave evolution aside and discuss the
cybernetics laws only as this is exactly where, I think, information
comes into play. A good short video from the Artificial Intelligence
Class that I have recently attended would be a good introduction (an
intelligent agent sensing external information and then acting):
http://www.youtube.com/watch?v=cx3lV07w-XE
Thus, the question would be about the relationship between physics and
cybernetics laws. When we consider the Equation of Everything, are the
cybernetics laws already there or we still need to introduce them
separately? One of possible answers would be that the cybernetics laws
emerge or supervene on the physics laws. I however does not understand
what this means. It probably has something to do with a transition
between quantity and quality, but I do not understand how it happens
either. For myself, it remains a magic.
Let me repeat a series from physical objects discussed already recently
(see also [2][3]):
1) A rock;
2) A ballcock in the toilet;
3) A self-driving car;
4) A living cell.
Where do we have the cybernetics laws (information) and where not? Can
physics describe these objects without the cybernetics laws? What
emergence and superveniece mean along this series? Any idea?
Evgenii
[1] http://blog.rudnyi.ru/2010/12/entropy-and-artificial-life.html
[2] http://blog.rudnyi.ru/2011/01/perception-feedback-and-qualia.html
[3] http://blog.rudnyi.ru/2011/02/rock-and-information.html
" Some physicists say that information is related to the entropy"
If you look around you may still find species of scientists who still
are working with classical thermodynamics (search for example for
CALPHAD). Well, if you refer to them as physicists or not, it is your
choice. Anyway in experimental thermodynamics people determine
entropies, for example from CODATA tables
http://www.codata.org/resources/databases/key1.html
S � (298.15 K)
J K-1 mol-1
Ag cr 42.55 � 0.20
Al cr 28.30 � 0.10
Do you mean that 1 mole of Ag has more information than 1 mole of Al at
298.15 K?
Also remember that at constant volume dS = (Cv/T) dT and dU = CvdT. If
the entropy is information then its derivative must be related to
information as well. Hence Cv must be related to information. This
however means that the energy also somehow related to information.
Finally, the entropy is defined by the Second Law and the best would be
to stick to this definition. Only in this case, it is possible to
understand what we are talking about.
Evgenii
--
http://blog.rudnyi.ru
Evgenii, while you may be right that some physicists (mostly
experimentalists) work in thermodynamics without recourse to the
notion of information, and chemists even more so, it is also true that
the modern theoretical understanding of entropy (and indeed
thermodynamics) is information-based.
This trend really became mainstream with Landauer's work demonstrating
thermodynamic limits of information processing in the 1960s, which
turned earlier speculations by the likes of Schroedinger and Brillouin
into something that couldn't be ignored, even by experimentalists.
This trend of an information basis to physics has only accelerated
in my professional lifetime - I've seen people like Hawking discuss
information processing of black holes, and we've see concepts like the
Beckenstein bound linking geometry of space to information capacity.
David Deutsch is surely backing a winning horse to point out that
algorithmic information theory must be a foundational strand of the
"fabric of reality".
Cheers
--
----------------------------------------------------------------------------
Prof Russell Standish Phone 0425 253119 (mobile)
Principal, High Performance Coders
Visiting Professor of Mathematics hpc...@hpcoders.com.au
University of New South Wales http://www.hpcoders.com.au
----------------------------------------------------------------------------
Yes, it has more internal degrees of freedom so that it takes addition of more energy in
order to increase those we measure as temperature.
Brent
That's because the colloquial meaning of the terms takes into account the environment and
which form of information can be causally effective.
Brent
I know that many physicists identify the entropy with information.
Recently I had a nice discussion on biotaconv and people pointed out
that presumably Edwin T. Jaynes was the first to make such a connection
(Information theory and statistical mechanics, 1957). Google Scholar
shows that his paper has been cited more than 5000 times, that is
impressive and it shows indeed that this is in a way mainstream.
I have studied Jaynes papers but I have been stacked with for example
“With such an interpretation the expression “irreversible process”
represents a semantic confusion; it is not the physical process that is
irreversible, but rather our ability to follow it. The second law of
thermodynamics then becomes merely the statement that although our
information as to the state of a system may be lost in a variety of
ways, the only way in which it can be gained is by carrying out further
measurements.”
“It is important to realize that the tendency of entropy to increase is
not a consequence of the laws of physics as such, … . An entropy
increase may occur unavoidably, due to our incomplete knowledge of the
forces acting on a system, or it may be entirely voluntary act on our part.”
This is above of my understanding. As I have mentioned, I do not buy it,
I still consider the entropy as it has been defined by for example Gibbs.
Basically I do not understand what the term information then brings. One
can certainly state that information is the same as the entropy (we are
free with definitions after all). Yet I miss the meaning of that. Let me
put it this way, we have the thermodynamic entropy and then the
informational entropy as defined by Shannon. The first used to designe a
motor and the second to design a controller. Now let us suppose that
these two entropies are the same. What this changes in a design of a
motor and a controller? In my view nothing.
By the way, have you seen the answer to my question:
>> Also remember that at constant volume dS = (Cv/T) dT and dU =
>> CvdT. If the entropy is information then its derivative must be
>> related to information as well. Hence Cv must be related to
>> information. This however means that the energy also somehow
>> related to information.
If the entropy is the same as information, than through the derivatives
all thermodynamic properties are related to information as well. I am
not sure if this makes sense in respect for example to design a
self-driving car.
I am aware of works that estimated the thermodynamic limit (kT) to
process information. I do not see however, how this proves the
equivalence of information and entropy.
Evgenii
P.S. For a long time, people have identified the entropy with chaos. I
have recently read a nice book to this end, Entropy and Art by Arnheim,
1971, it is really nice. One quote:
"The absurd consequences of neglecting structure but using the concept
of order just the same are evident if one examines the present
terminology of information theory. Here order is described as the
carrier of information, because information is defined as the opposite
of entropy, and entropy is a measure of disorder. To transmit
information means to induce order. This sounds reasonable enough. Next,
since entropy grows with the probability of a state of affairs,
information does the opposite: it increases with its improbability. The
less likely an event is to happen, the more information does its
occurrence represent. This again seems reasonable. Now what sort of
sequence of events will be least predictable and therefore carry a
maximum of information? Obviously a totally disordered one, since when
we are confronted with chaos we can never predict what will happen next.
The conclusion is that total disorder provides a maximum of information;
and since information is measured by order, a maximum of order is
conveyed by a maximum of disorder. Obviously, this is a Babylonian
muddle. Somebody or something has confounded our language."
On 18.01.2012 23:42 Russell Standish said the following:
...
>> If you look around you may still find species of scientists who
>> still are working with classical thermodynamics (search for example
>> for CALPHAD). Well, if you refer to them as physicists or not, it
>> is your choice. Anyway in experimental thermodynamics people
>> determine entropies, for example from CODATA tables
>>
>> http://www.codata.org/resources/databases/key1.html
>>
>> S ° (298.15 K) J K-1 mol-1
>>
>> Ag cr 42.55 ą 0.20 Al cr 28.30 ą 0.10
>>
>> Do you mean that 1 mole of Ag has more information than 1 mole of
>> Al at 298.15 K?
>
> Yes, it has more internal degrees of freedom so that it takes
> addition of more energy in order to increase those we measure as
> temperature.
Could you please explain then why engineers do not use the CODATA/JANAF
Tables to find the best material to keep information?
Evgenii
Because they are interested in information that they can insert and retrieve. I once
invented write-only-memory, but it didn't sell. :-)
Brent
Well, but this shows that by information physicists and engineers mean
different things. It would good then to distinguish them.
Evgenii
> Brent
>
"This suggests to me that a molecule of DNA belonging to a kangaroo could have no more information than the same molecule with the primary sequence scrambled into randomness
or 'blanked out' with a single repeating A-T base pair.
"That would seem to make this definition of information the exact opposite of the colloquial meaning of the term."
Would you rather have an instruction manual in English or Urdu?
Brent
On 1/19/2012 11:06 AM, Evgenii Rudnyi wrote:On 19.01.2012 06:37 meekerdb said the following:On 1/18/2012 11:13 AM, Evgenii Rudnyi wrote:
snip
Could you please explain then why engineers do not use the CODATA/JANAF Tables to find the best material to keep information?
Because they are interested in information that they can insert and retrieve. I once invented write-only-memory, but it didn't sell. :-)
BrentEvgeniiBrentAlso remember that at constant volume dS = (Cv/T) dT and dU = CvdT.If the entropy is information then its derivative must be relatedto information as well. Hence Cv must be related to information.This however means that the energy also somehow related toinformation.Finally, the entropy is defined by the Second Law and the bestwould be to stick to this definition. Only in this case, it ispossible to understand what we are talking about.Evgenii
--
You received this message because you are subscribed to the Google Groups "Everything List" group.
To post to this group, send email to everyth...@googlegroups.com.
To unsubscribe from this group, send email to everything-li...@googlegroups.com.
For more options, visit this group at http://groups.google.com/group/everything-list?hl=en.
Because I tend to think of "negentropy", which is really another term
for information, I tend to give priority to Schroedinger who wrote
about the topic in the early 40s. But Jaynes was certainly
instrumental in establishing the information based foundations to
statistical physics, even before information was properly defined (it
wasn't really until the likes of Kolmogorov, Chaitin and Solomonoff in
the 60s that information was really understood.
But Landauer in the late 60s was probably the first to make physicists
really wake up to the concept of physical information.
But then, I'm not a science historian, so what would I know :).
>
> I have studied Jaynes papers but I have been stacked with for example
>
... snip ...
>
> Basically I do not understand what the term information then brings.
> One can certainly state that information is the same as the entropy
> (we are free with definitions after all). Yet I miss the meaning of
> that. Let me put it this way, we have the thermodynamic entropy and
> then the informational entropy as defined by Shannon. The first used
> to designe a motor and the second to design a controller. Now let us
> suppose that these two entropies are the same. What this changes in
> a design of a motor and a controller? In my view nothing.
>
I can well recommend Denbigh & Denbigh's book from the 80s - its a bit
more of a modern understanding of the topic than Jaynes :)
@book{Denbigh-Denbigh87,
author = {Denbigh, K. G. and Denbigh, J.},
publisher = { Cambridge UP},
title = { Entropy in Relation to Incomplete Knowledge},
year = { 1987},
}
> By the way, have you seen the answer to my question:
>
> >> Also remember that at constant volume dS = (Cv/T) dT and dU =
> >> CvdT. If the entropy is information then its derivative must be
> >> related to information as well. Hence Cv must be related to
> >> information. This however means that the energy also somehow
> >> related to information.
>
> If the entropy is the same as information, than through the
> derivatives all thermodynamic properties are related to information
> as well. I am not sure if this makes sense in respect for example to
> design a self-driving car.
>
The information embodied in the thermodynamic state is presumably not
relevant to the design of a self-driving car. By the same token,
thermodynamic treatment (typically) discards a lot of information
useful for engineering.
> I am aware of works that estimated the thermodynamic limit (kT) to
> process information. I do not see however, how this proves the
> equivalence of information and entropy.
>
> Evgenii
>
> P.S. For a long time, people have identified the entropy with chaos.
> I have recently read a nice book to this end, Entropy and Art by
> Arnheim, 1971, it is really nice. One quote:
>
I guess this is the original meaning of chaos, not the more modern
meaning referring to "low dimension dynamical systems having strange
attractors".
> "The absurd consequences of neglecting structure but using the
> concept of order just the same are evident if one examines the
> present terminology of information theory. Here order is described
> as the carrier of information, because information is defined as the
> opposite of entropy, and entropy is a measure of disorder. To
> transmit information means to induce order. This sounds reasonable
> enough. Next, since entropy grows with the probability of a state of
> affairs, information does the opposite: it increases with its
> improbability. The less likely an event is to happen, the more
> information does its occurrence represent. This again seems
> reasonable. Now what sort of sequence of events will be least
> predictable and therefore carry a maximum of information? Obviously
> a totally disordered one, since when we are confronted with chaos we
> can never predict what will happen next.
This rather depends on whether the disorder is informationally
significant. This is context dependent. I have a discussion on this
(it relates to the Kolmogorov idea that random sequences have maximum
complexity) in my paper "On Complexity and Emergence". I also touch on
the theme in my book "Theory of Nothing", which I know you've read!
> The conclusion is that
> total disorder provides a maximum of information;
Total disorder corresponds to a maximum of entropy. Maximum entropy
minimises the amount of information.
> and since
> information is measured by order, a maximum of order is conveyed by
> a maximum of disorder. Obviously, this is a Babylonian muddle.
> Somebody or something has confounded our language."
>
I would say it is many people, rather than just one. I wrote "On
Complexity and Emergence" in response to the amount of unmitigated
tripe I've seen written about these topics.
...
>>
>> Basically I do not understand what the term information then
>> brings. One can certainly state that information is the same as the
>> entropy (we are free with definitions after all). Yet I miss the
>> meaning of that. Let me put it this way, we have the thermodynamic
>> entropy and then the informational entropy as defined by Shannon.
>> The first used to designe a motor and the second to design a
>> controller. Now let us suppose that these two entropies are the
>> same. What this changes in a design of a motor and a controller? In
>> my view nothing.
>>
>
> I can well recommend Denbigh& Denbigh's book from the 80s - its a
> bit more of a modern understanding of the topic than Jaynes :)
>
> @book{Denbigh-Denbigh87, author = {Denbigh, K. G. and Denbigh, J.},
> publisher = { Cambridge UP}, title = { Entropy in Relation to
> Incomplete Knowledge}, year = { 1987}, }
Thanks. On biotaconv they have recommended John Avery's "Information
Theory and Evolution" but I think I have already satisfied my curiosity
with Jaynes's two papers. My personal feeling is as follows:
1) The concept of information is useless in conventional thermodynamic
problems. Let us take for example the Fe-C phase diagram
http://www.calphad.com/graphs/Metastable%20Fe-C%20Phase%20Diagram.gif
What information has to do with the entropies of the phases in this
phase diagram? Do you mean that I find an answer in Denbigh's book?
2) If physicists say that information is the entropy, they must take it
literally and then apply experimental thermodynamics to measure
information. This however seems not to happen.
3) I am working with engineers developing mechatronics products.
Thermodynamics (hence the entropy) is there as well as information.
However, I have not met a practitioner yet who makes a connection
between the entropy and information.
>
>> By the way, have you seen the answer to my question:
>>
>>>> Also remember that at constant volume dS = (Cv/T) dT and dU =
>>>> CvdT. If the entropy is information then its derivative must
>>>> be related to information as well. Hence Cv must be related to
>>>> information. This however means that the energy also somehow
>>>> related to information.
>>
>> If the entropy is the same as information, than through the
>> derivatives all thermodynamic properties are related to
>> information as well. I am not sure if this makes sense in respect
>> for example to design a self-driving car.
>>
>
> The information embodied in the thermodynamic state is presumably
> not relevant to the design of a self-driving car. By the same token,
> thermodynamic treatment (typically) discards a lot of information
> useful for engineering.
Sorry, I do not understand what this means.
>> I am aware of works that estimated the thermodynamic limit (kT) to
>> process information. I do not see however, how this proves the
>> equivalence of information and entropy.
>>
>> Evgenii
...
>> and since information is measured by order, a maximum of order is
>> conveyed by a maximum of disorder. Obviously, this is a Babylonian
>> muddle. Somebody or something has confounded our language."
>>
>
> I would say it is many people, rather than just one. I wrote "On
> Complexity and Emergence" in response to the amount of unmitigated
> tripe I've seen written about these topics.
>
>
I have found your work on archiv.org and I will look at it. Thank you
for mentioning it.
Evgenii
It does happen. The number of states, i.e. the information, available from a black hole
is calculated from it's thermodynamic properties as calculated by Hawking. At a more
conventional level, counting the states available to molecules in a gas can be used to
determine the specific heat of the gas and vice-verse. The reason the thermodynamic
measures and the information measures are treated separately in engineering problems is
that the information that is important to engineering is infinitesimal compared to the
information stored in the microscopic states. So the latter is considered only in terms
of a few macroscopic averages, like temperature and pressure.
Brent
...
>> 2) If physicists say that information is the entropy, they must
>> take it literally and then apply experimental thermodynamics to
>> measure information. This however seems not to happen.
>
> It does happen. The number of states, i.e. the information, available
> from a black hole is calculated from it's thermodynamic properties
> as calculated by Hawking. At a more conventional level, counting the
> states available to molecules in a gas can be used to determine the
> specific heat of the gas and vice-verse. The reason the thermodynamic
> measures and the information measures are treated separately in
> engineering problems is that the information that is important to
> engineering is infinitesimal compared to the information stored in
> the microscopic states. So the latter is considered only in terms of
> a few macroscopic averages, like temperature and pressure.
>
> Brent
Doesn't this mean that by information engineers means something
different as physicists?
Evgenii
I don't think so. A lot of the work on information theory was done by communication
engineers who were concerned with the effect of thermal noise on bandwidth. Of course
engineers specialize more narrowly than physics, so within different fields of engineering
there are different terminologies and different measurement methods for things that are
unified in basic physics, e.g. there are engineers who specialize in magnetism and who
seldom need to reflect that it is part of EM, there are others who specialize in RF and
don't worry about "static" fields.
Brent
>
> Evgenii
>
Do you mean that engineers use experimental thermodynamics to determine
information?
Evgenii
> Brent
>
>>
>> Evgenii
>>
>
To be concrete. This is for example a paper from control
J.C. Willems and H.L. Trentelman
H_inf control in a behavioral context: The full information case
IEEE Transactions on Automatic Control
Volume 44, pages 521-536, 1999
http://homes.esat.kuleuven.be/~jwillems/Articles/JournalArticles/1999.4.pdf
The term information is there but the entropy not. Could you please
explain why? Or alternatively could you please point out to papers where
engineers use the concept of the equivalence between the entropy and
information?
Evgenii
>
>> Brent
>>
>>>
>>> Evgenii
>>>
>>
>
...
>> and since information is measured by order, a maximum of order is
>> conveyed by a maximum of disorder. Obviously, this is a Babylonian
>> muddle. Somebody or something has confounded our language."
>>
>
> I would say it is many people, rather than just one. I wrote "On
> Complexity and Emergence" in response to the amount of unmitigated
> tripe I've seen written about these topics.
>
Russel,
I have read your paper
http://arxiv.org/abs/nlin/0101006
It is well written. Could you please apply the principles from your
paper to a problem on how to determine information in a book (for
example let us take your book Theory of Nothing)?
Also do you believe earnestly that this information is equal to the
thermodynamic entropy of the book? If yes, can one determine the
information in the book just by means of experimental thermodynamics?
Evgenii
P.S. Why it is impossible to state that a random string is generated by
some random generator?
These are two quite different questions. To someone who reads my book,
the physical form of the book is unimportant - it could just as easily
be a PDF file or a Kindle e-book as a physical paper copy. The PDF is
a little over 30,000 bytes long. Computing the information content
would be a matter of counting the number 30,000 long byte strings that
generate a recognisable variant of ToN when fed into Acrobat
reader. Then subtract the logarithm (to base 256) of this figure from
30,000 to get the information content in bytes.
This is quite impractical, of course, not to speak of expense in
paying for an army of people to go through 256^30,000 variants to
decide which ones are the true ToN's. An upper bound can be
found by compressing the file - PDFs are already compressed, so we
could estimate the information content as being between 25KB and 30KB (say).
To a physicist, it is the physical form that is important - the fact
that it is made of paper, with a bit of glue to hold it together. The
arrangement of ink on the pages is probably quite unimportant - a book
of the same size and shape, but with blank pages would do just as
well. Even if the arrangement of ink is important, then does
typesetting the book in a different font lead to the same book or a
different book?
To compute the thermodynamic information, one could imagine performing
a massive molecular dynamics simulation, and then count the number of
states that correspond to the physical book, take the logarithm, then
subtract that from the logarithm of the total possible number of
states the molecules could take on (if completely disassociated).
This is, of course, completely impractical. Computing the complexity
of something is generally NP-hard. But in principle doable.
Now, how does this relate to the thermodynamic entropy of the book? It
turns out that the information computed by the in-principle process
above is equal to the difference between the maximum entropy of the
molecules making up the book (if completely disassociated) and the
thermodynamic entropy, which could be measured in a calorimeter.
> If yes, can one determine the
> information in the book just by means of experimental
> thermodynamics?
>
One can certainly determine the information of the physical book
(defined however you might like) - but that is not the same as the
information of the abstract book.
> Evgenii
>
> P.S. Why it is impossible to state that a random string is generated
> by some random generator?
>
Not sure what you mean, unless you're really asking "Why it is
impossible to state that a random string is generated by some
pseudorandom generator?"
In which case the answer is that a pseudorandom generator is an
algorithm, so by definition doesn't produce random numbers. There is a
lot of knowledge about how to decide if a particular PRNG is
sufficiently random for a particular purpose. No PRNG is sufficiently
random for all purposes - in particular they are very poor for
security purposes, as they're inherently predictable.
Cheers
Rearranging the text file to have 40KB of 1s and 200KB of 0s
dramatically reduces the information and increases the entropy by the
same amount, although not nearly as much as completely scrambling the
file. I'd say you have a gross misunderstanding of how these measures
work if you think otherwise.
In thinking about how to answer this I came across an excellent paper by Roman Frigg and
Charlotte Werndl http://www.romanfrigg.org/writings/EntropyGuide.pdf which explicates the
relation more comprehensively than I could and which also gives some historical background
and extensions: specifically look at section 4.
Brent
Yet, this is already information. Hence if take the equivalence between
the informational and thermodynamic entropies literally, then even in
this case the thermodynamic entropy (that should be possible to measure
by experimental thermodynamics) must exist. What it is in this case?
> To a physicist, it is the physical form that is important - the fact
> that it is made of paper, with a bit of glue to hold it together.
> The arrangement of ink on the pages is probably quite unimportant - a
> book of the same size and shape, but with blank pages would do just
> as well. Even if the arrangement of ink is important, then does
> typesetting the book in a different font lead to the same book or a
> different book?
It is a good question and in my view it again shows that thermodynamic
entropy and information are some different things, as for the same
object we can define the information differently (see also below).
> To compute the thermodynamic information, one could imagine
> performing a massive molecular dynamics simulation, and then count
> the number of states that correspond to the physical book, take the
> logarithm, then subtract that from the logarithm of the total
> possible number of states the molecules could take on (if completely
> disassociated).
Do not forget that molecular dynamics simulation is based on the Newton
laws (even quantum mechanics molecular dynamics). Hence you probably
mean here the Monte-Carlo method. Yet, it is much simpler to employ
experimental thermodynamics (see below).
> This is, of course, completely impractical. Computing the complexity
> of something is generally NP-hard. But in principle doable.
>
> Now, how does this relate to the thermodynamic entropy of the book?
> It turns out that the information computed by the in-principle
> process above is equal to the difference between the maximum entropy
> of the molecules making up the book (if completely disassociated) and
> the thermodynamic entropy, which could be measured in a calorimeter.
>
>
>> If yes, can one determine the information in the book just by means
>> of experimental thermodynamics?
>>
>
> One can certainly determine the information of the physical book
> (defined however you might like) - but that is not the same as the
> information of the abstract book.
Let me suggest a very simple case to understand better what you are
saying. Let us consider a string "10" for simplicity. Let us consider
the next cases. I will cite first the thermodynamic properties of Ag and
Al from CODATA tables (we will need them)
S � (298.15 K)
J K-1 mol-1
Ag cr 42.55 � 0.20
Al cr 28.30 � 0.10
In J K-1 cm-3 it will be
Ag cr 42.55/107.87*10.49 = 4.14
Al cr 28.30/26.98*2.7 = 2.83
1) An abstract string "10" as the abstract book above.
2) Let us make now an aluminum plate (a page) with "10" hammered on it
(as on a coin) of the total volume 10 cm^3. The thermodynamic entropy is
then 28.3 J/K.
3) Let us make now a silver plate (a page) with "10" hammered on it (as
on a coin) of the total volume 10 cm^3. The thermodynamic entropy is
then 41.4 J/K.
4) We can easily make another aluminum plate (scaling all dimensions
from 2) to the total volume of 100 cm^3. Then the thermodynamic entropy
is 283 J/K.
Now we have four different combinations to represent a string "10" and
the thermodynamic entropy is different. If we take the statement
literally then the information must be different in all four cases and
defined uniquely as the thermodynamic entropy is already there. Yet in
my view this makes little sense.
Could you please comment on this four cases?
>> Evgenii
>>
>> P.S. Why it is impossible to state that a random string is
>> generated by some random generator?
>>
>
> Not sure what you mean, unless you're really asking "Why it is
> impossible to state that a random string is generated by some
> pseudorandom generator?"
>
> In which case the answer is that a pseudorandom generator is an
> algorithm, so by definition doesn't produce random numbers. There is
> a lot of knowledge about how to decide if a particular PRNG is
> sufficiently random for a particular purpose. No PRNG is
> sufficiently random for all purposes - in particular they are very
> poor for security purposes, as they're inherently predictable.
I understand. Yet if we take a finite random string, then presumably
there should be some random generate with some seed that produces it.
What would be wrong with this?
Evgenii
> Cheers
>
> If you are instead saying that they are inversely proportional then
> I would agree in general - information can be considered negentropy.
> Sorry, I thought you were saying that they are directly proportional
> measures (Brent and Evgenii seem to be talking about it that way). I
I am not an expert in the informational entropy. For me it does not
matter how they define it in the information theory, whether as entropy
or negentropy. My point is that this has nothing to do with the
thermodynamic entropy (see my previous message with four cases for the
string "10").
Evgenii
> In thinking about how to answer this I came across an excellent paper
> by Roman Frigg and Charlotte Werndl
> http://www.romanfrigg.org/writings/EntropyGuide.pdf which explicates
> the relation more comprehensively than I could and which also gives
> some historical background and extensions: specifically look at
> section 4.
>
> Brent
>
Thanks for the link. I will try to work it out to see if they have an
answer to the four cases with the string "10" that I have described in
my reply to Russe