Entropy: A Guide for the Perplexed

13 views
Skip to first unread message

Evgenii Rudnyi

unread,
Feb 5, 2012, 11:16:18 AM2/5/12
to everyth...@googlegroups.com
On 24.01.2012 22:56 meekerdb said the following:

> In thinking about how to answer this I came across an excellent paper
> by Roman Frigg and Charlotte Werndl
> http://www.romanfrigg.org/writings/EntropyGuide.pdf which explicates
> the relation more comprehensively than I could and which also gives
> some historical background and extensions: specifically look at
> section 4.
>
> Brent

Brent,

I have started reading the pdf. A few comments to section 2 Entropy in
thermodynamics.

The authors seem to be sloppy.

1) p. 2 (116). "If we consider a cyclical process—a process in which the
beginning and the end state are the same — a reversible process leaves
the system and its surroundings unchanged."

This is wrong, as one runs the Carnot cycle reversibly, then the heat
will be converted to work (or vice versa) and there will be changes in
the surroundings. They probably mean that if one runs the Carnot cycle
reversibly twice, first in one direction and then in the opposite, then
the surrounding will be unchanged.

2) p. 2(116). "We can then assign an absolute entropy value to every
state of the system by choosing one particular state A (we can choose
any state we please!) as the reference point."

They misuse the conventional terminology. The absolute entropy is
defined by the Third Law and they just want employ S instead of Del S.
It is pretty dangerous, as when one changes the working body in the
Carnot cycle, then such a notation will lead to a catastrophe.

3) p.3(117). "If we now restrict attention to adiathermal processes
(i.e. ones in which temperature is constant),"

According to Eq 4 that they discuss they mean an adiabatic process where
temperature is not constant.

However, at the end of this small section they write

p. 3(117). "S_TD has no intuitive interpretation as a measure of
disorder, disorganization, or randomness (as is often claimed). In fact
such considerations have no place in TD."

I completely agree with that, so I am going to read further.

Evgenii

Evgenii Rudnyi

unread,
Feb 5, 2012, 1:28:47 PM2/5/12
to everyth...@googlegroups.com
On 05.02.2012 17:16 Evgenii Rudnyi said the following:

> On 24.01.2012 22:56 meekerdb said the following:
>
>> In thinking about how to answer this I came across an excellent
>> paper by Roman Frigg and Charlotte Werndl
>> http://www.romanfrigg.org/writings/EntropyGuide.pdf which
>> explicates the relation more comprehensively than I could and which
>> also gives some historical background and extensions: specifically
>> look at section 4.
>>
>> Brent

I have browsed the paper. I should say that I am not impressed. The
logic is exactly the same as in other papers and books.

I have nothing against the Shannon entropy (Section 3 in the paper).
Shannon can use the term entropy (why not) but then we should just
distinguish between the informational entropy and the thermodynamic
entropy as they have been introduced for completely different problems.

The logic that both entropies are the same is in Section 4 and it is
expressed bluntly as

p. 13 (127) "Then, if we regard the w_i as messages, S_B,m(M_i) is
equivalent to the Shannon entropy up to the multiplicative constant nk
and the additive constant C."

p. 15 (129) "The most straightforward connection is between the Gibbs
entropy and the continuous Shannon entropy, which differ only by the
multiplicative constant k."

Personally I find this clumsy. In my view, the same mathematical
structure of equations does not say that the phenomena are related. For
example the Poisson equation for electrostatics is mathematically
equivalent to the stationary heat conduction equation. So what? Well,
one creative use is for people who have a thermal FEM solver and do not
have an electrostatic solver. They can solve an electrostatic problem by
using a thermal FEM solver by means of mathematical analogy. This does
happen but I doubt that we could state that the stationary heat
conduction is equivalent to electrostatics.

The most funny it looks in the conclusion

p. 28(142) "First, all notions of entropy discussed in this essay,
except the thermodynamic and the topological entropy, can be understood
as variants of some information-theoretic notion of entropy."

I understand it this way. When I am working with gas, liquid or solid at
the level of experimental thermodynamics, the information according to
the authors is not there (at this point I am in agreement with them).
Yet, as soon as theoretical physicists start thinking about these
objects, they happen to be fully filled with information.

Evgenii

> Brent,
>
> I have started reading the pdf. A few comments to section 2 Entropy
> in thermodynamics.
>
> The authors seem to be sloppy.
>

> 1) p. 2 (116). "If we consider a cyclical process�a process in which
> the beginning and the end state are the same � a reversible process

Russell Standish

unread,
Feb 5, 2012, 4:33:00 PM2/5/12
to everyth...@googlegroups.com
On Sun, Feb 05, 2012 at 07:28:47PM +0100, Evgenii Rudnyi wrote:
> The most funny it looks in the conclusion
>
> p. 28(142) "First, all notions of entropy discussed in this essay,
> except the thermodynamic and the topological entropy, can be
> understood as variants of some information-theoretic notion of
> entropy."
>
> I understand it this way. When I am working with gas, liquid or
> solid at the level of experimental thermodynamics, the information
> according to the authors is not there (at this point I am in
> agreement with them). Yet, as soon as theoretical physicists start
> thinking about these objects, they happen to be fully filled with
> information.
>
> Evgenii

Would you say that thermodynamic entropy is the same as the
Bolztmann-Gibbs formulation? If not, then why not? You will certainly
be arguing against orthodoxy, but you're welcome to try.

If you agree that it is the same, then surely you can see that
information and entropy are related - they are both the logarithm of a
probability - in the case of Boltzmann it is the logarithm of the
number of possible microstates multiplied by the probability of the
thermodynamic state.

Are you aware of the result relating the Kolmogorov "program length"
complexity measure to the logarithm of the probability of that program
appearing in the universal prior?

Both are examples of information, but measured in different contexts.

I will comment on the entropy context of the JANAF tables in another
post. Essentially you are asserting that the context of those tables
is the only context under which thermodynamic entropy makes sense. All
other contexts for which there is an entropy-like quantity do not
count, and those measures should be called something else. A variety
of information, or complexity perhaps.

Alternatively, we could recognise the modern understanding that these
terms are all essentially equivalent, but that they refer to a family
of measures that vary depending on the context.

It comes down to a terminological argument, sure, but your insistence
that thermodynamic entropy is a special case strikes me as a baroque
means of hiding the thermodynamic context - one that doesn't engender
understanding of the topic.

--

----------------------------------------------------------------------------
Prof Russell Standish Phone 0425 253119 (mobile)
Principal, High Performance Coders
Visiting Professor of Mathematics hpc...@hpcoders.com.au
University of New South Wales http://www.hpcoders.com.au
----------------------------------------------------------------------------

Jason Resch

unread,
Feb 6, 2012, 11:44:06 AM2/6/12
to everyth...@googlegroups.com
I think entropy is better intuitively understood as uncertanty. The
entropy of a gas is the uncertanty of the particle positions and
velocities. The hotter it is the more uncertanties there are. A
certain amount of information is required to eliminate this uncertanty.

Jason

>> 1) p. 2 (116). "If we consider a cyclical process—a process in wh
>> ich
>> the beginning and the end state are the same — a reversible process


>> leaves the system and its surroundings unchanged."
>>
>> This is wrong, as one runs the Carnot cycle reversibly, then the heat
>> will be converted to work (or vice versa) and there will be changes
>> in the surroundings. They probably mean that if one runs the Carnot
>> cycle reversibly twice, first in one direction and then in the
>> opposite, then the surrounding will be unchanged.
>>
>> 2) p. 2(116). "We can then assign an absolute entropy value to every
>> state of the system by choosing one particular state A (we can
>> choose any state we please!) as the reference point."
>>
>> They misuse the conventional terminology. The absolute entropy is
>> defined by the Third Law and they just want employ S instead of Del
>> S. It is pretty dangerous, as when one changes the working body in
>> the Carnot cycle, then such a notation will lead to a catastrophe.
>>
>> 3) p.3(117). "If we now restrict attention to adiathermal processes
>> (i.e. ones in which temperature is constant),"
>>
>> According to Eq 4 that they discuss they mean an adiabatic process
>> where temperature is not constant.
>>
>> However, at the end of this small section they write
>>
>> p. 3(117). "S_TD has no intuitive interpretation as a measure of
>> disorder, disorganization, or randomness (as is often claimed). In
>> fact such considerations have no place in TD."
>>
>> I completely agree with that, so I am going to read further.
>>
>> Evgenii
>>
>

> --
> You received this message because you are subscribed to the Google
> Groups "Everything List" group.
> To post to this group, send email to everyth...@googlegroups.com.
> To unsubscribe from this group, send email to everything-li...@googlegroups.com
> .
> For more options, visit this group at http://groups.google.com/group/everything-list?hl=en
> .
>

Evgenii Rudnyi

unread,
Feb 6, 2012, 2:18:55 PM2/6/12
to everyth...@googlegroups.com
On 05.02.2012 22:33 Russell Standish said the following:

> On Sun, Feb 05, 2012 at 07:28:47PM +0100, Evgenii Rudnyi wrote:
>> The most funny it looks in the conclusion
>>
>> p. 28(142) "First, all notions of entropy discussed in this essay,
>> except the thermodynamic and the topological entropy, can be
>> understood as variants of some information-theoretic notion of
>> entropy."
>>
>> I understand it this way. When I am working with gas, liquid or
>> solid at the level of experimental thermodynamics, the information
>> according to the authors is not there (at this point I am in
>> agreement with them). Yet, as soon as theoretical physicists start
>> thinking about these objects, they happen to be fully filled with
>> information.
>>
>> Evgenii
>
> Would you say that thermodynamic entropy is the same as the
> Bolztmann-Gibbs formulation? If not, then why not? You will
> certainly be arguing against orthodoxy, but you're welcome to try.

There is some difference between the entropy and classical and
statistical thermodynamics. I will copy my old text to describe it.

In order to explain you this, let us consider a simple experiment. We
bring a glass of hot water in the room and leave it there. Eventually
the temperature of the water will be equal to the ambient temperature.
In classical thermodynamics, this process is considered as irreversible,
that is, the Second Law forbids that the temperature in the glass will
be hot again spontaneously. It is in complete agreement with our
experience, so one would expect the same from statistical mechanics.
However there the entropy has some statistical meaning and there is a
nonzero chance that the water will be hot again. Moreover, there is a
theorem (Poincar� recurrence) that states that if we wait long enough
then the temperature of the glass must be hot again.

Otherwise, they are the same. This does not mean however that the
information come into the play in the Boltzmann-Gibbs formulation. You
have missed my comment to this, hence I will repeat it.

On 05.02.2012 19:28 Evgenii Rudnyi said the following:
...

In my option, the similarity of mathematical equations does not mean
that the phenomena are the same. Basically it is about definitions. If
you define the information through the Shannon entropy, it is okay. You
have however to prove that the Shannon entropy is the same as the
thermodynamic entropy. In this respect, the similarity of equations, in
my view, is a weak argument.

Do you have anything else to support that the thermodynamic entropy is
information except that the two equations are similar with each other?

Evgenii

Evgenii Rudnyi

unread,
Feb 6, 2012, 2:46:14 PM2/6/12
to everyth...@googlegroups.com
On 06.02.2012 17:44 Jason Resch said the following:

> I think entropy is better intuitively understood as uncertanty. The
> entropy of a gas is the uncertanty of the particle positions and
> velocities. The hotter it is the more uncertanties there are. A
> certain amount of information is required to eliminate this
> uncertanty.
>
> Jason

Could you please show how your definition of entropy could be employed
to build for example the next phase diagram

http://www.calphad.com/graphs/Metastable%20Fe-C%20Phase%20Diagram.gif

If you find such a question too complicated, please consider the
textbook level problem below and show how you will solve it using
uncertainties.

Evgenii

����������������
Problem. Given temperature, pressure, and initial number of moles of
NH3, N2 and H2, compute the equilibrium composition.

To solve the problem one should find thermodynamic properties of NH3, N2
and H2 for example in the JANAF Tables and then compute the equilibrium
constant.

From thermodynamics tables (all values are molar values for the
standard pressure 1 bar, I have omitted the symbol o for simplicity but
it is very important not to forget it):

Del_f_H_298(NH3), S_298(NH3), Cp(NH3), Del_f_H_298(N2), S_298(N2),
Cp(N2), Del_f_H_298(H2), S_298(H2), Cp(H2)

2NH3 = N2 + 3H2

Del_H_r_298 = Del_f_H_298(N2) + 3 Del_f_H_298(H2) � 2 Del_f_H_298(NH3)

Del_S_r_298 = S_298(N2) + 3 S_298(H2) � 2 S_298(NH3)

Del_Cp_r = Cp(N2) + 3 Cp(H2) � 2 Cp(NH3)

To make life simple, I will assume below that Del_Cp_r = 0, but it is
not a big deal to extend the equations to include heat capacities as well.

Del_G_r_T = Del_H_r_298 � T Del_S_r_298

Del_G_r_T = � R T ln Kp

When Kp, total pressure and the initial number of moles are given, it is
rather straightforward to compute equilibrium composition. If you need
help, please just let me know.
����������������

meekerdb

unread,
Feb 6, 2012, 3:10:37 PM2/6/12
to everyth...@googlegroups.com

It is not based just on the similarity of equations. The equations are similar because
they come from the same concepts. The Shannon information measure of the amount of phase
space available to a system, given the value of some macro variables like temperature,
pressure,... is proportional to the statistical mechanical entropy of the system. There
are idealizations in the analysis, both on the thermodynamic and on the statistical
mechanics side. Among the idealizations is the neglect of bulk shapes (e.g. the text
stamped on a coin) and collective motions (e.g. acoustic waves).

Brent

Evgenii Rudnyi

unread,
Feb 7, 2012, 2:42:54 PM2/7/12
to everyth...@googlegroups.com
On 06.02.2012 21:10 meekerdb said the following:

Brent,

I would suggest to look at the history briefly.

Statistical thermodynamics has been derived by Boltzmann and Gibbs and
at that time there was no information in it. This lasted for quite
awhile and many famous physicists have not found any information in
statistical mechanics.

The information entropy has started with Shannon's work where he writes

"The form of H will be recognized as that of entropy as defined in
certain formulations of statistical mechanics8 where pi is the
probability of a system being in cell i of its phase space. H is then,
for example, the H in Boltzmann s famous H theorem."

Yet, he just shows that the equation is similar but he does not make a
statement about the meaning of such a similarity, that is, he does not
identifies his entropy as the thermodynamic entropy. He just uses the
term, nothing more. Now we have two similar equations describing two
different phenomena and such a state took again a while.

Now let me quote from Edwin T. Jaynes first paper

p. 622 after eq (2-3) (this is the Shannon equation) "Since this is just
the expression for entropy as found in statistical mechanics, it will be
called the entropy of the probability distribution p_i; henceforth we
will consider the terms "entropy" and "uncertainty" as synonymous."

This is exactly the logic, that I have mentioned above and that is
expressed in the paper you give me. As the two equations are the same,
they describe the same phenomenon. In my view, this is clumsy and I have
given example when the same mathematical equation describes two
different physical phenomena.

If you talk about the same concept, let me ask you following. The only
example of the entropy used by engineers in informatics has been given
by Jason and I will quote him below. Could you please tell me, the
thermodynamic entropy of what is discussed in his example?

Evgenii


On 03.02.2012 00:14 Jason Resch said the following:
...
> Evgenii,
>
> Sure, I could give a few examples as this somewhat intersects with my
> line of work.
>
> The NIST 800-90 recommendation (
> http://csrc.nist.gov/publications/nistpubs/800-90A/SP800-90A.pdf )
> for random number generators is a document for engineers implementing
> secure pseudo-random number generators. An example of where it is
> important is when considering entropy sources for seeding a random
> number generator. If you use something completely random, like a
> fair coin toss, each toss provides 1 bit of entropy. The formula is
> -log2(predictability). With a coin flip, you have at best a .5
> chance of correctly guessing it, and -log2(.5) = 1. If you used a
> die roll, then each die roll would provide -log2(1/6) = 2.58 bits of
> entropy. The ability to measure unpredictability is necessary to
> ensure, for example, that a cryptographic key is at least as
> difficult to predict the random inputs that went into generating it
> as it would be to brute force the key.
>
> In addition to security, entropy is also an important concept in the
> field of data compression. The amount of entropy in a given bit
> string represents the theoretical minimum number of bits it takes to
> represent the information. If 100 bits contain 100 bits of entropy,
> then there is no compression algorithm that can represent those 100
> bits with fewer than 100 bits. However, if a 100 bit string contains
> only 50 bits of entropy, you could compress it to 50 bits. For
> example, let's say you had 100 coin flips from an unfair coin. This
> unfair coin comes up heads 90% of the time. Each flip represents
> -log2(.9) = 0.152 bits of entropy. Thus, a sequence of 100 coin
> flips with this biased coin could be represent with 16 bits. There
> is only 15.2 bits of information / entropy contained in that 100 bit
> long sequence.
>
> Jason

Reply all
Reply to author
Forward
0 new messages