Fwd: Entropy and Artificial Life

3 views
Skip to first unread message

Evgenii Rudnyi

unread,
Mar 17, 2010, 5:04:54 PM3/17/10
to thermody...@googlegroups.com
Recently I have sent a message below to biotaconv list

http://lists.ccon.org/cgi-bin/mailman/listinfo/biotaconv

and there was some discussion on this topic there. Well, if someone here
has interest to the topic, we could discuss it here as well.

Evgenii

-------- Original Message --------
Subject: [bC] Entropy and Artificial Life
Date: Sun, 07 Mar 2010 19:17:49 +0100
From: Evgenii Rudnyi <use...@rudnyi.ru>
Reply-To: biot...@lists.ccon.org
To: biot...@lists.ccon.org

I am reading a book of Christoph Adami "Introduction to Artificial Life"
and I am a bit confused of his definition of entropy. It seems that the
notation he uses is quite different from what is defined in the normal
thermodynamics.

I will make a few citations from the book

p. 94 "Entropy is a measure of the disorder present in a system, or
alternatively, a measure of our lack of knowledge about this system."

p. 96 "If an observer gains knowledge about the system and thus
determines that a number of states that were previously deemed probable
are in fact unlikely, the entropy of the system (which now has turned
into a conditional entropy), is lowered, simply because the number of
different possible states in the lower. (Note that such a change in
uncertainty is usually due to a measurement).

p. 97 "Clearly, the entropy can also depend on what we consider
"different". For example, one may count states as different that differ
by, at most, del_x in some observable x (for example, the color of a
ball drawn from an ensemble of differently shaded balls in an urn). Such
entropies are then called fine-grained (if del_x is small), or
course-grained (if del_x is large) entropies."

Let us compare these statements with what available in the
thermodynamics tables, for example one can find entropies of many
substances in the NIST Chemistry WebBook

http://webbook.nist.gov/chemistry/

or short list from CODATA

http://www.codata.org/resources/databases/key1.html

As one can see, the entropy of many substances has been measured and
tabulated. Actually the entropy is a function of temperature and
pressure (for a solution also of composition). In any case, it is hard
to imagine that these values in the thermodynamic tables depend on an
observer, that is, with some magic one can just change them by saying
fine-grained or course-grained.

Now to disorder. I do not know how to define it, yet the entropy as such
is a well-defined physical properties. The values in the thermodynamics
tables have been just measured (without any reference to disorder). Let
me give a small example. Silicon and diamond have the same crystal
structure. However, the entropies at 298.15 and 1 bar are quite
different, 2.4 J/(K mol) for diamond and 18.8 J/(K mol) for silicon.
Well, one may want to convert this per gram (divide by 12.01 and 28.08
accordingly) but the entropy of silicon will be still higher. Can anyone
explain my why silicon has more disorder that diamond in this case?

I would appreciate any comments.

Best wishes,

Evgenii

_______________________________________________
BiotaConv mailing list
Biot...@lists.ccon.org
http://lists.ccon.org/cgi-bin/mailman/listinfo/biotaconv

Reply all
Reply to author
Forward
0 new messages