Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Review of papers by Zurek and Peres on fundamental physics #1

9 views
Skip to first unread message

Jack Sarfatti

unread,
Feb 18, 1993, 2:47:47 AM2/18/93
to sci-physic...@uunet.uu.net

[moderator's note: though the author of this posting offered a
continuation of this review, it is suggested
that further review of this fine and
interesting source be pursued individually
at your local library. A quick search on the
internet suggests widespread availabilty - crb]

Commentaries on "Complexity, Entropy and the Physics of Information",
edited by W.H. Zurek, Santa Fe Institute (Addison-Wesley, ISBN 0-201-51509-
1/QC39.S48 1991)

Let's review Asher Peres'(Technion, Israel) "Thermodynamic Constraints on
Quantum Axioms" and contrast it with Zurek's "Algorithmic Information
Content, Church-Turing Thesis, Physical Entropy, and Maxwell's Demon."

"The second law of thermodynamics imposes severe constraints on the formal
structure of quantum theory. In particular, that law would be violated if
it were possible to distinguish non-orhtogonal states or if Schrodinger's
quation were nonlinear."

What about the Hartree-Fock self-consistent field equation for many-
electron atoms, are they "nonlinear" in the sense Peres means?

"Thermodynamics, relativity and quantum theory are the three pillars ...
not branches .... but general frameworks encompassing every aspect of
physics. Thermodynamics ... governs the convertibility of various forms of
energy; relativity theory deals with measurements of space and time; and
quantum theory is a set of rules for computing probabilities of outcomes of
tests (also called "measurements") following specified preparations. .. in
quantum theory, probabilities can be computed for the outcomes of tests
which follow specified preparations, not those which precede them."
"

This last definition of quantum mechanics is debatable because the Wheeler
"delayed choice" experiment suggests preparation from the future. Thus,
Wheeler writes: "Not until we have fixed arrangements at our telescope do
we register tonight's quantum as having passed to the left (or right) of
the (gravitational) lens or by both routes (as in a double slit
experiment)." So that the "preparation" of the light at the gravitational
lens, particle or wave is from the future choice of the telescope
arrangement.

Peres continues:"Each member of this triad involves timeordering as a
primitive concept. In thermodynamics, high-grade ordered energy can
spontaneously degrade into a disordered form of energy, called heat. The
time-reversed process never occurs. .... In relativity, information is
collected from the past light cone and propagates into the future."

Perhaps so, at the classical level, but Feynman's photon propagator has an
advanced component that propagates into the past in quantum
electrodynamics.

"... a detailed investigation of the radiating relativistic gas... shows
that thermal equilibrium can be obtained if, and only if, the spontaneous
decay rate of an excited atom (Einstein's A-coefficient) is reduced in the
exact ratio of the relativistic time dilation due to the motion of that
atom."

Peres' main idea in this paper seems to be "that the von-Neumann-Shannon
entropy is authentic entropy" which is in apparent conflict with Zurek's
paper. Thus, claims that "physical entropy" is not equal to the "Boltzmann-
Gibbs-Shannon entropy" which is H, the same as "von-Neumann-Shannon
entropy". H given by

H = - Tr[p logp] (base 2)

where p is the quantum mechanical density matrix. The physical entropy S
is the sum of H with K which is Chaitin's "algorithmic randomness". Zurek
says that H is "remaining ignorance", while K is "randomness in the already
available data". Zurek says "Measurements can only convert (H) uncertainty
(quantified by the statistical entropy) into (K) randomness of the outcome
(given by the algorithmic information content of the data). The ability to
extract useful work is measured by physical entropy, which is equal to the
sum of these two measures of disorder. so defined, physical entropy is, on
the average, constant in the course of measurements carried out by the
observer on an equilibrium system. .... the amount of useful work DW which
can be extracted by a 'demon' (i.e., "information gathering and using
system IGUS") from a system is given by

DW = kBT(DH + DK) = kBTDS"

"D" means "finite change in", kB is kln2 where k is usual Boltzmann's
constant, "algorithmic information content" = "algorithmic entropy" =
"algorithmic complexity" = "algorithmic randomness" in the literature.
They all mean, given a binary bit string s, K(s) is the size in bits |s| of
the shortest program that computes s on a universal computer. For example,
pi and rt2 appear to be random but have a small K. "... assessing a
system's algorithmic complexity is related to Godel's undecidability" Zurek
remarks. He also writes:

"The ability of living organisms to perform measurements and 'profit' by
exploiting their outcomes can be analyzed in .. algorithmic terms...
measurements decrease (H) ignorance about the specific state of the system,
but increase the size (K) of the record necessary to encode the acquired
information."

According to Zurek's Figures 5(a&b), if we plot the entropy in bits on the
vertical and the number of measurements on the horizontal, if we make a
time series of measurements, accumulating data d, on a quasi-closed non-
living system in thermal equilibrium, the total physical entropy S is a
constant of the irreversible motion (i.e., DS = 0 on the average), so that
the work DW that can be extracted is zero in accord with the second law of
thermodynamics. In other words, the von-Neumann-Shannon entropy H is
decreasing by the same average amount as the algorithmic complexity K
(making the record) is increasing. In contrast, if the sequence of
measurements is made on an open system far-from-thermal-equlibrium (all
living systems have this property but not all such open systems are
living), then DS is no longer zero and net useful work can be extracted by
the IGUS demon. That is, the increase of algorithmic complexity K needed
to make the record of the data is significantly smaller than the decrease
of von-Neumann-Shannon entropy H (in absolute value), so that a net useful
work can be extracted from a time series on measurements on the
nonequilbrium system. Zurek says:

"The second law is safe when formulated in terms of the physical entropy S.
Indeed, physical entropy S has the great advantage of removing the
'illusion' that entropy decreases in the course of a measurement. Of
course, the ignorance (measured by H) does indeed decrease, but only at the
expense of the increase of the minimal record size K. Hence, in
measurements performed on equilibrium ensembles S = H + K is ,on the
average, constant. By contrast, measurements performed on far-from-
equilibrium systems can result in a decrease of ignorance which is much
larger than the resulting increase in record size. ... Fortunately, the
Universe that we inhabit is precisely such a nonequilibrium environment: It
pays to measure ... it is not really necessary to look for truly minimal
programs: the possible gain of useful work outweighs even substantial
inefficiencies in the record size optimization ... the ability to recognize
in the measurement outcome the opportunity for such a gain of useful work
is essential for the well-being of IGUS's."

We have met the IGUS's and they are us! It would be interesting to flesh
out Zurek's general framework for measurement in the specific contxt of the
DNA code and other life mechanisms. But now let's return to Peres.

to be continued

#9
Continuing with our examination of Asher Peres's "Thermodynamic COnstraints
on Quantum Axioms" in "Complexity, Entropy and the Physics of Information",
edited by W.H. Zurek, Santa Fe Institute (Addison-Wesley, ISBN 0-201-51509-
1/QC39.S48 1991).

Peres assumes the Gibbs ensemble "namely an infinite set of conceptual
replicas of that object, all identically prepared. Only then can we give a
meaning to the notion of probability." He also assumes "that all
information about the preparation of such an ensemble can be represented by
a Hermitian matrix p, satisfying Tr[p] = 1, called the density matrix." He
further assumes the standard lore that observables A are also Hermitian
operators and that the expectation value over the Gibbs ensemble is
<A> = Tr[Ap]. He then makes an important argument: "These rules have a
remarkable consequence. Given two different preparations represented by
matrices p1 and p2, one can prescribe another preparation p by the
following recipe: Let a random process have probability x to 'succeed' and
probability (1 - x) to 'fail'. In case of success, prepare the quantum
system according to p1. In case of failure, prepare it according to p2.
This process results in a p given by

p = xp1 + (1 - x)p2

.....

<A> = x Tr[Ap1] + (1 - x) Tr[Ap2] = Tr[Ap]

What I find truly amazing in this result is that once p is given, it
contains all the available information and it is impossible to reconstruct
from it p1 and p2! For example, if we prepare a large number of polarized
photons and if we toss a coin to decide, with equal probabilities, whether
the next photon to be prepared will have a vertical or horizontal linear
polarization, or, in a different experimental setup, we likewise randomly
decide whether each photon will have right-handed or left-handed circular
polarization, we get in both cases the same

1/2 0
p =
0 1/2

... If this were not true, EPR correlations would allow instantaneous
transfer of information to distant observers in violation of relativistic
causality."

Peres then goes on to show that a new theory beyond standard quantum
mechanics that permitted this kind of causality-violating quantum
connection communication would also allow us to beat the second law of
thermodynamics. That is, in such an alternative reality, Maxwell's demon
could do the job. In particular, it would be possible to extract energy
from the zero point fluctuations of the vacuum. Thus, Peres writes, (e.g.,
quantum multiplexing in a quantum optical computer):

"Another example would be to prepare photons having, with equal
probabilities, linear vertical polarization or circular right-handed
polarization. An observer requested to guess what was the preparation of a
particular photon, under the best conditions allowed by quantum theory,
would be able to give the answer with certainty in only 29.3% of cases. It
will be shown below that a 'superobserver' who could always give an
unambiguous answer would also be able to extract an infinite amount of work
from an isothermal reservoir."

Peres then fixes Einstein's argument for the entropy of a quantum ensemble
using an interaction that "allows us to distinguish different eigenvalues
of Hermitian operators, which correspond to orthogonal states of the
quantum system." He then adds: "Let us suppose now that some other type of
interaction would allow us to distinguish non-orthogonal states." What
about Glauber coherent states and squeezed states? "This would have
momentous consequences; for example, a certain type of EPR correlation
could be used to transfer information instantaneously. I shall now show
that this could also be used to convert into work an unlimited amount of
heat extracted from an isothermal reservoir."

to be continued.

0 new messages