A bit of a miscellany this week....
1) Algorithms for quantum computation: discrete log and factoring,
extended abstract by Peter Shor.
There has been a bit of a stir about this paper; since I know Peter
Shor's sister I was able to get a copy and see what it was really all
about.
Quantum computers are so far just a theoretical possibility. It's
easiest to see why machines that take advantage of quantum theory might
be efficient at computation if we think in terms of path integrals. In
Feynman's path-integral approach to quantum theory, the probability of
getting from state A at time zero to state B some later time is obtained
by integrating the exponential of the action
exp(iS/hbar)
over *all* paths from A to B, and then taking the absolute value
squared. (Here we are thinking of states A and B that correspond to
points in the classical configuration space.) We can think of the
quantum system as proceeding along all paths simultaneously; it is the
constructive or destructive interference between paths due to the phases
exp(iS/hbar) that makes certain final outcomes B more likely than
others. In many situations, there is massive destructive interference
except among paths very close to those which are critical points of the
action S; the latter are the *classical* paths. So in a sense, a
classical device functions as it does by executing all possible motions;
motions far from those satisfying Newton's laws simply cancel out by
destructive interference! (There are many other ways of thinking about
quantum theory; this one can be difficult to make mathematically
rigorous, but it's often very handy.)
This raises the idea of building a computer that would take advantage of
quantum theory by trying out all sorts of paths, but making sure that
paths that give the wrong answer cancel out! It seems that Feynman was
the first to seriously consider quantum computation:
2) Simulating physics with computers, by Richard Feynman,
International Journal of Theoretical Physics, Vol. 21, nos. 6/7,
pp. 467--488 (1982).
but by now quite a bit of work has been done on the subject, e.g.:
3) Quantum mechanical Hamiltonian models of Turing machines, by P.
Benioff J. Stat. Phys., Vol. 29, pp. 515--546 (1982).
Quantum theory, the Church--Turing principle and the universal quantum
computer, by D. Deutsch, Proc. R. Soc. Lond., Vol. A400, pp. 96--117
(1985).
Quantum computational networks, by D. Deutsch, Proc. R. Soc. Lond.,
Vol. A425, pp. 73--90 (1989).
Rapid solution of problems by quantum computation, by D. Deutsch
and R. Jozsa, Proc. R. Soc. Lond., Vol. A439, pp. 553--558 (1992).
Quantum complexity theory, E. Bernstein and U. Vazirani,
Proc. 25th ACM Symp. on Theory of Computation, pp. 11--20
(1993).
The quantum challenge to structural complexity theory, A. Berthiaume and
G. Brassard, Proc. 7th IEEE Conference on Structure in Complexity
Theory (1992).
Quantum circuit complexity, by A. Yao, Proc. 34th IEEE Symp. on
Foundations of Computer Science, 1993.
Thanks to this work, there are now mathematical definitions of quantum
Turing machines and the class "BQP" of problems that can be solved in
polynomial time with a bounded probability of error. This allows a
mathematical investigation of whether quantum computers can, in
principle, do things more efficiently than classical ones. Shor shows
that factoring integers is in BQP. I won't try to describe how, as it's
a bit technical and I haven't really comprehended it. Instead, I'd like
say a couple things about the *practicality* of building quantum
computers, since people seem quite puzzled about this issue.
There are, as I see it, two basic problems with building quantum
computers. First, it seems that the components must be carefully
shielded from unwanted interactions with the outside world, since such
interactions can cause "decoherence", that is, superpositions of the
computer states will evolve into superpositions of the system consisting
of the computer together with what it's interacting with, which from the
point of view of the computer alone are the same as mixed states. This
tends to ruin the interference effects upon which the advantages of
quantum computation are based.
Second, it seems difficult to incorporate error-correction mechanisms in
a quantum computer. Without such mechanisms, slight deviations of the
Hamiltonian of the computer from the design specifications will cause
the computation to drift away from what was intended to occur. Luckily,
it appears that this drift is only *linear* rather than *exponential* as
a function of time. (This impression is based on some simplifications
that might be oversimplifications, so anyone who wants to build a
quantum computer had better ponder this issue carefully.) Linear
increase of error with time sets an upper bound on how complicated a
computation one could do before the answer is junk, but if the rate of
error increase was made low enough, this might be acceptable.
Certainly as time goes by and computer technology becomes ever more
miniaturized, hardware designers will have to pay ever more attention to
quantum effects, for good or for ill! (Vaughn Pratt estimates that
quantum effects will be a serious concern by 2020.) The question is
just whether they are only a nuisance, or whether they can possibly be
harnessed. Some designs for quantum computers have already been
proposed (sorry, I have no reference for these), and seeing whether they
are workable should be a very interesting engineering problem, even if
they are not good enough to outdo ordinary computers.
4) The Chern-Simons invariant as the natural time variable for classical
and quantum cosmology, by Lee Smolin and Chopin Soo, 16 pages in LaTeX
format, available as gr-qc/9405015.
Let me just quote the abstract on this one:
We propose that the Chern-Simons invariant of the Ashtekar-Sen
connection (or its imaginary part in the Lorentzian case) is the natural
internal time coordinate for classical and quantum cosmology. The
reasons for this are: 1) It is a function on the gauge and
diffeomorphism invariant configuration space, whose gradient is
orthogonal to the two physical degrees of freedom, in the metric defined
by the Ashtekar formulation of general relativity. 2) The imaginary
part of the Chern-Simons form reduces in the limit of small cosmological
constant, Lambda, and solutions close to DeSitter spacetime, to the York
extrinsic time coordinate. 3) Small matter-field excitations of the
Chern-Simons state satisfy, by virtue of the quantum constraints, a
functional Schroedinger equation in which the matter fields evolve on a
DeSitter background in the Chern-Simons time. We then propose this is
the natural vacuum state of the theory for nonzero Lambda. 4) This time
coordinate is periodic on the Euclidean configuration space, due to the
large gauge trans- formations, which means that physical expectation
values for all states in non-perturbative quantum gravity will satisfy
the KMS condition, and may then be interpreted as thermal states.
Finally, forms for the physical hamil- tonian and inner product are
suggested and a new action principle for general relativity, as a
geodesic principle on the connection superspace, is found.
5) Symplectic geometry, a series of lectures by Mikhail Gromov, compiled
by Richard Brown, edited by Robert Miner (le...@math.umd.edu).
Symplectic geometry is the geometry of classical phase spaces. That is,
it's the geometry of spaces on which one can take Poisson brackets of
functions in a manner given locally by the usual formulas. Gromov has
really revolutionized the subject, and these lectures look like a good
place to begin learning what is going on. There is also an appendix on
contact geometry (another aspect of classical physics) based on a
lecture by Eliashberg.
--------------------------------------------------------------------------
Previous issues of "This Week's Finds" and other expository articles on
mathematics and physics (as well as some of my research papers) can be
obtained by anonymous ftp from ucrmath.ucr.edu; they are in the
directory "baez." The README file lists the contents of all the papers.
On the World-Wide Web, you can attach to the address
http://info.desy.de/user/projects/Physics.html to access these files and
more on physics. Please do not ask me how to use hep-th or gr-qc;
instead, read the file preprint.info.
>A bit of a miscellany this week....
>Certainly as time goes by and computer technology becomes ever more
>miniaturized, hardware designers will have to pay ever more attention to
>quantum effects, for good or for ill! (Vaughan Pratt estimates that
>quantum effects will be a serious concern by 2020.) The question is
>just whether they are only a nuisance, or whether they can possibly be
>harnessed. Some designs for quantum computers have already been
>proposed (sorry, I have no reference for these), and seeing whether they
>are workable should be a very interesting engineering problem, even if
>they are not good enough to outdo ordinary computers.
Hmm... Not sure what you mean here. The field of mesoscopic physics
has been looking at quantum effects in electronic devices for over a
decade now. You can make single electron pumps and transistors, get
moderate variations in resistance (switching) due to interference effects,
put devices in superpositions, see non-local transport of electrons,
and see particle-in-a-box states in tunneling. Or do you mean something
else?
The phrase "see quantum effects in computer technology" is sort of
odd. Most computers are based on semiconductor physics, which involves
a fair bit of quantum mechanics. I don't mean to sound pedantic here,
but this is, after all, a physics discussion group...
Kieran Mullen
[Moderator's note: sorry to be vague. Yes, semiconductors rely upon
quantum theory, but the chips used in digital computers are designed to
function in such a way that one can ignore quantum effects when using
them, no? In a sense, the hardware engineers use quantum theory very
carefully to design hardware such that the software people don't need to
ever think about quantum theory. People in quantum computation like
those I cited, and people like Vaughan Pratt who are considering quantum
logic and other generalizations of Boolean logic for applications to
computer science, are looking ahead of a time when this will no longer
be the case. I don't know enough to tell whether this is really
practical, so I am simply reporting their speculations. In short, when
I said "see quantum effects in computer technology" I should have said
"design hardware in a manner such that the software people can no longer
model its operation as a deterministic, classical process." - jb]
You say that as if there were some other mathematical models available.
QM claims probabilites are irreducible but the mathematical models
we have can at best produce deterministic classical pseudo random
sequences. The superpositions that occur in the QM wave function *is*
a deterministic classical process. The unitary evolution of the wave
function is as solid example as one can come up with of a deterministic
classical process. Standard classical theory has no problem with
parallel processes following multiple paths. Speculative execution
in high end risk processors does this today using classical logic.
In effect the processor is in a `superposition' of states corresponding
to different branches the program might take based on the value
of variables that are still being computed.
Many physicists believe that there is something in QM phenomena that is
outside the ability of classical mathematics to model but
there is no evidence that this is the case nor is there any
`non classical' mathematics that might provide a theoretical basis
for such a claim. The models we use to evolve the QM wave function are
useful in part because they are classical deterministic models.
The claim that there is something outside this domain in
QM arises because the wave function only allows us to predict
probabilities. While it is possible that there are irreducible random
processes or other new unique effects outside of classical
mathematics it also possible and far more likely that this apparent
randomness is simply due to our limited understanding of an underlying
physical process that is every bit is deterministic and classical as
the wave function model.
Paul Budnik
John drew this exchange to my attention in case I had 2 cents to kick
in. KM's point is well taken, including his own assessment of its
pedantry, and more than adequately responded to by JB. On the other
hand, KM's "ten years now" rather understates things. Thirty years ago
as a third year physics undergraduate at Sydney university my term
project was to design computer circuits based on tunnel diodes. These
are the quintessential quantum mechanical semiconductor; the crucial
kink in their current curve is due to electrons tunnelling through a
narrow region (10 nm if I remember right, or 100 angstroms, we spoke
Scandinavian back then), permitting tunnel diodes to serve as very
fast switching elements.
At barely submicron scales, VLSI designs are still a couple of orders
of magnitude away from feeling these effects strongly, and VLSI
designers can still reason classically, as John says. But VLSI
geometries are shrinking steadily, and as they do, this classical
reasoning will become progressively less reliable and will gradually
need to be displaced by quantum reasoning.
This is not to be confused with quantum computation, whose basic
principle is parallelism via path integrals, something to be taken
advantage of if we only knew how, in contrast to the emerging obstacle
to standard design techniques referred to above. Shor's remarkable
quantum factoring algorithm is easily the most striking application of
this principle to date. It depends critically on a certain "mass"
property that arises in modular arithmetic but that has no evident
generalization to arbitrary nondeterministic computation, giving no
additional support for the prospect that NP (classically defined) might
be in quantum polynomial time.
Here are the opening paragraphs from my PhysComp'92 paper on linear
logic as quantum logic, I think KM should find nothing objectionable
here.
VLSI designers will eventually need to reckon with quantum mechanics
(QM), certainly within 50 years and possibly by the 2010's.
Computer-aided design (CAD) tools will need to adapt accordingly.
Does this entail changing only the theory or also the logic of CAD
verification systems? That is, will it be enough just to change those
nonlogical axioms defining how electronics works, or will it prove
necessary to dig deeper and tamper with logic itself?
Birkhoff and von Neumann \cite{BN36} incorporate quantum uncertainty
into Boolean logic by interpreting conjunction $A\land B$ standardly
but modifying negation $A\perp$ to mean certainly not $A$, in the sense
that $A\perp$ holds in just those states having probability 0 of being
mistaken for a state in which $A$ holds. This relationship of
unmistakability is an irreflexive symmetric binary relation on states
called {\it orthogonality} and denoted $x\bot y$. Disjunction as the
De Morgan dual of conjunction does not distribute over conjunction in
this {\it quantum logic} (QL), suggesting that the emergence of quantum
phenomena in VLSI will force changes to CAD verifiers at as fundamental
a level as their Boolean logic.
This can retrieved by anonymous ftp (hence by Mosaic on WWW) as
boole.stanford.edu:/pub/ql.tex, see also ABSTRACTS in that directory
for abstracts of related papers on linear logic and concurrent
behavior.
I'm currently working on a follow-up paper, submitted to PhysComp'94,
giving a particularly simple setting for Heisenberg uncertainty, namely
Chu spaces, simpler even than the Fourier transform setting (which is
simpler than the uncertainty that obtains between say position and
momentum). Chu spaces don't involve rings or even groups, just plain
sets and functions, yet they exhibit the same basic uncertainty that
one associates with determining the precise time that a pure tone
starts and stops. This undercertainty can be found in Stone duality
(the logician's version of Pontrjagin duality, on which the Fourier
transform can be understood as resting, with truth values in place of
complex numbers on the unit circle). Chu spaces create a uniform and
mathematically very attractive home for the immense "Stone zoo" of
dualities treated in Peter Johnstone's tome "Stone Spaces." For my
money this is the mathematically most honest way for Wheeler's "it from
bit" to acquire the basic characteristics of quantum mechanics.
--
Vaughan Pratt pr...@cs.stanford.edu boole.stanford.edu:/pub/ABSTRACTS
Boy, talk about "speculative," sci.physics.research is impressively
tolerant.
No relationship is presently known between the performance benefits of
speculative execution in today's high-end RISC processors and those of
Peter Shor's polynomial time quantum factoring algorithm. A lucid
account of such a relationship would win its author fame and perhaps
fortune, especially if it follows from the account that all problems in
NP are as easy as factoring.
>Many physicists believe that there is something in QM phenomena that is
>outside the ability of classical mathematics to model but
>there is no evidence that this is the case ...
It's a good thing your training didn't leave you in the sorry condition
it left these poor souls. Evidently they misunderstood Schroedinger's
cat puzzle. Good luck curing them.
: Boy, talk about "speculative," sci.physics.research is impressively
: tolerant.
There is nothing speculative about it. Nondeterministic execution is
a standard part of computer science. What is of interest about Shor's
result is the extremely speculative possibility that one might be
able to exploit QM effects for extremely efficient nondeterministic
computation.
: No relationship is presently known between the performance benefits of
: speculative execution in today's high-end RISC processors and those of
: Peter Shor's polynomial time quantum factoring algorithm. A lucid
: account of such a relationship would win its author fame and perhaps
: fortune, especially if it follows from the account that all problems in
: NP are as easy as factoring.
That is a strange comment. My point is that there is nothing intrinsically
nonclassical in the mathematics that underlys these results.
: >Many physicists believe that there is something in QM phenomena that is
: >outside the ability of classical mathematics to model but
: >there is no evidence that this is the case ...
: It's a good thing your training didn't leave you in the sorry condition
: it left these poor souls. Evidently they misunderstood Schroedinger's
: cat puzzle. Good luck curing them.
As I pointed out in the original article the conceptual problems in QM
all stem from the projection of the deterministic evolution in configuration
space of the wave function to generate probabilistic predictions in physical
space. No existing theory has a good physical explanation for that and
so most explanations are relegated to metaphysics with the resulting
paradoxes like Schrodinger's cat. This does not mean that there is
not a physical explanation for this projection that is entirely in
the domain of classical mathematics.
The only thing we *know* is that no classical *local* model can reproduce
the standard predictions. For this reasons QM is forced to use explicitly
nonlocal classical models. However these predictions (contrary to widely
held misconceptions) have never been verified experimentally. Thus we
cannot rule out the possibility that a local model within classical
mathematics can account for all experimental results.
Many physicists talk about nonclassical mathematics. However
no such thing exists. They all *use* only classical mathematics.
Paul Budnik
What do you mean by quantum reasoning? Certainly one needs to use
QM models at some point. However the models that describe the evolution
of the wave function are `classical models'. The interpretation of
the wave function as a probability density is also something that is
essentially classical mathematics. There are special mathematical
models for QM. There is no QM `reasoning'.
: This is not to be confused with quantum computation, whose basic
: principle is parallelism via path integrals, something to be taken
: advantage of if we only knew how, in contrast to the emerging obstacle
: to standard design techniques referred to above. Shor's remarkable
: quantum factoring algorithm is easily the most striking application of
: this principle to date. It depends critically on a certain "mass"
: property that arises in modular arithmetic but that has no evident
: generalization to arbitrary nondeterministic computation, giving no
: additional support for the prospect that NP (classically defined) might
: be in quantum polynomial time.
These are indeed interesting but these are also classical results
in the sense that the mathematics involved is entirely classical.
Non deterministic and parallel algorithms are standard fare in
computer science and have nothing inherently to do with QM.
: Here are the opening paragraphs from my PhysComp'92 paper on linear
: logic as quantum logic, I think KM should find nothing objectionable
: here.
: VLSI designers will eventually need to reckon with quantum mechanics
: (QM), certainly within 50 years and possibly by the 2010's.
: Computer-aided design (CAD) tools will need to adapt accordingly.
: Does this entail changing only the theory or also the logic of CAD
: verification systems? That is, will it be enough just to change those
: nonlogical axioms defining how electronics works, or will it prove
: necessary to dig deeper and tamper with logic itself?
: Birkhoff and von Neumann \cite{BN36} incorporate quantum uncertainty
: into Boolean logic by interpreting conjunction $A\land B$ standardly
: but modifying negation $A\perp$ to mean certainly not $A$, in the sense
: that $A\perp$ holds in just those states having probability 0 of being
: mistaken for a state in which $A$ holds. This relationship of
: unmistakability is an irreflexive symmetric binary relation on states
: called {\it orthogonality} and denoted $x\bot y$. Disjunction as the
: De Morgan dual of conjunction does not distribute over conjunction in
: this {\it quantum logic} (QL), suggesting that the emergence of quantum
: phenomena in VLSI will force changes to CAD verifiers at as fundamental
: a level as their Boolean logic.
Again interesting but again not anything outside of classical mathematics
as you illustrate by giving a classical definition of the relationship.
People have been using non boolean logic to design specialized circuits
for some time now. Fuzzy logic is one obvious example.
: I'm currently working on a follow-up paper, submitted to PhysComp'94,
: giving a particularly simple setting for Heisenberg uncertainty, namely
: Chu spaces, simpler even than the Fourier transform setting (which is
: simpler than the uncertainty that obtains between say position and
: momentum). Chu spaces don't involve rings or even groups, just plain
: sets and functions, yet they exhibit the same basic uncertainty that
: one associates with determining the precise time that a pure tone
: starts and stops. This undercertainty can be found in Stone duality
: (the logician's version of Pontrjagin duality, on which the Fourier
: transform can be understood as resting, with truth values in place of
: complex numbers on the unit circle). Chu spaces create a uniform and
: mathematically very attractive home for the immense "Stone zoo" of
: dualities treated in Peter Johnstone's tome "Stone Spaces." For my
: money this is the mathematically most honest way for Wheeler's "it from
: bit" to acquire the basic characteristics of quantum mechanics.
This well illustrates the point I am making. The uncertainty relationships
in QM are mathematically similar to the uncertainty in time and frequency
for classical waves. A precise tone, i. e. one that is an impulse in
the frequency domain, can never start or stop. It just keeps going. There
is nothing in this mathematics that is outside of the domain of classical
mathematics. It is only a different model than classical physics and one
it which the claim is made that probabilities are irreducible.
Paul Budnik