Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

This Week's Finds in Mathematical Physics (Week 34)

5 views
Skip to first unread message

John Baez

unread,
May 24, 1994, 3:16:07 AM5/24/94
to
This Week's Finds in Mathematical Physics (Week 34)
John Baez

A bit of a miscellany this week....

1) Algorithms for quantum computation: discrete log and factoring,
extended abstract by Peter Shor.

There has been a bit of a stir about this paper; since I know Peter
Shor's sister I was able to get a copy and see what it was really all
about.

Quantum computers are so far just a theoretical possibility. It's
easiest to see why machines that take advantage of quantum theory might
be efficient at computation if we think in terms of path integrals. In
Feynman's path-integral approach to quantum theory, the probability of
getting from state A at time zero to state B some later time is obtained
by integrating the exponential of the action

exp(iS/hbar)

over *all* paths from A to B, and then taking the absolute value
squared. (Here we are thinking of states A and B that correspond to
points in the classical configuration space.) We can think of the
quantum system as proceeding along all paths simultaneously; it is the
constructive or destructive interference between paths due to the phases
exp(iS/hbar) that makes certain final outcomes B more likely than
others. In many situations, there is massive destructive interference
except among paths very close to those which are critical points of the
action S; the latter are the *classical* paths. So in a sense, a
classical device functions as it does by executing all possible motions;
motions far from those satisfying Newton's laws simply cancel out by
destructive interference! (There are many other ways of thinking about
quantum theory; this one can be difficult to make mathematically
rigorous, but it's often very handy.)

This raises the idea of building a computer that would take advantage of
quantum theory by trying out all sorts of paths, but making sure that
paths that give the wrong answer cancel out! It seems that Feynman was
the first to seriously consider quantum computation:

2) Simulating physics with computers, by Richard Feynman,
International Journal of Theoretical Physics, Vol. 21, nos. 6/7,
pp. 467--488 (1982).

but by now quite a bit of work has been done on the subject, e.g.:

3) Quantum mechanical Hamiltonian models of Turing machines, by P.
Benioff J. Stat. Phys., Vol. 29, pp. 515--546 (1982).

Quantum theory, the Church--Turing principle and the universal quantum
computer, by D. Deutsch, Proc. R. Soc. Lond., Vol. A400, pp. 96--117
(1985).

Quantum computational networks, by D. Deutsch, Proc. R. Soc. Lond.,
Vol. A425, pp. 73--90 (1989).

Rapid solution of problems by quantum computation, by D. Deutsch
and R. Jozsa, Proc. R. Soc. Lond., Vol. A439, pp. 553--558 (1992).

Quantum complexity theory, E. Bernstein and U. Vazirani,
Proc. 25th ACM Symp. on Theory of Computation, pp. 11--20
(1993).

The quantum challenge to structural complexity theory, A. Berthiaume and
G. Brassard, Proc. 7th IEEE Conference on Structure in Complexity
Theory (1992).

Quantum circuit complexity, by A. Yao, Proc. 34th IEEE Symp. on
Foundations of Computer Science, 1993.


Thanks to this work, there are now mathematical definitions of quantum
Turing machines and the class "BQP" of problems that can be solved in
polynomial time with a bounded probability of error. This allows a
mathematical investigation of whether quantum computers can, in
principle, do things more efficiently than classical ones. Shor shows
that factoring integers is in BQP. I won't try to describe how, as it's
a bit technical and I haven't really comprehended it. Instead, I'd like
say a couple things about the *practicality* of building quantum
computers, since people seem quite puzzled about this issue.

There are, as I see it, two basic problems with building quantum
computers. First, it seems that the components must be carefully
shielded from unwanted interactions with the outside world, since such
interactions can cause "decoherence", that is, superpositions of the
computer states will evolve into superpositions of the system consisting
of the computer together with what it's interacting with, which from the
point of view of the computer alone are the same as mixed states. This
tends to ruin the interference effects upon which the advantages of
quantum computation are based.

Second, it seems difficult to incorporate error-correction mechanisms in
a quantum computer. Without such mechanisms, slight deviations of the
Hamiltonian of the computer from the design specifications will cause
the computation to drift away from what was intended to occur. Luckily,
it appears that this drift is only *linear* rather than *exponential* as
a function of time. (This impression is based on some simplifications
that might be oversimplifications, so anyone who wants to build a
quantum computer had better ponder this issue carefully.) Linear
increase of error with time sets an upper bound on how complicated a
computation one could do before the answer is junk, but if the rate of
error increase was made low enough, this might be acceptable.

Certainly as time goes by and computer technology becomes ever more
miniaturized, hardware designers will have to pay ever more attention to
quantum effects, for good or for ill! (Vaughn Pratt estimates that
quantum effects will be a serious concern by 2020.) The question is
just whether they are only a nuisance, or whether they can possibly be
harnessed. Some designs for quantum computers have already been
proposed (sorry, I have no reference for these), and seeing whether they
are workable should be a very interesting engineering problem, even if
they are not good enough to outdo ordinary computers.

4) The Chern-Simons invariant as the natural time variable for classical
and quantum cosmology, by Lee Smolin and Chopin Soo, 16 pages in LaTeX
format, available as gr-qc/9405015.

Let me just quote the abstract on this one:

We propose that the Chern-Simons invariant of the Ashtekar-Sen
connection (or its imaginary part in the Lorentzian case) is the natural
internal time coordinate for classical and quantum cosmology. The
reasons for this are: 1) It is a function on the gauge and
diffeomorphism invariant configuration space, whose gradient is
orthogonal to the two physical degrees of freedom, in the metric defined
by the Ashtekar formulation of general relativity. 2) The imaginary
part of the Chern-Simons form reduces in the limit of small cosmological
constant, Lambda, and solutions close to DeSitter spacetime, to the York
extrinsic time coordinate. 3) Small matter-field excitations of the
Chern-Simons state satisfy, by virtue of the quantum constraints, a
functional Schroedinger equation in which the matter fields evolve on a
DeSitter background in the Chern-Simons time. We then propose this is
the natural vacuum state of the theory for nonzero Lambda. 4) This time
coordinate is periodic on the Euclidean configuration space, due to the
large gauge trans- formations, which means that physical expectation
values for all states in non-perturbative quantum gravity will satisfy
the KMS condition, and may then be interpreted as thermal states.
Finally, forms for the physical hamil- tonian and inner product are
suggested and a new action principle for general relativity, as a
geodesic principle on the connection superspace, is found.

5) Symplectic geometry, a series of lectures by Mikhail Gromov, compiled
by Richard Brown, edited by Robert Miner (le...@math.umd.edu).

Symplectic geometry is the geometry of classical phase spaces. That is,
it's the geometry of spaces on which one can take Poisson brackets of
functions in a manner given locally by the usual formulas. Gromov has
really revolutionized the subject, and these lectures look like a good
place to begin learning what is going on. There is also an appendix on
contact geometry (another aspect of classical physics) based on a
lecture by Eliashberg.

--------------------------------------------------------------------------
Previous issues of "This Week's Finds" and other expository articles on
mathematics and physics (as well as some of my research papers) can be
obtained by anonymous ftp from ucrmath.ucr.edu; they are in the
directory "baez." The README file lists the contents of all the papers.
On the World-Wide Web, you can attach to the address
http://info.desy.de/user/projects/Physics.html to access these files and
more on physics. Please do not ask me how to use hep-th or gr-qc;
instead, read the file preprint.info.

Kieran Mullen

unread,
May 27, 1994, 3:10:59 PM5/27/94
to
In article <2rs9jn$b...@galaxy.ucr.edu>, John Baez <ba...@ucrmath.ucr.edu> wrote:
>This Week's Finds in Mathematical Physics (Week 34)
>John Baez

>A bit of a miscellany this week....

>Certainly as time goes by and computer technology becomes ever more


>miniaturized, hardware designers will have to pay ever more attention to

>quantum effects, for good or for ill! (Vaughan Pratt estimates that


>quantum effects will be a serious concern by 2020.) The question is
>just whether they are only a nuisance, or whether they can possibly be
>harnessed. Some designs for quantum computers have already been
>proposed (sorry, I have no reference for these), and seeing whether they
>are workable should be a very interesting engineering problem, even if
>they are not good enough to outdo ordinary computers.


Hmm... Not sure what you mean here. The field of mesoscopic physics
has been looking at quantum effects in electronic devices for over a
decade now. You can make single electron pumps and transistors, get
moderate variations in resistance (switching) due to interference effects,
put devices in superpositions, see non-local transport of electrons,
and see particle-in-a-box states in tunneling. Or do you mean something
else?

The phrase "see quantum effects in computer technology" is sort of
odd. Most computers are based on semiconductor physics, which involves
a fair bit of quantum mechanics. I don't mean to sound pedantic here,
but this is, after all, a physics discussion group...

Kieran Mullen


[Moderator's note: sorry to be vague. Yes, semiconductors rely upon
quantum theory, but the chips used in digital computers are designed to
function in such a way that one can ignore quantum effects when using
them, no? In a sense, the hardware engineers use quantum theory very
carefully to design hardware such that the software people don't need to
ever think about quantum theory. People in quantum computation like
those I cited, and people like Vaughan Pratt who are considering quantum
logic and other generalizations of Boolean logic for applications to
computer science, are looking ahead of a time when this will no longer
be the case. I don't know enough to tell whether this is really
practical, so I am simply reporting their speculations. In short, when
I said "see quantum effects in computer technology" I should have said
"design hardware in a manner such that the software people can no longer
model its operation as a deterministic, classical process." - jb]


Paul Budnik

unread,
May 31, 1994, 11:29:51 AM5/31/94
to
: John Baez <ba...@ucrmath.ucr.edu> wrote:
: I said "see quantum effects in computer technology" I should have said

: "design hardware in a manner such that the software people can no longer
: model its operation as a deterministic, classical process."

You say that as if there were some other mathematical models available.
QM claims probabilites are irreducible but the mathematical models
we have can at best produce deterministic classical pseudo random
sequences. The superpositions that occur in the QM wave function *is*
a deterministic classical process. The unitary evolution of the wave
function is as solid example as one can come up with of a deterministic
classical process. Standard classical theory has no problem with
parallel processes following multiple paths. Speculative execution
in high end risk processors does this today using classical logic.
In effect the processor is in a `superposition' of states corresponding
to different branches the program might take based on the value
of variables that are still being computed.

Many physicists believe that there is something in QM phenomena that is
outside the ability of classical mathematics to model but
there is no evidence that this is the case nor is there any
`non classical' mathematics that might provide a theoretical basis
for such a claim. The models we use to evolve the QM wave function are
useful in part because they are classical deterministic models.

The claim that there is something outside this domain in
QM arises because the wave function only allows us to predict
probabilities. While it is possible that there are irreducible random
processes or other new unique effects outside of classical
mathematics it also possible and far more likely that this apparent
randomness is simply due to our limited understanding of an underlying
physical process that is every bit is deterministic and classical as
the wave function model.

Paul Budnik

Vaughan R. Pratt

unread,
May 31, 1994, 11:31:36 AM5/31/94
to
In article <CqH6M...@usenet.ucs.indiana.edu>,

Kieran Mullen <kie...@gibbs.physics.indiana.edu> wrote:
>In article <2rs9jn$b...@galaxy.ucr.edu>, John Baez <ba...@ucrmath.ucr.edu>wrote:
>>This Week's Finds in Mathematical Physics (Week 34)
>>John Baez
>...

>In short, when
>I said "see quantum effects in computer technology" I should have said
>"design hardware in a manner such that the software people can no longer
>model its operation as a deterministic, classical process." - jb]

John drew this exchange to my attention in case I had 2 cents to kick
in. KM's point is well taken, including his own assessment of its
pedantry, and more than adequately responded to by JB. On the other
hand, KM's "ten years now" rather understates things. Thirty years ago
as a third year physics undergraduate at Sydney university my term
project was to design computer circuits based on tunnel diodes. These
are the quintessential quantum mechanical semiconductor; the crucial
kink in their current curve is due to electrons tunnelling through a
narrow region (10 nm if I remember right, or 100 angstroms, we spoke
Scandinavian back then), permitting tunnel diodes to serve as very
fast switching elements.

At barely submicron scales, VLSI designs are still a couple of orders
of magnitude away from feeling these effects strongly, and VLSI
designers can still reason classically, as John says. But VLSI
geometries are shrinking steadily, and as they do, this classical
reasoning will become progressively less reliable and will gradually
need to be displaced by quantum reasoning.

This is not to be confused with quantum computation, whose basic
principle is parallelism via path integrals, something to be taken
advantage of if we only knew how, in contrast to the emerging obstacle
to standard design techniques referred to above. Shor's remarkable
quantum factoring algorithm is easily the most striking application of
this principle to date. It depends critically on a certain "mass"
property that arises in modular arithmetic but that has no evident
generalization to arbitrary nondeterministic computation, giving no
additional support for the prospect that NP (classically defined) might
be in quantum polynomial time.

Here are the opening paragraphs from my PhysComp'92 paper on linear
logic as quantum logic, I think KM should find nothing objectionable
here.

VLSI designers will eventually need to reckon with quantum mechanics
(QM), certainly within 50 years and possibly by the 2010's.
Computer-aided design (CAD) tools will need to adapt accordingly.

Does this entail changing only the theory or also the logic of CAD
verification systems? That is, will it be enough just to change those
nonlogical axioms defining how electronics works, or will it prove
necessary to dig deeper and tamper with logic itself?

Birkhoff and von Neumann \cite{BN36} incorporate quantum uncertainty
into Boolean logic by interpreting conjunction $A\land B$ standardly
but modifying negation $A\perp$ to mean certainly not $A$, in the sense
that $A\perp$ holds in just those states having probability 0 of being
mistaken for a state in which $A$ holds. This relationship of
unmistakability is an irreflexive symmetric binary relation on states
called {\it orthogonality} and denoted $x\bot y$. Disjunction as the
De Morgan dual of conjunction does not distribute over conjunction in
this {\it quantum logic} (QL), suggesting that the emergence of quantum
phenomena in VLSI will force changes to CAD verifiers at as fundamental
a level as their Boolean logic.

This can retrieved by anonymous ftp (hence by Mosaic on WWW) as
boole.stanford.edu:/pub/ql.tex, see also ABSTRACTS in that directory
for abstracts of related papers on linear logic and concurrent
behavior.

I'm currently working on a follow-up paper, submitted to PhysComp'94,
giving a particularly simple setting for Heisenberg uncertainty, namely
Chu spaces, simpler even than the Fourier transform setting (which is
simpler than the uncertainty that obtains between say position and
momentum). Chu spaces don't involve rings or even groups, just plain
sets and functions, yet they exhibit the same basic uncertainty that
one associates with determining the precise time that a pure tone
starts and stops. This undercertainty can be found in Stone duality
(the logician's version of Pontrjagin duality, on which the Fourier
transform can be understood as resting, with truth values in place of
complex numbers on the unit circle). Chu spaces create a uniform and
mathematically very attractive home for the immense "Stone zoo" of
dualities treated in Peter Johnstone's tome "Stone Spaces." For my
money this is the mathematically most honest way for Wheeler's "it from
bit" to acquire the basic characteristics of quantum mechanics.
--
Vaughan Pratt pr...@cs.stanford.edu boole.stanford.edu:/pub/ABSTRACTS

Vaughan R. Pratt

unread,
Jun 1, 1994, 3:04:27 AM6/1/94
to
In article <CqoB1...@murdoch.acc.Virginia.EDU>,

Paul Budnik <pa...@mtnmath.mtnmath.com> wrote:
>Standard classical theory has no problem with
>parallel processes following multiple paths. Speculative execution
>in high end risk processors does this today using classical logic.
>In effect the processor is in a `superposition' of states corresponding
>to different branches the program might take based on the value
>of variables that are still being computed.

Boy, talk about "speculative," sci.physics.research is impressively
tolerant.

No relationship is presently known between the performance benefits of
speculative execution in today's high-end RISC processors and those of
Peter Shor's polynomial time quantum factoring algorithm. A lucid
account of such a relationship would win its author fame and perhaps
fortune, especially if it follows from the account that all problems in
NP are as easy as factoring.

>Many physicists believe that there is something in QM phenomena that is
>outside the ability of classical mathematics to model but

>there is no evidence that this is the case ...

It's a good thing your training didn't leave you in the sorry condition
it left these poor souls. Evidently they misunderstood Schroedinger's
cat puzzle. Good luck curing them.

Paul Budnik

unread,
Jun 4, 1994, 9:11:00 AM6/4/94
to

Vaughan R. Pratt (pr...@sunburn.stanford.edu) wrote:
: In article <CqoB1...@murdoch.acc.Virginia.EDU>,

: Paul Budnik <pa...@mtnmath.mtnmath.com> wrote:
: >Standard classical theory has no problem with
: >parallel processes following multiple paths. Speculative execution
: >in high end risk processors does this today using classical logic.
: >In effect the processor is in a `superposition' of states corresponding
: >to different branches the program might take based on the value
: >of variables that are still being computed.

: Boy, talk about "speculative," sci.physics.research is impressively
: tolerant.

There is nothing speculative about it. Nondeterministic execution is
a standard part of computer science. What is of interest about Shor's
result is the extremely speculative possibility that one might be
able to exploit QM effects for extremely efficient nondeterministic
computation.

: No relationship is presently known between the performance benefits of


: speculative execution in today's high-end RISC processors and those of
: Peter Shor's polynomial time quantum factoring algorithm. A lucid
: account of such a relationship would win its author fame and perhaps
: fortune, especially if it follows from the account that all problems in
: NP are as easy as factoring.

That is a strange comment. My point is that there is nothing intrinsically
nonclassical in the mathematics that underlys these results.

: >Many physicists believe that there is something in QM phenomena that is


: >outside the ability of classical mathematics to model but
: >there is no evidence that this is the case ...

: It's a good thing your training didn't leave you in the sorry condition
: it left these poor souls. Evidently they misunderstood Schroedinger's
: cat puzzle. Good luck curing them.

As I pointed out in the original article the conceptual problems in QM
all stem from the projection of the deterministic evolution in configuration
space of the wave function to generate probabilistic predictions in physical
space. No existing theory has a good physical explanation for that and
so most explanations are relegated to metaphysics with the resulting
paradoxes like Schrodinger's cat. This does not mean that there is
not a physical explanation for this projection that is entirely in
the domain of classical mathematics.

The only thing we *know* is that no classical *local* model can reproduce
the standard predictions. For this reasons QM is forced to use explicitly
nonlocal classical models. However these predictions (contrary to widely
held misconceptions) have never been verified experimentally. Thus we
cannot rule out the possibility that a local model within classical
mathematics can account for all experimental results.

Many physicists talk about nonclassical mathematics. However
no such thing exists. They all *use* only classical mathematics.

Paul Budnik

Paul Budnik

unread,
Jun 4, 1994, 9:12:59 AM6/4/94
to
Vaughan R. Pratt (pr...@sunburn.stanford.edu) wrote:
___
: At barely submicron scales, VLSI designs are still a couple of orders

: of magnitude away from feeling these effects strongly, and VLSI
: designers can still reason classically, as John says. But VLSI
: geometries are shrinking steadily, and as they do, this classical
: reasoning will become progressively less reliable and will gradually
: need to be displaced by quantum reasoning.

What do you mean by quantum reasoning? Certainly one needs to use
QM models at some point. However the models that describe the evolution
of the wave function are `classical models'. The interpretation of
the wave function as a probability density is also something that is
essentially classical mathematics. There are special mathematical
models for QM. There is no QM `reasoning'.

: This is not to be confused with quantum computation, whose basic


: principle is parallelism via path integrals, something to be taken
: advantage of if we only knew how, in contrast to the emerging obstacle
: to standard design techniques referred to above. Shor's remarkable
: quantum factoring algorithm is easily the most striking application of
: this principle to date. It depends critically on a certain "mass"
: property that arises in modular arithmetic but that has no evident
: generalization to arbitrary nondeterministic computation, giving no
: additional support for the prospect that NP (classically defined) might
: be in quantum polynomial time.

These are indeed interesting but these are also classical results
in the sense that the mathematics involved is entirely classical.
Non deterministic and parallel algorithms are standard fare in
computer science and have nothing inherently to do with QM.

: Here are the opening paragraphs from my PhysComp'92 paper on linear


: logic as quantum logic, I think KM should find nothing objectionable
: here.

: VLSI designers will eventually need to reckon with quantum mechanics
: (QM), certainly within 50 years and possibly by the 2010's.
: Computer-aided design (CAD) tools will need to adapt accordingly.

: Does this entail changing only the theory or also the logic of CAD
: verification systems? That is, will it be enough just to change those
: nonlogical axioms defining how electronics works, or will it prove
: necessary to dig deeper and tamper with logic itself?

: Birkhoff and von Neumann \cite{BN36} incorporate quantum uncertainty
: into Boolean logic by interpreting conjunction $A\land B$ standardly
: but modifying negation $A\perp$ to mean certainly not $A$, in the sense
: that $A\perp$ holds in just those states having probability 0 of being
: mistaken for a state in which $A$ holds. This relationship of
: unmistakability is an irreflexive symmetric binary relation on states
: called {\it orthogonality} and denoted $x\bot y$. Disjunction as the
: De Morgan dual of conjunction does not distribute over conjunction in
: this {\it quantum logic} (QL), suggesting that the emergence of quantum
: phenomena in VLSI will force changes to CAD verifiers at as fundamental
: a level as their Boolean logic.

Again interesting but again not anything outside of classical mathematics
as you illustrate by giving a classical definition of the relationship.
People have been using non boolean logic to design specialized circuits
for some time now. Fuzzy logic is one obvious example.

: I'm currently working on a follow-up paper, submitted to PhysComp'94,


: giving a particularly simple setting for Heisenberg uncertainty, namely
: Chu spaces, simpler even than the Fourier transform setting (which is
: simpler than the uncertainty that obtains between say position and
: momentum). Chu spaces don't involve rings or even groups, just plain
: sets and functions, yet they exhibit the same basic uncertainty that
: one associates with determining the precise time that a pure tone
: starts and stops. This undercertainty can be found in Stone duality
: (the logician's version of Pontrjagin duality, on which the Fourier
: transform can be understood as resting, with truth values in place of
: complex numbers on the unit circle). Chu spaces create a uniform and
: mathematically very attractive home for the immense "Stone zoo" of
: dualities treated in Peter Johnstone's tome "Stone Spaces." For my
: money this is the mathematically most honest way for Wheeler's "it from
: bit" to acquire the basic characteristics of quantum mechanics.

This well illustrates the point I am making. The uncertainty relationships
in QM are mathematically similar to the uncertainty in time and frequency
for classical waves. A precise tone, i. e. one that is an impulse in
the frequency domain, can never start or stop. It just keeps going. There
is nothing in this mathematics that is outside of the domain of classical
mathematics. It is only a different model than classical physics and one
it which the claim is made that probabilities are irreducible.

Paul Budnik

Archimedes Plutonium

unread,
Sep 25, 2021, 4:39:42 PM9/25/21
to
Would Marcella Carena, have asked the question which is the atom's real electron, muon or 0.5MeV particle if she had continued to read sci.physics and sci.math where AP constantly posted from 1993 to this day?

On Tuesday, May 10, 1994 at 8:59:17 PM UTC-5, John Baez wrote:
> This Week's Finds in Mathematical Physics (Week 33)
> John Baez
> With tremendous relief, I have finished writing a book, and will return
> to putting out This Week's Finds on a roughly weekly basis. Let me
> briefly describe my book, which took so much more work than I had
> expected... and then let me start catching up on listing some of the
> stuff that's cluttering my desk!

On Tuesday, May 10, 1994 at 8:59:17 PM UTC-5, John Baez wrote:
> This Week's Finds in Mathematical Physics (Week 33)
> John Baez
> With tremendous relief, I have finished writing a book, and will return
> to putting out This Week's Finds on a roughly weekly basis. Let me
> briefly describe my book, which took so much more work than I had
> expected... and then let me start catching up on listing some of the
> stuff that's cluttering my desk!

I suspect John will never realize he is a worthless sap head when it comes to physics, totally worthless and "out of it" for when John read that AP had discovered that the Oval was the slant cut in cone not the ellipse, that imp of math and physics of John Baez could never understand the discovery, instead, John wanted to redefine what a ellipse and a oval was. This goes to show that John has no logic brain but functions on memorization and guesswork. No power of logic or reason can come to John Baez, why, John cannot even multiply 9 x 105 and see that proton and neutron are in sigma error. John is a menace and pest to real physics, real math.

Would Marcella Carena, have asked the question which is the atom's real electron, muon or 0.5MeV particle if she had continued to read sci.physics and sci.math where AP constantly posted from 1993 to this day? How can Marcella or anyone in physics be called a "theoretical physicist" when the person is too stupid to even ask a simple question-- Yo, hey, hey yo, which of these two particles, the muon or the 0.5MeV, which is the Atom's true electron? Is it the muon? For that would instantly explain why Fermi Natl. Lab gets a "far more spinning muon" than expected.

Is Marcella as dumb and dirty and ignorant as John Baez, when told that the real slant cut in cone is a Oval, not the ellipse? Would Marcella then go and try to redefine ellipse and oval just because she never wants to give credit to the truth? Unlike John Baez who knows more about ad hominem than he ever will know about true physics and science.

Head of Theory Division Fermi National Accelerator Laboratory.
Quoting her in Scientific American, Oct2021 "What is this wobble effect that has me and other scientists so intrigued? It has to do with the way a muon spins when it travels through a magnetic field."

AP writes: yes, well if the muon is the real electron of atoms, then it has spin as it thrusts through proton toruses in a Faraday law.

AP writes: Good Marcella, in that sentence, but then when you wander over into the garbage fake physics of Standard Model, you lost it Marcella. Instead, after that sentence of yours above, your next sentence should have been this---

We, physicists are looking to discover that the real true electron of atoms is the muon and not what JJ Thomson found in 1897 of a 0.5MeV particle. Because if the muon is the true electron of atoms, it is packed inside a proton torus thrusting through the proton torus in a Faraday law and producing electricity, some of which this produced electricity is the 0.5MeV particle. And because that is the job, function and task of the muon as true electron of atoms, the muon has a intrinsic spin inside the protons and outside the proton. And the Standard Model was a bizarre theory of physics proposed by clownish imps of physics with no logical brain to speak of, for their particles have no job, no task, no function to tie them together and ask why in the world do they exist at all.

SAVE// AP's 151st book of science// TEACHING TRUE PHYSICS, 1st year College.

Alright, I am trying my hardest to get this published before September 2021 ends.

And down on the home stretch, I come to realize more than ever how much dependent I am on using biology's DNA structure to model the structure of light-waves-photons.

Just today I am writing the passages of where the Electric field in the AP-EM equations of E' = (V/(CB))' The Coulomb gravity law of EM-theory, that I need a Light wave to be both a pencil ellipse closed loop where the source is always connected until the light wave ends up, and that one end of the light wave is electromotive push, while the other end of the light wave at the source is electromotive pull. In other words, light waves are Double Transverse Waves, and in electricity, those are split up at the source and combined in the path traveled. And this is duplicated in DNA as when in reproduction, the DNA is two single strands coming together to form a double strand.

The only physics experiments to date that can colloborate AP's intuition on this is the Harvard Dr. Hau slow light experiment, if for some logical reason she finally decides to flick the switch of the light source off in BEC and watch the entire column of light vanish at the --same time---. But Harvard's Dr. Hau is too stubborn and thus ignorant in complying.

AP
King of Science, especially Physics


On Thursday, June 10, 1993 at 9:55:53 PM UTC-5, john baez wrote:
> In "week16" of my "This Week's Finds" series I described some of the
> recent work on discrete spacetime models in which there is NOT an
> absolute frame of reference. None of these have succeeded in modelling
> much 4d physics, but it's certainly a possibility to keep in mind.

AP writes: does it matter whatever John writes about science, when he cannot admit a slant cut in cone is a oval, not ellipse. Earth may as well be flat in a mind like John Baez.



Here is an example of someone in physics, John Baez, Hartle, Hawking, who should never have been in physics. For they end up making big convoluted messes, never able to ask simple questions of true physics or math. For example, how simple is it to ask which is the real electron inside atoms-- is it the muon or the 0.5MeV particle. Or the simple question of , well Hawking you have the Pauli Exclusion principle that forbids collapse and black holes, so why are you freakishly thinking black holes exist at all.

Or the simple question of Ln(1.00005) does not come from a function Y = 1/x for that is a function (see AP's newsgroup of what is a function) unrelated to Ln(x) but close, still unrelated. And so the true value of Ln(1.00005) = 0.00005 and not what your computer or calculator throws at you.

John Baez is a failed physicist and mathematician, so caught up in abstraction messes and outright lies of science.

On Wednesday, March 17, 1993 at 6:28:03 PM UTC-5, john baez wrote:
> Watch the quantum cosmologists squirm. Hawking had for a while argued
> that according to his formula for the wavefunction of the universe
> (the Hartle-Hawking "no boundary boundary-value conditions") entropy
> would increase to the point halfway between big bang and big crunch, and
> then decrease. But in a recent paper (mentioned in my "This Week's
> Finds") he says he thinks one can *derive* time asymmetry from his
> wavefunction. I am very dubious of his argument, though, which seems to
> handwave at crucial points.
> In Andrei Linde's "chaotic inflation" scenario (another proposal for the
> wavefunction of the universe), it probably is not right to think of the
> hunk of the universe we see as being a closed system.
> In any event, if you are puzzled by this problem, you can take some
> consolation by knowing you are in good company!

Why bother John for you failed both math and physics, you cannot even acknowledge a oval is the slant cut in cone, never the ellipse. And when in the world will that small mind of yours ask the question which is the atom's real electron, the muon or the 0.5MeV particle. John, you have been a charlatan physicist all your life, but why not turn the corner and acknowledge a oval is the slant cut of cone.
0 new messages