On 27-04-2022 01:37, Bruce Kellett wrote:
On Tue, Apr 26, 2022 at 10:03 AM smitra <smi...@zonnet.nl> wrote:
On 24-04-2022 03:16, Bruce Kellett wrote:
A moment's thought should make it clear to you that this is notcase
possible. If both possibilities are realized, it cannot be the
that one has twice the probability of the other. In the long run,if
both are realized they have equal probabilities of 1/2.
The probabilities do not have to be 1/2. Suppose one million people
participate in a lottery such that there will be exactly one winner.
The
probability that one given person will win, is then one in a
million.
Suppose now that we create one million people using a machine and
then
organize such a lottery. The probability that one given newly
created
person will win is then also one in a million. The machine can be
adjusted to create any set of persons we like, it can create one
million
identical persons, or almost identical persons, or totally different
persons. If we then create one million almost identical persons, the
probability is still one one in a million. This means that the limit
of
identical persons, the probability will be one in a million.
Why would the probability suddenly become 1/2 if the machine is set
to
create exactly identical persons while the probability would be one
in a
million if we create persons that are almost, but not quite
identical?
Your lottery example is completely beside the point.
It provides for an example of a case where your logic does not apply.
I think you
should pay more attention to the mathematics of the binomial
distribution. Let me explain it once more: If every outcome is
realized on every trial of a binary process, then after the first
trial, we have a branch with result 0 and a branch with result 1.
After two trials we have four branches, with results 00, 01, 10,and
11; after 3 trials, we have branches registering 000, 001, 011, 010,
100, 101, 110, and 111. Notice that these branches represent all
possible binary strings of length 3.
After N trials, there are 2^N distinct branches, representing all
possible binary sequences of length N. (This is just like Pascal's
triangle) As N becomes very large, we can approximate the binomial
distribution with the normal distribution, with mean 0.5 and standard
deviation that decreases as 1/sqrt(N). In other words, the majority of
trials will have equal, or approximately equal, numbers of 0s and 1s.
Observers in these branches will naturally take the probability to be
approximated by the relative frequencies of 0s and 1s. In other words,
they will take the probability of each outcome to be 0.5.
The problem with this is that you just assume that all branches are equally probable. You don't make that explicit, it's implicitly assumed, but it's just an assumption. You are simply doing branch counting.
The important point to notice is that this result of all possible
binary sequences for N trials is independent of the coefficients in
the binary expansion of the state:
.
Changing the weights of the components in the superposition does not
change the conclusion of most observers that the actual probabilities
are 0.5 for each result. This is simple mathematics, and I am amazed
that even after all these years, and all the times I have spelled this
out, you still seek to deny the obvious result. Your logical and
mathematical skill are on a par with those of John Clark.
It's indeed simple mathematics. You apply that to branch counting to arrive at the result of equal probabilities. So, the conclusion has to be that one should not do branch counting. The question is then if this disproves the MWI. If by MWI we mean QM minus collapse then clearly not. Because in that case we use the Born rule to compute the probability of outcomes and assume that after a measurement we have different sectors for observers who have observed the different outcomes with the probabilities as given by the Born rule.
You then want to argue against that by claiming that your argument applies generally and would not allow one to give different sectors unequal probabilities. But that's nonsense, because you make the hidden assumption of equal probabilities right from the start. There is nothing in QM that says that branches must count equally, and the lottery example I gave makes it clear that you can have branching with unequal probabilities in classical physics.
Saibal
Bruce
--
You received this message because you are subscribed to the Google
Groups "Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send
an email to everything-li...@googlegroups.com.
To view this discussion on the web visit
https://groups.google.com/d/msgid/everything-list/CAFxXSLT22raNMhxUsrUqHni2P-T4Ww%3DXQh_HKUO7CBpTZv8q_Q%40mail.gmail.com
[1].
Links:
------
[1]
https://groups.google.com/d/msgid/everything-list/CAFxXSLT22raNMhxUsrUqHni2P-T4Ww%3DXQh_HKUO7CBpTZv8q_Q%40mail.gmail.com?utm_medium=email&utm_source=footer
On 4/26/2022 5:32 PM, smitra wrote:
On 27-04-2022 01:37, Bruce Kellett wrote:Changing the weights of the components in the superposition does not
change the conclusion of most observers that the actual probabilities
are 0.5 for each result. This is simple mathematics, and I am amazed
that even after all these years, and all the times I have spelled this
out, you still seek to deny the obvious result. Your logical and
mathematical skill are on a par with those of John Clark.
It's indeed simple mathematics. You apply that to branch counting to arrive at the result of equal probabilities.
So, the conclusion has to be that one should not do branch counting. The question is then if this disproves the MWI. If by MWI we mean QM minus collapse then clearly not. Because in that case we use the Born rule to compute the probability of outcomes and assume that after a measurement we have different sectors for observers who have observed the different outcomes with the probabilities as given by the Born rule.
You then want to argue against that by claiming that your argument applies generally and would not allow one to give different sectors unequal probabilities. But that's nonsense, because you make the hidden assumption of equal probabilities right from the start.
There is nothing in QM that says that branches must count equally, and the lottery example I gave makes it clear that you can have branching with unequal probabilities in classical physics.
On 28-04-2022 07:51, Bruce Kellett wrote:
> On Thu, Apr 28, 2022 at 3:24 PM Brent Meeker <meeke...@gmail.com>
> wrote:
>
>> On 4/26/2022 5:32 PM, smitra wrote:
>>
>>> On 27-04-2022 01:37, Bruce Kellett wrote:
>> Changing the weights of the components in the superposition does not
>> change the conclusion of most observers that the actual probabilities
>> are 0.5 for each result. This is simple mathematics, and I am amazed
>> that even after all these years, and all the times I have spelled this
>> out, you still seek to deny the obvious result. Your logical and
>> mathematical skill are on a par with those of John Clark.
>>
>> It's indeed simple mathematics. You apply that to branch counting to
>> arrive at the result of equal probabilities.
>
> I have not used branch counting. Please stop accusing me of that.
>
You are considering each branch to have an equal probability when there
is no logical reason to do so, and when that's also being contradicted
by QM.
>>> There is nothing in QM that says that branches must count equally,
>>> and the lottery example I gave makes it clear that you can have
>>> branching with unequal probabilities in classical physics.
>
> As I have said, there is no classical analogue of an interaction in
> which all outcomes necessarily occur. So your lottery example is
> useless. There is no concept of probability involved in any of this.
>
The lottery example I gave clearly is a classical example in which all
outcomes necessarily occur.
Your reasoning does not involve any QM at
all, you just apply it to the MWI. Your argument goes through also in
case of the lottery example, in which case it leads to an obviopusly
wrong conclusion. So, it's your reasoning that's at fault not the MWI
taken to be QM minus collapse.
> Is there any evidence that is NOT from collapse?
> How does it get recorded?
> Where is it?
> Collapse, after all, has a perfectly reasonable mechanism in terms of the flashes of relativistic GRW theory.
In
fact, that idea introduces a raft of problems of its own -- what is
the measure over this infinity of branches? What does it mean to
partition infinity in the ratio of 0.9:0.1? What is the mechanism
(necessarily outside the Schrodinger equation) that achieves this?
That simply means that there is as of yet no good model for QM without the Born rule.
On 5/4/2022 12:27 PM, smitra wrote:
In fact, that idea introduces a raft of problems of its own -- what isThat simply means that there is as of yet no good model for QM without the Born rule.
the measure over this infinity of branches? What does it mean to
partition infinity in the ratio of 0.9:0.1? What is the mechanism
(necessarily outside the Schrodinger equation) that achieves this?
But there is no mechanism for the Born rule. It is inconsistent with pure Schroedinger evolution of the wave function. I think the problem of measures on infinity is overcome if you simply postulate a very large but finite number of branches to split.
Or why not not an continuum probability and just measure by the density around the eigenvalue
...the measured values are never exact anyway. I don't these things are wrong or show MWI is inconsistent, but I think they show it has just moved the problems it purported to solve off to some unobservable worlds, which is no better than CI.
On 04-05-2022 01:49, Bruce Kellett wrote:
>
> I have not introduced any concept of probability. The 2^N branches
> that are constructed when both outcomes are realized on each of N
> Bernoulli trials are all on the same basis.
If you ignore the amplitudes in the states, and that means modifying QM
into something else.
On Thu, May 5, 2022 at 8:04 AM Brent Meeker <meeke...@gmail.com> wrote:
On 5/4/2022 12:27 PM, smitra wrote:
In fact, that idea introduces a raft of problems of its own -- what isThat simply means that there is as of yet no good model for QM without the Born rule.
the measure over this infinity of branches? What does it mean to
partition infinity in the ratio of 0.9:0.1? What is the mechanism
(necessarily outside the Schrodinger equation) that achieves this?
But there is no mechanism for the Born rule. It is inconsistent with pure Schroedinger evolution of the wave function. I think the problem of measures on infinity is overcome if you simply postulate a very large but finite number of branches to split.
The trouble if the number of branches is finite is that, given the large number of splits since the beginning of time, you will eventually run out of branches to split.
Or why not not an continuum probability and just measure by the density around the eigenvalue
How do you measure the density? You still need to impose a measure on an infinite set.
...the measured values are never exact anyway. I don't these things are wrong or show MWI is inconsistent, but I think they show it has just moved the problems it purported to solve off to some unobservable worlds, which is no better than CI.
They show that MWI, as proposed by Everett, cannot work without such extensive modification that it is no longer the same theory. What is more, the required modifications are all ad hoc patches -- you lose any claim to rigour.
Bruce
--
You received this message because you are subscribed to the Google Groups "Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email to everything-li...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/everything-list/CAFxXSLRi9Zw9%3DVvy4xZCWdBvgatAP4_Zc_4V%2B7w%3DxzY254GcCg%40mail.gmail.com.
On 04-05-2022 01:49, Bruce Kellett wrote:
> On Tue, May 3, 2022 at 10:11 PM smitra <smi...@zonnet.nl> wrote:
>
What you are constructing is not the result of QM.
> I think you are being confused by the presence of coefficients in the
> expansion of the original state: the a and b in
>
> |psi> = a|0> + b|1>
>
> The linearity of the Schrodinger equation means that the coefficients,
> a and b, play no part in the construction of the 2^N possible
> branches; you get the same set of 2^N branches whatever the values of
> a and b. Think of it this way. If a = sqrt(0.9) and b = sqrt(0.1), the
> Born rule probability for |0> is 90%, and the Born rule probability
> for |1> is 10%. But, by hypothesis, both outcomes occur with certainty
> on each trial. There is a conflict here. You cannot rationally have a
> 10% probability for something that is certain to happen.
Of course you can. The lottery example shows that even in classical
physics you can imagine this happening. If a million copies of you are
made and one will win a lottery whole the rest won't then you have one
in a million chance of experiencing winning the lottery, even though
both outcomes of winning and losing will occur with certainty.
On 5/4/2022 4:01 PM, Bruce Kellett wrote:
On Thu, May 5, 2022 at 8:04 AM Brent Meeker <meeke...@gmail.com> wrote:
On 5/4/2022 12:27 PM, smitra wrote:
In fact, that idea introduces a raft of problems of its own -- what isThat simply means that there is as of yet no good model for QM without the Born rule.
the measure over this infinity of branches? What does it mean to
partition infinity in the ratio of 0.9:0.1? What is the mechanism
(necessarily outside the Schrodinger equation) that achieves this?
But there is no mechanism for the Born rule. It is inconsistent with pure Schroedinger evolution of the wave function. I think the problem of measures on infinity is overcome if you simply postulate a very large but finite number of branches to split.
The trouble if the number of branches is finite is that, given the large number of splits since the beginning of time, you will eventually run out of branches to split.
There's always a bigger, but still finite number. Hilbert space already assumes continuous complex values.
Or why not not an continuum probability and just measure by the density around the eigenvalue
How do you measure the density? You still need to impose a measure on an infinite set.
The reals have natural measurable subsets which define the Lebesque measure.
https://e.math.cornell.edu/people/belk/measuretheory/LebesgueMeasure.pdf
On Thu, May 5, 2022 at 9:57 AM Brent Meeker <meeke...@gmail.com> wrote:
On 5/4/2022 4:01 PM, Bruce Kellett wrote:
On Thu, May 5, 2022 at 8:04 AM Brent Meeker <meeke...@gmail.com> wrote:
On 5/4/2022 12:27 PM, smitra wrote:
In fact, that idea introduces a raft of problems of its own -- what isThat simply means that there is as of yet no good model for QM without the Born rule.
the measure over this infinity of branches? What does it mean to
partition infinity in the ratio of 0.9:0.1? What is the mechanism
(necessarily outside the Schrodinger equation) that achieves this?
But there is no mechanism for the Born rule. It is inconsistent with pure Schroedinger evolution of the wave function. I think the problem of measures on infinity is overcome if you simply postulate a very large but finite number of branches to split.
The trouble if the number of branches is finite is that, given the large number of splits since the beginning of time, you will eventually run out of branches to split.
There's always a bigger, but still finite number. Hilbert space already assumes continuous complex values.
You cannot adjust the total number of branches as you go: you can't manufacture more branches if you run short
Or why not not an continuum probability and just measure by the density around the eigenvalue
How do you measure the density? You still need to impose a measure on an infinite set.
The reals have natural measurable subsets which define the Lebesque measure.
https://e.math.cornell.edu/people/belk/measuretheory/LebesgueMeasure.pdf
Can you put the infinite set of branches in one-to-one correspondence with the reals? Are these, in fact, equivalent sets. What is the length of a set of branches?
I think there might be problems with using the Lebesgue measure over sets of branches. You can define a Lebesgue measure over the real line because there is a natural concept of the length of an interval. There is no such natural concept of length over a set of branches.
--
Bruce
You received this message because you are subscribed to the Google Groups "Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email to everything-li...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/everything-list/CAFxXSLQ%3DMry5ZCV%3DYc8jE4Cr_GXnS4U4aKD4ncUP6qEOF5mK8Q%40mail.gmail.com.
On 5/4/2022 5:16 PM, Bruce Kellett wrote:
On Thu, May 5, 2022 at 9:57 AM Brent Meeker <meeke...@gmail.com> wrote:
On 5/4/2022 4:01 PM, Bruce Kellett wrote:
On Thu, May 5, 2022 at 8:04 AM Brent Meeker <meeke...@gmail.com> wrote:
On 5/4/2022 12:27 PM, smitra wrote:
In fact, that idea introduces a raft of problems of its own -- what isThat simply means that there is as of yet no good model for QM without the Born rule.
the measure over this infinity of branches? What does it mean to
partition infinity in the ratio of 0.9:0.1? What is the mechanism
(necessarily outside the Schrodinger equation) that achieves this?
But there is no mechanism for the Born rule. It is inconsistent with pure Schroedinger evolution of the wave function. I think the problem of measures on infinity is overcome if you simply postulate a very large but finite number of branches to split.
The trouble if the number of branches is finite is that, given the large number of splits since the beginning of time, you will eventually run out of branches to split.
There's always a bigger, but still finite number. Hilbert space already assumes continuous complex values.
You cannot adjust the total number of branches as you go: you can't manufacture more branches if you run short
Why not? "Being a branch" is only a matter of degree anyway. There are a bazillion weakly decohered states every second which our instruments could not distinguish.
Or why not not an continuum probability and just measure by the density around the eigenvalue
How do you measure the density? You still need to impose a measure on an infinite set.
The reals have natural measurable subsets which define the Lebesque measure.
https://e.math.cornell.edu/people/belk/measuretheory/LebesgueMeasure.pdf
Can you put the infinite set of branches in one-to-one correspondence with the reals? Are these, in fact, equivalent sets. What is the length of a set of branches?
I think there might be problems with using the Lebesgue measure over sets of branches. You can define a Lebesgue measure over the real line because there is a natural concept of the length of an interval. There is no such natural concept of length over a set of branches.
If the branches differ by a real parameter, like the time of the radioactive decay for Schoerdinger's cat, it should work. In general you might have to come up with something like Zurek's quantum Darwinism to provide a measure.
On 05-05-2022 01:57, Bruce Kellett wrote:
> On Thu, May 5, 2022 at 5:27 AM smitra <smi...@zonnet.nl> wrote:
>>
>> Of course you can. The lottery example shows that even in classical
>> physics you can imagine this happening. If a million copies of you are
>> made and one will win a lottery whole the rest won't then you have one
>> in a million chance of experiencing winning the lottery, even though
>> both outcomes of winning and losing will occur with certainty.
>
> The trouble is that classically, a million copies of you cannot be
> made.
Then assume that I'm Mr. Data and just copy the software running Mr.
Data a million times. So, this is not a findamtnel problem with the
argument.
> The issue was that if the probability of an outcome is 10%, then
> it does not make sense to say that that outcome will certainly happen.
It does make sense in a scenario where there are multiple copies if the
same observer. If Alice makes 10 copies of Bob, and one copy of Bob is
going to experience outcome A and the rest will experience outcome B,
then from Alice will see all the possible states for Bob. But from Bob's
point of view, things are different. After Bob is exposed to the result
(A or B) there are two versions of Bob, Bob<A and Bob_B, and if Bob
knows beforehand how the experiment s set up, he'll assign a probability
of 10% of going to find himself in state Bob_B after the experiment.
On 05-05-2022 01:15, Bruce Kellett wrote:
> On Thu, May 5, 2022 at 5:27 AM smitra <smi...@zonnet.nl> wrote:
>
>> On 04-05-2022 01:49, Bruce Kellett wrote:
>>>
>>> I have not introduced any concept of probability. The 2^N branches
>>> that are constructed when both outcomes are realized on each of N
>>> Bernoulli trials are all on the same basis.
>>
>> If you ignore the amplitudes in the states, and that means modifying
>> QM into something else.
>
> QM does not assume that all branches exist equally. In Everett you
> have already modified QM into something else.
>
> The Schrodinger equation is insensitive to the amplitudes. You get the
> same set of 2^N branches from the Schrodinger equation, whatever
> amplitudes you have. The weights of these branches certainly depend on
> the amplitudes: if there are n zeros in the set of N trials, there are
> N-n ones. The weight of the corresponding binary string is a^n
> b^(N-n), but without further assumption, this plays no role in the
> future development of the state or in the interpretation of the binary
> string. If you interpret it as the probability of the string, you
> again have a conflict, since all binary strings are constructed on an
> equal basis, the natural probability for each is 2^{-N}.
There is no conflict whatsoever with assuming the Born rule and the
Schrodinger equation. The "construction on an equal basis" is not at all
implied by the Schrödinger equation.
On 05-05-2022 00:04, Brent Meeker wrote:
On 5/4/2022 12:27 PM, smitra wrote:
In
fact, that idea introduces a raft of problems of its own -- what
is
the measure over this infinity of branches? What does it mean to
partition infinity in the ratio of 0.9:0.1? What is the mechanism
(necessarily outside the Schrodinger equation) that achieves this?
That simply means that there is as of yet no good model for QM
without the Born rule.
But there is no mechanism for the Born rule. It is inconsistent with
pure Schroedinger evolution of the wave function. I think the problem
of measures on infinity is overcome if you simply postulate a very
large but finite number of branches to split. Or why not not an
continuum probability and just measure by the density around the
eigenvalue...the measured values are never exact anyway. I don't
these things are wrong or show MWI is inconsistent, but I think they
show it has just moved the problems it purported to solve off to some
unobservable worlds, which is no better than CI.
Born rule is not inconsistent with the Schrödinger equation, it just tells you that the wavefunction gives you the probability amplitudes. This is better than the CI, because the CI is inconsistent with the Schrödinger equation.
The issues with branches etc. are likely just artifacts with making hidden assumptions about branches. At the end of the day there are only a finite number of states an observer can be in. If an observer is modeled as an algorithm, take e.g. Star Trek's Mr. Data then it's clear that there are only a finite number of bitstrings that can correspond to the set of all possible things Mr. Data can be aware of.
The issues with branches etc. are likely just artifacts with making
hidden assumptions about branches. At the end of the day there are only
a finite number of states an observer can be in. If an observer is
modeled as an algorithm, take e.g. Star Trek's Mr. Data then it's clear
that there are only a finite number of bitstrings that can correspond to
the set of all possible things Mr. Data can be aware of.
When you start to rely on subjective perspectives I think you've already violated the spirit of MWI which was proposed to apply to simple instrument records as well as consciousness. Decoherence is such an instrument that is implicit in the environment.
On 08-05-2022 06:04, Bruce Kellett wrote:
> On Sun, May 8, 2022 at 11:21 AM smitra <smi...@zonnet.nl> wrote:
>
>> The issues with branches etc. are likely just artifacts with making
>> hidden assumptions about branches. At the end of the day there are
>> only
>> a finite number of states an observer can be in. If an observer is
>> modeled as an algorithm, take e.g. Star Trek's Mr. Data then it's
>> clear
>> that there are only a finite number of bitstrings that can
>> correspond to
>> the set of all possible things Mr. Data can be aware of.
>
> Everett is supposed to be QM without observers. So the number of
> things that Mr Data can possibly be aware of is irrelevant. According
> to the SE, all branches are equivalent. All else flows from this --
> there are no further "hidden assumptions about branches".
>
Yes, but I'm not a big fan of "sticking to scripture". What matters for
me is that collapse is inconsistent with the SE, therefore we should
consider QM without collapse and see how to best to move forward on that
basis.
On 08-05-2022 05:58, Bruce Kellett wrote:
> It is when you take the SE to imply that all possible outcomes exist
> on each trial. That gives all outcomes equal status.
All outcomes can exist without these being equally likely. One can make
models based on more branches for certain outcomes, but these are just
models that may not be correct.
What matters is that such models can be
formulated in a mathematically consistent way, which demonstrates that
there is n o contradiction. The physical plausibility of such models is
another issue.
On Mon, May 9, 2022 at 6:37 AM smitra <smi...@zonnet.nl> wrote:
On 08-05-2022 05:58, Bruce Kellett wrote:
> It is when you take the SE to imply that all possible outcomes exist
> on each trial. That gives all outcomes equal status.
All outcomes can exist without these being equally likely. One can make
models based on more branches for certain outcomes, but these are just
models that may not be correct.
Such models are certainly inconsistent with the SE. So if your concern is that the SE does not contain provision for a collapse, then you should doubt other theories that violate the SE. You can't have it both ways: you can't reject collapse models because they violate the SE and then embrace other models that also violate the SE. Either the SE is universally correct, or it is not.What matters is that such models can be
formulated in a mathematically consistent way, which demonstrates that
there is n o contradiction. The physical plausibility of such models is
another issue.
This has been discussed. To allow for real number probabilities, the number of branches on each split must be infinite.
On 5/8/2022 3:42 PM, Bruce Kellett wrote:
On Mon, May 9, 2022 at 6:37 AM smitra <smi...@zonnet.nl> wrote:
On 08-05-2022 05:58, Bruce Kellett wrote:
> It is when you take the SE to imply that all possible outcomes exist
> on each trial. That gives all outcomes equal status.
All outcomes can exist without these being equally likely. One can make
models based on more branches for certain outcomes, but these are just
models that may not be correct.
Such models are certainly inconsistent with the SE. So if your concern is that the SE does not contain provision for a collapse, then you should doubt other theories that violate the SE. You can't have it both ways: you can't reject collapse models because they violate the SE and then embrace other models that also violate the SE. Either the SE is universally correct, or it is not.What matters is that such models can be
formulated in a mathematically consistent way, which demonstrates that
there is n o contradiction. The physical plausibility of such models is
another issue.
This has been discussed. To allow for real number probabilities, the number of branches on each split must be infinite.
I don't think that's a problem. The number of information bits within a Hubble sphere is something like the area in Planck units, which already implies the continuum is a just a convenient approximation. If the area is N then something order 1/N would be the smallest non-zero probability. Also there would be a cutoff for the off-diagonal terms of the density matrix. Once all the off-diagonal terms are zero then it's like a mixed matrix and one could say that one of the diagonal terms has "happened".
> The Everett program is to say that the SE is all that there is -- it explains everything.
> That is clearly false (no Born rule in the SE),
> so it might be wise to doubt the universal application of the SE.
On 5/8/2022 1:50 PM, smitra wrote:
>> That the CI is inconsistent with the Schrödinger equation is easy to>> see. If the Schrödinger is valid, then the state of a system evolves>> in a unitary way. But after a real collapse the state changes in a>> non-unitary way.
> Which is only a problem if one insists that the Schroedinger equation is
the whole of the theory and it is ontic. CI denies the first and says
that measurements are projection operators because a measurements is
necessarily a classical-like result. QBism says the whole theory is
epistemic.
>> If the measurement takes one minute, then the initial state of a patch>> of one light-minute diameter around the location of the experiment>> maps to a final state of that patch in a unitary way.
> You seem to overlook that this one-light minute sphere also had incoming particles and radiation which could not be accounted for the Schroedinger equation.
--
You received this message because you are subscribed to the Google Groups "Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email to everything-li...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/everything-list/CAJPayv0Jv79qeMuYg%3DcRXWhpYjRPKSeXE%3DO02jgxxAwN8cGAFw%40mail.gmail.com.
On Sun, May 8, 2022 at 7:00 PM Brent Meeker <meeke...@gmail.com> wrote:
On 5/8/2022 1:50 PM, smitra wrote:
>> That the CI is inconsistent with the Schrödinger equation is easy to>> see. If the Schrödinger is valid, then the state of a system evolves>> in a unitary way. But after a real collapse the state changes in a>> non-unitary way.
> Which is only a problem if one insists that the Schroedinger equation is
the whole of the theory and it is ontic. CI denies the first and says
that measurements are projection operators because a measurements is
necessarily a classical-like result. QBism says the whole theory is
epistemic.
And all of that is fundamentally the same as "shut up and calculate ", they're just dressed up in slightly different philosophical bafflegab.
>> If the measurement takes one minute, then the initial state of a patch>> of one light-minute diameter around the location of the experiment>> maps to a final state of that patch in a unitary way.
> You seem to overlook that this one-light minute sphere also had incoming particles and radiation which could not be accounted for the Schroedinger equation.
If Everett is right then when an electron makes an up/down decision it makes no difference if you think of it as the entire universe instantly splits or as the split expanding outward at the speed of light, either way something that happens on the surface of that expanding sphere can have no effect on its center because no signal can travel faster than light.
--
You received this message because you are subscribed to the Google Groups "Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email to everything-li...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/everything-list/CAJPayv2Lvg3LxCth3AG4HT9Grf%3DrPBD34BWhH70tdXjDCN80HQ%40mail.gmail.com.
>> And all of that is fundamentally the same as "shut up and calculate ", they're just dressed up in slightly different philosophical bafflegab.
> They're not "dressed up", they are perfectly explicit in their interpretation and ontology.
>> If Everett is right then when an electron makes an up/down decision it makes no difference if you think of it as the entire universe instantly splits or as the split expanding outward at the speed of light, either way something that happens on the surface of that expanding sphere can have no effect on its center because no signal can travel faster than light.
> An electron makes an up/down decision?? That's a new interpretation! You miss the point that [...]
> parts of the universe not accounted for in your SE are acting on the instrument which is interacting with the electron as it's "making a decision".
> It's not the one-light-minute sphere expanding after the measurement event, it's the one-light-minute sphere contracting onto the measurement event before it. How does the SE account for it?
On 09-05-2022 00:42, Bruce Kellett wrote:
>
> Such models are certainly inconsistent with the SE. So if your concern
> is that the SE does not contain provision for a collapse, then you
> should doubt other theories that violate the SE. You can't have it
> both ways: you can't reject collapse models because they violate the
> SE and then embrace other models that also violate the SE. Either the
> SE is universally correct, or it is not.
>
>> What matters is that such models can be
>> formulated in a mathematically consistent way, which demonstrates that
>> there is n o contradiction. The physical plausibility of such models
>> is another issue.
>
> This has been discussed. To allow for real number probabilities, the
> number of branches on each split must be infinite. The measure problem
> for infinite numbers of branches has not been solved. It is unlikely
> that any consistent measure over infinite numbers of branches can be
> defined. So this idea is probably a non-starter. At least other models
> have a reasonable chance of success.
>
As Brent has also pointed out, there amount of information in the
visible universe is finite.
But one can also consider observers and then
each observer has a some finite memory so there are only a finite number
of branches the observer can distinguish between.
On 09-05-2022 00:34, Bruce Kellett wrote:
> That still treats the SE as indubitally true. No theory in physics is
> 'indubitably true'.
>
> The Everett program is to say that the SE is all that there is -- it
> explains everything. That is clearly false (no Born rule in the SE),
> so it might be wise to doubt the universal application of the SE.
There is no good reason to doubt the SE without any experimental hints
that it breaks down, or any good theoretical reasons why it is likely to
break down in some regime.
On 11-05-2022 06:06, Bruce Kellett wrote:
> On Wed, May 11, 2022 at 1:56 PM smitra <smi...@zonnet.nl> wrote:
>
>> On 09-05-2022 00:34, Bruce Kellett wrote:
>>
>>> That still treats the SE as indubitally true. No theory in physics is
>>> 'indubitably true'.
>>>
>>> The Everett program is to say that the SE is all that there is -- it
>>> explains everything. That is clearly false (no Born rule in the SE),
>>> so it might be wise to doubt the universal application of the SE.
>>
>> There is no good reason to doubt the SE without any experimental hints
>> that it breaks down, or any good theoretical reasons why it is
>> likely to break down in some regime.
>
> Such faith would be touching if it weren't so naive. There are good
> theoretical and experimental reasons to believe that it cannot be the
> whole story.
>
As John Clark has also mentioned, the opposite is true. There are no
good arguments for collapse theories. There are no experimental hints
for real collapse
and if we argue based on theory, then we see that it
leads to many problems.
On 11-05-2022 06:01, Bruce Kellett wrote:
> On Wed, May 11, 2022 at 1:51 PM smitra <smi...@zonnet.nl> wrote:
>
>> On 09-05-2022 00:42, Bruce Kellett wrote:
>>>
>>> Such models are certainly inconsistent with the SE. So if your concern
>>> is that the SE does not contain provision for a collapse, then you
>>> should doubt other theories that violate the SE. You can't have it
>>> both ways: you can't reject collapse models because they violate the
>>> SE and then embrace other models that also violate the SE. Either the
>>> SE is universally correct, or it is not.
>>>
>>>> What matters is that such models can be
>>>> formulated in a mathematically consistent way, which demonstrates that
>>>> there is n o contradiction. The physical plausibility of such models
>>>> is another issue.
>>>
>>
>> As Brent has also pointed out, there amount of information in the
>> visible universe is finite.
>
> That does not limit the number of branches. A finite universe does not
> limit the number of points in a line.
There is no such thing as a mathematical continuum in the real physical
world.
There are only a finite number of distinct quantum states
available for a finite universe.
This is clear for states below some
total energy E. But there is an upper limit to the total energy due to
gravitational collapse when the energy exceeds a certain limit.
>> But one can also consider observers and then
>> each observer has a some finite memory so there are only a finite
>> number of branches the observer can distinguish between.
>
> That does not follow.
>
If there are only a finite number of states the entire universe can be
in, then that's also true for observers.
> The SE also has many problems
> Well, there's a big fat hint that it [SE] breaks down FAPP in every
measurement, in every bit of physics that appears classical and
irreversible.
> And it might just be an effective approximation as in QBism.
That's complete and audacious question begging. What you mean by
"real" is "modeled within the SE". There is NOTHING BUT collapse
experimentally; every result recorded in every notebook and every tape
is evidence of a collapse.
There is effective collapse in experiments we do, but the experiments nevertheless demonstrate that the fundamental processes proceed under unitary time evolution.
On 11-05-2022 07:42, Brent Meeker wrote:
>
> That's complete and audacious question begging. What you mean by
> "real" is "modeled within the SE". There is NOTHING BUT collapse
> experimentally; every result recorded in every notebook and every tape
> is evidence of a collapse.
>
There is effective collapse in experiments we do, but the experiments
nevertheless demonstrate that the fundamental processes proceed under
unitary time evolution.
On 11-05-2022 07:30, Bruce Kellett wrote:
>
> Who proved that the universe was finite?
>
If it's infinite, one can focus on only the visible part of it.
>> If there are only a finite number of states the entire universe can
>> be in, then that's also true for observers.
>
> That simply begs the question.
>
Finite or infinite universe, observers are always finite.
On 11-05-2022 08:14, Bruce Kellett wrote:
> On Wed, May 11, 2022 at 3:39 PM Brent Meeker <meeke...@gmail.com>
> wrote:
>
>> On 5/10/2022 9:43 PM, smitra wrote:
>>
>>> If there are only a finite number of states the entire universe can be
>>> in, then that's also true for observers.
>>
>> So what does the SE for this discrete universe look like? The one
>> every cites assumes a continuum. If the universe is finite then there's
>> smallest non-zero probability, which as Bruce says, raises some
>> problems.
>
> Not the least of these problems is the fact that a smallest non-zero
> probability makes the collapse real; destroys the ongoing
> superposition; renders everything absolutely irreversible; and screws
> the hell out of unitary evolution.
Counterexample: The internal state of an ideal quantum computer will
always evolve under unitary time evolution.
All that the experiments demonstrate is that the wave function evolves
unitarily between state preparation and measurement. This is most
easily accounted for by assuming that the wave function is a purely
epistemic vehicle for the time evolution of probabilities. Since it is
purely epistemic, collapse is not a problem since it is not a physical
event. One does not have to go the whole way to QBism -- the wave
function can still be objective (inter-subjectively agreed).
That's possible but that means that QM is not a complete fundamental theory of reality. Anything that explains these probabilities is then possible, including the existence of a multiverse.
On 12-05-2022 01:36, Bruce Kellett wrote:
> On Thu, May 12, 2022 at 9:24 AM smitra <smi...@zonnet.nl> wrote:
>
>> On 11-05-2022 07:30, Bruce Kellett wrote:
>>> Who proved that the universe was finite?
>>
>> If it's infinite, one can focus on only the visible part of it.
>
> The visible part is only locally defined -- go to the edge and there
> is another, larger, region.
>
Yes, but in the end this doesn't really matter due to there only being
local interactions. After a finite time any finite system can only
interact with a finite number of degrees of freedom in its environment.
>>>> If there are only a finite number of states the entire universe can
>>>> be in, then that's also true for observers.
>>>
>>> That simply begs the question.
>>>
>>
>> Finite or infinite universe, observers are always finite.
>
> The universe itself is not defined by observers.
The state of the observer can then factor out of the branches the
universe is in.
On 12-05-2022 00:44, Brent Meeker wrote:
> On 5/11/2022 1:06 PM, smitra wrote:
>
>> There is effective collapse in experiments we do, but the
>> experiments nevertheless demonstrate that the fundamental processes
>> proceed under unitary time evolution.
>
> Except when you measure them and actually get a result.
>
No, there exist no experiment results that demonstrate that unitary time
evolution is not exactly valid. What you are referring to is that in
experiments we do the wavefunction of the measured system (effectively)
collapses. But, because we also know from all the experimental results
that the wavefunction evolves in a unitary way, and experiments are
ultimately nothing more that many particle interactions, that either
unitary time evolution cannot be exactly valid or that the collapse
during measurement is an artifact of decoherence where the observer (and
the local environment) gets into an entangled superposition with the
measured system. The former hypothesis lacks experimental support.
> Explaining the values of the probabilities isn't the problem with MWI, it's explaining that there are probabilities
On 12-05-2022 22:18, Brent Meeker wrote:
>
> I agree. And in fact SE fails all the time. It fails to predict a
> definite outcome...which is OK if you accept probabilistic theories.
Physics doesn't work in this way. You always need to define a well
defined hypothesis first in order to interpret experimental results and
be able to test various alternative hypotheses/theories. If you don't do
this, you are not doing physics.
> But then its real failure is that it doesn't tell you exactly when and
> where and why it stops unitary evolution and produces a result.
That's a failure of particular interpretations of QM, e.g. the CI that
postulate collapse.
> The Born rule tells us the probability of a result...IF there is one.
> Decoherence tells there's an asymptotic approach to a result and
> why...but not when and where it arrives.
Decoherence does does tell you how the different sectors split over
time.
On Thu, May 12, 2022 at 4:27 PM Brent Meeker <meeke...@gmail.com> wrote:> Explaining the values of the probabilities isn't the problem with MWI, it's explaining that there are probabilitiesThat's easy in MWI. Probabilities exist because until you actually look at it there is no way to know if you are the Brent Meeker who lives in a universe where the electron went left or you are the Brent Meeker who lives in a universe where the electron went right, due to the fact that the only difference between the two Brent Meekers is what the electron does.
--
You received this message because you are subscribed to the Google Groups "Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email to everything-li...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/everything-list/CAJPayv3Dn8ghJUMTt%3DU0L48ROnNYOpweXCa1sB0os140U8FNyA%40mail.gmail.com.
>>> Explaining the values of the probabilities isn't the problem with MWI, it's explaining that there are probabilities>> That's easy in MWI. Probabilities exist because until you actually look at it there is no way to know if you are the Brent Meeker who lives in a universe where the electron went left or you are the Brent Meeker who lives in a universe where the electron went right, due to the fact that the only difference between the two Brent Meekers is what the electron does.> But you don’t think this applies with non MWI duplication.