Re: The Nature of Contingency: Quantum Physics as Modal Realism

18 views
Skip to first unread message

Brent Meeker

unread,
Apr 28, 2022, 1:23:25 AM4/28/22
to Everything List

On 4/27/2022 10:38 AM, smitra wrote:
> On 27-04-2022 04:08, Brent Meeker wrote:
>> On 4/26/2022 5:32 PM, smitra wrote:
>>
>>> On 27-04-2022 01:37, Bruce Kellett wrote:
>>> On Tue, Apr 26, 2022 at 10:03 AM smitra <smi...@zonnet.nl> wrote:
>>>
>>> On 24-04-2022 03:16, Bruce Kellett wrote:
>>>
>>> A moment's thought should make it clear to you that this is not
>>> possible. If both possibilities are realized, it cannot be the
>>> case
>>> that one has twice the probability of the other. In the long run,
>>> if
>>> both are realized they have equal probabilities of 1/2.
>>>
>>> The probabilities do not have to be 1/2.  Suppose one million people
>>>
>>>
>>> participate in a lottery such that there will be exactly one winner.
>>>
>>> The
>>> probability that one given person will win, is then one in a
>>> million.
>>> Suppose now that we create one million people using a machine and
>>> then
>>> organize such a lottery. The probability that one given newly
>>> created
>>> person will win is then also one in a million. The machine can be
>>> adjusted to create any set of persons we like, it can create one
>>> million
>>> identical persons, or almost identical persons, or totally different
>>>
>>>
>>> persons. If we then create one million almost identical persons, the
>>>
>>>
>>> probability is still one one in a million. This means that the limit
>>>
>>> of
>>> identical persons, the probability will be one in a million.
>>>
>>> Why would the probability suddenly become 1/2 if the machine is set
>>> to
>>> create exactly identical persons while the probability would be one
>>> in a
>>> million if we create persons that are almost, but not quite
>>> identical?
>>
>> Your lottery example is completely beside the point.
>>
>> It provides for an example of a case where your logic does not apply.
>>
>>> I think you
>>> should pay more attention to the mathematics of the binomial
>>> distribution. Let me explain it once more: If every outcome is
>>> realized on every trial of a binary process, then after the first
>>> trial, we have a branch with result 0 and a branch with result 1.
>>> After two trials we have four branches, with results 00, 01, 10,and
>>> 11; after 3 trials, we have branches registering 000, 001, 011, 010,
>>>
>>> 100, 101, 110, and 111. Notice that these branches represent all
>>> possible binary strings of length 3.
>>>
>>> After N trials, there are 2^N distinct branches, representing all
>>> possible binary sequences of length N. (This is just like Pascal's
>>> triangle) As N becomes very large, we can approximate the binomial
>>> distribution with the normal distribution, with mean 0.5 and
>>> standard
>>> deviation that decreases as 1/sqrt(N). In other words, the majority
>>> of
>>> trials will have equal, or approximately equal, numbers of 0s and
>>> 1s.
>>> Observers in these branches will naturally take the probability to
>>> be
>>> approximated by the relative frequencies of 0s and 1s. In other
>>> words,
>>> they will take the probability of each outcome to be 0.5.
>>
>> The problem with this is that you just assume that all branches are
>> equally probable. You don't make that explicit, it's implicitly
>> assumed, but it's just an assumption. You are simply doing branch
>> counting.
>>
>> But it shows why you can't use branch counting.  There's no physical
>> mechanism for translating the _a_ and _b_ of  _|psi> = a|0> + b|1>_
>> into numbers of branches.  To implement that you have put it in "by
>> hand" that the branches have weights or numerousity of _a _and _b_.
>> This is possible, but it gives the lie to the MWI mantra of "It's just
>> the Schroedinger equation."
>>
>
> The problem is with giving a physical interpretation to the
> mathematics here. If we take MWI to be QM without collapse, then we
> have not specified anything about branches yet. Different MWI
> advocates have published different ideas about this, and they can't
> all be right. But at heart MWI is just QM without collapse. To proceed
> in a rigorous way, one has to start with what counts as a branch. It
> seems to me that this has to involve the definition of an observer,
> and that requires a theory about what observation is. I.m.o, this has
> to be done by defining an observer as an algorithm, but many people
> think that you need to invoke environmental decoherence. People like
> e.g. Zurek using the latter definition have attempted to derive the
> Born rule based on that idea.
>
> I.m.o., one has to start working out a theory based on rigorous
> definitions and then see where that leads to, instead of arguing based
> on vague, ill defined notions.

"Observer as an algorithm" seems pretty ill defined to me.  Which
algorithm?  applied to what input?  How does the algorithm, a Platonic
construct, interface with the physical universe? Decoherence seems much
better defined.  And so does QBism.

Brent

smitra

unread,
May 3, 2022, 7:48:22 AM5/3/22
to everyth...@googlegroups.com
Any human observer is arguably implemented by an algorithm run by a
brain. So, for any given observer at some time, there exists a precisely
defined algorithm that defines that observer. In practice we cannot
provide for any such definition, but from the point of view of the
theory, it's important to takr into account the way an observer should
be rigorously defined.

Decoherence should be irrelevant to this issue. It's a process that
happens at the macroscopic scale that is associate with observation. But
ultimately observation is just the entanglement of the state of the
observer with the measured system, it doesn't matter of that information
also leaks out to an astronomically large number of other degrees of
freedom.

Saibal


> Brent

Brent Meeker

unread,
May 3, 2022, 1:47:01 PM5/3/22
to everyth...@googlegroups.com
Plus sensors, plus environment....you call that "well defined"??

> So, for any given observer at some time, there exists a precisely
> defined algorithm that defines that observer. In practice we cannot
> provide for any such definition, but from the point of view of the
> theory, it's important to takr into account the way an observer should
> be rigorously defined.
>
> Decoherence should be irrelevant to this issue. It's a process that
> happens at the macroscopic scale that is associate with observation.
> But ultimately observation is just the entanglement of the state of
> the observer with the measured system, it doesn't matter of that
> information also leaks out to an astronomically large number of other
> degrees of freedom.

The brain already has a bazillion degrees of freedom.  Without
decoherence the entanglement is reversible and nothing ever really
happens...and probability is meaningless.  You can only talk this way
because you've artificially bounded the observer-algorithm as though
it's an isolated system...which is OK FAPP...but then you want to treat
the evolution as though it's pure.  This is just muddling the CI problem
of Heisenberg's cut, not solving it.

Brent

>
> Saibal
>
>
>> Brent
>

smitra

unread,
May 4, 2022, 2:26:08 PM5/4/22
to everyth...@googlegroups.com
What matters is that it's well defined in principle. That in practice it
looks like a big mess is irrelevant.

>> So, for any given observer at some time, there exists a precisely
>> defined algorithm that defines that observer. In practice we cannot
>> provide for any such definition, but from the point of view of the
>> theory, it's important to takr into account the way an observer should
>> be rigorously defined.
>>
>> Decoherence should be irrelevant to this issue. It's a process that
>> happens at the macroscopic scale that is associate with observation.
>> But ultimately observation is just the entanglement of the state of
>> the observer with the measured system, it doesn't matter of that
>> information also leaks out to an astronomically large number of other
>> degrees of freedom.
>
> The brain already has a bazillion degrees of freedom.  Without
> decoherence the entanglement is reversible and nothing ever really
> happens...and probability is meaningless.  You can only talk this way
> because you've artificially bounded the observer-algorithm as though
> it's an isolated system...which is OK FAPP...but then you want to
> treat the evolution as though it's pure.  This is just muddling the CI
> problem of Heisenberg's cut, not solving it.
>

Even with decoherence, everything is still reversible. So, the idea that
reversibility implies that nothing can happen, must be wrong. Locality
means that even in case of an open system, all the interactions involved
in an observation involve only a finite number of degrees of freedom. A
process that takes 10 seconds can only involve the degrees of freedom
inside a radius of 10 light seconds.

It's true that probability is a problematic concept, it's perhaps
better interpret probability as information. So, in case of a branching
where some branches have low probability, then this low probability
means more information needed to describe the observations compared
observations in high probability branches.

Saibal

> Brent
>
>>
>> Saibal
>>
>>
>>> Brent
>>
Reply all
Reply to author
Forward
0 new messages