Horizons protect Church-Turing

48 views
Skip to first unread message

ronaldheld

unread,
Mar 5, 2020, 6:42:27 AM3/5/20
to Everything List
Any comments, especially from Bruno, and the Physicalists?
2003.01807.pdf

Lawrence Crowell

unread,
Mar 5, 2020, 10:01:38 AM3/5/20
to Everything List
There seems to be a bit of a gell-mould setting that is at work. This and related ideas are appearing in a number of places. Read my paper on the FQXi contest


and my final comment has a loose summary of some of this. The MIC = ER is interesting as well, and I am carving out a time next week to seriously study this. I think the domain of computation there has a connection with Hogarth-Malament spacetimes and the role of epistemic horizons, whether topological obstructions with quantum entanglements or event horizons, this appears to present barriers that protect the Church-Turing thesis. 

LC

Bruno Marchal

unread,
Mar 6, 2020, 3:49:30 AM3/6/20
to everyth...@googlegroups.com
On 5 Mar 2020, at 12:42, ronaldheld <ronal...@gmail.com> wrote:

Any comments, especially from Bruno, and the Physicalists?

Hmm… this is the “quantum Church-Turing thesis”, but OK it is from Susskind, which I often appreciate to read or listen too.
I will read it and comment next week. 
(Busy day, Sorry), 

Good Week-End!

Bruno




--
You received this message because you are subscribed to the Google Groups "Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email to everything-li...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/everything-list/ff950ee5-b253-464e-b0aa-9eca73399b9c%40googlegroups.com.
<2003.01807.pdf>

Philip Thrift

unread,
Mar 6, 2020, 5:31:17 AM3/6/20
to Everything List


Quantum mechanics places limits on what can be observed. Further, while it is deterministic wave theory, outcomes of specific measurements are purely stochastic.

The "deterministic wave theory" is one of the biggest mistakes in the history of theoretical physics. To get rid of this error, it should be formulated in terms of its foundational mathematics as a stochastic theory.

https://spiral.imperial.ac.uk/bitstream/10044/1/70797/1/Wilkes-H-2019-PhD-Thesis.pdf

I don't think quantum theory - or quantum+gravity/spacetime theory - will ever have anything to do with Church-Turing,  or Gödel,  or Cantor's Paradise, ...


@philipthrift

Lawrence Crowell

unread,
Mar 6, 2020, 6:40:08 AM3/6/20
to Everything List
Szangolies [ J. Szangolies, "Epistemic Horizons and the Foundations of Quantum Mechanics," https://arxiv.org/abs/1805.10668  ] works a form of the Cantor diagonalization for quantum measurements. As yet a full up form of the CHSH or Bell inequality violation result is waiting. There are exciting possibilities for connections between quantum mechanics, in particular the subject of quantum decoherence and measurement, and Gödel’s theorem.

If we think of all physics as a form of convex sets of states, then there are dualisms of measures p and q that obey 1/p + 1/q = 1. For quantum mechanics this is p = ½ as an L^2 measure theory. It then has a corresponding q = ½ measure system that I think is spacetime physics. A straight probability system has p = 1, sum of probabilities as unity, and the corresponding q → ∞ has no measure or distribution system. This is any deterministic system, think completely localized, that can be a Turing machine, Conway's <i>Game of life</i> or classical mechanics. A quantum measurement is a transition between p = ½ for QM and ∞ for classicality or 1 for classical probability on a fundamental level.

What separates these different convex sets are these topological obstructions, such as the indices given by the Kirwan polytope. The distinction between entanglements is also given by these topological indices or obstructions. How these determine a measurement outcome, or the ontology of an element of a decoherent sets is not decidable. This is where Gödel’s theorem enters in. A quantum measurement is a way that quantum information or qubits encode other qubits as Gödel numbers.

The prospect spacetime, or the entropy of spacetime via event horizon areas, is a condensate or large N-entanglement of quantum states then implies there is a connection between quantum computation and information accessible in spacetime configurations. These configurations may either be the Bekenstein bound S = kA/4ℓ_p^2, or quantum modified version S = kA/4ℓ_p^2 + quantum corrections. Then the quantum processing or quantum Church-Turing thesis is I think equivalent to the information processing of spacetime as black holes and maybe entire cosmologies.

These are exciting developments.

LC

Philip Thrift

unread,
Mar 6, 2020, 6:57:34 AM3/6/20
to Everything List


While programming/computing in (hypothetical) infinite domains is interesting ...

Computing in Cantor’s Paradise With λ_ZFC

how any of this relates in any way to physical reality (the stuff of nature that is actually around us in the universe, vs. just some theoretical, mathematical concoction someone may come up with) is dubious.

(Things like consciousness is another thing, or subject: It may be "beyond" Turing, bit in a way that has nothing to do with "super" or "hyper" Turing or Cantor or Godel.)

@philipthrift

ronaldheld

unread,
Mar 6, 2020, 3:07:17 PM3/6/20
to Everything List
interesting responses I did expect.   From the physical universe POV, CT is relevant?

Brent Meeker

unread,
Mar 6, 2020, 5:25:08 PM3/6/20
to everyth...@googlegroups.com


On 3/6/2020 3:40 AM, Lawrence Crowell wrote:
Szangolies [ J. Szangolies, "Epistemic Horizons and the Foundations of Quantum Mechanics," https://arxiv.org/abs/1805.10668  ] works a form of the Cantor diagonalization for quantum measurements. As yet a full up form of the CHSH or Bell inequality violation result is waiting. There are exciting possibilities for connections between quantum mechanics, in particular the subject of quantum decoherence and measurement, and Gödel’s theorem.

If we think of all physics as a form of convex sets of states, then there are dualisms of measures p and q that obey 1/p + 1/q = 1. For quantum mechanics this is p = ½ as an L^2 measure theory. It then has a corresponding q = ½

Which would give 1/p + 1/q = 4 ??

Brent

measure system that I think is spacetime physics. A straight probability system has p = 1, sum of probabilities as unity, and the corresponding q → ∞ has no measure or distribution system. This is any deterministic system, think completely localized, that can be a Turing machine, Conway's <i>Game of life</i> or classical mechanics. A quantum measurement is a transition between p = ½ for QM and ∞ for classicality or 1 for classical probability on a fundamental level.

What separates these different convex sets are these topological obstructions, such as the indices given by the Kirwan polytope. The distinction between entanglements is also given by these topological indices or obstructions. How these determine a measurement outcome, or the ontology of an element of a decoherent sets is not decidable. This is where Gödel’s theorem enters in. A quantum measurement is a way that quantum information or qubits encode other qubits as Gödel numbers.

The prospect spacetime, or the entropy of spacetime via event horizon areas, is a condensate or large N-entanglement of quantum states then implies there is a connection between quantum computation and information accessible in spacetime configurations. These configurations may either be the Bekenstein bound S = kA/4ℓ_p^2, or quantum modified version S = kA/4ℓ_p^2 + quantum corrections. Then the quantum processing or quantum Church-Turing thesis is I think equivalent to the information processing of spacetime as black holes and maybe entire cosmologies.

These are exciting developments.

LC

--
You received this message because you are subscribed to the Google Groups "Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email to everything-li...@googlegroups.com.

Lawrence Crowell

unread,
Mar 6, 2020, 6:21:02 PM3/6/20
to Everything List
On Friday, March 6, 2020 at 4:25:08 PM UTC-6, Brent wrote:


On 3/6/2020 3:40 AM, Lawrence Crowell wrote:
Szangolies [ J. Szangolies, "Epistemic Horizons and the Foundations of Quantum Mechanics," https://arxiv.org/abs/1805.10668  ] works a form of the Cantor diagonalization for quantum measurements. As yet a full up form of the CHSH or Bell inequality violation result is waiting. There are exciting possibilities for connections between quantum mechanics, in particular the subject of quantum decoherence and measurement, and Gödel’s theorem.

If we think of all physics as a form of convex sets of states, then there are dualisms of measures p and q that obey 1/p + 1/q = 1. For quantum mechanics this is p = ½ as an L^2 measure theory. It then has a corresponding q = ½

Which would give 1/p + 1/q = 4 ??

Brent


Oops, I meant p = 2 and q = 2.

LC
 
measure system that I think is spacetime physics. A straight probability system has p = 1, sum of probabilities as unity, and the corresponding q → ∞ has no measure or distribution system. This is any deterministic system, think completely localized, that can be a Turing machine, Conway's <i>Game of life</i> or classical mechanics. A quantum measurement is a transition between p = ½ for QM and ∞ for classicality or 1 for classical probability on a fundamental level.

What separates these different convex sets are these topological obstructions, such as the indices given by the Kirwan polytope. The distinction between entanglements is also given by these topological indices or obstructions. How these determine a measurement outcome, or the ontology of an element of a decoherent sets is not decidable. This is where Gödel’s theorem enters in. A quantum measurement is a way that quantum information or qubits encode other qubits as Gödel numbers.

The prospect spacetime, or the entropy of spacetime via event horizon areas, is a condensate or large N-entanglement of quantum states then implies there is a connection between quantum computation and information accessible in spacetime configurations. These configurations may either be the Bekenstein bound S = kA/4ℓ_p^2, or quantum modified version S = kA/4ℓ_p^2 + quantum corrections. Then the quantum processing or quantum Church-Turing thesis is I think equivalent to the information processing of spacetime as black holes and maybe entire cosmologies.

These are exciting developments.

LC

--
You received this message because you are subscribed to the Google Groups "Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email to everyth...@googlegroups.com.

Lawrence Crowell

unread,
Mar 6, 2020, 6:25:13 PM3/6/20
to Everything List
On Friday, March 6, 2020 at 5:57:34 AM UTC-6, Philip Thrift wrote:


While programming/computing in (hypothetical) infinite domains is interesting ...

Computing in Cantor’s Paradise With λ_ZFC

how any of this relates in any way to physical reality (the stuff of nature that is actually around us in the universe, vs. just some theoretical, mathematical concoction someone may come up with) is dubious.

(Things like consciousness is another thing, or subject: It may be "beyond" Turing, bit in a way that has nothing to do with "super" or "hyper" Turing or Cantor or Godel.)

@philipthrift

λ-calculus is equivalent to Turing computation. In fact it is similar to Assembly language. It might be that some of these problems could be looked at according to λ-calculus.

LC

Philip Thrift

unread,
Mar 7, 2020, 7:07:26 AM3/7/20
to Everything List



This is about the λ_ZFC calculus, not the λ calculus.


λ_ZFC contains infinite terms. Infinitary languages are useful
and definable: the infinitary lambda calculus [10] is an example, and Aczel’s
broadly used work [2] on inductive sets treats infinite inference rules explicitly.

@philipthrift

Lawrence Crowell

unread,
Mar 7, 2020, 12:33:11 PM3/7/20
to Everything List
On Saturday, March 7, 2020 at 6:07:26 AM UTC-6, Philip Thrift wrote:



This is about the λ_ZFC calculus, not the λ calculus.


λ_ZFC contains infinite terms. Infinitary languages are useful
and definable: the infinitary lambda calculus [10] is an example, and Aczel’s
broadly used work [2] on inductive sets treats infinite inference rules explicitly.

@philipthrift


I am aware of this, It is a bit like considering Peano arithmetic in a domain where the axioms of infinity and choice hold.

Bruno Marchal

unread,
Mar 9, 2020, 5:57:24 AM3/9/20
to everyth...@googlegroups.com
On 5 Mar 2020, at 12:42, ronaldheld <ronal...@gmail.com> wrote:

Any comments, especially from Bruno, and the Physicalists?

--
You received this message because you are subscribed to the Google Groups "Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email to everything-li...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/everything-list/ff950ee5-b253-464e-b0aa-9eca73399b9c%40googlegroups.com.
<2003.01807.pdf>


The paper is quite interesting, but I will have to deepen my understanding of Black Hole (and GR) to better appreciate it. 

There are some preliminary point were “I disagree” (or the universal machine disagree) but they might not be relevant, with respect to the paper, but relevant to the plausible link you make between the paper and physicalism.

Typically, the first sentence of the paper is “physicalist”, which is not astonishing in this context. Susskind says that the original Church-Turing thesis may be regarded as a principle of physics (which it is not). 

This can be shown inconsistant with the mechanist assumption in Cognitive Science (not in physics). Indeed, with mechanism, a priori, the physical reality should be able to compute more than a Turing machine. The physical reality can simulate “real” random oracle, for examples, and any physical object requires the entire universal dovetailing to be determined. That entails non cloning, and a priori much more computability abilities (random oracle, “white rabbits”, infinite sum on infinitely histoiries, the full set of true sigma_1 sentences, but also non computable Pi_1 truth pertaining to the distribution of accessible states, etc. 
That is so true, that Mechanism must explain the apparent computability of nature from non computable subset of the arithmetical truth. So, here, implicitly the paper relies on physicalism, without the awareness that eventually the physical appearances have to be explained by the statistics on all computations in arithmetic, not just the “quantum one”, and the quantum must be extracted from the machine’s theory of consciousness (as I did). So, normally, it can be expected that the (original) Church-Turing thesis (which is one half of Mechanism) might imply the falsity of the quantum Church-Turing thesis (due to Deutch, and I have to think how much that is related to Susskind quantum-extended Church thesis).

This concerns only the original Church Thesis and its impact on the possible physics available to arithmetical machine, and as physics is not yet entirely derived neither from Arithmetic, nor from observation (cf the GR + QM problems), it is hard at this stage to see how much mechanism will assess or diminish the validity of Susskind’s idea on the extended Quantum and physical version of CT. 
Yet, unlike the typical use of quantum mechanics to prevent an infinite computation to be realised in the physical universe, which would need digital state encodable below the Planck Era (and thus hardly usable by any concrete observer), the idea of Susskind is more subtle, and involves a notion of “complexity” related to the interior of Black-Hole. I would need here to revise (to say the least) Finkelstein derivation of GR from a finitist or discrete approach to Quantum Mechanics, which is still above my head … (I mentioned the interesting book by Selesnick on it sometimes ago).

So, just to be clear:

CT = anything computable is computable by a Turing machine (or by a combinator, or by Robinson Arithmetic, or by LISP, etc.). This has a priori nothing to do with physics. 

I will note s-CT for Susskind Physicalist version of CT: any thing physically computable is computable by a Turing machine. (The physical reality does not compute more). This is an open problem to me. It is not excluded that the physical reality which emerges from all computations in arithmetic might have non Turing computable components. 

Then s-ECT is the thesis that anything computable *efficiently* (i.e. in polynomial time) physically, by nature, is computable in polynomial time by a Turing machine. This thesis is usually believed to be wrong, as Susskind says, and indeed, if that was not wrong, we would not invest in quantum computing. Most people today believes that factoring (large) number cannot be done in polynomial time by a Turing machine.

qECT (Susskind notation) is the (extended) thesis that says that if nature can compute efficiently something then a quantum computer can compute it efficiently. That is mainly what I call the Deustch Thesis. And as <I said, I do think that CT (+ YD, i.e. mechanism) entails its plausible falsity. 

And Susskind abounds in that direction, and this without Mechanism, which would make this into a yet another confirmation of Mechanism. 

With Mechanism, and assuming the existence of Black Hole, it should be obvious that whatever happens in a black hole will not play a role in the working of your brain. A good thing, as you will not have to ask to the Doctor to emulate the interior of a black hole. But with mechanism, this means that a black hole is full of "crazy virtual particles" doing infinitely complex task, just because your state of mind is totally independent of the”content of the black hole (without its boundary)". 

At first sight, Susskind seems convincing on this, but again, to be able to asses this would require that I study much more the QM and GR of the black hole. To compare with Mechanism, we have the rather complex task to derive GR from QM and QM from arithmetic before, so this is a bit premature (I still work hard to have a notion of space, although its shadow is there, but requires the existence of large cardinal in set theory. (As I said, with Mechanism, the ontology is extremely simple (Robinson arithmetic), but the phenomenology is of unbounded complexity. 

So, very interesting but complex idea by Susskind, but it touches on problem which are far from being treated with the mechanist hypothesis. If I progress in my understating of Finkelstein, I might say more later. The paper confirms that there is something in the holographic idea, and when you compactify a universal dovetailing, you get a sort of similar principle, given that the first person experience are determined only on its “boundary”.

Bruno 












Philip Thrift

unread,
Mar 9, 2020, 7:00:09 AM3/9/20
to Everything List


On Monday, March 9, 2020 at 4:57:24 AM UTC-5, Bruno Marchal wrote:
On 5 Mar 2020, at 12:42, ronaldheld <ronal...@gmail.com> wrote:

Any comments, especially from Bruno, and the Physicalists?

--
You received this message because you are subscribed to the Google Groups "Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email to everyth...@googlegroups.com.

In the (subject of) programming in infinite domains


Böhm tree is a (potentially infinite) tree-like mathematical object that can be used to provide denotational semantics (the "meaning") for terms of the lambda calculus (and programming languages in general by using translations to lambda calculus). It is named after Corrado Böhm.

A simple way to read the meaning of a computation is to consider it as a mechanical procedure consisting of a finite number of steps that, when completed, yields a result. This interpretation is inadequate, however, for procedures that do not terminate after a finite number of steps, but nonetheless have an intuitive meaning. Consider, for example, a procedure for computing the decimal expansion of π; if implemented appropriately, it can provide partial output as it "runs", and this ongoing output is a natural way to assign meaning to the computation. This is in contrast to, say, a program that loops infinitely without ever providing output. These two procedures have very different intuitive meanings.

Since a computation described using lambda calculus is the process of reducing a lambda term to its normal form, this normal form itself is the result of the computation, and the entire process may be considered as "evaluating" the original term. For this reason Church's original suggestion was that the meaning of the computation (described by) a lambda term should be the normal form it reduces to, and that terms which do not have a normal form are meaningless.[1] This suffers exactly the inadequacy described above. Extending the π analogy, however, if "trying" to reduce a term to its normal form would give "in the limit" an infinitely long lambda term (if such a thing existed), that object could be considered this result. 


No such term exists in the lambda calculus, of course, and so Böhm trees are the objects used in this place.



@philipthrift 

Bruno Marchal

unread,
Mar 9, 2020, 7:42:41 AM3/9/20
to everyth...@googlegroups.com
On 6 Mar 2020, at 12:40, Lawrence Crowell <goldenfield...@gmail.com> wrote:

Szangolies [ J. Szangolies, "Epistemic Horizons and the Foundations of Quantum Mechanics," https://arxiv.org/abs/1805.10668  ] works a form of the Cantor diagonalization for quantum measurements. As yet a full up form of the CHSH or Bell inequality violation result is waiting. There are exciting possibilities for connections between quantum mechanics, in particular the subject of quantum decoherence and measurement, and Gödel’s theorem.

The Digital Mechanist thesis enforces that physics is derivable from Gödel’s and Löbs’ theorem, and indeed we find quantum logic exactly were expected. 

All computations are executed/emulated, in the mathematical sense of the Logicians, in arithmetic. The physical appearances have to be justified by the calculus of the 1p (plural) indeterminacy in arithmetic. 

There is a natural, canonical “many-wold” interpretation of arithmetic, developed  by the “majority of universal numbers in arithmetic.




If we think of all physics as a form of convex sets of states, then there are dualisms of measures p and q that obey 1/p + 1/q = 1. For quantum mechanics this is p = ½ as an L^2 measure theory. It then has a corresponding q = ½ measure system that I think is spacetime physics. A straight probability system has p = 1, sum of probabilities as unity, and the corresponding q → ∞ has no measure or distribution system. This is any deterministic system, think completely localized, that can be a Turing machine, Conway's <i>Game of life</i> or classical mechanics. A quantum measurement is a transition between p = ½ for QM and ∞ for classicality or 1 for classical probability on a fundamental level.

What separates these different convex sets are these topological obstructions, such as the indices given by the Kirwan polytope. The distinction between entanglements is also given by these topological indices or obstructions. How these determine a measurement outcome, or the ontology of an element of a decoherent sets is not decidable. This is where Gödel’s theorem enters in. A quantum measurement is a way that quantum information or qubits encode other qubits as Gödel numbers.

The prospect spacetime, or the entropy of spacetime via event horizon areas, is a condensate or large N-entanglement of quantum states then implies there is a connection between quantum computation and information accessible in spacetime configurations. These configurations may either be the Bekenstein bound S = kA/4ℓ_p^2, or quantum modified version S = kA/4ℓ_p^2 + quantum corrections. Then the quantum processing or quantum Church-Turing thesis is I think equivalent to the information processing of spacetime as black holes and maybe entire cosmologies.

These are exciting developments.

Sure. 


Bruno




LC


--
You received this message because you are subscribed to the Google Groups "Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email to everything-li...@googlegroups.com.

Bruno Marchal

unread,
Mar 9, 2020, 7:47:30 AM3/9/20
to everyth...@googlegroups.com
On 6 Mar 2020, at 12:57, Philip Thrift <cloud...@gmail.com> wrote:



While programming/computing in (hypothetical) infinite domains is interesting ...

Computing in Cantor’s Paradise With λ_ZFC

how any of this relates in any way to physical reality (the stuff of nature that is actually around us in the universe, vs. just some theoretical, mathematical concoction someone may come up with) is dubious.

Not at all. It is necessary, once you postulate YD + CT. It is just logically unavoidable.




(Things like consciousness is another thing, or subject: It may be "beyond" Turing, bit in a way that has nothing to do with "super" or "hyper" Turing or Cantor or Godel.)

OK. If you abandon Mechanism, all paths are open. But the derivation of physics from arithmetic is just more simple with the usual “mechanism”. It is possible to weaken mechanism to form alpha-mechanism, where you ask that some relation related to alpha (an infinite ordinal, or an oracle) is preserved in the brain transplant, but you will need a complex infinite cardinal to begin to depart from G*. In fact, what I call “the theology of the machine” is valid for a much larger class of creatures.

Bruno





@philipthrift

On Friday, March 6, 2020 at 5:40:08 AM UTC-6, Lawrence Crowell wrote:
Szangolies [ J. Szangolies, "Epistemic Horizons and the Foundations of Quantum Mechanics," https://arxiv.org/abs/1805.10668  ] works a form of the Cantor diagonalization for quantum measurements. As yet a full up form of the CHSH or Bell inequality violation result is waiting. There are exciting possibilities for connections between quantum mechanics, in particular the subject of quantum decoherence and measurement, and Gödel’s theorem.

If we think of all physics as a form of convex sets of states, then there are dualisms of measures p and q that obey 1/p + 1/q = 1. For quantum mechanics this is p = ½ as an L^2 measure theory. It then has a corresponding q = ½ measure system that I think is spacetime physics. A straight probability system has p = 1, sum of probabilities as unity, and the corresponding q → ∞ has no measure or distribution system. This is any deterministic system, think completely localized, that can be a Turing machine, Conway's <i>Game of life</i> or classical mechanics. A quantum measurement is a transition between p = ½ for QM and ∞ for classicality or 1 for classical probability on a fundamental level.

What separates these different convex sets are these topological obstructions, such as the indices given by the Kirwan polytope. The distinction between entanglements is also given by these topological indices or obstructions. How these determine a measurement outcome, or the ontology of an element of a decoherent sets is not decidable. This is where Gödel’s theorem enters in. A quantum measurement is a way that quantum information or qubits encode other qubits as Gödel numbers.

The prospect spacetime, or the entropy of spacetime via event horizon areas, is a condensate or large N-entanglement of quantum states then implies there is a connection between quantum computation and information accessible in spacetime configurations. These configurations may either be the Bekenstein bound S = kA/4ℓ_p^2, or quantum modified version S = kA/4ℓ_p^2 + quantum corrections. Then the quantum processing or quantum Church-Turing thesis is I think equivalent to the information processing of spacetime as black holes and maybe entire cosmologies.

These are exciting developments.

LC


--
You received this message because you are subscribed to the Google Groups "Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email to everything-li...@googlegroups.com.

Bruno Marchal

unread,
Mar 9, 2020, 7:55:40 AM3/9/20
to everyth...@googlegroups.com
On 6 Mar 2020, at 21:07, ronaldheld <ronal...@gmail.com> wrote:

interesting responses I did expect.   From the physical universe POV, CT is relevant?


Yes, CT is useful if only to define a mechanical observer properly, like Everett did. Then it is also useful also for the study the quantum extended version of Susskind. I say a bit more in my other posts. Note that the quantum extended version consider complexity, and not much computability. The use of “Church” and “Turing” is a bit misleading here.

At least, almost everyone agree (in the literature) on the original CT thesis. But I see also many misunderstandings since long, on this. I will have to give *the* exercice again maybe. There is a simple (but wrong) refutation of CT, which correction can be made in many ways, and all leads to important theorems in computer science, including deep form of incompleteness.

Bruno






--
You received this message because you are subscribed to the Google Groups "Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email to everything-li...@googlegroups.com.

Bruno Marchal

unread,
Mar 9, 2020, 8:04:13 AM3/9/20
to everyth...@googlegroups.com
On 7 Mar 2020, at 00:25, Lawrence Crowell <goldenfield...@gmail.com> wrote:

On Friday, March 6, 2020 at 5:57:34 AM UTC-6, Philip Thrift wrote:


While programming/computing in (hypothetical) infinite domains is interesting ...

Computing in Cantor’s Paradise With λ_ZFC

how any of this relates in any way to physical reality (the stuff of nature that is actually around us in the universe, vs. just some theoretical, mathematical concoction someone may come up with) is dubious.

(Things like consciousness is another thing, or subject: It may be "beyond" Turing, bit in a way that has nothing to do with "super" or "hyper" Turing or Cantor or Godel.)

@philipthrift

λ-calculus is equivalent to Turing computation.

… to Turing computability. Right. But the paper here define a special λ-calculus based on ZFC. It is unclear if this does not computer more. The author claims that his approach is not leading to hype or super-computation, but after a glance I would say that it is an invitation to descriptive set theory, which allows distincts way to conceive notions of computations on the real numbers. So, it does not address or criticise the usual CT.



In fact it is similar to Assembly language. It might be that some of these problems could be looked at according to λ-calculus.

.. and ZFC.

In fact, the axiom of choice can be proven to have no incidence of the elementary theory of computations, but the axiom of choice (the C in ZFC) does say something about how to interpret their results.

Bruno




LC
 

On Friday, March 6, 2020 at 5:40:08 AM UTC-6, Lawrence Crowell wrote:
Szangolies [ J. Szangolies, "Epistemic Horizons and the Foundations of Quantum Mechanics," https://arxiv.org/abs/1805.10668  ] works a form of the Cantor diagonalization for quantum measurements. As yet a full up form of the CHSH or Bell inequality violation result is waiting. There are exciting possibilities for connections between quantum mechanics, in particular the subject of quantum decoherence and measurement, and Gödel’s theorem.

If we think of all physics as a form of convex sets of states, then there are dualisms of measures p and q that obey 1/p + 1/q = 1. For quantum mechanics this is p = ½ as an L^2 measure theory. It then has a corresponding q = ½ measure system that I think is spacetime physics. A straight probability system has p = 1, sum of probabilities as unity, and the corresponding q → ∞ has no measure or distribution system. This is any deterministic system, think completely localized, that can be a Turing machine, Conway's <i>Game of life</i> or classical mechanics. A quantum measurement is a transition between p = ½ for QM and ∞ for classicality or 1 for classical probability on a fundamental level.

What separates these different convex sets are these topological obstructions, such as the indices given by the Kirwan polytope. The distinction between entanglements is also given by these topological indices or obstructions. How these determine a measurement outcome, or the ontology of an element of a decoherent sets is not decidable. This is where Gödel’s theorem enters in. A quantum measurement is a way that quantum information or qubits encode other qubits as Gödel numbers.

The prospect spacetime, or the entropy of spacetime via event horizon areas, is a condensate or large N-entanglement of quantum states then implies there is a connection between quantum computation and information accessible in spacetime configurations. These configurations may either be the Bekenstein bound S = kA/4ℓ_p^2, or quantum modified version S = kA/4ℓ_p^2 + quantum corrections. Then the quantum processing or quantum Church-Turing thesis is I think equivalent to the information processing of spacetime as black holes and maybe entire cosmologies.

These are exciting developments.

LC


--
You received this message because you are subscribed to the Google Groups "Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email to everything-li...@googlegroups.com.

Bruno Marchal

unread,
Mar 9, 2020, 8:09:37 AM3/9/20
to everyth...@googlegroups.com
On 7 Mar 2020, at 18:33, Lawrence Crowell <goldenfield...@gmail.com> wrote:

On Saturday, March 7, 2020 at 6:07:26 AM UTC-6, Philip Thrift wrote:



This is about the λ_ZFC calculus, not the λ calculus.


λ_ZFC contains infinite terms. Infinitary languages are useful
and definable: the infinitary lambda calculus [10] is an example, and Aczel’s
broadly used work [2] on inductive sets treats infinite inference rules explicitly.

@philipthrift


I am aware of this, It is a bit like considering Peano arithmetic in a domain where the axioms of infinity and choice hold.

Up to an abuse of language, ZF is mainly PA + the axiom of infinity. The set V_omega is basically the arithmetical reality, seen embedded in ZF set theory. ZF and ZFC proves the same arithmetical proposition, and much more than PA. ZFC + higher infinities proves even more. And there are simple combinatorial problem, notably on the table of Laver, which today requires super-higher cardinal to be proven, notably the cardinals of Laver. They might play a role in the derivation of space from arithmetic. They are related to the theory if braids and knots!

Bruno





LC
 

On Friday, March 6, 2020 at 5:25:13 PM UTC-6, Lawrence Crowell wrote:
On Friday, March 6, 2020 at 5:57:34 AM UTC-6, Philip Thrift wrote:


While programming/computing in (hypothetical) infinite domains is interesting ...

Computing in Cantor’s Paradise With λ_ZFC

how any of this relates in any way to physical reality (the stuff of nature that is actually around us in the universe, vs. just some theoretical, mathematical concoction someone may come up with) is dubious.

(Things like consciousness is another thing, or subject: It may be "beyond" Turing, bit in a way that has nothing to do with "super" or "hyper" Turing or Cantor or Godel.)

@philipthrift

λ-calculus is equivalent to Turing computation. In fact it is similar to Assembly language. It might be that some of these problems could be looked at according to λ-calculus.

LC
 

On Friday, March 6, 2020 at 5:40:08 AM UTC-6, Lawrence Crowell wrote:
Szangolies [ J. Szangolies, "Epistemic Horizons and the Foundations of Quantum Mechanics," https://arxiv.org/abs/1805.10668  ] works a form of the Cantor diagonalization for quantum measurements. As yet a full up form of the CHSH or Bell inequality violation result is waiting. There are exciting possibilities for connections between quantum mechanics, in particular the subject of quantum decoherence and measurement, and Gödel’s theorem.

If we think of all physics as a form of convex sets of states, then there are dualisms of measures p and q that obey 1/p + 1/q = 1. For quantum mechanics this is p = ½ as an L^2 measure theory. It then has a corresponding q = ½ measure system that I think is spacetime physics. A straight probability system has p = 1, sum of probabilities as unity, and the corresponding q → ∞ has no measure or distribution system. This is any deterministic system, think completely localized, that can be a Turing machine, Conway's <i>Game of life</i> or classical mechanics. A quantum measurement is a transition between p = ½ for QM and ∞ for classicality or 1 for classical probability on a fundamental level.

What separates these different convex sets are these topological obstructions, such as the indices given by the Kirwan polytope. The distinction between entanglements is also given by these topological indices or obstructions. How these determine a measurement outcome, or the ontology of an element of a decoherent sets is not decidable. This is where Gödel’s theorem enters in. A quantum measurement is a way that quantum information or qubits encode other qubits as Gödel numbers.

The prospect spacetime, or the entropy of spacetime via event horizon areas, is a condensate or large N-entanglement of quantum states then implies there is a connection between quantum computation and information accessible in spacetime configurations. These configurations may either be the Bekenstein bound S = kA/4ℓ_p^2, or quantum modified version S = kA/4ℓ_p^2 + quantum corrections. Then the quantum processing or quantum Church-Turing thesis is I think equivalent to the information processing of spacetime as black holes and maybe entire cosmologies.

These are exciting developments.

LC


--
You received this message because you are subscribed to the Google Groups "Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email to everything-li...@googlegroups.com.

Lawrence Crowell

unread,
Mar 9, 2020, 8:52:00 PM3/9/20
to Everything List
On Monday, March 9, 2020 at 6:42:41 AM UTC-5, Bruno Marchal wrote:

On 6 Mar 2020, at 12:40, Lawrence Crowell <goldenfield...@gmail.com> wrote:

Szangolies [ J. Szangolies, "Epistemic Horizons and the Foundations of Quantum Mechanics," https://arxiv.org/abs/1805.10668  ] works a form of the Cantor diagonalization for quantum measurements. As yet a full up form of the CHSH or Bell inequality violation result is waiting. There are exciting possibilities for connections between quantum mechanics, in particular the subject of quantum decoherence and measurement, and Gödel’s theorem.

The Digital Mechanist thesis enforces that physics is derivable from Gödel’s and Löbs’ theorem, and indeed we find quantum logic exactly were expected. 

All computations are executed/emulated, in the mathematical sense of the Logicians, in arithmetic. The physical appearances have to be justified by the calculus of the 1p (plural) indeterminacy in arithmetic. 

There is a natural, canonical “many-wold” interpretation of arithmetic, developed  by the “majority of universal numbers in arithmetic.



I will have to write more if possible. I am not sure that all of physics is derived from Gödel’s theorem. I see is as more that from classical to quantum mechanics there is a sort of forcing, to borrow from set theory, to extend a model with undecidable propositions. Where this undecidable matter enters in is with the problem of measurement and decoherence.

As for an earlier comment, Turing's model is in a grey zone between mathematics often seen as pure and with physics. The tape and reader, appearing a bit like a sort of cart on a track, is a model of a physical system. That system is a computer.

LC
 


If we think of all physics as a form of convex sets of states, then there are dualisms of measures p and q that obey 1/p + 1/q = 1. For quantum mechanics this is p = ½ as an L^2 measure theory. It then has a corresponding q = ½ measure system that I think is spacetime physics. A straight probability system has p = 1, sum of probabilities as unity, and the corresponding q → ∞ has no measure or distribution system. This is any deterministic system, think completely localized, that can be a Turing machine, Conway's <i>Game of life</i> or classical mechanics. A quantum measurement is a transition between p = ½ for QM and ∞ for classicality or 1 for classical probability on a fundamental level.

What separates these different convex sets are these topological obstructions, such as the indices given by the Kirwan polytope. The distinction between entanglements is also given by these topological indices or obstructions. How these determine a measurement outcome, or the ontology of an element of a decoherent sets is not decidable. This is where Gödel’s theorem enters in. A quantum measurement is a way that quantum information or qubits encode other qubits as Gödel numbers.

The prospect spacetime, or the entropy of spacetime via event horizon areas, is a condensate or large N-entanglement of quantum states then implies there is a connection between quantum computation and information accessible in spacetime configurations. These configurations may either be the Bekenstein bound S = kA/4ℓ_p^2, or quantum modified version S = kA/4ℓ_p^2 + quantum corrections. Then the quantum processing or quantum Church-Turing thesis is I think equivalent to the information processing of spacetime as black holes and maybe entire cosmologies.

These are exciting developments.

Sure. 


Bruno




LC


--
You received this message because you are subscribed to the Google Groups "Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email to everyth...@googlegroups.com.

Philip Thrift

unread,
Mar 10, 2020, 12:54:15 AM3/10/20
to Everything List


On Monday, March 9, 2020 at 7:52:00 PM UTC-5, Lawrence Crowell wrote:

I will have to write more if possible. I am not sure that all of physics is derived from Gödel’s theorem. I see is as more that from classical to quantum mechanics there is a sort of forcing, to borrow from set theory, to extend a model with undecidable propositions. Where this undecidable matter enters in is with the problem of measurement and decoherence.

 
There is nothing in any quantum mechanics theory that goes beyond a formulation in terms of a quantum Turing machine. 


Quantum Turing machines can be related to classical and probabilistic Turing machines in a framework based on transition matrices. That is, a matrix can be specified whose product with the matrix representing a classical or probabilistic machine provides the quantum probability matrix representing the quantum machine. This was shown by Lance Fortnow. [ https://arxiv.org/abs/quant-ph/0003035 ]

Actually it can all be reduced to the SKIP calculus.


SKIP: Probabilistic SKI combinator calculus


Add to the SK, and I combinators of the SKI combinator calculus the P combinator:

Sxyz = xz(yz)
Kxy = x
Ix = x
P = K or KI with equal probability (0.5)

It follows that Pxy evaluates to Kxy or KIxy, then to x or Iy = y with equal probability.

There is nothing "uncomputable" in any of this.

@philipthrift

Lawrence Crowell

unread,
Mar 10, 2020, 7:04:19 AM3/10/20
to Everything List
Except predicting the outcome of any quantum measurement in a deterministic manner.

LC 

Philip Thrift

unread,
Mar 10, 2020, 7:27:46 AM3/10/20
to Everything List
Hopelessly pursuing a theory for that is just a religious fantasy too many physicists believe in. It is a rabbit hole going to nowhere.

@philipthrift

Lawrence Crowell

unread,
Mar 10, 2020, 7:44:37 AM3/10/20
to Everything List
On Monday, March 9, 2020 at 4:57:24 AM UTC-5, Bruno Marchal wrote:
On 5 Mar 2020, at 12:42, ronaldheld <ronal...@gmail.com> wrote:

Any comments, especially from Bruno, and the Physicalists?

--
You received this message because you are subscribed to the Google Groups "Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email to everyth...@googlegroups.com.

From the perspective of a physicist who knows some things about Gödel’s theorem and even Löb’s theorem I think in one sense you use language or metaphysics that is a bit outside of science. Terms such as physicalists are not used, and materialism sometimes comes up and it is not clear to me how this deviates from the term physicalist.

The first of Gödel’s theorem comes in with looking at a list of observations of a quantum system, such as a list of probabilities on the abscissa and actual measurements on the ordinate. This can then be used to perform the Cantor diagonal trick, which flips the outcome, and this is then not predicable. The inability to predict the outcome of a particular measurement of a quantum system can then be expressed according to a Cantor diagonal argument. This then leads to a form of the incomputable nature of QM.

This then leads to the observation that measurements in QM require that if one is to measure a spin in the z direction this means any prior knowledge of spin in the x direction is to be lost. In general relativity there are also event horizons that restrict knowledge one can have of the quantum state of a black hole. Jacobson showed how spacetime can be viewed from statistical mechanics as composed of a distribution of states. An event horizon is also a surface of reduced dimension that has quantum information. Raamsdonk also illustrated how spacetime can be looked at as due to large N-entanglement. So the loss of knowledge of a quantum spin in one direction in a measurement along another is in a general setting much the same as the red shifting of information from n event horizon that restricts access to information.

This then suggests with the Cantor diagonalization that the relationship between stochasticity and its dual in determinism has an incomputable relationship between quantum and spacetime physics. For stochasticity a p = 1 in a convex set with a dual q = ∞ and 1/p + 1/q = 1 there is are associated L^2 systems for p = q = 2, or 1/p = 1/q = ½, which are relativity as a metric space and QM as a system of probabilities determined by the square of amplitudes.

LC

Bruno Marchal

unread,
Mar 10, 2020, 8:34:24 AM3/10/20
to everyth...@googlegroups.com
You can do that. But it is simpler and more general to define a computation by what a universal do, when given an input, or not. If P_i, represent the ith program in an complete enumeration of all programs in some Turing-universal system, and phi_i is the corresponding partial recursive function, or partial computable function, the computation can be described by the succession of steps brought by the universal machine which is able to run those programs. It is describable by phi_i(j)^s, where s number the steps. This works both for halting and non halting computations, and is not so far from the Bohm tree (which requires lambda calculus or the combinators). Church, like many logicians and mathematicians, was still too much focusing on extensional function, and not on the very process which compute them. This does not concern so much the measure problem (on all computations), because no machine can recognise if a computation run “its first person experience”, and that is why we need a more abstract approach, to get the measure by the probability logic, restricted to the computational state, weighted by the histories going through that state, without ever precising which state in particular is obtained. This works, as we get both the many-histories and the quantum logics on them.

Bruno





@philipthrift 

--
You received this message because you are subscribed to the Google Groups "Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email to everything-li...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/everything-list/543a9058-de71-4097-acdc-26abcf8ee078%40googlegroups.com.

Bruno Marchal

unread,
Mar 10, 2020, 8:50:32 AM3/10/20
to everyth...@googlegroups.com
On 10 Mar 2020, at 01:52, Lawrence Crowell <goldenfield...@gmail.com> wrote:

On Monday, March 9, 2020 at 6:42:41 AM UTC-5, Bruno Marchal wrote:

On 6 Mar 2020, at 12:40, Lawrence Crowell <goldenfield...@gmail.com> wrote:

Szangolies [ J. Szangolies, "Epistemic Horizons and the Foundations of Quantum Mechanics," https://arxiv.org/abs/1805.10668  ] works a form of the Cantor diagonalization for quantum measurements. As yet a full up form of the CHSH or Bell inequality violation result is waiting. There are exciting possibilities for connections between quantum mechanics, in particular the subject of quantum decoherence and measurement, and Gödel’s theorem.

The Digital Mechanist thesis enforces that physics is derivable from Gödel’s and Löbs’ theorem, and indeed we find quantum logic exactly were expected. 

All computations are executed/emulated, in the mathematical sense of the Logicians, in arithmetic. The physical appearances have to be justified by the calculus of the 1p (plural) indeterminacy in arithmetic. 

There is a natural, canonical “many-wold” interpretation of arithmetic, developed  by the “majority of universal numbers in arithmetic.



I will have to write more if possible. I am not sure that all of physics is derived from Gödel’s theorem.


That might not be the case, but it has to be so when we assume the digital mechanist hypothesis? Of course it is not a direct derivation from Gödel, but from all nuances that incompleteness impose to the provability notion. Incompleteness gives sense to the Theatetus-like variant of the rational opinion/belief, namely:

TRUTH (p, God, the One, …)
BELIEF ([]p, provability, Incompleteness forbids to see this as a  knowledge)
KNOWLEDGE ([]p & p, true belief, the soul, the first person, the owner of consciousness)

And the “two matters”:

INTELLIGIBLE MATTER ([]p & <>t) (with p restricted to sigma_1 propositions, the partial computable one)
SENSIBLE MATTER ([]p & <>t & p) (idem)

The soul provides an intuitionist logic, the two matters and the soul provides quantum logics when p is restricted on the partial computable (sigma_1) propositions (the true and the false one, which makes things more complex).

See may papers for more on this. This needs some understanding of the existence of all computations in the models of arithmetic).






I see is as more that from classical to quantum mechanics there is a sort of forcing, to borrow from set theory, to extend a model with undecidable propositions. Where this undecidable matter enters in is with the problem of measurement and decoherence.

With mechanism, physics must become a study of the relative computational histories statistics.



As for an earlier comment, Turing's model is in a grey zone between mathematics often seen as pure and with physics. The tape and reader, appearing a bit like a sort of cart on a track, is a model of a physical system. That system is a computer.

A computer is universal implemented in the physical reality, but with mechanism, the physical reality is what emerge statistically from the first person points of view of the computer executed in the arithmetical reality (or in any reality related to any universal machine). I use numbers only because most people are familiar with them.

I sum up 40 years of work, in a taboo domain, and all this is build on not so well known, or understood,  theorem in mathematical logic, so ask any question, and maybe read also the papers.  The first thing to understand is the incompatibility between (very weak form of) Digital Mechanism and (very weak form of ) Materialism or Physicalism.

Bruno




LC
 


If we think of all physics as a form of convex sets of states, then there are dualisms of measures p and q that obey 1/p + 1/q = 1. For quantum mechanics this is p = ½ as an L^2 measure theory. It then has a corresponding q = ½ measure system that I think is spacetime physics. A straight probability system has p = 1, sum of probabilities as unity, and the corresponding q → ∞ has no measure or distribution system. This is any deterministic system, think completely localized, that can be a Turing machine, Conway's <i>Game of life</i> or classical mechanics. A quantum measurement is a transition between p = ½ for QM and ∞ for classicality or 1 for classical probability on a fundamental level.

What separates these different convex sets are these topological obstructions, such as the indices given by the Kirwan polytope. The distinction between entanglements is also given by these topological indices or obstructions. How these determine a measurement outcome, or the ontology of an element of a decoherent sets is not decidable. This is where Gödel’s theorem enters in. A quantum measurement is a way that quantum information or qubits encode other qubits as Gödel numbers.

The prospect spacetime, or the entropy of spacetime via event horizon areas, is a condensate or large N-entanglement of quantum states then implies there is a connection between quantum computation and information accessible in spacetime configurations. These configurations may either be the Bekenstein bound S = kA/4ℓ_p^2, or quantum modified version S = kA/4ℓ_p^2 + quantum corrections. Then the quantum processing or quantum Church-Turing thesis is I think equivalent to the information processing of spacetime as black holes and maybe entire cosmologies.

These are exciting developments.

Sure. 


Bruno




LC


--
You received this message because you are subscribed to the Google Groups "Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email to everyth...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/everything-list/0d684c57-3761-4867-8baf-5f2807d2af9f%40googlegroups.com.


--
You received this message because you are subscribed to the Google Groups "Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email to everything-li...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/everything-list/ab75db4c-a76b-4af5-afea-5c2416b8ccd0%40googlegroups.com.

Bruno Marchal

unread,
Mar 10, 2020, 8:55:36 AM3/10/20
to everyth...@googlegroups.com
Nor the simpler WM outcome in the simple classical  WM-self-duplication.

Arithmetic seen from inside might still look too much random, and if that is shown to be the case, Mechanism is refuted, but up to now, Nature confirms Mechanism Immaterialist and non computable predictions. Classical mechanics does contradict Mechanism, with intuitively from the thought experience, and formally, by the non booleanity of the logic of observation imposed by Mechanism.

Bruno




LC 

--
You received this message because you are subscribed to the Google Groups "Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email to everything-li...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/everything-list/f5f30902-a57e-47a2-9587-293757a503fd%40googlegroups.com.

Bruno Marchal

unread,
Mar 10, 2020, 9:34:20 AM3/10/20
to everyth...@googlegroups.com
My main point is to illustrate that when we assume the Mechanist hypothesis in the cognitive science, then the question of metaphysics/theology becomes problem in mathematics and physics, and, in particular, that the assomption of (Digital) Mechanism is empirically testable.

My interest is in the mind-body problem. What I claim/explain is that if we take seriously the digital mechanist thesis (in cognitive science, not in physics!) then the mind-body problem is reduced into the problem of deriving the physical laws entirely from computer-science/arithmetic/"machine-theology”. More on this later, probably.





Terms such as physicalists are not used, and materialism sometimes comes up and it is not clear to me how this deviates from the term physicalist.


Materialism is only a “naïve” version of phyicalism. You can equate them without problem, until we need, or not, to make a nuance.

Materialism is the belief in primary matter (the idea that there is really matter out there whose appearance cannot be explained without assuming its existence).

Physicalism is the belief that physics is the fundamental science, and here to, this means that we have to assume some physical theory, (but not necessarily some primary matter).

That describes basically the theological, and metaphysical paradigm since 1500 years, with of course some exceptions, but usually they are not well seen.

The antic greek used the Dream Argument to show that no experimentation at all can prove an ontological existence, be it a god or some Aristotelian primate matter. With the digital mechanism hypothesis, that “dream argument” and similar can be made rigorous.






The first of Gödel’s theorem comes in with looking at a list of observations of a quantum system, such as a list of probabilities on the abscissa and actual measurements on the ordinate.

There is some typo error? Gödel has never assumed nor study quantum mechanics. Bad tongues said that, when old, and living near Einstein, in Princeton, he was told by Einstein to not open his mouth on quantum Mechanics. That is why Gödel will only use General relativity to tickle Einstein…

The first theorem of Gödel is an easy consequence of the Church-Turing thesis (as Kleene saw, and Webb wrote an entire book around this). Somehow E. Post discovered this in the 1920s. Gödel’s proof avoid carefully both the notion of machine, and the notion of Truth.
Eventually, in 1936, Gödel will be OK with both Turing notion of machine, and Tarski’s notion of Truth. And understood them probably better than most people. Gödel understood that the thesis of Turing and Church, also proposed by ¨Post and Kleene, is just incredible. But he missed the digital mechanist of mind, and of course its immaterialist consequences. 





This can then be used to perform the Cantor diagonal trick, which flips the outcome, and this is then not predicable. The inability to predict the outcome of a particular measurement of a quantum system can then be expressed according to a Cantor diagonal argument. This then leads to a form of the incomputable nature of QM.


The non computability related to I-reciting an outcome does not comes from Gödel or Turing, which shows limitations in the long run, or all theories about any universal machine or being.

With mechanism, the non computability for an outcome comes from the first person indeterminacy, which is related partially to Gödel, even made clear by Penrose error and then correction, which is that Gödel’s theorem cannot be sued to show that we are not machine, but it can be used to show that no machine can know which machine she is, and still less which computations execute her, among an infinity of computations (which exists in the arithmetical reality).




This then leads to the observation that measurements in QM require that if one is to measure a spin in the z direction this means any prior knowledge of spin in the x direction is to be lost. In general relativity there are also event horizons that restrict knowledge one can have of the quantum state of a black hole. Jacobson showed how spacetime can be viewed from statistical mechanics as composed of a distribution of states. An event horizon is also a surface of reduced dimension that has quantum information. Raamsdonk also illustrated how spacetime can be looked at as due to large N-entanglement. So the loss of knowledge of a quantum spin in one direction in a measurement along another is in a general setting much the same as the red shifting of information from n event horizon that restricts access to information.


Intersting, and perhaps exploitable, for the mechanist mind body problem.





This then suggests with the Cantor diagonalization that the relationship between stochasticity and its dual in determinism has an incomputable relationship between quantum and spacetime physics. For stochasticity a p = 1 in a convex set with a dual q = ∞ and 1/p + 1/q = 1 there is are associated L^2 systems for p = q = 2, or 1/p = 1/q = ½, which are relativity as a metric space and QM as a system of probabilities determined by the square of amplitudes.


I am not sure of how to get this by the diagonal (in the style of Kleene, Cantor).  You might need to be more precise about your suggestion, at least for me. Keep also in mind that once we reduce the mind-body problem to the derivation of physics from arithmetic, we can no more assume anything in physics, nor even in analysis, or geometry. All those domain must be entirely justified from only, say,  Kxy = x and Sxyz = xz(yz), or by the laws ofd addition and multiplication on the natural numbers. In fact from any universal machinery, or from any universal machine “introspection”.

Bruno 




LC

--
You received this message because you are subscribed to the Google Groups "Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email to everything-li...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/everything-list/68bf4530-5d14-493c-90c2-bc4d232bfb32%40googlegroups.com.

Lawrence Crowell

unread,
Mar 10, 2020, 6:27:01 PM3/10/20
to Everything List
On Tuesday, March 10, 2020 at 7:50:32 AM UTC-5, Bruno Marchal wrote:

On 10 Mar 2020, at 01:52, Lawrence Crowell <goldenfield...@gmail.com> wrote:

On Monday, March 9, 2020 at 6:42:41 AM UTC-5, Bruno Marchal wrote:

On 6 Mar 2020, at 12:40, Lawrence Crowell <goldenfield...@gmail.com> wrote:

Szangolies [ J. Szangolies, "Epistemic Horizons and the Foundations of Quantum Mechanics," https://arxiv.org/abs/1805.10668  ] works a form of the Cantor diagonalization for quantum measurements. As yet a full up form of the CHSH or Bell inequality violation result is waiting. There are exciting possibilities for connections between quantum mechanics, in particular the subject of quantum decoherence and measurement, and Gödel’s theorem.

The Digital Mechanist thesis enforces that physics is derivable from Gödel’s and Löbs’ theorem, and indeed we find quantum logic exactly were expected. 

All computations are executed/emulated, in the mathematical sense of the Logicians, in arithmetic. The physical appearances have to be justified by the calculus of the 1p (plural) indeterminacy in arithmetic. 

There is a natural, canonical “many-wold” interpretation of arithmetic, developed  by the “majority of universal numbers in arithmetic.



I will have to write more if possible. I am not sure that all of physics is derived from Gödel’s theorem.


That might not be the case, but it has to be so when we assume the digital mechanist hypothesis? Of course it is not a direct derivation from Gödel, but from all nuances that incompleteness impose to the provability notion. Incompleteness gives sense to the Theatetus-like variant of the rational opinion/belief, namely:

TRUTH (p, God, the One, …)
BELIEF ([]p, provability, Incompleteness forbids to see this as a  knowledge)
KNOWLEDGE ([]p & p, true belief, the soul, the first person, the owner of consciousness)

And the “two matters”:

INTELLIGIBLE MATTER ([]p & <>t) (with p restricted to sigma_1 propositions, the partial computable one)
SENSIBLE MATTER ([]p & <>t & p) (idem)

The soul provides an intuitionist logic, the two matters and the soul provides quantum logics when p is restricted on the partial computable (sigma_1) propositions (the true and the false one, which makes things more complex).

See may papers for more on this. This needs some understanding of the existence of all computations in the models of arithmetic).






I see is as more that from classical to quantum mechanics there is a sort of forcing, to borrow from set theory, to extend a model with undecidable propositions. Where this undecidable matter enters in is with the problem of measurement and decoherence.

With mechanism, physics must become a study of the relative computational histories statistics.



As for an earlier comment, Turing's model is in a grey zone between mathematics often seen as pure and with physics. The tape and reader, appearing a bit like a sort of cart on a track, is a model of a physical system. That system is a computer.

A computer is universal implemented in the physical reality, but with mechanism, the physical reality is what emerge statistically from the first person points of view of the computer executed in the arithmetical reality (or in any reality related to any universal machine). I use numbers only because most people are familiar with them.

I sum up 40 years of work, in a taboo domain, and all this is build on not so well known, or understood,  theorem in mathematical logic, so ask any question, and maybe read also the papers.  The first thing to understand is the incompatibility between (very weak form of) Digital Mechanism and (very weak form of ) Materialism or Physicalism.

Bruno



I am not sure that all physics comes from this. My intent is more limited in showing that Gödel's theorem results in epistemic horizons or limitation on knowledge any observer can have. This then in general is a basis for uncertainty principle and limits with observing qubits with black holes. I am not sure this encompasses the entirety of physical principles. It does not for instance tell us why dynamics involves the time derivative of momentum or second order differentiation of position. 

LC
 


LC
 


If we think of all physics as a form of convex sets of states, then there are dualisms of measures p and q that obey 1/p + 1/q = 1. For quantum mechanics this is p = ½ as an L^2 measure theory. It then has a corresponding q = ½ measure system that I think is spacetime physics. A straight probability system has p = 1, sum of probabilities as unity, and the corresponding q → ∞ has no measure or distribution system. This is any deterministic system, think completely localized, that can be a Turing machine, Conway's <i>Game of life</i> or classical mechanics. A quantum measurement is a transition between p = ½ for QM and ∞ for classicality or 1 for classical probability on a fundamental level.

What separates these different convex sets are these topological obstructions, such as the indices given by the Kirwan polytope. The distinction between entanglements is also given by these topological indices or obstructions. How these determine a measurement outcome, or the ontology of an element of a decoherent sets is not decidable. This is where Gödel’s theorem enters in. A quantum measurement is a way that quantum information or qubits encode other qubits as Gödel numbers.

The prospect spacetime, or the entropy of spacetime via event horizon areas, is a condensate or large N-entanglement of quantum states then implies there is a connection between quantum computation and information accessible in spacetime configurations. These configurations may either be the Bekenstein bound S = kA/4ℓ_p^2, or quantum modified version S = kA/4ℓ_p^2 + quantum corrections. Then the quantum processing or quantum Church-Turing thesis is I think equivalent to the information processing of spacetime as black holes and maybe entire cosmologies.

These are exciting developments.

Sure. 


Bruno




LC


--
You received this message because you are subscribed to the Google Groups "Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email to everyth...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/everything-list/0d684c57-3761-4867-8baf-5f2807d2af9f%40googlegroups.com.


--
You received this message because you are subscribed to the Google Groups "Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email to everyth...@googlegroups.com.

Lawrence Crowell

unread,
Mar 10, 2020, 7:35:25 PM3/10/20
to Everything List
The core idea there is given by Szangolies in https://arxiv.org/abs/1805.10668 , where the details are therein. The goal is to derive the full CHSH theorem, which is a variant of 

Bells’ theorem on how QM violates classical inequalities. There is a polytope associated with CHSH, which I think bears a certain relationship to Kirwan polytopes for entanglement measures and Hamming distance.

The CHSH polytope is based on the relationship

I_{chsh} = A_1×B_1 + A_1×B_2 + A_2×B_1 - A_2×B_2,

for Alice and Bob experiments with two outcomes. This curiously is a type of metric that can be interpreted as pseudo-Euclidean. This is also a measure of entropy, for it may be expressed according to conditional probabilities. An arbitrary two-qubit state after Schmidt decomposition can always be written as

|ψ_n⟩ = c_0|n_+, n_+⟩ + c_1|n_−, n_−⟩.

We choose the measurement settings in the following way

A_1 = m_1·σ, A_2 =  m_2·σ,
B_1 = (1/√2)(m_1·σ + m_2·σ), B_2 = (1/√2)(m_1·σ − m_2·σ).

Here n,  m_1 and  m_2 are the unit vectors perpendicular to each other. Now find the expectation value of the CHSH operator in the state |ψ_n⟩. We get

⟨ψ_n|I_{chsh}|ψ_n⟩ = 2√2C.

The expectation of I then has this bound.

For measures along the x and z directions this polytope has 16 vertices. The Kirwan polytope is a pure z, or x if one prefers, lattice. This lattice is either a sublattice or an additional lattice in a 24-cell. The 24-cell In fact I think it is both. The 24-cell is the root space for the F4 exceptional group that is a stabilizer with G2 in E8.

I am in communications with Szangolies with respect to these problems. This appears to be a fascinating prospect, where Gödel’s theorem means there is no dynamics or computational determinism for how quantum states becomes classical or equivalently what a particular outcome is. Spacetime is a classical system, and as composed of large N-entanglements it is something that is emergent by entirely spontaneous means.

LC


 



LC

--
You received this message because you are subscribed to the Google Groups "Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email to everyth...@googlegroups.com.

Bruno Marchal

unread,
Mar 12, 2020, 11:43:03 AM3/12/20
to everyth...@googlegroups.com
I can explain that if we assume Mechanism, (Indexical Digital Mechanism, i.e, roughly ,The "yes Doctor” assumption (YD), + the Church-Turing (or Post-Kleene) thesis (CT).

YD is the assumption that there is a substitution level such that we can survive with an artificial brain/body (in a large sense of that expression)

CT is the thesis that the intuitively computable functions from N to N are the functions computable by the combinators (or the Turing machine, or the Lisp programs, or the quantum computer, or the game of life, etc. All those thesis are equivalent.

With this thesis, whatever you add in the ontology will be a trouble, as you will have to justify its influence in the selection of the computational histories in the arithmetical reality (which we need to just enunciate the Church-Turing thesis).




My intent is more limited in showing that Gödel's theorem results in epistemic horizons or limitation on knowledge any observer can have.


That is coherent with the fact that incompleteness justifies all the variants of “opinion/belief” by showing that, with p partial computable proposition (sigma_1)

1) it is true that p is equivalent with []p, and with []p & p (Theatetus), and with []p & <>t, and with []p & <>t & p. The true logic of provability (the modal logic G1* (the G*of Solovay with the interpretation in arithmetic restricted to the sigma_1 sentence, that’s the Universal dovetailer in arithmetic) proves those propositional equivalence. 

Yet,

2) G, the provable logic by th any self-referentially correct machine, does not prove those equivalence, only a few of them. The universal machine knows that she is universal, so she can prove p -> []p (she proves her own sigma_1 completeness), but she can still not prove []p -> p. Obviously she would prove []f -> f, and thus ~[]f, and thus <>t (her consistency!) contradicting Gödel’s second theorem.

So we have 5 variants, which are actually 8, as 3 of those self-reference modal logics splits in two along the difference between G and G*. 

 p
 []p
 []p & p
 []p & <>t
 []p & <>t & p

The three last one provides close but distinct quantum logics, and depending which one is closer to the observable/inferable logic of observation will measure a sort of degree of “idealism”. 

If they all violate the logic inferred from Nature, that becomes an experimental evidence that either Mechanism is false (or we are in a simulation made for failing us, but that is a bit too much conspiratorial for me!, it is just that this cannot be excluded logically.




This then in general is a basis for uncertainty principle and limits with observing qubits with black holes.

The problem with my approach, and mechanism, is that there is only way to find out, which is too extract all physical notions from the universal machine introspection, so that we can compare. And here, there is a lot of work to do, because the mathematical problem are huge. It is already a miracle that this “theology” is decidable at the propositional level, but it can been shown highly undecidable at the first order level (and the contrary would have been quite astonishing).

So, we have not yet space (although the shadow of braids, and inserting projective algebra hides in the graded variants of the material modes: []^n p & <>^m t (m > n), and their quantisation ([]<>p).

Finkelstein gives indices that the “right quantum logic” will gives not just (Everett) quantum mechanics (and the Gleason measure) but also General Relativity. But that can take a millennium, double so when the guy just saying that we can already listen to the machine (and that’s what Gödel, Löb, Visser, ..; Solovay did) is not much listen too. 




I am not sure this encompasses the entirety of physical principles.

This means you have not meditate enough on the “Universal Dovetailer Paradox”, or “Argument”. Wit mechanism, the physical has tp be given by a measure on all computations/histories going through our states. I have believed that this would easily leads to far too much histories, and be quickly refuted, but the math shows only a sort of quantum weirdness, so Digital Mechanism is not yet refuted. 




It does not for instance tell us why dynamics involves the time derivative of momentum or second order differentiation of position. 


I guess that information will be obtained by group theory, at some point. To get the measure right, the core of the physical reality has to be a very rich “symmetrical” object, and NOT Turing universal. To get the derivative and the integral right will need time, space and patience. Just reminding the mind body problem, or taking it out of below the rug requires a lot of patience and energy. BTW, energy remains a bit of a mystery too, with mechanism (and probably without too).

Bruno





Reply all
Reply to author
Forward
0 new messages