Hutter's article on a complete theory of everything

36 views
Skip to first unread message

Digital Physics

unread,
Mar 11, 2011, 4:39:35 AM3/11/11
to everyth...@googlegroups.com
Rummaging through the archives, I realized that a highly relevant article by Marcus Hutter
apparently has not yet been discussed on this list, although many have downloaded it:

A Complete Theory of Everything (Will Be Subjective)
Algorithms 2010, 3(4), 329-350; doi:10.3390/a3040329
Part of the Special Issue
"Algorithmic Complexity in Physics & Embedded Artificial Intelligences"
In Memoriam Ray Solomonoff (1926-2009)

http://www.mdpi.com/1999-4893/3/4/329/

Abstract: Increasingly encompassing models have been suggested for our world. Theories
range from generally accepted to increasingly speculative to apparently bogus. The
progression of theories from ego- to geo- to helio-centric models to universe and multiverse
theories and beyond was accompanied by a dramatic increase in the sizes of the postulated
worlds, with humans being expelled from their center to ever more remote and random
locations. Rather than leading to a true theory of everything, this trend faces a turning point
after which the predictive power of such theories decreases (actually to zero). Incorporating
the location and other capacities of the observer into such theories avoids this problem
and allows to distinguish meaningful from predictively meaningless theories. This also
leads to a truly complete theory of everything consisting of a (conventional objective)
theory of everything plus a (novel subjective) observer process. The observer localization
is neither based on the controversial anthropic principle, nor has it anything to do with
the quantum-mechanical observation process. The suggested principle is extended to more
practical (partial, approximate, probabilistic, parametric) world models (rather than theories
of everything). Finally, I provide a justification of Ockham’s razor, and criticize the anthropic
principle, the doomsday argument, the no free lunch theorem, and the falsifiability dogma.

Keywords: world models; observer localization; predictive power; Ockham’s razor;
universal theories; inductive reasoning; simplicity and complexity; universal self-sampling;
no-free-lunch; computability

Remarkably, Prof. Hutter holds doctoral degrees in both physics and computer science,
where he made fundamental contributions.

Andrew Soltau

unread,
Mar 11, 2011, 4:26:59 PM3/11/11
to everyth...@googlegroups.com
On 11/03/11 09:39, Digital Physics wrote:
Rummaging through the archives, I realized that a highly relevant article by Marcus Hutter
apparently has not yet been discussed on this list, although many have downloaded it:
Highly relevant indeed. He states in his summary "I have demonstrated that a theory that perfectly describes our universe or multiverse,
rather than being a Theory of Everything (ToE), might also be a theory of nothing." just as Russell maintains. " The collection of all possible descriptions has zero complexity, or information content. ... There is a mathematical equivalence between the Everything, as represented by this collection of all possible descriptions and Nothing, a state of no information." (2006, p. 5)

A Complete Theory of Everything (Will Be Subjective)
Algorithms 2010, 3(4), 329-350; doi:10.3390/a3040329
Part of the Special Issue
"Algorithmic Complexity in Physics & Embedded Artificial Intelligences"
In Memoriam Ray Solomonoff (1926-2009)

http://www.mdpi.com/1999-4893/3/4/329/

Abstract: Increasingly encompassing models have been suggested for our world. Theories
range from generally accepted to increasingly speculative to apparently bogus. The
progression of theories from ego- to geo- to helio-centric models to universe and multiverse
theories and beyond was accompanied by a dramatic increase in the sizes of the postulated
worlds, with humans being expelled from their center to ever more remote and random
locations. Rather than leading to a true theory of everything, this trend faces a turning point
after which the predictive power of such theories decreases (actually to zero). Incorporating
the location and other capacities of the observer into such theories avoids this problem
and allows to distinguish meaningful from predictively meaningless theories. This also
leads to a truly complete theory of everything consisting of a (conventional objective)
theory of everything plus a (novel subjective) observer process. The observer localization
is neither based on the controversial anthropic principle, nor has it anything to do with
the quantum-mechanical observation process. The suggested principle is extended to more
practical (partial, approximate, probabilistic, parametric) world models (rather than theories
of everything). Finally, I provide a justification of Ockham�s razor, and criticize the anthropic

principle, the doomsday argument, the no free lunch theorem, and the falsifiability dogma.

Keywords: world models; observer localization; predictive power; Ockham�s razor;

universal theories; inductive reasoning; simplicity and complexity; universal self-sampling;
no-free-lunch; computability

Remarkably, Prof. Hutter holds doctoral degrees in both physics and computer science,
where he made fundamental contributions.

--
You received this message because you are subscribed to the Google Groups "Everything List" group.
To post to this group, send email to everyth...@googlegroups.com.
To unsubscribe from this group, send email to everything-li...@googlegroups.com.
For more options, visit this group at http://groups.google.com/group/everything-list?hl=en.

Digital Physics

unread,
Mar 16, 2011, 4:53:09 AM3/16/11
to everyth...@googlegroups.com
You are right, the "theory of nothing" is an old hat, published in the 1990s. Hutter's new contribution is the observer localization: how many bits are necessary to identify the observer's location in the multiverse? Random locations cost many bits. Non-random ones are much more likely and therefore more predictive in the Bayesian framework. This "allows to distinguish meaningful from predictively meaningless theories".


Date: Fri, 11 Mar 2011 21:26:59 +0000
From: andrew...@gmail.com
To: everyth...@googlegroups.com
Subject: Re: Hutter's article on a complete theory of everything

On 11/03/11 09:39, Digital Physics wrote: Send
Rummaging through the archives, I realized that a highly relevant article by Marcus Hutter
apparently has not yet been discussed on this list, although many have downloaded it:
Highly relevant indeed. He states in his summary "I have demonstrated that a theory that perfectly describes our universe or multiverse,
rather than being a Theory of Everything (ToE), might also be a theory of nothing." just as Russell maintains. " The collection of all possible descriptions has zero complexity, or information content. ... There is a mathematical equivalence between the Everything, as represented by this collection of all possible descriptions and Nothing, a state of no information." (2006, p. 5)

A Complete Theory of Everything (Will Be Subjective)
Algorithms 2010, 3(4), 329-350; doi:10.3390/a3040329
Part of the Special Issue
"Algorithmic Complexity in Physics & Embedded Artificial Intelligences"
In Memoriam Ray Solomonoff (1926-2009)

http://www.mdpi.com/1999-4893/3/4/329/

Abstract: Increasingly encompassing models have been suggested for our world. Theories
range from generally accepted to increasingly speculative to apparently bogus. The
progression of theories from ego- to geo- to helio-centric models to universe and multiverse
theories and beyond was accompanied by a dramatic increase in the sizes of the postulated
worlds, with humans being expelled from their center to ever more remote and random
locations. Rather than leading to a true theory of everything, this trend faces a turning point
after which the predictive power of such theories decreases (actually to zero). Incorporating
the location and other capacities of the observer into such theories avoids this problem
and allows to distinguish meaningful from predictively meaningless theories. This also
leads to a truly complete theory of everything consisting of a (conventional objective)
theory of everything plus a (novel subjective) observer process. The observer localization
is neither based on the controversial anthropic principle, nor has it anything to do with
the quantum-mechanical observation process. The suggested principle is extended to more
practical (partial, approximate, probabilistic, parametric) world models (rather than theories
of everything). Finally, I provide a justification of Ockham’s razor, and criticize the anthropic

principle, the doomsday argument, the no free lunch theorem, and the falsifiability dogma.

Keywords: world models; observer localization; predictive power; Ockham’s razor;

universal theories; inductive reasoning; simplicity and complexity; universal self-sampling;
no-free-lunch; computability

Remarkably, Prof. Hutter holds doctoral degrees in both physics and computer science,
where he made fundamental contributions.

--
You received this message because you are subscribed to the Google Groups "Everything List" group.
To post to this group, send email to everyth...@googlegroups.com.
To unsubscribe from this group, send email to everything-li...@googlegroups.com.
For more options, visit this group at http://groups.google.com/group/everything-list?hl=en.

Bruno Marchal

unread,
Mar 16, 2011, 1:03:12 PM3/16/11
to everyth...@googlegroups.com
On 16 Mar 2011, at 09:53, Digital Physics wrote:

You are right, the "theory of nothing" is an old hat, published in the 1990s. Hutter's new contribution is the observer localization: how many bits are necessary to identify the observer's location in the multiverse? Random locations cost many bits. Non-random ones are much more likely and therefore more predictive in the Bayesian framework. This "allows to distinguish meaningful from predictively meaningless theories".

Marcus Hutter, like Schmidhuber, seems to be not aware of the mind body problem, notably as formulated in the computationalist theory of mind. His self-localization used implicitly the mind-brain identity thesis, which cannot be sustained in any computationalist framework. I already give you the references on sane04. 
Please don't hesitate to ask questions if you find something unclear in it. 
You will understand that the notion of self-localization of "myself" in a universe or a multiverse does not make sense, except in a precise *relative* way which needs the use of the classical theory of knowledge of Plato (Theaetetus), and its translation into arithmetic (something which can be done by using a technic due to Gödel) to be defined precisely. This makes computationalism predictive and falsifiable. The key idea consists in distinguishing the logics of the first person views (and the first person plural views) from the third person views.

Bruno

Russell Standish

unread,
Mar 16, 2011, 5:40:17 PM3/16/11
to everyth...@googlegroups.com

Yes, he is a little biased by the Kolmogorov complexity result that
random strings are the most complex. The problem is in treating all
random bits as significant, when usually they aren't. I argue (On
complexity & emergence, 2001) that the relevant complexity measure
takes into account the significance of the information to the
observer, which really means that random strings are amongst the
simplest.

Of course this touches on your point that one cannot localise an
observer to a particular bitstring, but rather to all consistent
strings (that have the same meaning to the observer).

I would say this is my main significant departure with Schmidhuber and Hutter.

--

----------------------------------------------------------------------------
Prof Russell Standish Phone 0425 253119 (mobile)
Mathematics
UNSW SYDNEY 2052 hpc...@hpcoders.com.au
Australia http://www.hpcoders.com.au
----------------------------------------------------------------------------

Bruno Marchal

unread,
Mar 17, 2011, 8:39:34 AM3/17/11
to everyth...@googlegroups.com

You are right. This is old stuff already refuted in this list a long
time ago by different people.
Then, as you say, we are obliged to restrict the measure on the
relative consistent extensions of the strings/theories/bodies, and
this forces us to take the self-referential logics into account,
together with their POV variants. This lead to the G* theory, and the
G/G* splitting, and their corresponding POV variants splittings, so
that we get a theory of qualia, with quanta as a particular sharable
case, making the whole theory testable, and rather well confirmed up
to now. Indeed most quantum weirdness (indeterminacy, non locality,
non cloning, etc.) are easy consequences of the comp assumption (and
even of much weaker similar assumptions).

The problem of Schmidhuber, Hutter and also Tegmark is that they
continue to hide or trivialize the person's consciousness, if not the
person per se, under the rug. They do use a more Platonic framework
than most materialists, which makes their work being a sort of
progress compared to purely physicalist approaches, but, like most
physicists, they are still blinded by the Aristotelian frames which
they continue to take for granted.
They missed the first person indeterminacy, which makes their position
hard to maintain. Not only they miss the qualia, but they have an
inconsistent (with comp) theory of quanta.

Such Aristotelian materialism/naturalism has been probably the root of
the apparent unsolvability of the mind body problem. Nagel says that
some 'revolution' is needed to solve the mind-body problem.
Computationalism shows that such a revolution is only a coming back to
Plato and Plotinus' type of conception of reality. This announces a
coming back of theology, soon or later, in the scientific curriculum.
A good thing, I think, because the lack of modesty and rigor in
theology is responsible for a lot of unnecessary suffering on this
planet. Authoritative arguments, conscious one or not, are bad in
*all* fields, and even worst as much as the fields are concerned with
fundamental questions.

Bruno
http://iridia.ulb.ac.be/~marchal/

Reply all
Reply to author
Forward
0 new messages