Rummaging through the archives, I realized that a highly relevant article by Marcus Hutter
apparently has not yet been discussed on this list, although many have downloaded it:
A Complete Theory of Everything (Will Be Subjective)
Algorithms 2010, 3(4), 329-350; doi:10.3390/a3040329
Part of the Special Issue
"Algorithmic Complexity in Physics & Embedded Artificial Intelligences"
In Memoriam Ray Solomonoff (1926-2009)
http://www.mdpi.com/1999-4893/3/4/329/
Abstract: Increasingly encompassing models have been suggested for our world. Theories
range from generally accepted to increasingly speculative to apparently bogus. The
progression of theories from ego- to geo- to helio-centric models to universe and multiverse
theories and beyond was accompanied by a dramatic increase in the sizes of the postulated
worlds, with humans being expelled from their center to ever more remote and random
locations. Rather than leading to a true theory of everything, this trend faces a turning point
after which the predictive power of such theories decreases (actually to zero). Incorporating
the location and other capacities of the observer into such theories avoids this problem
and allows to distinguish meaningful from predictively meaningless theories. This also
leads to a truly complete theory of everything consisting of a (conventional objective)
theory of everything plus a (novel subjective) observer process. The observer localization
is neither based on the controversial anthropic principle, nor has it anything to do with
the quantum-mechanical observation process. The suggested principle is extended to more
practical (partial, approximate, probabilistic, parametric) world models (rather than theories
of everything). Finally, I provide a justification of Ockham�s razor, and criticize the anthropic
principle, the doomsday argument, the no free lunch theorem, and the falsifiability dogma.
Keywords: world models; observer localization; predictive power; Ockham�s razor;
universal theories; inductive reasoning; simplicity and complexity; universal self-sampling;
no-free-lunch; computability
Remarkably, Prof. Hutter holds doctoral degrees in both physics and computer science,
where he made fundamental contributions.
--
You received this message because you are subscribed to the Google Groups "Everything List" group.
To post to this group, send email to everyth...@googlegroups.com.
To unsubscribe from this group, send email to everything-li...@googlegroups.com.
For more options, visit this group at http://groups.google.com/group/everything-list?hl=en.
Rummaging through the archives, I realized that a highly relevant article by Marcus HutterHighly relevant indeed. He states in his summary "I have demonstrated that a theory that perfectly describes our universe or multiverse,
apparently has not yet been discussed on this list, although many have downloaded it:
A Complete Theory of Everything (Will Be Subjective)
Algorithms 2010, 3(4), 329-350; doi:10.3390/a3040329
Part of the Special Issue
"Algorithmic Complexity in Physics & Embedded Artificial Intelligences"
In Memoriam Ray Solomonoff (1926-2009)
http://www.mdpi.com/1999-4893/3/4/329/
Abstract: Increasingly encompassing models have been suggested for our world. Theories
range from generally accepted to increasingly speculative to apparently bogus. The
progression of theories from ego- to geo- to helio-centric models to universe and multiverse
theories and beyond was accompanied by a dramatic increase in the sizes of the postulated
worlds, with humans being expelled from their center to ever more remote and random
locations. Rather than leading to a true theory of everything, this trend faces a turning point
after which the predictive power of such theories decreases (actually to zero). Incorporating
the location and other capacities of the observer into such theories avoids this problem
and allows to distinguish meaningful from predictively meaningless theories. This also
leads to a truly complete theory of everything consisting of a (conventional objective)
theory of everything plus a (novel subjective) observer process. The observer localization
is neither based on the controversial anthropic principle, nor has it anything to do with
the quantum-mechanical observation process. The suggested principle is extended to more
practical (partial, approximate, probabilistic, parametric) world models (rather than theories
of everything). Finally, I provide a justification of Ockham’s razor, and criticize the anthropic
principle, the doomsday argument, the no free lunch theorem, and the falsifiability dogma.
Keywords: world models; observer localization; predictive power; Ockham’s razor;
universal theories; inductive reasoning; simplicity and complexity; universal self-sampling;
no-free-lunch; computability
Remarkably, Prof. Hutter holds doctoral degrees in both physics and computer science,
where he made fundamental contributions.
--
You received this message because you are subscribed to the Google Groups "Everything List" group.
To post to this group, send email to everyth...@googlegroups.com.
To unsubscribe from this group, send email to everything-li...@googlegroups.com.
For more options, visit this group at http://groups.google.com/group/everything-list?hl=en.
You are right, the "theory of nothing" is an old hat, published in the 1990s. Hutter's new contribution is the observer localization: how many bits are necessary to identify the observer's location in the multiverse? Random locations cost many bits. Non-random ones are much more likely and therefore more predictive in the Bayesian framework. This "allows to distinguish meaningful from predictively meaningless theories".
Yes, he is a little biased by the Kolmogorov complexity result that
random strings are the most complex. The problem is in treating all
random bits as significant, when usually they aren't. I argue (On
complexity & emergence, 2001) that the relevant complexity measure
takes into account the significance of the information to the
observer, which really means that random strings are amongst the
simplest.
Of course this touches on your point that one cannot localise an
observer to a particular bitstring, but rather to all consistent
strings (that have the same meaning to the observer).
I would say this is my main significant departure with Schmidhuber and Hutter.
--
----------------------------------------------------------------------------
Prof Russell Standish Phone 0425 253119 (mobile)
Mathematics
UNSW SYDNEY 2052 hpc...@hpcoders.com.au
Australia http://www.hpcoders.com.au
----------------------------------------------------------------------------
You are right. This is old stuff already refuted in this list a long
time ago by different people.
Then, as you say, we are obliged to restrict the measure on the
relative consistent extensions of the strings/theories/bodies, and
this forces us to take the self-referential logics into account,
together with their POV variants. This lead to the G* theory, and the
G/G* splitting, and their corresponding POV variants splittings, so
that we get a theory of qualia, with quanta as a particular sharable
case, making the whole theory testable, and rather well confirmed up
to now. Indeed most quantum weirdness (indeterminacy, non locality,
non cloning, etc.) are easy consequences of the comp assumption (and
even of much weaker similar assumptions).
The problem of Schmidhuber, Hutter and also Tegmark is that they
continue to hide or trivialize the person's consciousness, if not the
person per se, under the rug. They do use a more Platonic framework
than most materialists, which makes their work being a sort of
progress compared to purely physicalist approaches, but, like most
physicists, they are still blinded by the Aristotelian frames which
they continue to take for granted.
They missed the first person indeterminacy, which makes their position
hard to maintain. Not only they miss the qualia, but they have an
inconsistent (with comp) theory of quanta.
Such Aristotelian materialism/naturalism has been probably the root of
the apparent unsolvability of the mind body problem. Nagel says that
some 'revolution' is needed to solve the mind-body problem.
Computationalism shows that such a revolution is only a coming back to
Plato and Plotinus' type of conception of reality. This announces a
coming back of theology, soon or later, in the scientific curriculum.
A good thing, I think, because the lack of modesty and rigor in
theology is responsible for a lot of unnecessary suffering on this
planet. Authoritative arguments, conscious one or not, are bad in
*all* fields, and even worst as much as the fields are concerned with
fundamental questions.