Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Algorithmic Complexity and Randomness

1 view
Skip to first unread message

Eray Ozkural exa

unread,
May 15, 2004, 10:07:52 PM5/15/04
to
Greetings,

A random number with complexity n requires a program of complexity n
to generate it, a complexity-controlled pseudo random number
generator. Ed thinks it's standard dynamical systems stuff, like
LSFR's, simple 1 (or more) dimensional PRNG cellular automata... But I
think not, the complexity of random numbers we see in the nature are
overwhelming, and this must be attributed to program-size complexity.

Comments welcome,

--
Eray Ozkural

Ed Fredkin

unread,
May 16, 2004, 3:44:42 PM5/16/04
to
There are many cellular automata where it is possible to calculate the
proportion of f-cells within the past light cone of a particular cell. An
f-cell has the property that the state of the particular cell would be
different if the f-cell had a different state. Amazingly, this number
usually turns out to be 1/2 (for binary RUCA's). 1/2 is actually the
largest meaningful number since 2 changes to a binary state are equivalent
to no change. This can be thought of as saying that the state of a bit in
an evolved RUCA that contains n bits has an algorithmic complexity of n/2.
In other words, the algorithmic complexity is esssentially as large as is
mathematically possible. The reason is that even if every bit in the RUCA
affects the state of a particular bit, some cause it to be complemented
once, others twice, others 3 times, etc. It is reasonable to assume that
about half of the cells will affect the particular cell an even number of
times and half an odd number of times. Of course the state of the
particular binary cell is only changed if it is changed an odd number of
times. Basically it appears that there are no computational models that
generates better randomness than do many CA's.

So, it's hard to separate program from data. Simple CA's (with simple
programs) can generate the great complexity. In Kolmogorv - Solomonoff
complexity, it's obviously not just the size of the program, it's the size
of the package of both program and data. What Turing and Church taught us
is that a complex program can be emulated by a simple program with data.

Ed F

"Eray Ozkural exa" <er...@bilkent.edu.tr> wrote in message
news:fa69ae35.04051...@posting.google.com...

Eray Ozkural exa

unread,
May 17, 2004, 8:49:18 PM5/17/04
to
Hello Ed,

Thanks for the thoughtful explanation.

"Ed Fredkin" <edfr...@yahoo.com> wrote in message news:<10afdki...@news.supernews.com>...


> There are many cellular automata where it is possible to calculate the
> proportion of f-cells within the past light cone of a particular cell. An
> f-cell has the property that the state of the particular cell would be
> different if the f-cell had a different state. Amazingly, this number
> usually turns out to be 1/2 (for binary RUCA's). 1/2 is actually the
> largest meaningful number since 2 changes to a binary state are equivalent
> to no change. This can be thought of as saying that the state of a bit in
> an evolved RUCA that contains n bits has an algorithmic complexity of n/2.

I am a little perplexed. Is there a reference for this that a novice
in algorithmic complexity can digest? (I guess the keyword is
"evolved" here)

> In other words, the algorithmic complexity is esssentially as large as is
> mathematically possible. The reason is that even if every bit in the RUCA
> affects the state of a particular bit, some cause it to be complemented
> once, others twice, others 3 times, etc. It is reasonable to assume that
> about half of the cells will affect the particular cell an even number of
> times and half an odd number of times. Of course the state of the
> particular binary cell is only changed if it is changed an odd number of
> times. Basically it appears that there are no computational models that
> generates better randomness than do many CA's.

My point was that there must be a progression of randomness among UCAs
according to AIT. I found the below paper on the www, which perhaps
sets the connection:

http://www.cmi.univ-mrs.fr/~eforment/articles/cakolmo.ps

in which the authors define the complexity of a CA by the Kolmogorov
complexity of CA tables. More precisely they define a classification
parameter

K(x | l(x) ) + 1
kappa(x) = ------------------
l(x)

I haven't read the paper thoroughly, but the authors suggest that
low-complexity tables will be more regular CAs (naturally?), and
furthermore that their normalized parameter may do a good job at
distinguishing CAs. They do seem to analyze the connection between the
degree of randomness of the computation triangle and the complexity of
a CA table.

> So, it's hard to separate program from data. Simple CA's (with simple
> programs) can generate the great complexity. In Kolmogorv - Solomonoff
> complexity, it's obviously not just the size of the program, it's the size
> of the package of both program and data. What Turing and Church taught us
> is that a complex program can be emulated by a simple program with data.
>
> Ed F

I see, so we must talk about the algorithmic complexity of the entire
triangle of computation (and of course its bounded initial triangles).
Is that correct?

I would imagine that, up to step n, triangles of computations of UCA
with identical radii, could have distinct degrees of randomness.

Kind Regards,

--
Eray Ozkural

PS: More generally, we could define orders of randomness per step of
evolution. At any rate, I think I must first read that theoretical
paper fully to avoid error of pointless speculation. It is a very
interesting subject and I must thank you for the excellent post.

Ed Fredkin

unread,
May 17, 2004, 9:08:43 PM5/17/04
to

"Eray Ozkural exa" <er...@bilkent.edu.tr> wrote in message
news:fa69ae35.04051...@posting.google.com...
> Hello Ed,

>
> > So, it's hard to separate program from data. Simple CA's (with simple
> > programs) can generate the great complexity. In Kolmogorv - Solomonoff
> > complexity, it's obviously not just the size of the program, it's the
size
> > of the package of both program and data. What Turing and Church taught
us
> > is that a complex program can be emulated by a simple program with data.
> >
> > Ed F
>
> I see, so we must talk about the algorithmic complexity of the entire
> triangle of computation (and of course its bounded initial triangles).
> Is that correct?
>
> I would imagine that, up to step n, triangles of computations of UCA
> with identical radii, could have distinct degrees of randomness.
>
> Kind Regards,
>
> --
> Eray Ozkural
>
> PS: More generally, we could define orders of randomness per step of
> evolution. At any rate, I think I must first read that theoretical
> paper fully to avoid error of pointless speculation. It is a very
> interesting subject and I must thank you for the excellent post.
>
There is a much easier way to think about all this. The only kinds of CA's
we are considering are Universal CA's. What this means is that they are
computationally equivalent to ordinary computers. Now, if you are asked
"...what is the complexity of a computer?" the answer is, "...it can behave
in as complex a fashion as you like, considering the total number of
possible states." All you need to do is run any one of number of known
programs that fill and modify memory with chaotic data. Anyone who uses MS
Windows can vouch for that! In a similar fashion, the behavior of a UCA or
RUCA can be as complex as you like! The fact that there are initial
conditions that do not lead to complexity is not really relevant. It's like
worrying about the complexity of a computer when there is nothing (all
zeroes) in its memory.

Ed F

Ed F


Eray Ozkural exa

unread,
May 18, 2004, 6:00:04 AM5/18/04
to
"Ed Fredkin" <edfr...@yahoo.com> wrote in message news:<10aikvs...@news.supernews.com>...

> There is a much easier way to think about all this. The only kinds of CA's
> we are considering are Universal CA's. What this means is that they are
> computationally equivalent to ordinary computers. Now, if you are asked
> "...what is the complexity of a computer?" the answer is, "...it can behave
> in as complex a fashion as you like, considering the total number of
> possible states."

Yes, that's right. I would not assume there is a fundamental
difference between a UCA and a UTM, they are simply different models
of computation.

The complexity of a universal computer U is K(x) where x is the
program it runs, since the it is assumed to have system software and
hardware description that is constant. [I'm writing verbatim so that I
don't get things wrong]

> All you need to do is run any one of number of known
> programs that fill and modify memory with chaotic data. Anyone who uses MS
> Windows can vouch for that! In a similar fashion, the behavior of a UCA or
> RUCA can be as complex as you like! The fact that there are initial
> conditions that do not lead to complexity is not really relevant. It's like
> worrying about the complexity of a computer when there is nothing (all
> zeroes) in its memory.

I am no expert on UCAs, but I think it's rather difficult for a finite
computer, say like a human, to use only MS Windows, without gathering
entropy from the environment, to generate a document that is as
complex as he likes. That would seem to contradict a well known
incompleteness result in Kolmogorov complexity (the one Chaitin says
may be thought to be based on Berry's paradox).

Theorem 3 from Shen's lecture notes:
http://user.it.uu.se/~vorobyov/Courses/KC/2000/all.ps

There exists a constant c such that all the theorems of type "K(x)>n"
have n < c (he means of, course, in a formal axiomatic system).

If on the other hand, the human had access to a fair coin, a magical
statistical device of infinite complexity, then he could easily
generate maximally complex strings. But he has only a finite axiomatic
system at his disposal by our definition: himself. We don't let him
interact with the environment (neglecting that is normally
impossible).

All of this of course may turn out to be an unnecessarily complex, and
therefore a wrong way to think about computation. [*] Excuse me if
that is the case. [!]

I had imagined maybe we shouldn't be worried about the initial
conditions, but we could determine how fast a UCA gains complexity. (I
still haven't proved in my mind whether there are UCAs that can get
arbitrarily complex as you have told me.)

Best Regards,

--
Eray Ozkural

[*] This is like a bootstrapping problem for computational complexity,
it seems you can't add what isn't already there according to the above
incompleteness result: complexity is not created out of nothing, while
to all appearances we humans weave complexity from thin air, by
endless patterns of creativity.

[!] For an unnecessarily complex way to think about a problem
indicates a low standard of induction at work.

daniel B miller

unread,
May 18, 2004, 5:15:05 PM5/18/04
to
Eray Ozkural exa wrote:
> "Ed Fredkin" <edfr...@yahoo.com> wrote in message news:<10aikvs...@news.supernews.com>...
>
...>

> I am no expert on UCAs, but I think it's rather difficult for a finite
> computer, say like a human, to use only MS Windows, without gathering
> entropy from the environment, to generate a document that is as
> complex as he likes. That would seem to contradict a well known
> incompleteness result in Kolmogorov complexity (the one Chaitin says
> may be thought to be based on Berry's paradox).
>
...

>
> [*] This is like a bootstrapping problem for computational complexity,
> it seems you can't add what isn't already there according to the above
> incompleteness result: complexity is not created out of nothing, while
> to all appearances we humans weave complexity from thin air, by
> endless patterns of creativity.
>
...

my 2c: from a practical point of view, it doesn't take much initial data
to create a heck of a lot of complexity. Consider: if just 1000 bits
were able to specify the entire universe (rules, initial conditions, etc
-- and I mean compressed bits, ie the actual entropy) -- we would have
2**1000, or > 10**300 possible Universes. That is simply a
mind-boggling number.

That's why, in my gut, I can't see why we would ever need 'true'
randomness, or how we could possibly distinguish it from deterministic,
but chaotic, processes.

-dbm

ueb

unread,
May 19, 2004, 5:09:18 PM5/19/04
to
daniel B miller wrote:

> my 2c: from a practical point of view, it doesn't take much initial data
> to create a heck of a lot of complexity. Consider: if just 1000 bits
> were able to specify the entire universe (rules, initial conditions, etc
> -- and I mean compressed bits, ie the actual entropy) -- we would have
> 2**1000, or > 10**300 possible Universes. That is simply a
> mind-boggling number.

> That's why, in my gut, I can't see why we would ever need 'true'
> randomness, or how we could possibly distinguish it from deterministic,
> but chaotic, processes.

Humble question: Could at least one of these 2**1000 or 10**300 universes
behave like the real universe ?

Ulrich

Eray Ozkural exa

unread,
May 20, 2004, 8:07:22 PM5/20/04
to
ueb <Ulrich.B...@t-online.de> wrote in message news:<hceg8c...@Muse2.private.de>...

A good number of them would, it seems, but I'm not sure :) We can't
yet peek into those universes. :)

Cheers,

--
Eray Ozkural

ueb

unread,
May 22, 2004, 12:53:20 AM5/22/04
to
Eray Ozkural exa wrote:
> ueb wrote in message news:<hceg8c...@Muse2.private.de>...

That would mean, that quite simple rules with few data could lead
to the sophisticated rules of our universe. It seems a real challenge
to fathom that. Do you see as important it is to _exactly_ know the
valid rules including their degree of determinism ? Do you see too,
as important each humble realization is, and off topic by no means ?

Ulrich

Eray Ozkural exa

unread,
May 22, 2004, 10:53:49 AM5/22/04
to
ueb <Ulrich.B...@t-online.de> wrote in message news:<4cll8c...@Muse2.private.de>...

> Eray Ozkural exa wrote:
> >> Humble question: Could at least one of these 2**1000 or 10**300 universes
> >> behave like the real universe ?
>
> > A good number of them would, it seems, but I'm not sure :) We can't
> > yet peek into those universes. :)
>
> That would mean, that quite simple rules with few data could lead
> to the sophisticated rules of our universe. It seems a real challenge
> to fathom that. Do you see as important it is to _exactly_ know the
> valid rules including their degree of determinism ? Do you see too,
> as important each humble realization is, and off topic by no means ?

The honest answer is "I don't know", one needs a theoretical study of
the upper bounds of physical complexity that can be obtained with an
initial complexity of 1000 bits. My "engineer" guess is that it's too
low, by a few orders of magnitude. Of course, by all means, feel free
to debunk my estimate.

I meant, that this much complexity might behave *like* the real
universe, in that there might be relatively stable automata
(particles) and cosmic evolution, etc...

ueb

unread,
May 22, 2004, 11:44:08 PM5/22/04
to
Eray Ozkural exa wrote:
> ueb wrote in message news:<4cll8c...@Muse2.private.de>...

That may be nice and good. -
How will one explore it without knowing the actual rules ?

Ulrich

Tim Tyler

unread,
May 23, 2004, 3:40:01 AM5/23/04
to
ueb <Ulrich.B...@t-online.de> wrote or quoted:
> daniel B miller wrote:

> > my 2c: from a practical point of view, it doesn't take much initial data
> > to create a heck of a lot of complexity. Consider: if just 1000 bits
> > were able to specify the entire universe (rules, initial conditions, etc
> > -- and I mean compressed bits, ie the actual entropy) -- we would have
> > 2**1000, or > 10**300 possible Universes. That is simply a

> > mind-boggling number. [...]


>
> Humble question: Could at least one of these 2**1000 or 10**300 universes
> behave like the real universe ?

This is one of the theses explored on:

http://www.digitalphysics.org/Faq/#3.3

In particular they conjecture that some CA result in *every* finite
configuration being produced somewhere in the evolution from a
trivial set of initial conditions.

If a CA produces every finite sequence at some stage - then some of those
will be exact spatio-temporal representations of our universe.

Empirical tests suggest that there are CA for which the conjecture
is true - at least for small fields.

Whether there are CA for which it is true for *all* finite fields
is not yet known.
--
__________
|im |yler http://timtyler.org/ t...@tt1lock.org Remove lock to reply.

ueb

unread,
May 23, 2004, 8:25:45 PM5/23/04
to
Tim Tyler wrote:
> ueb wrote or quoted:
>> daniel B miller wrote:

> http://www.digitalphysics.org/Faq/#3.3

Thanks for the reference and the kind explanation.
May I call your's attention to results and realizations, which have
only seemingly nothing to do with your realm, and could indeed help
to clarify the context of these discrete models with the real world ?
http://home.t-online.de/home/Ulrich.Bruchholz/
It is nothing minor than the claim to know the fundamental rules of
even the real world. This claim is reasoned by results from numerical
simulations according to the Einstein-Maxwell equatione. These simulations
significantly indicate discrete solutions of these tensor equations,
in which the integration constants (from initial conditions) take on
particle numbers.
These numerical simulations should also be technically interesting
to you, for the methods. Moreover, if time & space are indeed
fundamentally discrete, such simulations could indicate it.

Ulrich Bruchholz
info at bruchholz minus acoustics dot de

PS: Actual sources are in the Linux package ./field9n.tgz .

0 new messages