Is there a "continuous" extension of Kolmogorov complexity

15 views
Skip to first unread message

Aruhan Borzigin

unread,
May 19, 2017, 5:48:22 AM5/19/17
to MAGIC
Hi,

K-complexity is defined for a string with finite alphabet. As title says. I wonder if there's a "natural" way to measure complexity of discrete/continuous time series? (Just like functional analysis is a "continuous" extension of linear algebra)

To make such definitions, one probably need to change notation "Turing Machine" into "Dynamical System", but I'm not sure what would be a match for "Universal Turing Machine".

Is there any well established / promising theoretical results on this? I also posted a similar question on SE.

Potential applications includes making complexity measure as regularizer for Recurrent Neural Networks. It would be interesting to see if performing gradient descent on such cost function can approximate Solomonoff Induction.

Thanks.

Laurent

unread,
May 19, 2017, 9:28:38 AM5/19/17
to magic...@googlegroups.com
Hi Aruhan,

I'm not sure it is necessary to do that actually, since Turing machines can not only simulate any (computable) dynamical system with arbitrary precision, but they can also of course do symbolic computations (like say, Mathematica or Coq).

Furthermore, you want to be able to compute the complexity of a sequence of /observations/, which themselves will have finite precision (like the weights of a NN), encoded in a finite binary string.

Or maybe what you want is a *differentiable* version of K?


--
Before posting, please read this: https://groups.google.com/forum/#!topic/magic-list/_nC7PGmCAE4
---
You received this message because you are subscribed to the Google Groups "MAGIC" group.
To unsubscribe from this group and stop receiving emails from it, send an email to magic-list+unsubscribe@googlegroups.com.
To post to this group, send email to magic...@googlegroups.com.
Visit this group at https://groups.google.com/group/magic-list.

Aruhan Borzigin

unread,
May 19, 2017, 9:32:36 PM5/19/17
to MAGIC
Hi, Laurent

Thanks for the reply.

> Or maybe what you want is a *differentiable* version of K?

Exactly. For NNs, or general dynamical systems, in theory, we should assume their parameters having infinite precision. (Or perhaps a PDF with infinitely precise parameter, so the total information is still finite.) I have always thought performing program search on a discrete space of TMs is very difficult to analysis in theory, or building practical approximations. Introducing a "natural" continuous definition may allow certain theoretical analysis easier. Certain things may also behave differently in continuous domain.
To unsubscribe from this group and stop receiving emails from it, send an email to magic-list+...@googlegroups.com.

Aruhan Borzigin

unread,
May 19, 2017, 10:23:02 PM5/19/17
to MAGIC
For an intuitive example:

Consider the two Mathematica expression:

expr1 = Total[Table[Sin[Exp[t] x] Exp[-t], {t, 1, Infinity}]];
expr2
= x^2 + 0.1743557654 (*very long incompressible digit string*);

Of course K complexity of expr1 is much smaller than expr2. However, if we plot the two expressions, the 1st graph just "looks" much more complex than the 2nd one. This example has problems such as machine precision is limited, but I get the feeling the K complexity on TM is sometimes "unnatural" or too "artificial".


On Friday, May 19, 2017 at 9:28:38 PM UTC+8, Laurent Orseau wrote:
To unsubscribe from this group and stop receiving emails from it, send an email to magic-list+...@googlegroups.com.

Vaibhav Gavane

unread,
May 19, 2017, 11:13:07 PM5/19/17
to magic...@googlegroups.com
Aruhan,

If you (hypothetically) "look" at the plot of y = 1 + \Omega, where
\Omega is Chaitin's halting probability, even that would "look"
extremely simple, even though K(\Omega) = \infinity.

The reason why *any* straight line graph "looks" simple is not because
our brain can accurately determine the exact equation of the line, but
because our brain constructs a *simple approximation* of the equation
of the line.
>>> <https://cstheory.stackexchange.com/questions/37990/how-to-extend-solomonoff-induction-to-continuous-domain>
>>>
>>> on SE.
>>>
>>> Potential applications includes making complexity measure as regularizer
>>>
>>> for Recurrent Neural Networks. It would be interesting to see if
>>> performing
>>> gradient descent on such cost function can approximate Solomonoff
>>> Induction.
>>>
>>> Thanks.
>>>
>>> --
>>> Before posting, please read this:
>>> https://groups.google.com/forum/#!topic/magic-list/_nC7PGmCAE4
>>> ---
>>> You received this message because you are subscribed to the Google Groups
>>>
>>> "MAGIC" group.
>>> To unsubscribe from this group and stop receiving emails from it, send an
>>>
>>> email to magic-list+...@googlegroups.com <javascript:>.
>>> To post to this group, send email to magic...@googlegroups.com
>>> <javascript:>.

Eray Ozkural

unread,
May 20, 2017, 4:55:05 AM5/20/17
to magic...@googlegroups.com
That ought to be trivial if you use computable reals instead of R.

There is no need to use R in any scientific problem anyhow.

Regards,

--
Before posting, please read this: https://groups.google.com/forum/#!topic/magic-list/_nC7PGmCAE4
---
You received this message because you are subscribed to the Google Groups "MAGIC" group.
To unsubscribe from this group and stop receiving emails from it, send an email to magic-list+unsubscribe@googlegroups.com.
Reply all
Reply to author
Forward
0 new messages