Infinite entropy of universal prior and its implications for our universe

30 views
Skip to first unread message

Gabriel Leuenberger

unread,
May 23, 2025, 7:42:45 AMMay 23
to Algorithmic Information Theory
The Shannon entropy of the universal prior is infinite (Li Vitányi , Exercise 4.3.4). What are its implications for our reality? Here are two separate questions:

1. Since entropy is often defined as the 'expected surprisal', this infinite of the universal prior entropy may explain the rich diversity observed in nature, e.g., the many different, surprising, chemical reactions, or Earth's biodiversity, with its numerous lurking surprises. The Nature paper by Dingle, Camargo, & Louis, has relied on the universal prior to show that various, complex systems, ranging from the economies to biochemistry, will be biased toward simple outputs. But similarly, could one not also use the infinite entropy to explain the richness of nature, in addition to its simplicity bias?

2. We can assume our observations or our universe to be sampled from the universal prior, which may explain the relatively simple laws of physics as well as the relatively small initial size and small initial thermodynamic entropy of our universe. However, the infinite entropy of the universal prior means that the expected Kolmogorov complexity of our observations or of our universe should be infinite. Of course, this high entropy can be explained away by pointing out that it may just be due to the possbiliy of random noise in simple stochastic ToEs. But my problem here is: How can we correctly use AIT to show that the laws of physics and the initial conditions must be simple, in some way, despite the infite expected K?

Aram Ebtekar

unread,
May 24, 2025, 11:08:41 PMMay 24
to Gabriel Leuenberger, Algorithmic Information Theory
Those are plausible connections! Reasoning through them carefully remains a challenge.

1. High entropy helps, but probably isn't the full story behind what we think of as richness. Charles Bennett invented logical depth to account for this, and more recently I hear Sean Carroll has been thinking about "complexogenesis".

2. This seems tricky too. I think Cole Wyeth and Alex Altair are also interested in this question.

In general, I would argue that Kolmogorov complexity is more fundamental than Shannon entropy, and should be thought of as the "real meaning" of the word entropy; and also that expected values are only relevant when the law of large numbers applies (e.g., when you have a lot of independent samples). So instead of talking about expected surprisal, I might examine the probability of getting a high Kolmogorov complexity.


Aram Ebtekar



More information: http://www.hutter1.net/ait.htm
---
You received this message because you are subscribed to the Google Groups "Algorithmic Information Theory" group.
To unsubscribe from this group and stop receiving emails from it, send an email to ait0+uns...@googlegroups.com.
To view this discussion visit https://groups.google.com/d/msgid/ait0/c3ecfe8a-66d9-4020-a98d-41553bcfa2a1n%40googlegroups.com.
Reply all
Reply to author
Forward
0 new messages