Interesting quote about all that (and information)
--
You received this message because you are subscribed to the Google Groups "Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email to everything-li...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/everything-list/CA%2BBCJUgRo-xNors%2BWZbDVpboT3QwiHC_NS24_uQ9_QkiTd3fyQ%40mail.gmail.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/everything-list/1890352135.1654730.1705733200088%40mail1.libero.it.
"In some respects, information is a qualitatively different sort of entity from all others in
terms of which the physical sciences describe the world. It is not, for instance, a function
only of tensor fields on spacetime (as general relativity requires all physical quantities to
be), nor is it a quantum-mechanical observable.
But in other respects, information does resemble some entities that appear in laws of
physics: the theory of computation, and statistical mechanics, seem to refer directly to it
without regard to the specific media in which it is instantiated, just as conservation laws
do for the electromagnetic four-current or the energy-momentum tensor. We call that the
substrate-independence of information. Information can also be moved from one type of
medium to another while retaining all its properties qua information. We call this its
interoperability property; it is what makes human capabilities such as language and science
possible, as well as biological adaptations that use symbolic codes, such as the genetic
code.
Also, information is of the essence in preparation and measurement, both of which are
necessary for testing scientific theories. The output of a measurement is information; the
input of a preparation includes information, specifying an attribute with which a physical
system is to be prepared.
All these applications of information involve abstraction, in that one entity is represented
symbolically by another. But information is not abstract in the same sense as, say, the set
of all prime numbers, for it only exists when it is physically instantiated. So the laws
governing it, like those governing computation – but unlike those governing prime
numbers – are laws of physics. In this paper we conjecture what these laws are.
Also, despite being physical, information has a counter-factual character: an object in a
particular physical state cannot be said to carry information unless it could have been in a
different state. As Weaver (1949) put it, this word ‘information’ in communication theory relates not so much to what you *do* say, as to what you *could* say…." D.Deutsch, Constructor Theory, Arxiv
To view this discussion on the web visit https://groups.google.com/d/msgid/everything-list/bafbd2d4-e18f-4be3-93fd-ddf8145128ca%40gmail.com.
> The problem with this is that information, like complexity, has no physically definite operational meaning. You can't go into the lab and ask what's the information content of "this".
To view this discussion on the web visit https://groups.google.com/d/msgid/everything-list/1059517988.1741745.1705818185432%40mail1.libero.it.
> If I write "tamaontietoa" is it information or gibberish? Is it about something?
> All the science of information is about encoding and decoding; it is not only substrate independent, it is content independent.
> The problem with this is that information, like complexity, has no physically definite operational meaning. You can't go into the lab and ask what's the information content of "this".
In 1948 Claude Shannon gave us an operational definition of information, the amount of uncertainty reduced by a message, and it is measured in bits.
There is also a thermodynamic definition for information, the amount of entropy that is reduced in a given system, and it is also measured in bits. The two definitions work harmoniously together.
So if you know the encoding algorithm you can always determine how much information something has, or at least the maximum amount of information a message has the potential to hold. For example, we know from experiment that the human genome contains 3 billion base pairs, and we know there are 4 bases, so each base can represent 2 bits and there are 8 bits per byte; therefore the entire human genome only has the capacity to hold 750 MB of information; that's about the amount of information you could fit on an old-fashioned CD, not a DVD, just a CD. The true number must be considerably less than that because the human genome contains a huge amount of redundancy, 750 MB is just the upper bound. Incidentally that's why I now think the singularity is likely to happen sometime within the next 5 years, one year ago, before it became obvious that a computer had passed the Turing Test, I would've said 20 to 30 years.
I think we can be as certain as we can be certain of anything that it should be possible to build a seed AI that can grow from knowing nothing to being super-intelligent, and the recipe for building such a thing must be less than 750 MB, a LOT less.
After all Albert Einstein went from understanding precisely nothing in 1879 to being the first man to understand General Relativity in 1915,
and the human genome only contains 750 megs of information, and yet that is enough information to construct an entire human being not just a brain. So whatever algorithm Einstein used to extract information from his environment was, it must have been pretty simple, much much less than 750 megs. That's why I've been saying for years that super-intelligence could be achieved just by scaling things up, no new scientific discovery was needed, just better engineering; although I admit I was surprised how little scaling up turned out to be required.
--
You received this message because you are subscribed to the Google Groups "Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email to everything-li...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/everything-list/CAJPayv31LubK_sNn6tRspUwfjRqvOHtf2dTDcr%2B96xBAhQmkRQ%40mail.gmail.com.
On Sun, Jan 21, 2024 at 2:31 PM Brent Meeker <meeke...@gmail.com> wrote:
> If I write "tamaontietoa" is it information or gibberish? Is it about something?There's no reason it couldn't be both, Shannon would say it's definitely information,
but he doesn't care if that information contains a great profundity or is just gibberish because that is a matter of opinion and there is no disputing matters of taste. And if you're designing the technology to send messages through a fiber optic line mathematically you don't need to know what the messages will be saying or if the information in them is saying anything important, you just need to know how big it is, and Shannon can tell you that.
> All the science of information is about encoding and decoding; it is not only substrate independent, it is content independent.
Yes, and Shannon was the first one to realize that is not a bug, it's a feature. Imagine the chaos that would result if Internet routers had to understand the information and determine whether it was important or just gibberish before they could transmit it!
--
You received this message because you are subscribed to the Google Groups "Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email to everything-li...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/everything-list/CAJPayv3fwe6sBuxZ3pXiFeFfN4Od%3DsQyrvpwFkSWHJOLSNNeZw%40mail.gmail.com.
>No he wouldn't.>>> If I write "tamaontietoa" is it information or gibberish? Is it about something?>> There's no reason it couldn't be both, Shannon would say it's definitely information,
> Shannon information is relative to the possible messages.
Interesting quote about all that (and information)
Frank Wilczek: "Information is another dimensionless quantity that plays a large and increasing role in our description of the world. Many of the terms that arise naturally in discussions of information have a distinctly physical character. For example we commonly speak of density of information and flow of information. Going deeper, we find far-reaching analogies between information and (negative) entropy, as noted already in Shannon's original work. Nowadays many discussions of the microphysical origin of entropy, and of foundations of statistical mechanics in general, start from discussions of information and ignorance. I think it is fair to say that there has been a unification fusing the physical quantity (negative) entropy and the conceptual quantity information. A strong formal connection between entropy and action arises through the Euclidean, imaginary-time path integral formulation of partition functions. Indeed, in that framework the expectation value of the Euclideanized action essentially is the entropy. The identification of entropy with Euclideanized action has been used, among other things, to motivate an algebraically simple (but deeply mysterious "derivation" of black hole entropy. If one could motivate the imaginary-time path integral directly and insightfully, rather than indirectly through the apparatus of energy eigenvalues, Boltzmann factors, and so forth, then one would have progressed toward this general prediction of unification: Fundamental action principles, and thus the laws of physics, will be re-interpreted as statements about information and its transformations." http://arxiv.org/pdf/1503.07735v1.pdf
--Il 20/01/2024 01:10 +01 Jason Resch <jason...@gmail.com> ha scritto:I put together a short write up on the relationship between physics, information, and computation, drawing heavily from the work of Seth Lloyd and others:I thought it might be interesting to members of this list who often debate whether our reality is fundamentally computational/informational.Jason--
You received this message because you are subscribed to the Google Groups "Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email to everything-li...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/everything-list/CA%2BBCJUgRo-xNors%2BWZbDVpboT3QwiHC_NS24_uQ9_QkiTd3fyQ%40mail.gmail.com.
You received this message because you are subscribed to the Google Groups "Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email to everything-li...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/everything-list/1890352135.1654730.1705733200088%40mail1.libero.it.
On Sun, Jan 21, 2024 at 7:03 PM Brent Meeker <meeke...@gmail.com> wrote:
>No he wouldn't.>>> If I write "tamaontietoa" is it information or gibberish? Is it about something?>> There's no reason it couldn't be both, Shannon would say it's definitely information,
Of course Shannon would say "tamaontietoa" contains information, and he can even tell you how much. There are 26 letters in the alphabet so 5 bits, 2^5, is more than enough to specify a letter, there are 12 letters in your example so"tamaontietoa" contains 60 bits of information.
Personally I don't think the information that tamaontietoa contains is very interesting but that's just me, Shannon makes no value judgments.
> Shannon information is relative to the possible messages.Yes, and there are 26^12 possible 12 letter strings, and tamaontietoa is one of them.
21e
--
You received this message because you are subscribed to the Google Groups "Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email to everything-li...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/everything-list/CAJPayv1TNS3-cA5a%3DLtK6ofeRX3CBVATzvnRAvZRMJ2J_FH8Nw%40mail.gmail.com.
> His [Shannon's] measure of information is relative to a channel and depends on the counterfactual number of messages that could be sent. You're presuming that each letter could have been one of 25 other letters. But there are only seven different letters in "tamaontietoa" so maybe only 3 bits are needed for each one.
> Incidentally "tama on tietoa" is Finnish for "this is information".
John K Clark See what's on my new list at Extropolis
inf