http://blog.rudnyi.ru/2012/02/entropy-and-information.html
No doubt, this is my personal viewpoint. If you see that I have missed
something, please let me know.
Evgenii
Very nice summary of the history of the terms in physics and
information theory. I think it is valuable to have centralized this
information.
While I was reading your discussion of thermodynamic entropy reaching
zero at 0K a thought occurred to me which can restore meaning and
comparability between the terms:
Thermodynamic entropy refers only to the entropy (uncertainty)
associated with particle velocities.
Thus at 0K thermodynamic entropy is 0 and 0 bits of information are
required to describe the relative velocities of all the paeticals at 0K.
Memory devices do not increase their storage capacity as they become
hotter because no memory device today encodes information in terms of
partical velocities.
Under this view it is apparent that thermodynamic entropy is not the
same thing as entropy or information, it is a qualified term.
However, it seems thermodynamic entropy and "thermodynamic
information" could perhaps be considered synonymous.
Jason
> --
> You received this message because you are subscribed to the Google
> Groups "Everything List" group.
> To post to this group, send email to everyth...@googlegroups.com.
> To unsubscribe from this group, send email to everything-li...@googlegroups.com
> .
> For more options, visit this group at http://groups.google.com/group/everything-list?hl=en
> .
>
>
> Perhaps a more basic, and more pertinent question related with
> entrophy and information in the context of this list is the relation
> of computability, living beings , the arrow of time and entropy,
>
> What the paper (http://qi.ethz.ch/edu/qisemFS10/papers/
> 81_Bennett_Thermodynamics_of_computation.pdf) that initiated the
> discussion suggest is that in practical terms it is necessary a
> driving force that avoids random reversibility to execute practical
> computations, this driving force implies dissipation of energy and
> thus an increase of entropy. This is so because most if not all
> practical computations are exponentually branched (Fig 10).
But all computations can been done in a non branched way.
There exists reversible universal machine, and indeed a quantum
computer is such a machine.
From Bennett 73, (Logical reversibility of computation, IBM, J. Res.
Dev., 17, 525).
Energy is dissipated only when information is erased (Landauer
principle), and, as the logicians already knew (Hao Wang, for example)
computations can been done without erasing any information.
Bennett convinced me also that we can extract work from (Shannon)
information, and that energy might be a local measure of information
content.
>
> And here comes the living beings. As the paper says in the
> introduction, living beings perform computations at the molecular
> level, and it must be said, at the neural level.
I am OK with this, but of course we don't know that.
> Therefore given the
> said above, life must proceed from less to more entrophy and this
> defines the arrow of time.
>
> Besides the paper concentrates itself in what happens inside a
> computation some concepts can be used to dilucidate what happens with
> the interaction of a living being and its surrounding reality. The
> reality behaves like a form of ballistic computer at the microscopic
> level., with elemental particles ruled by the forces of nature instead
> of ellastic macroscopic collisions. At the macroscopic level,
> however, there is a destruction of information and irreversibility.
OK.
>
> However in the direction of entropy dissipation, it is possible to
> perform calculations in order to predict the future at the macroscopic
> level. That´s a critical function of living beings. An extreme example
> of the difference between macro and micro computation is to "predict"
> the distrubution of water in a water collector after rain. It is not
> necessary to know the position and velocity of every water molecule,
> not even the position and velocity of each drop of water. is this
> erase of information that the increase of entropy perform at the
> macroscopic level (that indeed is the reason of the mere concept of
> macro-state in statistical mechanics) the process that permit
> economically feasible computations. Since computation is expensive
> and the process of discovery of the world by living beings trough
> natural selection very slow, (trough the aggregation of complexity
> and sophistication by natural selection is in the order of magnitude
> of the age of the universe : thousands of millions years) Then the
> macroscopic laws of nature must be simple enough, and there must be a
> privileged direction of easy computation for life to exist.
OK.
>
> The fact that evolution for intelligent life and age of the Universe
> are in the same magnitudes means that this universe is constrained to
> the maximum discoverable-by-evolution complexity in the
> computationally privileged direction of the arrow of time.
OK, in term of "this branch of the multiverse".
Bruno
I do not get your point. JANAF Tables have been created to solve a
particular problem. If you need change in concentration, surface
effects, magnetization effects, you have to extend the JANAF Tables. And
this has been to solve particular problems. Experimental thermodynamics
is not limited to JANAF Tables. For example, the databases in Thermocalc
already include dependence on concentration.
> Yet
> magnetization of small domains is how information is stored on hard
> disks, c.f. Donald McKay's book "Information Theory, Inference, and
> Learning Algorithm" chapter 31.
Do you mean that when we consider magnetization, then the entropy become
subjective, context-dependent, and it will be finally filled with
information?
> Did you actually read E. T. Jaynes 1957 paper in which he introduced
> the idea of basing entropy in statistical mechanics (which you also
> seem to dislike) on information? He wrote "The mere fact that the
> same mathematical expression -SUM[p_i log(p_i)] occurs in both
> statistical mechanics and in information theory does not in itself
> establish a connection between these fields. This can be done only by
> finding new viewpoints from which the thermodynamic entropy and
> information-theory entropy appear as the same /concept/." Then he
I have missed this quote, I have to add it. In general, the first
Jaynes's paper is in a way reasonable. I wanted to better understand it,
as I like maximum likelihood, I have been using it in my own research a
lot. However, when I have read in Jaynes's second paper the following
(two quotes below), I gave up.
�With such an interpretation the expression �irreversible process�
represents a semantic confusion; it is not the physical process that is
irreversible, but rather our ability to follow it. The second law of
thermodynamics then becomes merely the statement that although our
information as to the state of a system may be lost in a variety of
ways, the only way in which it can be gained is by carrying out further
measurements.�
�It is important to realize that the tendency of entropy to increase is
not a consequence of the laws of physics as such, � . An entropy
increase may occur unavoidably, due to our incomplete knowledge of the
forces acting on a system, or it may be entirely voluntary act on our part.�
This I do not understand. Do you agree with these two quotes? If yes,
could you please explain, what he means?
> goes on to show how the principle of maximum entropy can be used to
> derive statistical mechanics. That it *can* be done in some other
> way, and was historically as you assert, is not to the point. As an
> example of how the information view of statistical mechanics extends
> its application he calculates how much the spins of protons in water
> would be polarized by rotating the water at 36,000rpm. It seems you
> are merely objecting to "new viewpoints" on the grounds that you can
> see all that you /want/ to see from the old viewpoint.
> Your quotation of Arnheim, from his book on the theory of entropy in
> art, just shows his confusion. The Shannon information, which is
> greatest when the system is most disordered in some sense, does not
> imply that the most disordered message contains the greatest
> information. The Shannon information is that information we receive
> when the *potential messages* are most disordered. It's a property of
> an ensemble or a channel, not of a particular message.
It is not a confusion of Arnheim. His book is quite good. To this end,
let me quote your second sentence in your message.
> I think you are ignoring the conceptual unification provided by
> information theory and statistical mechanics.
You see, I would love to understand the conceptual unification. To this
end, I have created many simple problems to understand this better.
Unfortunately you do not want to discuss them, you just saying general
words but you do not want to apply it to my simple practical problems.
Hence it is hard for me to understand you.
If to speak about confusion, just one example. You tell that the higher
temperature the more information the system has. Yet, engineers seems to
be unwilling to employ this knowledge in practice. Why is that? Why
engineers seem not to be impressed by the conceptual unification?
Evgenii
> Brent
>
And you don't get my point. Of course all forms of entropy can be measured and tabulated,
but the information theory viewpoint shows how they are unified by the same concept.
>
>> Yet
>> magnetization of small domains is how information is stored on hard
>> disks, c.f. Donald McKay's book "Information Theory, Inference, and
>> Learning Algorithm" chapter 31.
>
> Do you mean that when we consider magnetization, then the entropy become subjective,
> context-dependent, and it will be finally filled with information?
It is context dependent in that we consider the magnetization. What does the JANAF table
assume about the magnetization of the materials it tabulates?
>
>> Did you actually read E. T. Jaynes 1957 paper in which he introduced
>> the idea of basing entropy in statistical mechanics (which you also
>> seem to dislike) on information? He wrote "The mere fact that the
>> same mathematical expression -SUM[p_i log(p_i)] occurs in both
>> statistical mechanics and in information theory does not in itself
>> establish a connection between these fields. This can be done only by
>> finding new viewpoints from which the thermodynamic entropy and
>> information-theory entropy appear as the same /concept/." Then he
>
> I have missed this quote, I have to add it. In general, the first Jaynes's paper is in a
> way reasonable. I wanted to better understand it, as I like maximum likelihood, I have
> been using it in my own research a lot. However, when I have read in Jaynes's second
> paper the following (two quotes below), I gave up.
>
> �With such an interpretation the expression �irreversible process� represents a semantic
> confusion; it is not the physical process that is irreversible, but rather our ability
> to follow it. The second law of thermodynamics then becomes merely the statement that
> although our information as to the state of a system may be lost in a variety of ways,
> the only way in which it can be gained is by carrying out further measurements.�
>
> �It is important to realize that the tendency of entropy to increase is
> not a consequence of the laws of physics as such, � . An entropy increase may occur
> unavoidably, due to our incomplete knowledge of the forces acting on a system, or it may
> be entirely voluntary act on our part.�
>
> This I do not understand. Do you agree with these two quotes? If yes, could you please
> explain, what he means?
Yes. The physical processes are not irreversible. The fundamental physical laws are time
reversible. The free-expansion of a gas is *statistically* irreversible because we cannot
follow the individual molecules and their correlations, so when we consider only the
macroscopic variables of pressure, density, temperature,... it seems irreversible. In
very simple systems we might be able to actually follow the microscopic evolution of the
state, but we can choose to ignore it and calculate the entropy increase as though this
information were lost. Whether and how the information is lost is the crux of the
measurement problem in QM. Almost everyone on this list assumes Everett's multiple worlds
interpretation in which the information is not lost but is divided up among different
continuations of the observer.
>
>> goes on to show how the principle of maximum entropy can be used to
>> derive statistical mechanics. That it *can* be done in some other
>> way, and was historically as you assert, is not to the point. As an
>> example of how the information view of statistical mechanics extends
>> its application he calculates how much the spins of protons in water
>> would be polarized by rotating the water at 36,000rpm. It seems you
>> are merely objecting to "new viewpoints" on the grounds that you can
>> see all that you /want/ to see from the old viewpoint.
>
>> Your quotation of Arnheim, from his book on the theory of entropy in
>> art, just shows his confusion. The Shannon information, which is
>> greatest when the system is most disordered in some sense, does not
>> imply that the most disordered message contains the greatest
>> information. The Shannon information is that information we receive
>> when the *potential messages* are most disordered. It's a property of
>> an ensemble or a channel, not of a particular message.
>
> It is not a confusion of Arnheim. His book is quite good.
Then why doesn't he know that Shannon's information does not refer to particular messages?
> To this end, let me quote your second sentence in your message.
>
> > I think you are ignoring the conceptual unification provided by
> > information theory and statistical mechanics.
>
> You see, I would love to understand the conceptual unification.
I find that doubt this. If you really loved to understand it there is lots of material
online as well as good books which Russell and I have suggested.
> To this end, I have created many simple problems to understand this better.
> Unfortunately you do not want to discuss them, you just saying general words but you do
> not want to apply it to my simple practical problems. Hence it is hard for me to
> understand you.
>
> If to speak about confusion, just one example. You tell that the higher temperature the
> more information the system has. Yet, engineers seems to be unwilling to employ this
> knowledge in practice. Why is that? Why engineers seem not to be impressed by the
> conceptual unification?
I'm not an expert on this subject and it has been forty years since I studied statistical
mechanics, which is why I prefer to refer you to experts. Engineers are generally not
impressed with conceptual unification; they are interested in what can be most easily and
reliably applied. RF engineers generally don't care that EM waves are really photons.
Structural engineers don't care about interatomic forces, they just look yield strength in
tables. Engineers are not at all concerned with 'in principle' processes which can only
be realized in carefully contrived laboratory experiments. But finding unifying
principles is the job of physicists.
Brent
First thank you for your time. Your answers as well as answers of other
participants has helped me to understand better the opposite viewpoint
and better to organize my thoughts.
You are right that the best is to study a good book but first I have
already a stack of books to read and second reading a book alone is
usually boring. I like much more to clear a question in a discussion. It
is more enjoyable.
What I will probably do is read more about the history development with
Maxwell's demon that John has mentioned. Somehow I have missed it.
I believe that it is normal when we have different opinions but I hope
that my emails has helped you to understand the opposite viewpoint better.
As for engineers and physicists, I do not know. Let us take for example
a landfill. There are too many of them in the modern society and
engineers are trying to find a solution to reuse waste. An open
question: Does your statement that all physical processes are not
irreversible will help engineers to find a better solution?
Finally the JANAF Tables assume that the magnetic field is zero. If it
is not, then one has to add a corresponding term.
Best wishes,
Evgenii
On 27.02.2012 20:43 meekerdb said the following:
I am thermodynamicist and I do not know exactly what is information and
computation. You have written that living beings perform computations.
Several questions in this respect.
Are computations are limited to living beings only?
Does a bacteria perform computations as well?
If yes, then what is the difference between a ballcock in the toilet and
bacteria (provided we exclude reproduction from consideration)?
Evgenii
On 27.02.2012 12:16 Alberto G.Corona said the following:
> A thing that I often ask myself concerning MMH is the question about
> what is mathematical and what is not?. The set of real numbers is a
> mathematical structure, but also the set of real numbers plus the
> point (1,1) in the plane is.
Sure. Now, with comp, that mathematical structure is more easily
handled in the "mind" of the universal machine. For the ontology we
can use arithmetic, on which everyone agree. It is absolutely
undecidable that there is more than that (with the comp assumption).
So for the math, comp invite to assume only what is called "the
sharable part of intuitionist and classical mathematics.
> The set of randomly chosen numbers { 1,4
> 3,4,.34, 3} is because it can be described with the same descriptive
> language of math. But the first of these structures have properties
> and the others do not. The first can be infinite but can be described
> with a single equation while the last must be described
> extensively. . At least some random universes (the finite ones) can be
> described extensively, with the tools of mathematics but they don´t
> count in the intuitive sense as mathematical.
Why? If they can be finitely described, then I don't see why they
would be non mathematical.
>
> What is usually considered genuinely mathematical is any structure,
> that can be described briefly.
Not at all. In classical math any particular real number is
mathematically real, even if it cannot be described briefly. Chaitin's
Omega cannot be described briefly, even if we can defined it briefly.
> Also it must have good properties ,
> operations, symmetries or isomorphisms with other structures so the
> structure can be navigated and related with other structures and the
> knowledge can be reused. These structures have a low kolmogorov
> complexity, so they can be "navigated" with low computing resources.
But they are a tiny part of bigger mathematical structures. That's why
we use big mathematical universe, like the model of ZF, or Category
theory.
>
> So the demand of computation in each living being forces to admit
> that universes too random or too simple, wiith no lineal or
> discontinuous macroscopic laws have no complex spatio-temporal
> volutes (that may be the aspect of life as looked from outside of our
> four-dimensional universe). The macroscopic laws are the macroscopic
> effects of the underlying mathematical structures with which our
> universe is isomorphic (or identical).
We need both, if only to make precise that very reasoning. Even in
comp, despite such kind of math is better seen as epistemological than
ontological.
>
> And our very notion of what is intuitively considered mathematical:
> "something general simple and powerful enough" has the hallmark of
> scarcity of computation resources. (And absence of contradictions fits
> in the notion of simplicity, because exception to rules have to be
> memorized and dealt with extensively, one by one)
>
> Perhaps not only is that way but even may be that the absence of
> contradictions ( the main rule of simplicity) or -in computationa
> terms- the rule of low kolmogorov complexity _creates_ itself the
> mathematics.
Precisely not. Kolmogorov complexity is to shallow, and lacks the
needed redundancy, depth, etc. to allow reasonable solution to the
comp measure problem.
> That is, for example, may be that the boolean logic for
> example, is what it is not because it is consistent simpleand it´s
> beatiful, but because it is the shortest logic in terms of the
> lenght of the description of its operations, and this is the reason
> because we perceive it as simple and beatiful and consistent.
It is not the shortest logic. It has the simplest semantics, at the
propositional level. Combinators logic is far simpler conceptually,
but have even more complex semantically.
But the main problem of the MHH is that nobody can define what it is,
and it is a priori too big to have a notion of first person
indeterminacy. Comp put *much* order into this, and needs no more math
than arithmetic, or elementary mathematical computer science at the
ontological level. tegmark seems unaware of the whole foundation-of-
math progress made by the logicians.
Bruno
> .
>> Dear Albert,
>>
>> One brief comment. In your Google paper you wrote, among other
>> interesting things, "But life and natural selection demands a
>> mathematical universe
>> <https://docs.google.com/Doc?docid=0AW-x2MmiuA32ZGQ1cm03cXFfMTk4YzR4cn...
>> >somehow".
>> Could it be that this is just another implication of the MMH idea? If
>> the physical implementation of computation acts as a selective
>> pressure
>> on the multiverse, then it makes sense that we would find ourselves
>> in a
>> universe that is representable in terms of Boolean algebras with
>> their
>> nice and well behaved laws of bivalence (a or not-A), etc.
>>
>> Very interesting ideas.
>>
>> Onward!
>>
>> Stephen
>
> --
> You received this message because you are subscribed to the Google
> Groups "Everything List" group.
> To post to this group, send email to everyth...@googlegroups.com.
> To unsubscribe from this group, send email to everything-li...@googlegroups.com
> .
> For more options, visit this group at http://groups.google.com/group/everything-list?hl=en
> .
>
On 29 feb, 11:20, Bruno Marchal <marc...@ulb.ac.be> wrote:On 29 Feb 2012, at 02:20, Alberto G.Corona wrote (to Stephen):A thing that I often ask myself concerning MMH is the question aboutwhat is mathematical and what is not?. The set of real numbers is amathematical structure, but also the set of real numbers plus thepoint (1,1) in the plane is.Sure. Now, with comp, that mathematical structure is more easilyhandled in the "mind" of the universal machine. For the ontology wecan use arithmetic, on which everyone agree. It is absolutelyundecidable that there is more than that (with the comp assumption).So for the math, comp invite to assume only what is called "thesharable part of intuitionist and classical mathematics.I do not thing in computations in terms of "minds of universal
machines" in the abstract sense but in terms of the needs of
computability of living beings.
The set of randomly chosen numbers { 1,43,4,.34, 3} is because it can be described with the same descriptivelanguage of math. But the first of these structures have propertiesand the others do not. The first can be infinite but can be describedwith a single equation while the last must be describedextensively. . At least some random universes (the finite ones) can bedescribed extensively, with the tools of mathematics but they don´tcount in the intuitive sense as mathematical.Why? If they can be finitely described, then I don't see why theywould be non mathematical.It is not mathematical in the intuitive sense that the list of the
ponits of ramdomly folded paper is not. That intuitive sense , more
restrictive is what I use here.
What is usually considered genuinely mathematical is any structure,that can be described briefly.Not at all. In classical math any particular real number ismathematically real, even if it cannot be described briefly. Chaitin'sOmega cannot be described briefly, even if we can defined it briefly.a real number in the sense I said above is not mathematical. in the
sense I said above. In fact there is no mathematical theory about
paticular real numbers. the set of all the real numbers , in the
contrary, is.
Also it must have good properties ,operations, symmetries or isomorphisms with other structures so thestructure can be navigated and related with other structures and theknowledge can be reused. These structures have a low kolmogorovcomplexity, so they can be "navigated" with low computing resources.But they are a tiny part of bigger mathematical structures. That's whywe use big mathematical universe, like the model of ZF, or Categorytheory.
If maths is all that can be described finitelly, then of course you
are right. but I´m intuitively sure that the ones that are interesting
can be defined briefly, using an evolutuionary sense of what is
interesting.
So the demand of computation in each living being forces to admitthat universes too random or too simple, wiith no lineal ordiscontinuous macroscopic laws have no complex spatio-temporalvolutes (that may be the aspect of life as looked from outside of ourfour-dimensional universe). The macroscopic laws are the macroscopiceffects of the underlying mathematical structures with which ouruniverse is isomorphic (or identical).We need both, if only to make precise that very reasoning. Even incomp, despite such kind of math is better seen as epistemological thanontological.There is a hole in the transition from certain mathematical
properties in macroscopic laws to simple mathematical theories of
everything .
The fact that strange, but relatively simple
mathematical structure (M theory)
include islands of macroscopic laws
that are warm for life.
I do not know the necessity of this greed for
reduction. The macroscopic laws can reigh in a hubble sphere,
sustained by a gigant at the top of a turtle swimming in an ocean.
And our very notion of what is intuitively considered mathematical:"something general simple and powerful enough" has the hallmark ofscarcity of computation resources. (And absence of contradictions fitsin the notion of simplicity, because exception to rules have to bememorized and dealt with extensively, one by one)Perhaps not only is that way but even may be that the absence ofcontradictions ( the main rule of simplicity) or -in computationaterms- the rule of low kolmogorov complexity _creates_ itself themathematics.Precisely not. Kolmogorov complexity is to shallow, and lacks theneeded redundancy, depth, etc. to allow reasonable solution to thecomp measure problem.I can not gasp from your terse definitions what the comp measure
problem is .
What i know is that, kolmogorov complexity is critical
for life. if living beings compute inputs to create appropriate
outputs for survival. And they do it.
That is, for example, may be that the boolean logic forexample, is what it is not because it is consistent simpleand it´sbeatiful, but because it is the shortest logic in terms of thelenght of the description of its operations, and this is the reasonbecause we perceive it as simple and beatiful and consistent.It is not the shortest logic. It has the simplest semantics, at thepropositional level. Combinators logic is far simpler conceptually,but have even more complex semantically.I meant the sortest binary logic.
I mean that any structure with
contradictions has longer description than the one without them.,
so
the first is harder to discover and harder to deal with.,.
Evgenii
On 29.02.2012 15:58 Alberto G.Corona said the following:
I start from the idea of whathever model of an
universe that can localy evolve computers. A mathematical continuous
structure with infinite small substitution measure , and thus non
computable can evolve computers. well not just computers,
but problem
adaptive systems, clearly separated from the environment, that respond
to external environment situations in order to preserve the internal
structures, to reproduce and so on.
The list advocates that 'everything' is simpler than 'something'. Butthis leads to a measure problem.It happens that the comp hypothesis gives crucial constraints on thatmeasure problem.
I agree with you. The little numbers are the real stars :)But the fact is that quickly, *some* rather little numbers havebehaviors which we can't explain without referring to big numbers oreven infinities. A diophantine polynomial of degree 4, with 54variables, perhaps less, is already Turing universal. There areprograms which does not halt, but you will need quite elaboratetransfinite mathematics to prove it is the case.that is not a problem as long as diophatine polynomials don´t usurpate
the role of boolean logic in our universe, and the transfinite
mathematics don´t vindicate a role in the second law of Newton. ;)
Kolmogorov complexity might be the key of the measure problem, but fewpeople have succeeded of using it to progress. It might play some rolein the selection of some particular dovetailer, but it can't work, bybeing non computable, and depending on constant. I don't know. I'mafraid that the possible role for Kolmogorov complexity will have tobe derived, not assumed. or you might find an alternative formulationof comp.
As I said above I do not see why a model of the universe as a whole
has to be restricted to the requirement of simulation.
I see (local)
and macroscopic computability as an "antropic" requirement of Life,
but not more.
That is, for example, may be that the boolean logic forexample, is what it is not because it is consistent simpleand it´sbeatiful, but because it is the shortest logic in terms of thelenght of the description of its operations, and this is the reasonbecause we perceive it as simple and beatiful and consistent.It is not the shortest logic. It has the simplest semantics, at thepropositional level. Combinators logic is far simpler conceptually,but have even more complex semantically.I meant the sortest binary logic.Classical logic is not the shorter binary logic. In term of the lengthof its possible formal descriptions.I mean that any structure withcontradictions has longer description than the one without them.,?None logic get contradictions, with the notable exception of theparaconsistant logics.Intuitionist logic is a consistent (free of contradiction) weakeningof classical logic. Quantum logic too.Note also that the term logic is vague. Strictly speaking I don't needlogic at the ..
I can define a set and arbitrary ioperations with contradictions.
I
can say True ´AND True is False half of the time. and True the other
half.depending on a third stocastic boolean variable that flip
according with some criteria. I can define multiplication of numbers
in weird ways so that i break the symetric an distributive
properties in certain cases . and so on. All of them can be defined
algorithmically or mathematically. In the broader sense, these
structures will be mathematical but with larger kolmogorov complexity
than the good ones.(and useless).
Chemical reactions take place not only in the cell. Let us take for
example the famous Belousov�Zhabotinsky reaction (nonlinear chemical
oscillator, there are several reactions in this process taking place
simultaneously). Could we say that a reactor with this reaction also
computes?
Evgenii
On 01.03.2012 02:13 Alberto G.Corona said the following:
Dear Stephen, A thing that I often ask myself concerning MMH is the question about what is mathematical and what is not?. The set of real numbers is a mathematical structure, but also the set of real numbers plus the point (1,1) in the plane is. The set of randomly chosen numbers { 1,4 3,4,.34, 3} is because it can be described with the same descriptive language of math. But the first of these structures have properties and the others do not. The first can be infinite but can be described with a single equation while the last must be described extensively. . At least some random universes (the finite ones) can be described extensively, with the tools of mathematics but they don´t count in the intuitive sense as mathematical.
What is usually considered genuinely mathematical is any structure, that can be described briefly. Also it must have good properties , operations, symmetries or isomorphisms with other structures so the structure can be navigated and related with other structures and the knowledge can be reused. These structures have a low kolmogorov complexity, so they can be "navigated" with low computing resources.
So the demand of computation in each living being forces to admit that universes too random or too simple, wiith no lineal or discontinuous macroscopic laws have no complex spatio-temporal volutes (that may be the aspect of life as looked from outside of our four-dimensional universe). The macroscopic laws are the macroscopic effects of the underlying mathematical structures with which our universe is isomorphic (or identical).
And our very notion of what is intuitively considered mathematical: "something general simple and powerful enough" has the hallmark of scarcity of computation resources. (And absence of contradictions fits in the notion of simplicity, because exception to rules have to be memorized and dealt with extensively, one by one)
Perhaps not only is that way but even may be that the absence of contradictions ( the main rule of simplicity) or -in computationa terms- the rule of low kolmogorov complexity _creates_ itself the mathematics. That is, for example, may be that the boolean logic for example, is what it is not because it is consistent simpleand it´s beatiful, but because it is the shortest logic in terms of the lenght of the description of its operations, and this is the reason because we perceive it as simple and beatiful and consistent. .