For Evgenii: the-unavoidable-cost-of-computation-revealed

4 views
Skip to first unread message

Russell Standish

unread,
Mar 12, 2012, 8:43:40 PM3/12/12
to everyth...@googlegroups.com
http://www.nature.com/news/the-unavoidable-cost-of-computation-revealed-1.10186

This about experimentally testing Landauer's principle that
computation has thermodynamic constraints.

--

----------------------------------------------------------------------------
Prof Russell Standish Phone 0425 253119 (mobile)
Principal, High Performance Coders
Visiting Professor of Mathematics hpc...@hpcoders.com.au
University of New South Wales http://www.hpcoders.com.au
----------------------------------------------------------------------------

Bruno Marchal

unread,
Mar 13, 2012, 9:48:36 AM3/13/12
to everyth...@googlegroups.com

On 13 Mar 2012, at 01:43, Russell Standish wrote:

> http://www.nature.com/news/the-unavoidable-cost-of-computation-revealed-1.10186
>
> This about experimentally testing Landauer's principle that
> computation has thermodynamic constraints.


I was worrying a bit with that title, thinking Landauer's principle
was refuted, but on the contrary, it is confirmed.

The title is misleading, because it looks like computation would need
energy, but the energy is needed only for erasing information, and
since a paper by Hao Wang (universal, and thus all) computations can
be done without ever erasing information.

This confirms also that we can transform information into energy, and
we can transform energy into information, and perhaps a quantum
nuclear computing machine can appear on the horizon.

Note that the self-duplication, à-la W/M, produces one first person
bit of information ("W or M"), which suggests that consciousness might
be always associated with an energy => information flow. Energy, mass
and information would be, and probably are, different aspect of the
self-multiplication in arithmetic, as it should be by the UD A.

The problem is that such a matter, even a vacuum, should contain an
infinite amount of energy, but similar problems occur also with
quantum field theory, and thus led to renormalization theories, and we
can expect important renormalization in the comp physics too.
Last, but not least, this confirms also Bennett analysis of the
Maxwell Daemon. I think.

We are not just machine, we are steam engine.
All we need is a bit of heat, to *forget* taxes and death ...

Thanks for the link, Russell.

Bruno


http://iridia.ulb.ac.be/~marchal/

John Clark

unread,
Mar 13, 2012, 10:38:05 AM3/13/12
to everyth...@googlegroups.com
On Tue, Mar 13, 2012  Bruno Marchal <mar...@ulb.ac.be> wrote:

> It looks like computation would need energy, but the energy is needed only for erasing information, and since a paper by Hao Wang (universal, and thus all) computations can be done without ever erasing information.

Yes, with a reversible computer information is not erased, you'd still need energy but the amount needed to make a calculation can be made arbitrarily small, however the only way to do that is by slowing down the calculation. Fortunately even a small reduction in speed can help a lot in energy saving, the power dissipation (per unit of time) falls as the square of the speed. We won’t know if that’s good enough to allow infinite computation until we know more about the basic laws of physics and cosmology, and it may not be possible to learn that from mathematics alone.

Today's computers are not reversible because there is no reason for them to be, in our big clunking machines the theoretical limits from the thermodynamics of computation is just not important compared with other factors. But a nanocomputer would almost have to be reversible, otherwise it
would generate so much heat it would be more like a bomb than a computer.

  John K Clark






Evgenii Rudnyi

unread,
Mar 13, 2012, 11:29:52 AM3/13/12
to everyth...@googlegroups.com
Russell,

Thanks for the link. Yet, it is unclear to me what is exactly
information and computation in this experiment. To be more precise, let
us take physicalism:

http://plato.stanford.edu/entries/physicalism/

This is probably in what I believe right now, although as I have
described recently in "Two Mathematicians in a Bunker and Existence of
Pi" I see problems with mathematics under physicalism.

We could repeat the same experiment with two computer scientists in a
bunker that discuss information and computation. The question is what
happens under the physicalism hypothesis with information and
computation in the bunker. Do we have information and computation in the
bunker when two computer scientists are still alive? Do we have
information and computation there when they are dead?

Evgenii
--
Two Mathematicians in a Bunker and Existence of Pi
http://blog.rudnyi.ru/2012/03/two-mathematicians-in-a-bunker.html


On 13.03.2012 01:43 Russell Standish said the following:

meekerdb

unread,
Mar 13, 2012, 12:53:43 PM3/13/12
to everyth...@googlegroups.com
On 3/12/2012 5:43 PM, Russell Standish wrote:
http://www.nature.com/news/the-unavoidable-cost-of-computation-revealed-1.10186

This about experimentally testing Landauer's principle that
computation has thermodynamic constraints.

"This energy consumption is getting ever lower, and Lutz says it’ll be approaching the Landauer limit within the next couple of decades. “Our experiment clearly shows that you cannot go below Landauer’s limit,” says Lutz. “Engineers will soon have to face that.”

Meanwhile, in fledgling quantum computers, which exploit the rules of quantum physics to achieve greater processing power, this limitation is already being confronted. “Logic processing in quantum computers already is well within the Landauer regime,” says physicist Seth Lloyd of the Massachusetts Institute of Technology in Cambridge. “One has to worry about Landauer's principle all the time.”"

But theoretically computation can be done without erasing information, as Feynman already noted.

Brent

meekerdb

unread,
Mar 13, 2012, 1:20:18 PM3/13/12
to everyth...@googlegroups.com
On 3/13/2012 6:48 AM, Bruno Marchal wrote:
>
> On 13 Mar 2012, at 01:43, Russell Standish wrote:
>
>> http://www.nature.com/news/the-unavoidable-cost-of-computation-revealed-1.10186
>>
>> This about experimentally testing Landauer's principle that
>> computation has thermodynamic constraints.
>
>
> I was worrying a bit with that title, thinking Landauer's principle was refuted, but on
> the contrary, it is confirmed.
>
> The title is misleading, because it looks like computation would need energy, but the
> energy is needed only for erasing information, and since a paper by Hao Wang (universal,
> and thus all) computations can be done without ever erasing information.
>
> This confirms also that we can transform information into energy,

That's not quite right. We can use information to make energy available for work, i.e.
reduce entropy. The energy is already there, it's just not accessible for useful work.

Brent

> and we can transform energy into information, and perhaps a quantum nuclear computing
> machine can appear on the horizon.
>

> Note that the self-duplication, �-la W/M, produces one first person bit of information

Evgenii Rudnyi

unread,
Mar 13, 2012, 1:28:27 PM3/13/12
to everyth...@googlegroups.com
On 13.03.2012 18:20 meekerdb said the following:

> On 3/13/2012 6:48 AM, Bruno Marchal wrote:
>>
>> On 13 Mar 2012, at 01:43, Russell Standish wrote:
>>
>>> http://www.nature.com/news/the-unavoidable-cost-of-computation-revealed-1.10186
>>>
>>>
>>> This about experimentally testing Landauer's principle that
>>> computation has thermodynamic constraints.
>>
>>
>> I was worrying a bit with that title, thinking Landauer's principle
>> was refuted, but on the contrary, it is confirmed.
>>
>> The title is misleading, because it looks like computation would need
>> energy, but the energy is needed only for erasing information, and
>> since a paper by Hao Wang (universal, and thus all) computations can
>> be done without ever erasing information.
>>
>> This confirms also that we can transform information into energy,
>
> That's not quite right. We can use information to make energy available
> for work, i.e. reduce entropy. The energy is already there, it's just
> not accessible for useful work.
>
> Brent

Could you please give one example from physics (yet please not a thought
experiment) where information allows us to reduce entropy?

As fas a I remember, you have told previously that information in
physics is the entropy, so your statement now looks a bit strange.

Evgenii

meekerdb

unread,
Mar 13, 2012, 3:09:35 PM3/13/12
to everyth...@googlegroups.com
On 3/13/2012 10:28 AM, Evgenii Rudnyi wrote:
> On 13.03.2012 18:20 meekerdb said the following:
>> On 3/13/2012 6:48 AM, Bruno Marchal wrote:
>>>
>>> On 13 Mar 2012, at 01:43, Russell Standish wrote:
>>>
>>>> http://www.nature.com/news/the-unavoidable-cost-of-computation-revealed-1.10186
>>>>
>>>>
>>>> This about experimentally testing Landauer's principle that
>>>> computation has thermodynamic constraints.
>>>
>>>
>>> I was worrying a bit with that title, thinking Landauer's principle
>>> was refuted, but on the contrary, it is confirmed.
>>>
>>> The title is misleading, because it looks like computation would need
>>> energy, but the energy is needed only for erasing information, and
>>> since a paper by Hao Wang (universal, and thus all) computations can
>>> be done without ever erasing information.
>>>
>>> This confirms also that we can transform information into energy,
>>
>> That's not quite right. We can use information to make energy available
>> for work, i.e. reduce entropy. The energy is already there, it's just
>> not accessible for useful work.
>>
>> Brent
>
> Could you please give one example from physics (yet please not a thought experiment)
> where information allows us to reduce entropy?

http://www.nature.com/news/2010/101114/full/news.2010.606.html

>
> As fas a I remember, you have told previously that information in physics is the entropy,

I think your memory is wrong. Please cite where I said that.

Brent

Evgenii Rudnyi

unread,
Mar 13, 2012, 3:26:12 PM3/13/12
to everyth...@googlegroups.com
On 13.03.2012 20:09 meekerdb said the following:

> On 3/13/2012 10:28 AM, Evgenii Rudnyi wrote:

...

>>
>> Could you please give one example from physics (yet please not a
>> thought experiment) where information allows us to reduce entropy?
>
> http://www.nature.com/news/2010/101114/full/news.2010.606.html

Thanks a lot. I will look at this.

>>
>> As fas a I remember, you have told previously that information in
>> physics is the entropy,
>
> I think your memory is wrong. Please cite where I said that.

In my collection I have this quote for example

http://blog.rudnyi.ru/2012/01/information.html

25.01.2012 21:25 Brent: “The thermodynamic entropy is a measure of the
information required to locate the possible states of the plates in the
phase space of atomic configurations constituting them. Note that the
thermodynamic entropy you quote is really the *change* in entropy per
degree at the given temperature. It’s a measure of how much more phase
space becomes available to the atomic states when the internal energy is
increased. More available phase space means more uncertainty of the
exact actual state and hence more information entropy. This information
is enormous compared to the “01″ stamped on the plate, the shape of the
plate or any other aspects that we would normally use to convey
information. It would only be in case we cooled the plate to near
absolute zero and then tried to encode information in its microscopic
vibrational states that the thermodynamic and the encoded information
entropy would become similar. ”

Evgenii

meekerdb

unread,
Mar 13, 2012, 3:32:28 PM3/13/12
to everyth...@googlegroups.com

Yes, that clearly states that entropy is equal to the information that would be required
to eliminate the uncertainty as to the exact state in phase space. It's *the missing*
information when you only specify the thermodynamic variables. So what is strange about
that? Dollars are a measure of debt, but that doesn't mean you have a lot of dollars when
you have a lot of debt.

Brent

Stephen P. King

unread,
Mar 13, 2012, 3:35:07 PM3/13/12
to everyth...@googlegroups.com
On 3/13/2012 9:48 AM, Bruno Marchal wrote:
>
> On 13 Mar 2012, at 01:43, Russell Standish wrote:
>
>> http://www.nature.com/news/the-unavoidable-cost-of-computation-revealed-1.10186
>>
>>
>> This about experimentally testing Landauer's principle that
>> computation has thermodynamic constraints.
>
>
> I was worrying a bit with that title, thinking Landauer's principle
> was refuted, but on the contrary, it is confirmed.
>
> The title is misleading, because it looks like computation would need
> energy, but the energy is needed only for erasing information, and
> since a paper by Hao Wang (universal, and thus all) computations can
> be done without ever erasing information.
>
> This confirms also that we can transform information into energy, and
> we can transform energy into information, and perhaps a quantum
> nuclear computing machine can appear on the horizon.
>
> Note that the self-duplication, �-la W/M, produces one first person
> bit of information ("W or M"), which suggests that consciousness might
> be always associated with an energy => information flow. Energy, mass
> and information would be, and probably are, different aspect of the
> self-multiplication in arithmetic, as it should be by the UD A.
>
> The problem is that such a matter, even a vacuum, should contain an
> infinite amount of energy, but similar problems occur also with
> quantum field theory, and thus led to renormalization theories, and we
> can expect important renormalization in the comp physics too.
> Last, but not least, this confirms also Bennett analysis of the
> Maxwell Daemon. I think.
>
> We are not just machine, we are steam engine.
> All we need is a bit of heat, to *forget* taxes and death ...
>
> Thanks for the link, Russell.
>
> Bruno
>
>
> http://iridia.ulb.ac.be/~marchal/
>
Dear Bruno,

Forgive me but there is one resource without which computation
cannot occur: Memory. Without the genus invariant manifolds upon which
the Machines can write their equations, the machines cannot run, just as
without our chalkboards and computer screens we cannot do our calculations.

Onward!

Stephen

Evgenii Rudnyi

unread,
Mar 13, 2012, 3:44:47 PM3/13/12
to everyth...@googlegroups.com
On 13.03.2012 20:32 meekerdb said the following:

What is the difference with what I have said previously? Entropy and
information are related, that is, if I know the entropy, I can infer
information and vice versa, so in essence the entropy is information.

Evgenii

meekerdb

unread,
Mar 13, 2012, 3:59:37 PM3/13/12
to everyth...@googlegroups.com

But the thermodynamic information, what you get from the JANAF tables, is the missing
information when you just specify the thermodynamic variables. If you specify more
variables there will be less missing and the entropy will be lower. If you specified the
exact state of every atom the entropy of the system would be zero. So the two are not the
same, they are complementary; like debt and wealth: both are measured in money but more of
one means less of the other.

Brent

Bruno Marchal

unread,
Mar 14, 2012, 11:08:52 AM3/14/12
to everyth...@googlegroups.com

On 13 Mar 2012, at 18:20, meekerdb wrote:

> On 3/13/2012 6:48 AM, Bruno Marchal wrote:
>>
>> On 13 Mar 2012, at 01:43, Russell Standish wrote:
>>
>>> http://www.nature.com/news/the-unavoidable-cost-of-computation-revealed-1.10186
>>>
>>> This about experimentally testing Landauer's principle that
>>> computation has thermodynamic constraints.
>>
>>
>> I was worrying a bit with that title, thinking Landauer's principle
>> was refuted, but on the contrary, it is confirmed.
>>
>> The title is misleading, because it looks like computation would
>> need energy, but the energy is needed only for erasing information,
>> and since a paper by Hao Wang (universal, and thus all)
>> computations can be done without ever erasing information.
>>
>> This confirms also that we can transform information into energy,
>
> That's not quite right. We can use information to make energy
> available for work, i.e. reduce entropy. The energy is already
> there, it's just not accessible for useful work.

Yes, you are probably right, for thermodynamics. Assuming comp and its
'reversal' consequence, what is "energy" is an open problem. It looks
like a constant obtained from the high symmetry of the core bottom
physical reality. But it seems infinite, intuitively. I guess we need
the full (first order) solution of the measure problem to say more.

Bruno

http://iridia.ulb.ac.be/~marchal/

Evgenii Rudnyi

unread,
Mar 14, 2012, 2:51:13 PM3/14/12
to everyth...@googlegroups.com
On 13.03.2012 20:59 meekerdb said the following:

> On 3/13/2012 12:44 PM, Evgenii Rudnyi wrote:
>> On 13.03.2012 20:32 meekerdb said the following:
>>> On 3/13/2012 12:26 PM, Evgenii Rudnyi wrote:

...

Then the thermodynamic entropy is subjective. Try to convince in this
engineers who develop engines, or chemists who compute equilibria, and
see what happens.

I will read the paper that you have found (it may take some time though
until I will find time for this). Let me be back to your definition then.

Evgenii

meekerdb

unread,
Mar 14, 2012, 2:58:59 PM3/14/12
to everyth...@googlegroups.com

It is relative not just to the information but the use of that information. Even if you
told an engineer designing a steam turbine the position and momentum of each molecule of
steam he would ignore it because he has no practical way of using it to take advantage of
the lower entropy that is in principle available. He has no way to flex and deform the
turbine blades billions of times per second in order to get more power from the steam.
The experiment I linked to is extremely simple so that it is possible to use the information.

Brent

Russell Standish

unread,
Mar 14, 2012, 6:34:47 PM3/14/12
to everyth...@googlegroups.com
On Wed, Mar 14, 2012 at 07:51:13PM +0100, Evgenii Rudnyi wrote:
>
> Then the thermodynamic entropy is subjective. Try to convince in
> this engineers who develop engines, or chemists who compute
> equilibria, and see what happens.

I take Denbigh & Denbigh's position that entropy is not subjective, but
rather fixed by convention. Conventions can be entirely
objective. This should assuage those engineers you speak of.

Evgenii Rudnyi

unread,
Mar 15, 2012, 2:25:01 PM3/15/12
to everyth...@googlegroups.com
On 14.03.2012 23:34 Russell Standish said the following:

> On Wed, Mar 14, 2012 at 07:51:13PM +0100, Evgenii Rudnyi wrote:
>>
>> Then the thermodynamic entropy is subjective. Try to convince in
>> this engineers who develop engines, or chemists who compute
>> equilibria, and see what happens.
>
> I take Denbigh& Denbigh's position that entropy is not subjective, but

> rather fixed by convention. Conventions can be entirely
> objective. This should assuage those engineers you speak of.
>
>

Could you please explain a bit more what you mean? What does it mean to
fix something by convention?

Evgenii

Russell Standish

unread,
Mar 15, 2012, 5:58:34 PM3/15/12
to everyth...@googlegroups.com

We take certain macro variables as thermodynamic state variables,
rather than others. A Laplace daemon would not agree with that.

Its better explained in Denbigh & Denbigh, but Brent Meeker has also
been making the same point.

Cheers

Evgenii Rudnyi

unread,
Mar 25, 2012, 9:44:20 AM3/25/12
to everyth...@googlegroups.com
On 14.03.2012 19:58 meekerdb said the following:

> On 3/14/2012 11:51 AM, Evgenii Rudnyi wrote:

...

>> Then the thermodynamic entropy is subjective. Try to convince in this
>> engineers who develop engines, or chemists who compute equilibria, and
>> see what happens.
>
> It is relative not just to the information but the use of that
> information. Even if you told an engineer designing a steam turbine the
> position and momentum of each molecule of steam he would ignore it
> because he has no practical way of using it to take advantage of the
> lower entropy that is in principle available. He has no way to flex and
> deform the turbine blades billions of times per second in order to get
> more power from the steam. The experiment I linked to is extremely
> simple so that it is possible to use the information.
>
> Brent
>

I have looked the paper that you have linked

On 13.03.2012 20:09 meekerdb said the following:
> On 3/13/2012 10:28 AM, Evgenii Rudnyi wrote:
...
>> Could you please give one example from physics (yet please not a
>> thought experiment) where information allows us to reduce entropy?
>
> http://www.nature.com/news/2010/101114/full/news.2010.606.html
>

Experimental demonstration of information-to-energy conversion and
validation of the generalized Jarzynski equality
Shoichi Toyabe, Takahiro Sagawa, Masahito Ueda, Eiro Muneyuki & Masaki Sano
Nature Physics, Volume: 6, Pages: 988–992 (2010)

I should say that I am not impressed. One can make a feedback mechanism
indeed (by the way, it is quite common in engineering), but then in my
view we should consider the whole system at once. What is the
information then and what is its relationship with the entropy of the
whole system?

By the way the information about the position of the bead have nothing
to do with its entropy. This is exactly what happens in any feedback
systems. One can introduce information, especially with digital control,
but it has nothing to do with the thermodynamic entropy.

Then I like

"In microscopic systems, thermodynamic quantities such as work, heat and
internal energy do not remain constant".

The authors seem to forget that work and heat are not state functions.
How work and heat could remain constant even in a macroscopic systems?

I also find the assumption at the beginning of the paper

"Note that, in the ideal case, energy to place the block can be
negligible; this implies that the particle can obtain free energy
without any direct energy injection."

funny. After block is there, the particle will jump in the direction of
the block and it will interact with the block. This interaction will
force the particle to jump in the other direction and I would say the
energy is there. The authors should have defined better what they mean
by direct energy injection.

In essence, in my view the title "information-to-energy conversion" is
some word game. It could work when instead of considering the whole
system in question, one concentrates on a small subsystem. Say if I
consider a thermostat then I could also say that information about the
current temperature is transformed to the heater and thus to energy. I
am not sure if this makes sense though.

Evgenii

Evgenii Rudnyi

unread,
Mar 25, 2012, 9:49:17 AM3/25/12
to everyth...@googlegroups.com
On 15.03.2012 22:58 Russell Standish said the following:

> On Thu, Mar 15, 2012 at 07:25:01PM +0100, Evgenii Rudnyi wrote:
>> On 14.03.2012 23:34 Russell Standish said the following:
>>> On Wed, Mar 14, 2012 at 07:51:13PM +0100, Evgenii Rudnyi wrote:
>>>>
>>>> Then the thermodynamic entropy is subjective. Try to convince in
>>>> this engineers who develop engines, or chemists who compute
>>>> equilibria, and see what happens.
>>>
>>> I take Denbigh& Denbigh's position that entropy is not subjective, but
>>> rather fixed by convention. Conventions can be entirely
>>> objective. This should assuage those engineers you speak of.
>>>
>>>
>>
>> Could you please explain a bit more what you mean? What does it mean
>> to fix something by convention?
>>
>> Evgenii
>>
>
> We take certain macro variables as thermodynamic state variables,
> rather than others. A Laplace daemon would not agree with that.
>
> Its better explained in Denbigh& Denbigh, but Brent Meeker has also

> been making the same point.
>
> Cheers
>

Do you mean that the Laplace deamon would not agree with the Second Law?

Evgenii

Russell Standish

unread,
Mar 25, 2012, 5:19:52 PM3/25/12
to everyth...@googlegroups.com
On Sun, Mar 25, 2012 at 03:49:17PM +0200, Evgenii Rudnyi wrote:
>
> Do you mean that the Laplace deamon would not agree with the Second Law?
>
> Evgenii
>

Yes - there is no second law for the Laplace daemon. Each microstate
is distinct and equiprobable.

meekerdb

unread,
Mar 25, 2012, 6:44:13 PM3/25/12
to everyth...@googlegroups.com
> Nature Physics, Volume: 6, Pages: 988�992 (2010)

>
> I should say that I am not impressed. One can make a feedback mechanism indeed (by the
> way, it is quite common in engineering), but then in my view we should consider the
> whole system at once. What is the information then and what is its relationship with the
> entropy of the whole system?

What you asked for was an example of using information to reduce entropy: not obtaining
information AND using it to reduce entropy.

"The experiment does not actually violate the second law of thermodynamics, because in the
system as a whole, energy must be consumed by the equipment � and the experimenters � to
monitor the bead and switch the voltage as needed."

>
> By the way the information about the position of the bead have nothing to do with its
> entropy.

It has to do with the entropy of the system of bead plus medium. The rotating bead could
be used to do mechanical work via energy which was extracted from the random motion of the
molecules in the medium. This is Gibbs free energy, so the bead plus medium plus
information has a lower entropy that just the bead plus medium.

> This is exactly what happens in any feedback systems. One can introduce information,
> especially with digital control, but it has nothing to do with the thermodynamic entropy.

Because it is not extracting energy from random molecular motion, aka "heat".


>
> Then I like
>
> "In microscopic systems, thermodynamic quantities such as work, heat and internal energy
> do not remain constant".
>
> The authors seem to forget that work and heat are not state functions. How work and heat
> could remain constant even in a macroscopic systems?

They don't remain constant, but their statistical fluctuations are very small compared to
their absolute value. Of course if you had information about these fluctuations you could
use it to extract energy and decrease the entropy of the system.

>
> I also find the assumption at the beginning of the paper
>
> "Note that, in the ideal case, energy to place the block can be negligible; this implies
> that the particle can obtain free energy
> without any direct energy injection."
>
> funny. After block is there, the particle will jump in the direction of the block and it
> will interact with the block. This interaction will force the particle to jump in the
> other direction

The molecular motion of the medium forces it to jump one way or the other at random, the
information is used to keep it from jumping back. So the work is extracted from the heat
energy of the medium, not from the interaction with the blocks.

> and I would say the energy is there. The authors should have defined better what they
> mean by direct energy injection.
>
> In essence, in my view the title "information-to-energy conversion" is some word game.
> It could work when instead of considering the whole system in question, one concentrates
> on a small subsystem.

Any demonstration of the principle is going to concentrate on a small system because it is
impossible to use information about 1e26 molecules. And of course it will be a
"subsystem" in the sense that some other device has to be used to get the information and
if that device in included as part of a closed system, then the 2nd law will apply - since
it applies to closed systems.

You seem to be arguing against claims that were not made by saying a laboratory
demonstration isn't a practical application.

Brent

Evgenii Rudnyi

unread,
Mar 27, 2012, 2:26:27 PM3/27/12
to everyth...@googlegroups.com
Brent,

I have nothing against of fundamental science and I do not expect
practical application for this paper.

Yet, I do not see fundamental results. What is in the paper is just a
change of vocabulary. I would say that we are free to choose a
definition. Well, right now when free will is under question such a
statement might be ill-posed but I guess that you understand what I mean.

Let me start with "extracting energy from random molecular motion". Let
us consider the next example. A macroscopic ball is flying in one
direction. We suddenly make a potential barrier on its way and it flies
back after the collision with this potential wall. Do we inject the
energy in the system to change the ball trajectory or not? Could you
please compare this example with the experiment described in the paper?
What is the difference between the wall in the example and potential
walls in the experiment?

My point above is that I am not convinced yet, that the energy in the
experiment is extracted from random molecular motion. It might be
possible to state this but then, in my view, some change in the normal
vocabulary is needed. This has been taken in the paper for granted.
Hence, I am not convinced.

Then

"What you asked for was an example of using information to reduce
entropy: not obtaining information AND using it to reduce entropy."

What do you mean here? I see two statements

"using information to reduce entropy"

and

"obtaining information AND using it to reduce entropy"

What in your view has been done in the paper and what difference do you
see between these two statements.

Finally when I have quoted a statement from the paper

"In microscopic systems, thermodynamic quantities such as work, heat and
internal energy do not remain constant"

I have meant the following. A thermodynamic system has an internal
energy, the Gibbs energy, the entropy and other state functions. However
a thermodynamic system does not possess work or heat, they are not state
functions. A thermodynamic system can perform work or produce heat when
it goes from one state to another. Hence the statement above as such is
just sloppy.

Evgenii


On 26.03.2012 00:44 meekerdb said the following:

meekerdb

unread,
Mar 27, 2012, 4:50:47 PM3/27/12
to everyth...@googlegroups.com
On 3/27/2012 11:26 AM, Evgenii Rudnyi wrote:
> Brent,
>
> I have nothing against of fundamental science and I do not expect practical application
> for this paper.
>
> Yet, I do not see fundamental results. What is in the paper is just a change of
> vocabulary. I would say that we are free to choose a definition. Well, right now when
> free will is under question such a statement might be ill-posed but I guess that you
> understand what I mean.
>
> Let me start with "extracting energy from random molecular motion". Let us consider the
> next example. A macroscopic ball is flying in one direction. We suddenly make a
> potential barrier on its way and it flies back after the collision with this potential
> wall. Do we inject the energy in the system to change the ball trajectory or not?

Not.

> Could you please compare this example with the experiment described in the paper? What
> is the difference between the wall in the example and potential walls in the experiment?

To be like the experiment the ball would have it's trajectory changed again by the random
heat energy of the medium.

>
> My point above is that I am not convinced yet, that the energy in the experiment is
> extracted from random molecular motion.

As I read it, not energy was actually extracted. It was a demonstration of principle. In
principle one could have a tiny shaft attached to the bead so that as it rotated the shaft
could be used to do work. But of course this is impractical.

> It might be possible to state this but then, in my view, some change in the normal
> vocabulary is needed. This has been taken in the paper for granted. Hence, I am not
> convinced.
>
> Then
>
> "What you asked for was an example of using information to reduce entropy: not obtaining
> information AND using it to reduce entropy."
>
> What do you mean here? I see two statements
>
> "using information to reduce entropy"
>
> and
>
> "obtaining information AND using it to reduce entropy"
>
> What in your view has been done in the paper and what difference do you see between
> these two statements.

The latter is of course what is done in the paper. The difference is that if you include
the obtaining the information in the balance sheets that costs free energy, so even though
you use information to gain free energy the second law is still upheld for the whole system.

>
> Finally when I have quoted a statement from the paper
>
> "In microscopic systems, thermodynamic quantities such as work, heat and internal energy
> do not remain constant"
>
> I have meant the following. A thermodynamic system has an internal energy, the Gibbs
> energy, the entropy and other state functions. However a thermodynamic system does not
> possess work or heat, they are not state functions. A thermodynamic system can perform
> work or produce heat when it goes from one state to another. Hence the statement above
> as such is just sloppy.

Complain to the authors.

Brent

Evgenii Rudnyi

unread,
Mar 28, 2012, 2:16:42 PM3/28/12
to everyth...@googlegroups.com
Brent,

Thank you for your answer. I have thought more and I believe that now I
understand the paper better.

I would agree that an ideal potential barrier, provided it is created
intelligently, does not inject the energy in a micro- and a macrosystem.
Well, if a potential barrier has some thickness, then when we insert it,
it should move the medium away. Also if we do not know the position of
the bead exactly then it well might be that the wall will push the bead
directly. Hence one cannot exclude that the potential wall inject the
energy as well, but presumably one can neglect it.

Still, I do not understand exactly how to describe the influence of the
wall on the physical system. In the ideal case, it does not change the
energy of the system but it definitely changes the momentum in the case
with the ball. In a microsystem, provided the wall goes through the
medium only, the momentum could stay the same though as the change from
both sides of the wall might cancel each other. It could be.

In any case, it is more interesting what happens with information. I
also agree that in this case the information is processed by the
controller, that is, there are some measurements, the results go into
the controller, and after some processing it makes an action.

Thereafter in my view, the title of the paper is misleading:
"information-to-energy conversion". By the way the authors are talking
about the energy, not the entropy.

What happens is that we have a multidomain system where there are
different interactions between different subsystems. Using some very
specific vocabulary one can presumably find a meaning in such a
statement "information-to-energy conversion" (or if you want it ot
"information-to-entropy conversion"). As I have already mentioned, this
could work if we limit the analysis for one subsystem of the whole
system. Yet, then information will be context dependent, so I am not
sure if it will be possible to bring a strict definition of information
as a property of a physical system from such an experiment.

Again, I have nothing against of the experiment as such. It looks
interesting. What is missing is a good theoretical analysis when one
starts from the whole system, including the controller (I guess, there
are computations there) and write down all the assumptions made to come
to the conclusion "information-to-energy conversion". It would be nice
to understand how information emerges from the movements of atoms and
molecules in the whole system including the controller.

Evgenii


On 27.03.2012 22:50 meekerdb said the following:

Reply all
Reply to author
Forward
0 new messages