1 view

Skip to first unread message

Sep 6, 2022, 4:57:40 AMSep 6

to

Arthur Eddington: "The law that entropy always increases—the Second Law of Thermodynamics—holds, I think, the supreme position among the laws of Nature. If someone points out to you that your pet theory of the universe is in disagreement with Maxwell’s equations—then so much the worse for Maxwell’s equations. If it is found to be contradicted by observation—well these experimentalists do bungle things sometimes. But if your theory is found to be against the second law of thermodynamics I can give you no hope; there is nothing for it but to collapse in deepest humiliation." https://todayinsci.com/E/Eddington_Arthur/EddingtonArthur-Entropy-Quotations.htm

Athel Cornish-Bowden: "The concept of entropy was introduced to thermodynamics by Clausius, who deliberately chose an obscure term for it, wanting a word based on Greek roots that would sound similar to "energy". In this way he hoped to have a word that would mean the same to everyone regardless of their language, and, as Cooper [2] remarked, he succeeded in this way in finding a word that meant the same to everyone: NOTHING. From the beginning it proved a very difficult concept for other thermodynamicists, even including such accomplished mathematicians as Kelvin and Maxwell; Kelvin, indeed, despite his own major contributions to the subject, never appreciated the idea of entropy [3]. The difficulties that Clausius created have continued to the present day, with the result that a fundamental idea that is absolutely necessary for understanding the theory of chemical equilibria continues to give trouble, not only to students but also to scientists who need the concept for their work." https://www.beilstein-institut.de/download/712/cornishbowden_1.pdf

Scientific American: "When I discussed it with John von Neumann, he had a better idea. Von Neumann told me, 'You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, no one knows what entropy really is, so in a debate you will always have the advantage." https://www.esalq.usp.br/lepse/imgs/conteudo_thumb/Energy-and-Information.pdf

See more here: https://twitter.com/pentcho_valev

Pentcho Valev

Athel Cornish-Bowden: "The concept of entropy was introduced to thermodynamics by Clausius, who deliberately chose an obscure term for it, wanting a word based on Greek roots that would sound similar to "energy". In this way he hoped to have a word that would mean the same to everyone regardless of their language, and, as Cooper [2] remarked, he succeeded in this way in finding a word that meant the same to everyone: NOTHING. From the beginning it proved a very difficult concept for other thermodynamicists, even including such accomplished mathematicians as Kelvin and Maxwell; Kelvin, indeed, despite his own major contributions to the subject, never appreciated the idea of entropy [3]. The difficulties that Clausius created have continued to the present day, with the result that a fundamental idea that is absolutely necessary for understanding the theory of chemical equilibria continues to give trouble, not only to students but also to scientists who need the concept for their work." https://www.beilstein-institut.de/download/712/cornishbowden_1.pdf

Scientific American: "When I discussed it with John von Neumann, he had a better idea. Von Neumann told me, 'You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, no one knows what entropy really is, so in a debate you will always have the advantage." https://www.esalq.usp.br/lepse/imgs/conteudo_thumb/Energy-and-Information.pdf

See more here: https://twitter.com/pentcho_valev

Pentcho Valev

Sep 6, 2022, 5:27:35 PMSep 6

to

"Entropy was discovered when it was noticed to be a quantity that behaves as a function of state, as a consequence of the second law of thermodynamics." https://ipfs.io/ipfs/QmXoypizjW3WknFiJnKLwHCnL72vedxjQkDDP1mXWo6uco/wiki/Entropy.html

It was Clausius who "noticed" that. Here is the story:

If you define the entropy S as a quantity that obeys the equation dS=dQrev/T, you will find that, so defined, the entropy is a state function FOR AN IDEAL GAS. Clausius decided to prove that the entropy (so defined) is a state function for ANY system. He based his proof on the assumption that any cycle can be disintegrated into small Carnot cycles, and nowadays this proof remains the only justification of "Entropy is a state function":

"Carnot Cycles: S is a State Function. Any reversible cycle can be thought of as a collection of Carnot cycles - this approximation becomes exact as cycles become infinitesimal. Entropy change around an individual cycle is zero. Sum of entropy changes over all cycles is zero." http://mutuslab.cs.uwindsor.ca/schurko/introphyschem/lectures/240_l10.pdf

The assumption on which "Entropy is a state function" is based - that any cycle can be subdivided into small Carnot cycles - is obviously false. An isothermal cycle CANNOT be subdivided into small Carnot cycles. A cycle involving the action of conservative forces CANNOT be subdivided into small Carnot cycles.

Conclusion: The belief that the entropy is a state function is totally unjustified. Any time scientists use the term "entropy", they don't know what they are talking about.

More here: https://twitter.com/pentcho_valev

Pentcho Valev

It was Clausius who "noticed" that. Here is the story:

If you define the entropy S as a quantity that obeys the equation dS=dQrev/T, you will find that, so defined, the entropy is a state function FOR AN IDEAL GAS. Clausius decided to prove that the entropy (so defined) is a state function for ANY system. He based his proof on the assumption that any cycle can be disintegrated into small Carnot cycles, and nowadays this proof remains the only justification of "Entropy is a state function":

"Carnot Cycles: S is a State Function. Any reversible cycle can be thought of as a collection of Carnot cycles - this approximation becomes exact as cycles become infinitesimal. Entropy change around an individual cycle is zero. Sum of entropy changes over all cycles is zero." http://mutuslab.cs.uwindsor.ca/schurko/introphyschem/lectures/240_l10.pdf

The assumption on which "Entropy is a state function" is based - that any cycle can be subdivided into small Carnot cycles - is obviously false. An isothermal cycle CANNOT be subdivided into small Carnot cycles. A cycle involving the action of conservative forces CANNOT be subdivided into small Carnot cycles.

Conclusion: The belief that the entropy is a state function is totally unjustified. Any time scientists use the term "entropy", they don't know what they are talking about.

More here: https://twitter.com/pentcho_valev

Pentcho Valev

Reply all

Reply to author

Forward

0 new messages

Search

Clear search

Close search

Google apps

Main menu