Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.

Entropy : the Not Even Wrong Concept That Always Wins

Skip to first unread message

Pentcho Valev

Sep 6, 2022, 4:57:40 AM9/6/22
Arthur Eddington: "The law that entropy always increases—the Second Law of Thermodynamics—holds, I think, the supreme position among the laws of Nature. If someone points out to you that your pet theory of the universe is in disagreement with Maxwell’s equations—then so much the worse for Maxwell’s equations. If it is found to be contradicted by observation—well these experimentalists do bungle things sometimes. But if your theory is found to be against the second law of thermodynamics I can give you no hope; there is nothing for it but to collapse in deepest humiliation."

Athel Cornish-Bowden: "The concept of entropy was introduced to thermodynamics by Clausius, who deliberately chose an obscure term for it, wanting a word based on Greek roots that would sound similar to "energy". In this way he hoped to have a word that would mean the same to everyone regardless of their language, and, as Cooper [2] remarked, he succeeded in this way in finding a word that meant the same to everyone: NOTHING. From the beginning it proved a very difficult concept for other thermodynamicists, even including such accomplished mathematicians as Kelvin and Maxwell; Kelvin, indeed, despite his own major contributions to the subject, never appreciated the idea of entropy [3]. The difficulties that Clausius created have continued to the present day, with the result that a fundamental idea that is absolutely necessary for understanding the theory of chemical equilibria continues to give trouble, not only to students but also to scientists who need the concept for their work."

Scientific American: "When I discussed it with John von Neumann, he had a bet­ter idea. Von Neumann told me, 'You should call it entropy, for two reasons. In the first place your uncertainty func­tion has been used in statistical mechan­ics under that name, so it already has a name. In the second place, and more important, no one knows what entropy really is, so in a debate you will always have the advantage."

See more here:

Pentcho Valev

Pentcho Valev

Sep 6, 2022, 5:27:35 PM9/6/22
"Entropy was discovered when it was noticed to be a quantity that behaves as a function of state, as a consequence of the second law of thermodynamics."

It was Clausius who "noticed" that. Here is the story:

If you define the entropy S as a quantity that obeys the equation dS=dQrev/T, you will find that, so defined, the entropy is a state function FOR AN IDEAL GAS. Clausius decided to prove that the entropy (so defined) is a state function for ANY system. He based his proof on the assumption that any cycle can be disintegrated into small Carnot cycles, and nowadays this proof remains the only justification of "Entropy is a state function":

"Carnot Cycles: S is a State Function. Any reversible cycle can be thought of as a collection of Carnot cycles - this approximation becomes exact as cycles become infinitesimal. Entropy change around an individual cycle is zero. Sum of entropy changes over all cycles is zero."

The assumption on which "Entropy is a state function" is based - that any cycle can be subdivided into small Carnot cycles - is obviously false. An isothermal cycle CANNOT be subdivided into small Carnot cycles. A cycle involving the action of conservative forces CANNOT be subdivided into small Carnot cycles.

Conclusion: The belief that the entropy is a state function is totally unjustified. Any time scientists use the term "entropy", they don't know what they are talking about.

More here:

Pentcho Valev
0 new messages