Discussion:
Not Even Wrong Concepts in Physics: Entropy
(too old to reply)
Pentcho Valev
2017-06-16 17:13:34 UTC
Permalink
The following argument is obviously valid:

If there is no evidence that the entropy is a state function for ANY system, then the concept of entropy is not even wrong.

Is there any evidence that the entropy is a state function for ANY system? No. If you define the entropy S as a quantity that obeys the equation dS=dQrev/T, you will find that, so defined, the entropy is a state function FOR AN IDEAL GAS. Clausius was very impressed by this statefunctionness and decided to prove that the entropy (so defined) is a state function for ANY system. So "Entropy is a state function" became a fundamental theorem in thermodynamics. Clausius deduced it from the assumption that any cycle can be disintegrated into small Carnot cycles, and nowadays this deduction remains the only justification of "Entropy is a state function":

http://mutuslab.cs.uwindsor.ca/schurko/introphyschem/lectures/240_l10.pdf
"Carnot Cycles: S is a State Function. Any reversible cycle can be thought of as a collection of Carnot cycles - this approximation becomes exact as cycles become infinitesimal. Entropy change around an individual cycle is zero. Sum of entropy changes over all cycles is zero."

http://ronispc.chem.mcgill.ca/ronis/chem213/hnd8.pdf
"Entropy Changes in Arbitrary Cycles. What if we have a process which occurs in a cycle other than the Carnot cycle, e.g., the cycle depicted in Fig. 3. If entropy is a state function, cyclic integral of dS = 0, no matter what the nature of the cycle. In order to see that this is true, break up the cycle into sub-cycles, each of which is a Carnot cycle, as shown in Fig. 3. If we apply Eq. (7) to each piece, and add the results, we get zero for the sum."

The assumption on which "Entropy is a state function" is based - that any cycle can be subdivided into small Carnot cycles - is obviously false. An isothermal cycle CANNOT be subdivided into small Carnot cycles. A cycle involving the action of conservative forces CANNOT be subdivided into small Carnot cycles.

Conclusion: The belief that the entropy is a state function is totally unjustified. Any time scientists use the term "entropy", they don't know what they are talking about.

https://en.wikipedia.org/wiki/History_of_entropy
"My greatest concern was what to call it. I thought of calling it 'information', but the word was overly used, so I decided to call it 'uncertainty'. When I discussed it with John von Neumann, he had a better idea. Von Neumann told me, 'You should call it entropy, for two reasons: In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, nobody knows what entropy really is, so in a debate you will always have the advantage."

Pentcho Valev
Markus Klyver
2017-06-16 18:26:39 UTC
Permalink
You obviously don't know what approximations and models are.
Pentcho Valev
2017-06-16 21:06:18 UTC
Permalink
The version of the second law of thermodynamics stated as "Entropy always increases" (a version which, according to A. Eddington, holds "the supreme position among the laws of Nature") is in fact a theorem deduced by Clausius in 1865:

http://philsci-archive.pitt.edu/archive/00000313/
Jos Uffink, Bluff your Way in the Second Law of Thermodynamics, p. 37: "Hence we obtain: THE ENTROPY PRINCIPLE (Clausius' version) For every nicht umkehrbar [irreversible] process in an adiabatically isolated system which begins and ends in an equilibrium state, the entropy of the final state is greater than or equal to that of the initial state. For every umkehrbar [reversible] process in an adiabatical system, the entropy of the final state is equal to that of the initial state."

Clausius' deduction was based on three postulates:

Postulate 1 (implicit): The entropy is a state function.

Postulate 2: Clausius' inequality (formula 10 on p. 33 in Uffink's paper) is correct.

Postulate 3: Any irreversible process can be closed by a reversible process to become a cycle.

All the three postulates remain totally unjustified even nowadays; Postulate 3 is almost obviously false:

Uffink, p.39: "A more important objection, it seems to me, is that Clausius bases his conclusion that the entropy increases in a nicht umkehrbar [irreversible] process on the assumption that such a process can be closed by an umkehrbar [reversible] process to become a cycle. This is essential for the definition of the entropy difference between the initial and final states. But the assumption is far from obvious for a system more complex than an ideal gas, or for states far from equilibrium, or for processes other than the simple exchange of heat and work. Thus, the generalisation to all transformations occurring in Nature is somewhat rash."

Note that, even if Clausius's theorem were correct (it is not), it only holds for "an adiabatically isolated system which begins and ends in an equilibrium state". This means that all applications of "Entropy always increases" to processes which do not begin and end in equilibrium are unjustified (even if the theorem were correct!). Needless to say, scientists couldn't care less about that.

Pentcho Valev

Loading...