Remember Me?

 Straight Dope Message Board Remember Me?

#101
05-03-2019, 03:30 PM
 Guest Join Date: Aug 2017 Location: Florida, USA Posts: 528
Quote:
 Originally Posted by Half Man Half Wit That's flat wrong. Take a room of gas, and pile up all the molecules in the left corner; this is an isolated system, and what's gonna happen is that all the gas will soon fill the room, until the whole is filled, which is the state of maximum entropy for the system.
Maybe you and I have different classical definitions of entropy. I quote: "∫ dQ/T = 0". dQ would be the change in transfer of heat with external reservoirs, ant T would be temperature. Heat moving around within a system does not count towards dQ, therefore entropy doesn't change no matter how the particles are configured.

Clausius went on and said there were such things as irreversible processes, which is in my opinion wrong, and refined the formula to an effective ∫ dQ/T > 0. Even so, both formulas are true.

Clausius said entropy of a closed system can only stay the same or increase. I say it never changes at all. You seem to be saying that it can increase or decrease. Somebody has to be wrong about something.

I've reproduced the relevant section of his paper in the spoiler below[1].

SPOILER:
According to this, the second fundamental theorem in the mechanical theory of heat, which in this form might appropriately be called the theorem of the equivalence of transformations, may be thus enunciated:
If two transformations which, without necessitating any other permanent change, can mutually replace one another, be called equivalent, then the generation of the quantity of heat Q of the temperature t from work, has the equivalence-value
Q

T
and the passage of the quantity of heat Q from the temperature t1 to the temperature t2, has the value
Q(1/T2 - 1/T1),
wherein T is a function of the temperature, independent of the nature of the process by which the transformation is effected.

If, to the last expression, we give the form
Q/T2 - Q/T1,
it is evident that the passage of the quantity of heat Q, from the temperature t1 to the temperature t2, has the same equivalence-value as a double transformation of the first kind, that is to say, the transformation of the quantity Q from heat, at the temperature t1 into work, and from work into heat at the temperature t2. A discussion of the question how far this external agreement is based upon the nature of the process itself would be out of place here; but at all events, in the mathematical determination of the equivalence-value, every transmission of heat, no matter how effected, can be considered as such a combination of two opposite transformations of the first kind.

By means of this rule, it will be easy to find a mathematical expression for the total value of all the transformations of both kinds, which are included in any circular process, however complicated. For instead of examining what part of a given quantity of heat received by a reservoir of heat, during the circular process, has arisen from work, and whence the other part has come, every such quantity received may be brought into calculation as if it had been generated by work, and every quantity lost by a reservoir of heat, as if it had been converted into work. Let us assume that the several bodies K1, K2, K3, &c., serving as reservoirs of heat at the temperatures t1, t2, t3, &c., have received during the process the quantities of heat Q1, Q2, Q3, &c., whereby the loss of a quantity of heat will be counted as the gain of a negative quantity of heat; then the total value N of all the transformations will be
N=Q1/T1 + Q2/T2 + Q3/T3 + &c. . = Σ Q/T
It is here assumed that the temperatures of the bodies K1, K2, K3, &c. are constant, or at least so nearly constant, that their variations may be neglected. When one of the bodies, however, either by the reception of the quantity of heat Q itself, or through some other cause, changes its temperature during the process so considerably, that the variation demands consideration, then for each element of heat dQ we must employ that temperature which the body possessed at the time it received it, whereby an integration will be necessary. For the sake of generality, let us assume that this is the case with all the bodies; then the forgoing equation will assume the form
N = ∫ dQ/T,
wherein the integral extends over all the quantities of heat received by the several bodies.

If the process is reversible, then, however complicated it may be, we can prove, as in the simple process before considered, that the transformations which occur must exactly cancel each other, so that their algebraical sum is zero.

For were this not the case, then we might conceive all the transformations divided into two parts, of which the first gives the algebraical sum zero, and the second consists entirely of transformations having the same sign. By means of a finite or infinite number of simple circular processes, the transformations of the first part must admit of being made in an opposite manner, so that the transformations of the second part would alone remain without any other change. Were these transformations negative, i.e. from heat into work, and the transmission of heat from a lower to a higher temperature, then of the two the first could be replaced by transformations of the latter kind, and ultimately transmissions of heat froma lower to a higher temperature would alone remain, which would be compensated by nothing, and therefore contrary to the above principle. Further, were those transformations positive, it would only be necessary to execute the operations in an inverse manner to render them negative, and thus obtain the foregoing impossible case again. Hence we conclude that the second part of the transformations can have no existence.

Consequently the equation
∫ dQ/T = 0
is the analytical expression of the second fundamental theorem in the mechanical theory of heat.

~Max

[1] Clausius, M. R. (1856, August). On a modified Form of the second Fundamental Theorem in the Mechanical Theory of Heat. London, Edinburgh and Dublin Philosophical Magazine and Journal of Science, Series 4, 12(77), 92-98. Retrieved from https://archive.org/stream/londonedi...ge/92/mode/2up

Last edited by Max S.; 05-03-2019 at 03:31 PM.
#102
05-03-2019, 03:40 PM
 Guest Join Date: Aug 2017 Location: Florida, USA Posts: 528
Quote:
 Originally Posted by Half Man Half Wit All the cold particles moving right, and the hot particles moving left, is therefore a reduction in entropy---both statistical (since we're going from a macrostate with many microscopic realizations to one with fewer) and in terms of heat transfer (since heat flows from a colder system to a hotter one). As you may recall, this is the setup of Maxwell's demon: he sits at the boundary between both sides, sorting hot particles to one, and cold particles to the other side. The trick is now simply that we don't need the demon at all: the whole thing can happen purely by chance, should all the molecule's velocities align in the right way; which they will, for generic initial conditions, after you've waited long enough.
Emphasis mine. The heat transfer is between sub-systems which were never isolated to begin with. The classical entropy of the overall system is unaffected.

And I thought we said microstatic particles still had to obey classical laws of physics? The movement of particles would not be fundamentally random, it would be absolutely determined by the microstate at any previous instant. You can theoretically create such a lopsided box of cold and hot gasses, but only by the demon expending energy outside the system to make it so orderly and expending energy again to remove the barrier.

~Max
#103
05-03-2019, 03:46 PM
 Guest Join Date: Aug 2017 Location: Florida, USA Posts: 528
Quote:
 Originally Posted by Half Man Half Wit Simple: there's more ways to increase than decrease the entropy, thus, any given change is more likely to increase it, and hence, on average, the entropy will increase.
You are missing a premise, which I believe to be equiprobability of microstates at every instant, regardless of previous microstates.
• There are more high-entropy* states than low entropy states.
• ?
• Therefore, any given change in microstate is more likely to increase entropy.

~Max
#104
05-03-2019, 04:04 PM
 Guest Join Date: Aug 2017 Location: Florida, USA Posts: 528
Quote:
 Originally Posted by Half Man Half Wit Neither. The statistical part comes in because we have incomplete information about the underlying microstate, and thus, can't make certain predictions. It is, in the end, not in any way different or more complicated than assigning a probability of 1/6 to a thrown die showing any given number. Perhaps as an analogy: thermodynamics tells you, after having observed lots of dice throws, that the die shows each number 1/6 of the time; statistical mechanics models the way the dice gets thrown, its possible trajectories, and derives that it's gonna come up with every number once in six throws.
Thermodynamics does not give me a probability distribution for anything, that would be statistical mechanics operating on top of thermodynamic observations. The problem is that you are assigning 1/6 to each dice roll when the underlying classical physics are purely deterministic. That is what makes statistical mechanics a set of heuristics rather than physical laws, so far as strict causality is concerned.

And it could very well be that the fundamental reality is random, but it was my impression from the other thread and your posts here that you do not subscribe to that idea, or at least do not present that argument now.

~Max
#105
05-03-2019, 04:44 PM
 Guest Join Date: Jun 2007 Posts: 6,672
Quote:
 Originally Posted by Max S. Clausius said entropy of a closed system can only stay the same or increase. I say it never changes at all. You seem to be saying that it can increase or decrease. Somebody has to be wrong about something.
Well, that's a settled question: theory predicts both increase and decrease, and both is experimentally well-confirmed.

Quote:
 Originally Posted by Max S. Emphasis mine. The heat transfer is between sub-systems which were never isolated to begin with. The classical entropy of the overall system is unaffected.
This is simply wrong. The 'classical' entropy (which, again, is the very same thing as the entropy of statistical mechanics, so there's a distinction that doesn't make a difference) of a system with an unequal distribution of heat is lower than that of a system in equilibrium.

Quote:
 And I thought we said microstatic particles still had to obey classical laws of physics? The movement of particles would not be fundamentally random, it would be absolutely determined by the microstate at any previous instant.
Which is not in conflict with anything I said. Giving a statistical description of a system does not entail that its behavior must be fundamentally probabilistic. A die roll is completely deterministic: at the instant it leaves your hand, it's already determined what number it will show. That doesn't mean saying 'it will show 6 with probability 1/6' is meaningless.

Quote:
 Originally Posted by Max S. You are missing a premise, which I believe to be equiprobability of microstates at every instant, regardless of previous microstates. There are more high-entropy* states than low entropy states. ? Therefore, any given change in microstate is more likely to increase entropy. ~Max
There's nothing missing. Take the die; say that four of the numbers it can show are high-entropy states. Then, each die roll is more likely to end up in a high-entropy state, simply by virtue of there being more of them. That's in no sense in conflict with the fact that the trajectory of the die is completely deterministic once it's left your hand.

Quote:
 Originally Posted by Max S. Thermodynamics does not give me a probability distribution for anything, that would be statistical mechanics operating on top of thermodynamic observations.
What I meant is, that thermodynamics gives you an effective, ultimately heuristic picture. Statistical mechanics connects this heuristic picture with the exact fundamental dynamics, and tells us how it comes about. At the microphysical level, there are no quantities like temperature or pressure; each is just an effective quantity that derives from more fundamental quantities of the underlying system, such as the kinetic energy of the individual particles. Entropy, likewise, emerges as an effective accounting of the number of states of the more fundamental system that realize the same state of the approximate, coarse-grained thermodynamic picture.
#106
05-03-2019, 05:06 PM
 Guest Join Date: Aug 2017 Location: Florida, USA Posts: 528
Quote:
 Originally Posted by Half Man Half Wit There's nothing missing. Take the die; say that four of the numbers it can show are high-entropy states. Then, each die roll is more likely to end up in a high-entropy state, simply by virtue of there being more of them. That's in no sense in conflict with the fact that the trajectory of the die is completely deterministic once it's left your hand.
Your argument is still an enthymeme, and quite easily quaternio terminorum too:
• This die has more marked sides than unmarked sides.
• ?
• Therefore, a dice roll is more likely to show a marked side.

~Max

Last edited by Max S.; 05-03-2019 at 05:06 PM. Reason: Hi, Opal!
#107
05-03-2019, 05:17 PM
 Guest Join Date: Aug 2017 Location: Florida, USA Posts: 528

## The definition of entropy

Quote:
 Originally Posted by Half Man Half Wit Well, that's a settled question: theory predicts both increase and decrease, and both is experimentally well-confirmed.
The theory of statistical mechanics, maybe. But not in classical thermodynamics, to my knowledge.

Quote:
 Originally Posted by Half Man Half Wit This is simply wrong. The 'classical' entropy (which, again, is the very same thing as the entropy of statistical mechanics, so there's a distinction that doesn't make a difference) of a system with an unequal distribution of heat is lower than that of a system in equilibrium.
I had just come to the conclusion that our definitions of classical entropy are different, with cites from the man who coined the word "entropy" (although not in that paper). The definition I cited is a state function on state variables. The definition you give is a function of microscopic variables. According to my definition, entropy will never change (or at least never decrease) in an isolated system. In your definition entropy fluctuates. How can you still say your definition is the same, with all of these differences?

~Max
#108
05-03-2019, 05:17 PM
 Guest Join Date: Jun 2007 Posts: 6,672
Quote:
 Originally Posted by Max S. Your argument is still an enthymeme, and quite easily quaternio terminorum too: This die has more marked sides than unmarked sides. ? Therefore, a dice roll is more likely to show a marked side. ~Max
We should get together for a game of craps sometime.
#109
05-03-2019, 05:33 PM
 Guest Join Date: Jun 2007 Posts: 6,672
Quote:
 Originally Posted by Max S. The theory of statistical mechanics, maybe. But not in classical thermodynamics, to my knowledge.
Because thermodynamics is just an approximate theory, and thus, doesn't always apply. That's what's meant by 'violation of the second law'.

Quote:
 I had just come to the conclusion that our definitions of classical entropy are different, with cites from the man who coined the word "entropy" (although not in that paper). The definition I cited is a state function on state variables. The definition you give is a function of microscopic variables. According to my definition, entropy will never change (or at least never decrease) in an isolated system. In your definition entropy fluctuates. How can you still say your definition is the same, with all of these differences? ~Max
Simple: Clausius was wrong about entropy, and you still are. When he came up with the whole thing, he thought it had certain properties, like being always non-decreasing; later on, a more complete theory was found, and it turns out that entropy doesn't actually have those properties. That's science: theories are formulated, overturned, and refined.

That something can be expressed in terms of state variables doesn't preclude that same thing being expressed on the basis of microscopic variables---just as how temperature can be expressed as the average molecular energy, as well as in terms of internal energy and entropy. Those aren't two different things, but merely two different ways of talking about the same thing. Like the morning star and the evening star both refer to Venus, even though, quite clearly, morning and evening are different things. You can't logically pry them apart; and you can't logically pry thermodynamic entropy and the entropy of statistical mechanics apart.

Last edited by Half Man Half Wit; 05-03-2019 at 05:34 PM.
#110
05-06-2019, 12:48 PM
 Guest Join Date: Aug 2017 Location: Florida, USA Posts: 528

## Re: Fluctuation theorem experiment

Quote:
 Originally Posted by Half Man Half Wit I searched the text, and the word 'mitochondria' doesn't appear. That's all the effort I'm really willing to expend on that... On another note: Max S., what do you make of papers like this one? Experimental Demonstration of Violations of the Second Law of Thermodynamics for Small Systems and Short Time Scales Do you think the experimental methodology was suspect? Are they misinterpreting their results?
I would like to wave that paper and all variants of the fluctuation theorem away as straw man arguments. Fluctuation theorem only contradicts a statistical formulation of the second law of thermodynamics, the entropy formulation with a statistical definition of entropy. From the paper:

Quote:
 That is, heat energy from the surroundings will be converted into useful work allowing the engine to run backwards. For larger engines, we would describe this as a violation of the Second Law of Thermodynamics, as entropy is consumed rather than generated. This has received little attention in the nanotechnology literature as there was no quantitative description of the probability of entropy consumption in such small engines. The only thermodynamic statement available was the Second Law itself, stating that, for large systems and over long times, the entropy production rate is necessarily positive.
Now that paper was a bit dense for a layman like myself, but my reading is that they suspended tiny latex particles in a solution with a somewhat even distribution. Then they pointed a laser beam at one specific particle, the particle moves towards the laser beam due to micro-forces, and they measure the refraction of the laser beam coming out the other end. From this they calculate the trajectory of the particle and say, look at this! For a second there, if we move a particle with a laser the entropy of the solution as a system decreases. Therefore, the second law of thermodynamics is violated on this tiny scale for a short period of time!

I mean, of course you can physically move particles to states of lesser (statistical) entropy. It takes work to do so but I'm not sure if they accounted for the powering of the laser and adjusted their heat sinks accordingly. They didn't give the temperature of the solution or heat sinks but it would be near impossible to keep internal equilibrium after imposing an artificial temperature gradient. Not that internal equilibrium is necessary for the second law of thermodynamics...

Unless you are using the statistical definition of entropy and the entropy formulation of the second law of thermodynamics. In that case the law only holds absolutely true in an isolated system when the internal motions of atomic particles are exactly periodical, or when the temperature of the system is zero Kelvin. In all other cases the statistical entropy of a system will fluctuate, and this is the extent of any conclusions I can support from that paper.

As I keep saying this is all fine and good within the theory of statistical mechanics, but it would seem that statistical mechanics assumes a priori that every microstate has equal probability at every moment, regardless of previous microstates. If it were not so, there would be no probability distributions to base fluctuation theory upon. That means every law of dynamics is reduced from an absolute law to "very probable". The same argument about the second law of thermodynamics in that paper can be made to show individual particles are merely unlikely to jump across my desk faster than the speed of light in a vacuum. Or rather, the laws of physics are violated all the time but the chances of it having any noticeable effect are so low that it may as well be impossible. But I don't believe this is the position you take.

~Max
#111
05-06-2019, 01:24 PM
 Guest Join Date: May 2016 Posts: 3,033
1. What are you saying is the problem with the experiment in that paper, or the theorem they proved?

2. It's hard for me to parse what you are saying here but on one hand, yes, as I mentioned there are certain "assumptions" about systems (like your gas in a box) tending towards thermal equilibrium under certain conditions, and on the other hand at the same time there is research on setting up systems involving eg ultracold atoms that resist thermalization and exhibit quantum scarring and similar phenomena. It goes to show you that there are still interesting things to do in condensed matter physics.
#112
05-06-2019, 01:30 PM
 Guest Join Date: Aug 2017 Location: Florida, USA Posts: 528
Quote:
 Originally Posted by Half Man Half Wit Because thermodynamics is just an approximate theory, and thus, doesn't always apply. That's what's meant by 'violation of the second law'.
If by "approximate... thus, doesn't always apply" you mean the classical second law of thermodynamics is probabilistic, I strongly disagree with you. I am of the opinion that the second law of thermodynamics, as I learned in secondary school, have never been and can never be violated, theoretically or experimentally. In your example of three particles hopping between three boxes within a system, random hopping is impossible as it contradicts not only the laws of thermodynamics but also the speed of light in a vacuum and strict causality. In the example with billiard balls I pointed out that the second law is not violated because no heat is transfered external to the table as a system; billiard balls moving across the middle of the table do not contradict the second law because neither half of the table is an isolated system, that is, the ball moving across the table changes the aggregate velocity of all billiard balls which satisfies the second law of thermodynamics.

Quote:
 Originally Posted by Half Man Half Wit Simple: Clausius was wrong about entropy, and you still are. When he came up with the whole thing, he thought it had certain properties, like being always non-decreasing; later on, a more complete theory was found, and it turns out that entropy doesn't actually have those properties. That's science: theories are formulated, overturned, and refined.
I think he was wrong about irreversible processes, but the formulations I cited are still proper physical laws, to my knowledge. Would you care to explain why you disagree? What is the more complete theory?

Quote:
 Originally Posted by Half Man Half Wit That something can be expressed in terms of state variables doesn't preclude that same thing being expressed on the basis of microscopic variables---just as how temperature can be expressed as the average molecular energy, as well as in terms of internal energy and entropy. Those aren't two different things, but merely two different ways of talking about the same thing. Like the morning star and the evening star both refer to Venus, even though, quite clearly, morning and evening are different things. You can't logically pry them apart; and you can't logically pry thermodynamic entropy and the entropy of statistical mechanics apart.
You are saying they are the same thing, but I do not understand how you came to this conclusion. I'm still working through Boltzmann's lectures (in German), between passages of Consciousness Explained. Since Boltzmann defined "statistical" entropy, I hope he will clear things up for me. My guess is that he assumes the equiprobability of microstates at every instant, regardless of microstates at previous instants. That is the axiom you rejected.

~Max
#113
05-06-2019, 01:46 PM
 Guest Join Date: Aug 2017 Location: Florida, USA Posts: 528

## Re: Fluctuation theorem experiment

Quote:
 Originally Posted by DPRK 1. What are you saying is the problem with the experiment in that paper, or the theorem they proved?
I'm not a physicist and have no background in physics or science so there is a good chance that I have no idea what I'm talking about. But it seemed that they pointed a laser at a particle in the water, thus moving the particle. The laser constitutes work and the movement of the particle constitutes heat. When calculating entropy using the classical definition, you are supposed to consider external work as external heat[1]. It doesn't seem like this was accounted for in the experiment, which makes sense because they were using a different definition of entropy.

[1] "... every such quantity received may be brought into calculation as if it had been generated by work, and every quantity lost by a reservoir of heat, as if it had been converted into work."
Claudius, R. Supra at #101, in spoiler..

Quote:
 Originally Posted by DPRK 2. It's hard for me to parse what you are saying here but on one hand, yes, as I mentioned there are certain "assumptions" about systems (like your gas in a box) tending towards thermal equilibrium under certain conditions, and on the other hand at the same time there is research on setting up systems involving eg ultracold atoms that resist thermalization and exhibit quantum scarring and similar phenomena. It goes to show you that there are still interesting things to do in condensed matter physics.
I think I agree with you. Maybe? It seems to me that the authors of the paper in question also assumed the equiprobability of microstates at every instant. I'm not sure what quantum scarring or condensed matter physics are though.

~Max
#114
05-07-2019, 12:12 AM
 Guest Join Date: Jun 2007 Posts: 6,672
Quote:
 Originally Posted by Max S. I would like to wave that paper and all variants of the fluctuation theorem away as straw man arguments. Fluctuation theorem only contradicts a statistical formulation of the second law of thermodynamics, the entropy formulation with a statistical definition of entropy.
No; the fluctuation theorem is a consequence of entropy being, fundamentally, a statistical quantity (not 'statistical entropy', but entropy; there's only one such quantity, and it's the same one as in classical thermodynamics, which we just these days understand better than they did back then).

Quote:
 Then they pointed a laser beam at one specific particle, the particle moves towards the laser beam due to micro-forces, and they measure the refraction of the laser beam coming out the other end. From this they calculate the trajectory of the particle and say, look at this!
I haven't had a chance to look at the paper in detail, but I don't think that's right. The laser doesn't move the bead; it acts as an optical trap, keeping it in place. From there, you can calculate how much the bead should move around (oscillate around the mean position), and if it moves around more than that, it's received energy (heat) from the environment, as a statistical effect, thus violating the second law.

Quote:
 As I keep saying this is all fine and good within the theory of statistical mechanics, but it would seem that statistical mechanics assumes a priori that every microstate has equal probability at every moment, regardless of previous microstates.
It doesn't, though. It's assumed that each initial condition consistent with a certain macrostate is as likely as any other; from there, since the vast majority of all possible microstates leads to evolutions that visit each possible state of the system (ergodicity), we can derive that whenever we 'look' at the system, it's as likely to be in one microstate as it is to be in another. But the microscopic succession of microstates is perfectly deterministic.

Quote:
 Originally Posted by Max S. If by "approximate... thus, doesn't always apply" you mean the classical second law of thermodynamics is probabilistic, I strongly disagree with you. I am of the opinion that the second law of thermodynamics, as I learned in secondary school, have never been and can never be violated, theoretically or experimentally.
It's nice that you have an opinion, it's just that your opinion is contradicted by both physical theory and experiment, at which point, in particular because you have no justification for your opinion beyond 'but I really think it's like that', it is reasonable to revise your opinion.

Quote:
 In the example with billiard balls I pointed out that the second law is not violated because no heat is transfered external to the table as a system; billiard balls moving across the middle of the table do not contradict the second law because neither half of the table is an isolated system, that is, the ball moving across the table changes the aggregate velocity of all billiard balls which satisfies the second law of thermodynamics.
This is confused. Heat can never be transferred to an isolated system, because if you can transfer heat to a system, it's not isolated anymore. For heat transfer, two systems must be in contact; and the two halves of the billiard table are two systems in contact, just as, e. g., two halves of a volume of gas after you've removed some divider between them (a common example in thermodynamics) are.

Quote:
 I think he was wrong about irreversible processes, but the formulations I cited are still proper physical laws, to my knowledge. Would you care to explain why you disagree? What is the more complete theory?
The more complete theory is statistical mechanics---that's simply a matter of logic, since you can use it to derive all the predictions (in the proper limit) of thermodynamics, but also, additional effects, which are in fact experimentally observed. Statistical mechanics subsumes and extends thermodynamics. There is no question about that.

Quote:
 You are saying they are the same thing, but I do not understand how you came to this conclusion.
Because they agree across the domain to which both apply. Consider relativistic speed. Wherever a Newtonian approximation applies, deviations between it and the classical notion due to Newton are so slight as to be imperceptible. Thus, in the proper limit, relativity reproduces all the predictions of Newtonian mechanics.

However, relativity makes additional predictions in the case where the Newtonian theory no longer applies. In particular, relativistic speed has a maximum value, which it can't exceed. The Newtonian theory knows no such limit; but it would be spurious to say that therefore, there are really two notions of speed, one Newtonian, and one relativistic. You can't say, but my spaceship flies with Newtonian speed, and so, can go as fast as it likes!

But that's exactly what you're saying regarding entropy. The relativistic notion of speed has revised and extended the Newtonian one, which is now understood to be only (approximately) valid in the regime of the very small (absolute value); the statistical notion of entropy has revised and extended the classical thermodynamical one, which is now understood to be only (approximately) valid in the regime of the very large (number of particles).

Quote:
 My guess is that he assumes the equiprobability of microstates at every instant, regardless of microstates at previous instants. That is the axiom you rejected.
He doesn't. Nobody ever did.
#115
05-07-2019, 12:23 PM
 Guest Join Date: Jun 2007 Posts: 6,672
Quote:
 Originally Posted by Max S. I'm still working through Boltzmann's lectures (in German), between passages of Consciousness Explained.
Maybe you should replace both with Seager's Theories of Consciousness: early on, he discusses the notion of 'intrusions from below', where the more fundamental notions make themselves apparent at a coarse-grained level. One of his examples is, naturally, thermodynamics and its relation to statistical mechanics:
Quote:
 Originally Posted by William Seager A more rarefied example is the mighty domain of thermodynamics, whose power to order the world is everywhere visible. But while the laws of thermodynamics are exceptionless as written, they are subject to statistical fluctuation and the very slight possibility of reversal. This is because the ‘implementation’ of thermodynamical properties is via a vast system of micro-states which can, in principle, be ordered so as to lead to violations of the letter of thermodynamic law.
Seager's example is Loschmidt's paradox: Loschmidt pointed out that, for any entropy-increasing evolution of a physical system consisting of an aggregate of many particles, one can obtain an entropy-decreasing one by simply multiplying all the particle momenta with -1. This is dynamically as valid as the original configuration.

If one assumes that the past is (for whatever reason) in a low-entropy state, there is no dilemma here: we're still overwhelmingly likely to observe a steady increase of entropy. But it does point out that there are perfectly physically allowable evolutions of a system that decrease entropy.

Furthermore, it also shows that by considering the microscopic dynamics, you can infer how, e. g., heat gets transferred to a hotter body: just take the molecules involved in, e. g., some convective process, and invert their momenta. And lo and behold, suddenly heat flows in the 'wrong' direction. But of course, we knew already that this must, on occasion, be possible.
#116
05-07-2019, 12:31 PM
 Guest Join Date: May 2016 Posts: 3,033
Quote:
 Originally Posted by Max S. I'm not a physicist and have no background in physics or science so there is a good chance that I have no idea what I'm talking about. But it seemed that they pointed a laser at a particle in the water, thus moving the particle. The laser constitutes work and the movement of the particle constitutes heat. When calculating entropy using the classical definition, you are supposed to consider external work as external heat[1]. It doesn't seem like this was accounted for in the experiment, which makes sense because they were using a different definition of entropy.
IIRC that experiment was designed to probe non-equilibrium steady-state conditions, and that's exactly what they did.

There is no "different definition" of entropy, just the usual nuances in defining state variables for a nonequilibrium system. There are also other experiments studying entropy in complicated nonequilibrium conditions if you are interested.
#117
05-07-2019, 01:36 PM
 Guest Join Date: Aug 2017 Location: Florida, USA Posts: 528
Quote:
 Originally Posted by Half Man Half Wit I haven't had a chance to look at the paper in detail, but I don't think that's right. The laser doesn't move the bead; it acts as an optical trap, keeping it in place. From there, you can calculate how much the bead should move around (oscillate around the mean position), and if it moves around more than that, it's received energy (heat) from the environment, as a statistical effect, thus violating the second law.
I probably misunderstand something. I thought the optical trap worked because the bead was attracted to the laser beam. If they aren't using the laser to move the bead and reduce entropy, it seems they are just capturing a moving particle in a tractor beam and seeing how much it wiggles before momentum dies down and it stops moving. Sort of like tossing a marble into a bowl. I'm not sure how this counteracts the second law because one does not simply shoot a laser into an system and declare an isolated system has spontaneously reduced entropy. I could just as easily drop a triangle in the middle of our billiard table and point out how unlikely it is that three billiard balls in are always observed in the middle of the table.

Quote:
Originally Posted by Half Man Half Wit
Quote:
 Originally Posted by Max S. You are saying [statistical and classical entropy] are the same thing, but I do not understand how you came to this conclusion.
Because they agree across the domain to which both apply. Consider relativistic speed. Wherever a Newtonian approximation applies, deviations between it and the classical notion due to Newton are so slight as to be imperceptible. Thus, in the proper limit, relativity reproduces all the predictions of Newtonian mechanics.

However, relativity makes additional predictions in the case where the Newtonian theory no longer applies. In particular, relativistic speed has a maximum value, which it can't exceed. The Newtonian theory knows no such limit; but it would be spurious to say that therefore, there are really two notions of speed, one Newtonian, and one relativistic. You can't say, but my spaceship flies with Newtonian speed, and so, can go as fast as it likes!

But that's exactly what you're saying regarding entropy. The relativistic notion of speed has revised and extended the Newtonian one, which is now understood to be only (approximately) valid in the regime of the very small (absolute value); the statistical notion of entropy has revised and extended the classical thermodynamical one, which is now understood to be only (approximately) valid in the regime of the very large (number of particles).
The difference between relativity and statistical mechanics to me is that I know and accept the results and conclusions of experiments confirming relativity's superiority over other theories of motion. These are the Michelson-Morley and Ives-Stilwell experiments as well as Mercury's orbit and observations of solar eclipses.

~Max
#118
05-07-2019, 01:41 PM
 Guest Join Date: Aug 2017 Location: Florida, USA Posts: 528
Quote:
Originally Posted by Half Man Half Wit
Quote:
 Originally Posted by Max S. I would like to wave that paper and all variants of the fluctuation theorem away as straw man arguments. Fluctuation theorem only contradicts a statistical formulation of the second law of thermodynamics, the entropy formulation with a statistical definition of entropy.
No; the fluctuation theorem is a consequence of entropy being, fundamentally, a statistical quantity (not 'statistical entropy', but entropy; there's only one such quantity, and it's the same one as in classical thermodynamics, which we just these days understand better than they did back then).

...

Quote:
 Originally Posted by Max S. I think he was wrong about irreversible processes, but the formulations I cited are still proper physical laws, to my knowledge. Would you care to explain why you disagree? What is the more complete theory?
The more complete theory is statistical mechanics---that's simply a matter of logic, since you can use it to derive all the predictions (in the proper limit) of thermodynamics, but also, additional effects, which are in fact experimentally observed. Statistical mechanics subsumes and extends thermodynamics. There is no question about that.
Then this is the root of our disagreement. I have not yet reached the conclusion that the two definitions of entropy are the same, neither have I concluded that statistical mechanics extends thermodynamics. To me they appear as two different ways of describing the same phenomena: thermodynamics is more (theoretically perfectly) accurate but statistical mechanics is easier to calculate.

~Max
#119
05-07-2019, 01:53 PM
 Guest Join Date: Aug 2017 Location: Florida, USA Posts: 528
Quote:
Originally Posted by Half Man Half Wit
Quote:
 Originally Posted by Max S. As I keep saying this is all fine and good within the theory of statistical mechanics, but it would seem that statistical mechanics assumes a priori that every microstate has equal probability at every moment, regardless of previous microstates.
It doesn't, though. It's assumed that each initial condition consistent with a certain macrostate is as likely as any other; from there, since the vast majority of all possible microstates leads to evolutions that visit each possible state of the system (ergodicity), we can derive that whenever we 'look' at the system, it's as likely to be in one microstate as it is to be in another. But the microscopic succession of microstates is perfectly deterministic.
If I measure (statistical) entropy at time t0 then measure entropy again at time t1, can I compare the two measurements? I think if only the initial microstate is random, it is inappropriate to compare two measurements of statistical entropy. Each measurement of entropy assumes that all microstates are equally probable by virtue of having the number of microstates Ω in the definition. I am assuming that when calculating entropy for an isolated thermodynamic system, Ω remains constant over time. Otherwise statistical mechanics loses a lot of its usability as we would have to calculate the microscopic dynamics of every state to determine Ω for any time tn>0.

~Max

Last edited by Max S.; 05-07-2019 at 01:54 PM.
#120
05-07-2019, 02:11 PM
 Guest Join Date: Aug 2017 Location: Florida, USA Posts: 528
Quote:
 Originally Posted by DPRK IIRC that experiment was designed to probe non-equilibrium steady-state conditions, and that's exactly what they did. There is no "different definition" of entropy, just the usual nuances in defining state variables for a nonequilibrium system. There are also other experiments studying entropy in complicated nonequilibrium conditions if you are interested.
I don't yet understand how entropy in statistical mechanics supersedes classical entropy, to me they seem to describe entirely different things. If they are different it makes no sense to use the statistical definition in the entropic formulation of the second law of thermodynamics, which was derived using the classical definition of entropy.

The point is, I don't see how that experiment contradicts the second law of thermodynamics as cited in the original post here. The "law" the paper purports to contradict is not the second law of thermodynamics, in my opinion.

~Max
#121
05-07-2019, 02:16 PM
 Guest Join Date: Dec 2009 Location: The Land of Smiles Posts: 19,119
When natural gas is burned, there are intermediate processes where molecules devolve randomly into states with high free energy. For example the reaction CO + H2O --> CO2 + H2 requires that a water molecule be torn apart. Yet it happens, and methane couldn't burn properly without such high free-energy intermediate states.

These states aren't considered to violate the Second Law because they affect only a tiny portion of the burning gas at any one time, and only very briefly.

Quote:
 Originally Posted by DPRK ... There are also other experiments studying entropy in complicated nonequilibrium conditions if you are interested.
I Googled a bit, mainly hitting paywalls, but did find a video lecture of that paper. It may be slightly interesting; it includes motion pictures of DNA molecules (which were chosen for their size, shape and flexibility rather than any special biologic properties) traversing a "tilted washboard." However I do not see any sense of "2nd Law violation" beyond obvious fluctuations similar to those that arise trivially, as in the burning gas mentioned above.
#122
05-07-2019, 02:28 PM
 Guest Join Date: Aug 2017 Location: Florida, USA Posts: 528
Quote:
 Originally Posted by Half Man Half Wit Seager's example is Loschmidt's paradox: Loschmidt pointed out that, for any entropy-increasing evolution of a physical system consisting of an aggregate of many particles, one can obtain an entropy-decreasing one by simply multiplying all the particle momenta with -1. This is dynamically as valid as the original configuration. If one assumes that the past is (for whatever reason) in a low-entropy state, there is no dilemma here: we're still overwhelmingly likely to observe a steady increase of entropy. But it does point out that there are perfectly physically allowable evolutions of a system that decrease entropy. Furthermore, it also shows that by considering the microscopic dynamics, you can infer how, e. g., heat gets transferred to a hotter body: just take the molecules involved in, e. g., some convective process, and invert their momenta. And lo and behold, suddenly heat flows in the 'wrong' direction. But of course, we knew already that this must, on occasion, be possible.
Clausius only offered the entropic formulation of the second law, where entropy must always increase, in situations where there are irreversible processes. It would therefore be impossible to prove a contradiction by literally reversing the process, if you can reverse the process the change in entropy should be exactly zero.

I disagree with Clausius that irreversible processes exist on the grand scale of things, and he himself gave Maxwell's demon as an example of such an irreversible process, some ten years before Maxwell thought of the idea and decades before the thought experiment was refuted.

So as far as I am concerned, in classical thermodynamics there is no dilemma to begin with. Heat doesn't flow in the wrong direction, either there is zero net heat flow or the system is not isolated.

~Max
#123
05-07-2019, 02:51 PM
 Guest Join Date: Aug 2017 Location: Florida, USA Posts: 528
Quote:
 Originally Posted by septimus When natural gas is burned, there are intermediate processes where molecules devolve randomly into states with high free energy. For example the reaction CO + H2O --> CO2 + H2 requires that a water molecule be torn apart. Yet it happens, and methane couldn't burn properly without such high free-energy intermediate states. These states aren't considered to violate the Second Law because they affect only a tiny portion of the burning gas at any one time, and only very briefly.
I don't see why they would violate the second law to begin with. What does free energy have to do with the second law? Changes in the distribution of heat within a system do not contradict the second law of thermodynamics as stated by Clausius, Kelvin, or Carathéodory.

Quote:
 Originally Posted by septimus I Googled a bit, mainly hitting paywalls, but did find a video lecture of that paper. It may be slightly interesting; it includes motion pictures of DNA molecules (which were chosen for their size, shape and flexibility rather than any special biologic properties) traversing a "tilted washboard." However I do not see any sense of "2nd Law violation" beyond obvious fluctuations similar to those that arise trivially, as in the burning gas mentioned above.

~Max

Last edited by Max S.; 05-07-2019 at 02:51 PM. Reason: fixed quote
#124
05-07-2019, 03:07 PM
 Guest Join Date: Aug 2017 Location: Florida, USA Posts: 528
Quote:
 Originally Posted by DPRK It is valid that two systems are in mutual (thermal) equilibrium if there is no net flow of heat when they are in diathermal contact. However, imagine just one isolated system: if it is not in equilibrium, then you can imagine that under some extreme conditions it might not even have a well-defined temperature. On the other hand, once the system is in equilibrium all the "random movements" make no difference to the macrostate, assuming it is sufficiently "macro". If you are suggesting that certain transitions are allowed but others are not then it is important to note that that is not the case: these transitions are reversible microscopic fluctuations and each transition is as likely to occur as its converse. Again, nothing is defied: in the macroscopic limit these fluctuations will be too small to detect.
I'm not sure if I ever responded to this. One isolated system would by my definition be considered in a state of thermal equilibrium with its surroundings. Self-equilibrium only makes sense when you divide an isolated system into two or more sub-systems, or when the temperature of the isolated system is zero degrees Kelvin.

Your definition of equilibrium makes sense but it does not describe the same property.

~Max
#125
05-07-2019, 03:59 PM
 Guest Join Date: Jun 2007 Posts: 6,672
Quote:
 Originally Posted by Max S. I probably misunderstand something. I thought the optical trap worked because the bead was attracted to the laser beam. If they aren't using the laser to move the bead and reduce entropy, it seems they are just capturing a moving particle in a tractor beam and seeing how much it wiggles before momentum dies down and it stops moving.
They are moving the bead, and the system has a certain entropy production along its trajectory. According to the second law, that entropy production ought to always be positive; according to the fluctuation theorem, it occasionally won't be. They measure, and do find negative entropy production.

Quote:
 The difference between relativity and statistical mechanics to me is that I know and accept the results and conclusions of experiments confirming relativity's superiority over other theories of motion. These are the Michelson-Morley and Ives-Stilwell experiments as well as Mercury's orbit and observations of solar eclipses. ~Max
Well, now you at least know the theory and experiment regarding violations of the second law. Acceptance, usually, comes eventually (it might be a 'going through the stages' thing), although you're showing some enviably strong convictions on a matter I had thought unlikely to really inspire controversy.

Quote:
 Originally Posted by Max S. Then this is the root of our disagreement. I have not yet reached the conclusion that the two definitions of entropy are the same, neither have I concluded that statistical mechanics extends thermodynamics. To me they appear as two different ways of describing the same phenomena: thermodynamics is more (theoretically perfectly) accurate but statistical mechanics is easier to calculate. ~Max
This gets things the wrong way around. Thermodynamics is like the science of many coin throws: for a large enough number, half of them will be heads, half of them tails. If all you ever observe are large numbers of coin throws, you might think it's a natural law that there are always equal amounts of heads and tails (say, you only know the rough weight of the coins that came up heads vs. those that came up tails).

Then, you start, either theoretically or experimentally, to study smaller numbers of coin throws. And you see, well, your 'law' doesn't hold anymore: you throw ten coins, and well, it's not always the case that five come up heads, and five tails. Rather, the whole thing turns out to be statistical: the chance of each coin toss coming up heads is 50%, but that doesn't mean all ten can't come up heads.

So, you've traded in the 'exactness' of your law---which has turned out to be only approximate---for a statistical law describing more fundamental dynamics.

You can use this new-found knowledge to precisely calculate how likely it is that a given ensemble of N coins contains half heads and half tails. This is how you can derive the laws of the ensemble from those of the more fundamental theory.

So it is with statistical mechanics. You can calculate the expected entropy increase for any given system, then measure it, and find agreement. This is also how you know that the entropy from statistical mechanics is nothing different from that of thermodynamics; it just yields a more complete picture.

Quote:
 Originally Posted by Max S. If I measure (statistical) entropy at time t0 then measure entropy again at time t1, can I compare the two measurements?
You can't measure 'statistical' entropy, you can only measure entropy. You can calculate the entropy based on statistical methods, then make a measurement, and see that it agrees.

Quote:
 I think if only the initial microstate is random, it is inappropriate to compare two measurements of statistical entropy. Each measurement of entropy assumes that all microstates are equally probable by virtue of having the number of microstates Ω in the definition.
This is confused. A system has entropy because we don't have complete knowledge of the microstate. Given that particular macrostate, it can be equally well in either microstate compatible with that description. It isn't any more likely to be in either of these states, because if we could say that it is, we would have more knowledge about the microstate than the macrostate allows, and hence, would not actually be in that macrostate.

Perhaps it helps if you think of it as a betting matter. Given the macrostate, which microstate would you bet the system's in? And there's only one way to bet: if all the information you have is the macrostate, then it could equally well be in either microstate consistent with that.

This isn't an assumption of all microstates being equally likely at any moment, it's merely a statement of ignorance.

Quote:
 Originally Posted by Max S. Clausius only offered the entropic formulation of the second law, where entropy must always increase, in situations where there are irreversible processes. It would therefore be impossible to prove a contradiction by literally reversing the process, if you can reverse the process the change in entropy should be exactly zero.
But you do accept that a gas, say, is ultimately made out of smaller particles, atoms? And do you accept that their motions are completely reversible? That is, that if an atom can move to the right at velocity v, then it can as well move to the left at velocity v?

If so, and if the gas is completely described in terms of its constituent atoms, and it's in a state where each atom has a certain velocity in a certain direction, then what does prevent the state in which each atom has the exactly opposite velocity in the exactly opposite direction from being possible?

Clausius' irreversible systems exist only in an approximate sense: if we start out in a low entropy state, then it will almost never happen that entropy further decreases. Hence, macroscopic systems are almost always irreversible. But, just as it's not in principle impossible that the shards of a broken cup, if shaken in a box, spontaneously reform into a cup, so it is not impossible for the velocities of the atoms in a gas to be aligned in such a way as to lower the entropy---say, by all collecting in the lower left corner of a box.

Quote:
 So as far as I am concerned, in classical thermodynamics there is no dilemma to begin with. Heat doesn't flow in the wrong direction, either there is zero net heat flow or the system is not isolated. ~Max
This doesn't make sense. If the system is isolated, then obviously there's zero net heat flow, since there's no place where the heat could flow.

But, in the case of heat flowing from body A to body B (hence, neither being isolated), say, by transfer of a mass of hot gas (convection), there is no contradiction in thinking that all the atoms in the gas could have the opposite velocity, heat thus flowing to the colder system.

Quote:
 Originally Posted by Max S. I'm not sure if I ever responded to this. One isolated system would by my definition be considered in a state of thermal equilibrium with its surroundings.
How could an isolated system be in equilibrium with its surroundings? If the system is isolated, there is no heat flow, and thus, no means by which to achieve equilibrium. I mean, that's why we keep our hot beverages in specially-designed flasks that limit the interaction with the surrounding---to keep the coffee hot, and slow down equilibration as much as possible!
#126
05-07-2019, 04:15 PM
 Guest Join Date: Aug 2017 Location: Florida, USA Posts: 528
Quote:
 Originally Posted by Half Man Half Wit How could an isolated system be in equilibrium with its surroundings? If the system is isolated, there is no heat flow, and thus, no means by which to achieve equilibrium. I mean, that's why we keep our hot beverages in specially-designed flasks that limit the interaction with the surrounding---to keep the coffee hot, and slow down equilibration as much as possible!
That is exactly my definition of thermal equilibrium - zero net heat flow between the system and its surroundings.

~Max
#127
05-07-2019, 04:21 PM
 Guest Join Date: Aug 2017 Location: Florida, USA Posts: 528

Quote:
 Originally Posted by Half Man Half Wit This doesn't make sense. If the system is isolated, then obviously there's zero net heat flow, since there's no place where the heat could flow. But, in the case of heat flowing from body A to body B (hence, neither being isolated), say, by transfer of a mass of hot gas (convection), there is no contradiction in thinking that all the atoms in the gas could have the opposite velocity, heat thus flowing to the colder system.
If there is zero net heat flow then by definition:
Let δQ = 0
dS = ∫ δQ/T
= ∫ 0/T
= ∫ 0
= 0; Q.E.D.
That is the resolution to Loschmidt's paradox, as related by DPRK. Apparently the resolution is different using your definition of entropy rather than Clausius's.

~Max

Last edited by Max S.; 05-07-2019 at 04:22 PM. Reason: indent
#128
05-07-2019, 04:28 PM
 Guest Join Date: Aug 2017 Location: Florida, USA Posts: 528
Quote:
 Originally Posted by Half Man Half Wit You can't measure 'statistical' entropy, you can only measure entropy. You can calculate the entropy based on statistical methods, then make a measurement, and see that it agrees.
When I used "measure" I meant calculate. I am not aware of a physical method to measure the entropy of a given system, of any analog to the thermometer or pressure gauge.

~Max
#129
05-07-2019, 04:35 PM
 Guest Join Date: Aug 2017 Location: Florida, USA Posts: 528
Quote:
 Originally Posted by Half Man Half Wit But you do accept that a gas, say, is ultimately made out of smaller particles, atoms? And do you accept that their motions are completely reversible? That is, that if an atom can move to the right at velocity v, then it can as well move to the left at velocity v? If so, and if the gas is completely described in terms of its constituent atoms, and it's in a state where each atom has a certain velocity in a certain direction, then what does prevent the state in which each atom has the exactly opposite velocity in the exactly opposite direction from being possible?
This would be the law of conservation of momentum, also known as Newton's first law of motion, which to my knowledge is preserved in relativity so long as we are talking about one inertial frame of reference.

~Max
#130
05-07-2019, 04:46 PM
 Guest Join Date: Aug 2017 Location: Florida, USA Posts: 528
Quote:
 Originally Posted by Half Man Half Wit This is confused. A system has entropy because we don't have complete knowledge of the microstate. Given that particular macrostate, it can be equally well in either microstate compatible with that description. It isn't any more likely to be in either of these states, because if we could say that it is, we would have more knowledge about the microstate than the macrostate allows, and hence, would not actually be in that macrostate. Perhaps it helps if you think of it as a betting matter. Given the macrostate, which microstate would you bet the system's in? And there's only one way to bet: if all the information you have is the macrostate, then it could equally well be in either microstate consistent with that. This isn't an assumption of all microstates being equally likely at any moment, it's merely a statement of ignorance.
We can't know every detail about real experiments, but theoretical thermodynamic systems can be fully defined by the laws of physics. This is the central premise of strong physicalism as we discussed in the Dualism thread. There's no reason to bet, I can look at the description of a theoretical system and tell you exactly how it is.

Are you to say a fully defined theoretical system has no entropy? How is this consistent with Clausius's definition of entropy as a function of heat flow and temperature?

Is it impossible to fully define a theoretical system? Are you denying local reality? How is that compatible with monistic physicalism?

~Max

Last edited by Max S.; 05-07-2019 at 04:48 PM.
#131
05-07-2019, 05:02 PM
 Guest Join Date: Aug 2017 Location: Florida, USA Posts: 528
Quote:
 Originally Posted by Half Man Half Wit This gets things the wrong way around. Thermodynamics is like the science of many coin throws: for a large enough number, half of them will be heads, half of them tails. If all you ever observe are large numbers of coin throws, you might think it's a natural law that there are always equal amounts of heads and tails (say, you only know the rough weight of the coins that came up heads vs. those that came up tails). Then, you start, either theoretically or experimentally, to study smaller numbers of coin throws. And you see, well, your 'law' doesn't hold anymore: you throw ten coins, and well, it's not always the case that five come up heads, and five tails. Rather, the whole thing turns out to be statistical: the chance of each coin toss coming up heads is 50%, but that doesn't mean all ten can't come up heads. So, you've traded in the 'exactness' of your law---which has turned out to be only approximate---for a statistical law describing more fundamental dynamics. You can use this new-found knowledge to precisely calculate how likely it is that a given ensemble of N coins contains half heads and half tails. This is how you can derive the laws of the ensemble from those of the more fundamental theory. So it is with statistical mechanics. You can calculate the expected entropy increase for any given system, then measure it, and find agreement. This is also how you know that the entropy from statistical mechanics is nothing different from that of thermodynamics; it just yields a more complete picture. ... Clausius' irreversible systems exist only in an approximate sense: if we start out in a low entropy state, then it will almost never happen that entropy further decreases. Hence, macroscopic systems are almost always irreversible. But, just as it's not in principle impossible that the shards of a broken cup, if shaken in a box, spontaneously reform into a cup, so it is not impossible for the velocities of the atoms in a gas to be aligned in such a way as to lower the entropy---say, by all collecting in the lower left corner of a box.
This is not my understanding of classical thermodynamics, it is my understanding of statistical mechanics as relayed to me by you. My interpretation of classical thermodynamics is based on reading papers at face value. I linked the papers in the original post. Your saying I have it backwards is truly perplexing.

If you must say there is no difference between classical thermodynamics and statistical mechanics, just replace my usage of "classical thermodynamics" with "Max S.'s personal science of thermodynamics". When there's a law in my personal science of thermodynamics, that means absolutely no violations. The definition of entropy in Max S.'s thermodynamics is given by the equation:

dS = ∫ δQ/T
where Q is net heat flow and T is temperature.

The second law of thermodynamics is given in three different, equally inviolable forms in the original post. There are two valid corollaries to the second law of Max S.'s thermodynamics:

"The entropy of an isolated system shall not change".
"The entropy of an isolated system shall not decrease".

Where "isolated system" means zero net heat flow external to the system. This is my plain interpretation of documents linked in the original post. Why is the physics wrong?

We are back to square one.

~Max

Last edited by Max S.; 05-07-2019 at 05:03 PM.
#132
05-07-2019, 05:15 PM
 Guest Join Date: Aug 2017 Location: Florida, USA Posts: 528
Quote:
 Originally Posted by Half Man Half Wit They are moving the bead, and the system has a certain entropy production along its trajectory. According to the second law, that entropy production ought to always be positive; according to the fluctuation theorem, it occasionally won't be. They measure, and do find negative entropy production. Well, now you at least know the theory and experiment regarding violations of the second law. Acceptance, usually, comes eventually (it might be a 'going through the stages' thing), although you're showing some enviably strong convictions on a matter I had thought unlikely to really inspire controversy.
To me it seems everybody is redefining entropy and the second law of thermodynamics. I take the laws as defined by Clausius, Kelvin, and Cartheodory and find no contradiction. Neither do I yet understand why everybody redefined entropy to begin with, or why the second law of thermodynamics as expressed in a corollary at the end of Clausius's paper should still hold true after changing the definition of entropy.

~Max
#133
05-08-2019, 12:00 AM
 Guest Join Date: Dec 2010 Posts: 7,618
Quote:
 Originally Posted by Max S. I take the laws as defined by Clausius, Kelvin, and Cartheodory and find no contradiction.
So what if there's no contradiction? Classical thermodynamics is false. It doesn't describe the universe. Temperature, heat, entropy, etc. aren't some goo that permeates the universe; they're emergent properties of an ensemble of particles. If there were no granularity to the universe--it it were smooth all the way down and temperature was a real property of things--then classical thermodynamics would be fine. But it's not, and so statistical mechanics is the real description, and with it comes the fluctuation theorem and the possibility that the 2LoT will only be true on average.
#134
05-08-2019, 12:45 AM
 Guest Join Date: May 2016 Posts: 3,033
Quote:
 Originally Posted by Max S. I don't yet understand how entropy in statistical mechanics supersedes classical entropy, to me they seem to describe entirely different things. If they are different it makes no sense to use the statistical definition in the entropic formulation of the second law of thermodynamics, which was derived using the classical definition of entropy.
It's the same thing. One can examine the microscopic origin of properties of matter, which is where statistical methods come in.
Quote:
 The point is, I don't see how that experiment contradicts the second law of thermodynamics as cited in the original post here. The "law" the paper purports to contradict is not the second law of thermodynamics, in my opinion.
I never said any such thing, and neither does the paper !? So there may be a misunderstanding.

Again, one way to think about it is that if you have a particle subjected to Brownian motion in a fluid, and you pull on it by moving a laser with constant velocity, then some of the time the force does positive work on the particle, and some of the time it does negative work, but the former is exponentially more probable than the latter. Far from purporting to contradict this (or any related) law, the experiment exactly confirms it.
#135
05-08-2019, 01:02 AM
 Guest Join Date: Jun 2007 Posts: 6,672
Quote:
 Originally Posted by Max S. That is exactly my definition of thermal equilibrium - zero net heat flow between the system and its surroundings. ~Max
That's indeed the case if a system is in equilibrium. However, it's a necessary, not sufficient condition for equilibrium: my coffee in a perfect thermos has no net heat flow to the environment; yet, it's not in equilibrium with it.

Quote:
 Originally Posted by Max S. This would be the law of conservation of momentum, also known as Newton's first law of motion, which to my knowledge is preserved in relativity so long as we are talking about one inertial frame of reference. ~Max
I didn't say that the momentum spontaneously changes to pointing in the other direction. I said that it's a valid state for the atom to move to the left at v, it's a valid state for it to move to the right at v. Both will be solutions of the equations of motion.

Quote:
 Originally Posted by Max S. We can't know every detail about real experiments, but theoretical thermodynamic systems can be fully defined by the laws of physics. This is the central premise of strong physicalism as we discussed in the Dualism thread. There's no reason to bet, I can look at the description of a theoretical system and tell you exactly how it is. Are you to say a fully defined theoretical system has no entropy? How is this consistent with Clausius's definition of entropy as a function of heat flow and temperature? Is it impossible to fully define a theoretical system? Are you denying local reality? How is that compatible with monistic physicalism? ~Max
I have no idea what any of that is supposed to mean. A thermodynamic system is defined (theoretically or experimentally) as one in which we use macroscopic, averaged quantities in order to describe what would otherwise be an impossible amount of data by few variables. If you're wanting to describe it in terms of microscopic physics, you're leaving the thermodynamic level.

Quote:
 Originally Posted by Max S. When there's a law in my personal science of thermodynamics, that means absolutely no violations. The definition of entropy in Max S.'s thermodynamics is given by the equation: dS = ∫ δQ/Twhere Q is net heat flow and T is temperature.
Which is the same as in statistical mechanics. It's just that statistical mechanics shows us that dS >= 0 doesn't always hold any more.

The wikipedia article shows you how to derive it from statistical physics. The fact that this equation is derivable, rather than having to be postulated, means that statistical mechanics is the more fundamental science.

Quote:
 Originally Posted by Max S. To me it seems everybody is redefining entropy and the second law of thermodynamics. I take the laws as defined by Clausius, Kelvin, and Cartheodory and find no contradiction.
Sure. You may also take the laws of Newton, and find no contradiction. You can write down all manner of consistent theories, it's just that nature has no need to oblige you. And nature doesn't: the laws of thermodynamics, just as the laws of Newton, are valid exactly only in their proper domain.

Quote:
 Neither do I yet understand why everybody redefined entropy to begin with, or why the second law of thermodynamics as expressed in a corollary at the end of Clausius's paper should still hold true after changing the definition of entropy. ~Max
There is no change, merely an explanation. Entropy was introduced as a phenomenological quantity; a more accurate, more inclusive theory was found that explains what that quantity is.

But this isn't going to lead anywhere, is it. I've shown you quotes, arguments, and experiments, all of which explicitly disagree with you, and yet, to you, it still 'seems like' the second law should hold inviolable. You don't give any reason for that, beyond your own belief.

So let's try something else. I'm going to try breaking things down as much as possible, so we can figure out where you actually don't understand what I'm saying. Let's start with a simple toy example.
1. Suppose you have a million coins.
2. At every time-step t, a subset k of those coins are flipped.
3. Which coins are flipped depends on which coins were flipped in the previous step, and how they landed.
4. Coin-flipping is a perfectly deterministic process: given perfect knowledge of the initial state, the outcome would be exactly predictable.
5. It is nevertheless meaningful to talk about a coin-flip coming up heads with 50% probability, since we never have that perfect knowledge.
6. At any time-step, roughly half of the k coins are going to come up heads.
7. Nevertheless, it is possible for all of them to come up tails.
9. Each time-step is going to drive the system closer to 50% heads, 50% tails, on average.
10. Suppose, for instance, that heads and tails are colored black and white, respectively, and you can only observe the aggregate color; then, the system is going to get closer to 'grey' over time.
11. There is exactly one way to realize the initial state.
12. There are many more ways to realize the state '50% heads, 50% tails'.
13. Virtually all the time, we're going to see a state that's away from '50% heads, 50% tails' evolve towards that state.
14. Nevertheless, occasionally, at some time-step, all of the coins (or a sizable majority of them) are going to come up heads.
15. Occasionally, if we wait long enough, we're going to see the system evolve away from '50% heads, 50% tails'.
16. These departures can take on an arbitrary length---two time-steps in which there is an evolution into the other direction may occur one after the other.
17. Waiting long enough, we're going to see the system get as far from '50% heads, 50% tails' as we want.
18. Suppose we had, after watching the system for some time, formulated a law: 'the system will always become more grey over time'.
19. Alternatively, we could say that the quantity G = 1 - (1 - 2K)^2, where K is the ratio of heads to tails, always tends to the maximum. This quantity assumes it maximum at K = 1/2, and is zero both for K = 0 (all white) and K = 1 (all black), and thus, captures what we mean by 'the system will always be more grey over time' at least in that respect.
20. This law, even though we had thought it to be exact, in fact only holds on average.
21. If we wait long enough, we will observe violations of this law. Sometimes, the system will become more white, or more black; equivalenty, sometimes, a time-step will lead to a decrease of G.

If you disagree with any of the above, please tell me exactly with what, and why.
#136
05-08-2019, 02:30 AM
 Guest Join Date: Feb 2006 Location: Shanghai Posts: 8,939
Quote:
 Originally Posted by Dr. Strangelove So what if there's no contradiction? Classical thermodynamics is false. It doesn't describe the universe. Temperature, heat, entropy, etc. aren't some goo that permeates the universe; they're emergent properties of an ensemble of particles. If there were no granularity to the universe--it it were smooth all the way down and temperature was a real property of things--then classical thermodynamics would be fine. But it's not, and so statistical mechanics is the real description, and with it comes the fluctuation theorem and the possibility that the 2LoT will only be true on average.
Agreed, but I would say we don't need to say it's false per se:
We can have different scientific models for the same phenomenon that are at different levels of abstraction or precision.

If I am calculating where a cannon ball will land, I'm not going to go to the level of electron orbitals because it would take forever to calculate and the chances of a quantum phenomenon happening on a macro scale (e.g. the whole ball quantum tunnels) is absurdly small. For this kind of calculation newtonian mechanics is probably the right choice.
Meanwhile the laws of thermodynamics are the right fit for almost all experiments and real world applications. But not all.

Last edited by Mijin; 05-08-2019 at 02:34 AM.
#137
05-08-2019, 09:10 AM
 Guest Join Date: Aug 2017 Location: Florida, USA Posts: 528
Quote:
 Originally Posted by DPRK I never said any such thing, and neither does the paper !? So there may be a misunderstanding. Again, one way to think about it is that if you have a particle subjected to Brownian motion in a fluid, and you pull on it by moving a laser with constant velocity, then some of the time the force does positive work on the particle, and some of the time it does negative work, but the former is exponentially more probable than the latter. Far from purporting to contradict this (or any related) law, the experiment exactly confirms it.
The paper in question is titled "Experimental Demonstration of Violations of the Second Law of Thermodynamics for
Small Systems and Short Timescales" and Half Man Half Wit clearly cited it as an experiment demonstrating violations of the second law. From the paper:

Quote:
 In other words, the [fluctuation] theorem predicts appreciable and measureable violations of the Second Law for small systems over short timescales... In this letter we demonstrate and quantitatively confirm the predictions of the FT for transient systems [3] by experimentally following the trajectory of a colloidal particle in an optical trap.
~Max
#138
05-08-2019, 09:12 AM
 Guest Join Date: Aug 2017 Location: Florida, USA Posts: 528
Quote:
 Originally Posted by Dr. Strangelove So what if there's no contradiction? Classical thermodynamics is false. It doesn't describe the universe. Temperature, heat, entropy, etc. aren't some goo that permeates the universe; they're emergent properties of an ensemble of particles. If there were no granularity to the universe--it it were smooth all the way down and temperature was a real property of things--then classical thermodynamics would be fine. But it's not, and so statistical mechanics is the real description, and with it comes the fluctuation theorem and the possibility that the 2LoT will only be true on average.
Sorry, I meant to say I found no contradiction between those laws and the thought experiments or real experiments in this thread. If a thermodynamic system is so small that it has no thermodynamic properties, that you can't even theoretically assign temperature or heat, it doesn't make sense to apply the laws of thermodynamic systems at that level does it? If such a small system exists without those properties it isn't really a thermodynamic system.

But a single molecule of gas in the smallest of boxes could be a thermodynamic system.

~Max

Last edited by Max S.; 05-08-2019 at 09:16 AM.
#139
05-08-2019, 09:58 AM
 Guest Join Date: Aug 2017 Location: Florida, USA Posts: 528
Quote:
Originally Posted by Half Man Half Wit
Quote:
 Originally Posted by Max S. That is exactly my definition of thermal equilibrium - zero net heat flow between the system and its surroundings. ~Max
That's indeed the case if a system is in equilibrium. However, it's a necessary, not sufficient condition for equilibrium: my coffee in a perfect thermos has no net heat flow to the environment; yet, it's not in equilibrium with it.
I don't understand, why is your coffee not in equilibrium with its surroundings?

Quote:
 Originally Posted by Half Man Half Wit I didn't say that the momentum spontaneously changes to pointing in the other direction. I said that it's a valid state for the atom to move to the left at v, it's a valid state for it to move to the right at v. Both will be solutions of the equations of motion.
Loschmidt's paradox as you described it involves negating the velocity of particles. I thought you provided this paradox to show a violation of the second law of thermodynamics. But there is no violation because an isolated system subject only to reversible processes has zero entropy, my definition. The point is, Loschmidt's paradox does not contradict the classical Max S.'s second law of thermodynamics. How can you say my definition of entropy is the same as yours when using your definition of entropy invites a paradox? No matter which definition is correct, you must admit that they are different.

Quote:
 Originally Posted by Half Man Half Wit A thermodynamic system is defined (theoretically or experimentally) as one in which we use macroscopic, averaged quantities in order to describe what would otherwise be an impossible amount of data by few variables. If you're wanting to describe it in terms of microscopic physics, you're leaving the thermodynamic level.
I don't understand why laws of thermodynamics cease to apply at microscopic levels. So long as you can define heat and temperature you can define entropy and the second law of thermodynamics should hold for a system consisting of one atom (closed but not isolated) or two atoms (isolated).

~Max
#140
05-08-2019, 10:30 AM
 Guest Join Date: Aug 2017 Location: Florida, USA Posts: 528
Quote:
 Originally Posted by Half Man Half Wit Which is the same as in statistical mechanics. It's just that statistical mechanics shows us that dS >= 0 doesn't always hold any more. The wikipedia article shows you how to derive it from statistical physics. The fact that this equation is derivable, rather than having to be postulated, means that statistical mechanics is the more fundamental science.
I'm having trouble understanding how Wikipedia justifies their definition of temperature (step 1), since the article they link to doesn't show a similar form. Neither do I know what an eigenstate is. I can keep studying these but it will take me some time to make any sense of it.

~Max
#141
05-08-2019, 10:51 AM
 Guest Join Date: Aug 2017 Location: Florida, USA Posts: 528

## Re: Flipping coins

Quote:
At step 5. you diverge from classical dynamics into statistical mechanics, and any pronouncement of laws thereafter (18) are not proper physical laws. You have disproved a law that I never defended. The laws of physics don't care whether you know all the details or not, a law is still inviolable and a single actual violation disproves the law. The process that determines k is salient to any law, and applying the analogy to thermodynamics, there is no process k which can flip a coin from white to black without flipping exactly one other coin from black to white, and vice versa. Thus the average color of all coins never changes. That is the second law of thermodynamics applied to your hypothetical.

~Max
#142
05-08-2019, 11:06 AM
 You mean he's STILL here? Charter Member Join Date: Apr 1999 Posts: 25,715
IANAPhysicist, but it seems to me the Laws of Physics don't have any legal standing. If any of them were "routinely" violated wouldn't the physics community simply stop calling it a Law?
#143
05-08-2019, 01:00 PM
 Guest Join Date: Dec 2006 Location: the hypersphere Posts: 475
It's historical verbiage.
#144
05-08-2019, 01:19 PM
 Guest Join Date: May 2016 Posts: 3,033
Quote:
 Originally Posted by Max S. The paper in question is titled "Experimental Demonstration of Violations of the Second Law of Thermodynamics for Small Systems and Short Timescales" and Half Man Half Wit clearly cited it as an experiment demonstrating violations of the second law. From the paper:
Please revisit the quote from Searles and Evans in Post #74, or simply observe that the (slightly misleading?) title refers to "small systems and short timescales" which is the precise opposite of the thermodynamic limit. But why lose sleep over the provocative title when the mathematical formulation is clear, and is related to a very general phenomenon?
#145
05-08-2019, 01:52 PM
 Guest Join Date: Jun 2007 Posts: 6,672
Quote:
 Originally Posted by Max S. At step 5. you diverge from classical dynamics into statistical mechanics, and any pronouncement of laws thereafter (18) are not proper physical laws.
This is both wrong and irrelevant. Irrelevant, because I wasn't talking about classical dynamics, and hence, couldn't well diverge from them---I was talking about coins. Wrong, because it's completely well-defined to talk about probabilities in classical physics. A dice, once thrown, is completely describable classically. Yet, it's meaningful to say that it comes up any given number 1/6th of the time. This you can take for a shortening of the following statement: 'one sixth of all possible evolutions of the dice end with it showing any given number'. As you already accept that one can choose initial conditions randomly, it follows that with probability 1/6th, we have chosen an initial condition that'll lead to it showing, say, a three, once it comes to rest.

Anyway, I presume that, if that was your only objection, you accept the conclusion---that, observing the system (talking just about the coins here), one might formulate a law to the effect that 'greyness only increases', and further, that this law can be violated?
#146
05-08-2019, 01:55 PM
 Guest Join Date: Aug 2017 Location: Florida, USA Posts: 528
Quote:
 Originally Posted by DPRK Please revisit the quote from Searles and Evans in Post #74, or simply observe that the (slightly misleading?) title refers to "small systems and short timescales" which is the precise opposite of the thermodynamic limit. But why lose sleep over the provocative title when the mathematical formulation is clear, and is related to a very general phenomenon?
I don't think the underlying fluctuation theorem is clear at all, because it doesn't make sense to me to compare statistical entropy of a system at two different times. The probability distribution of microstates at time t1 should be different than the equiprobability assumed at t0.

Neither do I think the second law of thermodynamics is statistical in nature, nor that it only applies for large systems. So long as a system has heat, temperature, and pressure, it should be a thermodynamic system subject to the laws of thermodynamics.

~Max
#147
05-08-2019, 02:52 PM
 Guest Join Date: May 2016 Posts: 3,033
Quote:
 Originally Posted by Max S. I don't think the underlying fluctuation theorem is clear at all, because it doesn't make sense to me to compare statistical entropy of a system at two different times. The probability distribution of microstates at time t1 should be different than the equiprobability assumed at t0. Neither do I think the second law of thermodynamics is statistical in nature, nor that it only applies for large systems. So long as a system has heat, temperature, and pressure, it should be a thermodynamic system subject to the laws of thermodynamics. ~Max
OK, so you are wondering how entropy can be produced in a system that is not at equilibrium? The precise details depend on the dynamics of the (possibly many-body) system in question and potentially tricky to work out, but one can see what happens in various models, for instance there could be a stochastic noise term (like in Brownian motion), or your system can otherwise be modelled as a Markov process incorporating some randomness, or it may arise when your system satisfies chaotic/mixing/ergodic properties. These are good questions, and people have written books on the subject.
#148
05-08-2019, 03:55 PM
 Guest Join Date: Aug 2017 Location: Florida, USA Posts: 528
Quote:
 Originally Posted by DPRK OK, so you are wondering how entropy can be produced in a system that is not at equilibrium? The precise details depend on the dynamics of the (possibly many-body) system in question and potentially tricky to work out, but one can see what happens in various models, for instance there could be a stochastic noise term (like in Brownian motion), or your system can otherwise be modelled as a Markov process incorporating some randomness, or it may arise when your system satisfies chaotic/mixing/ergodic properties. These are good questions, and people have written books on the subject.
If statistical and classical entropy really are the same thing, classical non-equilibrium thermodynamics says dQ implies dS. There's no reason to get into stochastic or chaotic microscopic models to make that point.

Actually I was wondering how entropy can increase (or decrease!) in an isolated system as Half Man Half Wit seems to imply it does.

~Max
#149
05-08-2019, 04:02 PM
 Guest Join Date: Aug 2017 Location: Florida, USA Posts: 528
Quote:
 Originally Posted by Half Man Half Wit This is both wrong and irrelevant. Irrelevant, because I wasn't talking about classical dynamics, and hence, couldn't well diverge from them---I was talking about coins. Wrong, because it's completely well-defined to talk about probabilities in classical physics. A dice, once thrown, is completely describable classically. Yet, it's meaningful to say that it comes up any given number 1/6th of the time. This you can take for a shortening of the following statement: 'one sixth of all possible evolutions of the dice end with it showing any given number'. As you already accept that one can choose initial conditions randomly, it follows that with probability 1/6th, we have chosen an initial condition that'll lead to it showing, say, a three, once it comes to rest. Anyway, I presume that, if that was your only objection, you accept the conclusion---that, observing the system (talking just about the coins here), one might formulate a law to the effect that 'greyness only increases', and further, that this law can be violated?
I meant to say your analogy of flipping coins ceased to apply to the current discussion at step 5.

I will concede that one might formulate a law to the effect that 'greyness only increases', and further, that law can be violated. What is your point?

~Max
#150
05-08-2019, 04:14 PM
 Guest Join Date: Aug 2017 Location: Florida, USA Posts: 528

## Dice logic

Quote:
 Originally Posted by Half Man Half Wit Wrong, because it's completely well-defined to talk about probabilities in classical physics. A dice, once thrown, is completely describable classically. Yet, it's meaningful to say that it comes up any given number 1/6th of the time. This you can take for a shortening of the following statement: 'one sixth of all possible evolutions of the dice end with it showing any given number'. As you already accept that one can choose initial conditions randomly, it follows that with probability 1/6th, we have chosen an initial condition that'll lead to it showing, say, a three, once it comes to rest.
It is meaningful to say the dice has a 1/6 probability of turning up any particular number. It is meaningful to say the dice will turn up any particular number after six trials. But neither of these statements follow from the premises. You never did fill in my syllogism:
• Max's die is a six-sided die.
• ?
• Therefore, Max's die is a die with a 1/6 probability of landing on any particular side.

I think the minor premise should read: "A six-sided die is a die with a 1/6 probability of landing on any particular side".

Then we have another blank here:
• Max's die is a die with a 1/6 probability of landing on any particular side.
• ?
• Therefore, Max's die is a die that will land on a particular side 1/6 of the time.

~Max

 Bookmarks

 Thread Tools Display Modes Linear Mode

 Posting Rules You may not post new threads You may not post replies You may not post attachments You may not edit your posts BB code is On Smilies are On [IMG] code is Off HTML code is Off Forum Rules
 Forum Jump User Control Panel Private Messages Subscriptions Who's Online Search Forums Forums Home Main     About This Message Board     Comments on Cecil's Columns/Staff Reports     General Questions     Great Debates     Elections     Cafe Society     The Game Room     Thread Games     In My Humble Opinion (IMHO)     Mundane Pointless Stuff I Must Share (MPSIMS)     Marketplace     The BBQ Pit

All times are GMT -5. The time now is 04:05 AM.

 -- Straight Dope v3.7.3 -- Sultantheme's Responsive vB3-blue Contact Us - Straight Dope Homepage - Archive - Top

Send questions for Cecil Adams to: cecil@straightdope.com