Is the second law of thermodynamics routinely violated?

Maybe you and I have different classical definitions of entropy. I quote: “∫ dQ/T = 0”. dQ would be the change in transfer of heat with external reservoirs, ant T would be temperature. Heat moving around within a system does not count towards dQ, therefore entropy doesn’t change no matter how the particles are configured.

Clausius went on and said there were such things as irreversible processes, which is in my opinion wrong, and refined the formula to an effective ∫ dQ/T > 0. Even so, both formulas are true.

Clausius said entropy of a closed system can only stay the same or increase. I say it never changes at all. You seem to be saying that it can increase or decrease. Somebody has to be wrong about something.

I’ve reproduced the relevant section of his paper in the spoiler below[1].

[SPOILER]According to this, the second fundamental theorem in the mechanical theory of heat, which in this form might appropriately be called the theorem of the equivalence of transformations, may be thus enunciated:
If two transformations which, without necessitating any other permanent change, can mutually replace one another, be called equivalent, then the generation of the quantity of heat Q of the temperature t from work, has the equivalence-value
Q

T
and the passage of the quantity of heat Q from the temperature t[SUB]1[/SUB] to the temperature t[SUB]2[/SUB], has the value
Q(1/T[SUB]2[/SUB] - 1/T[SUB]1[/SUB]),
wherein T is a function of the temperature, independent of the nature of the process by which the transformation is effected.

If, to the last expression, we give the form
Q/T[SUB]2[/SUB] - Q/T[SUB]1[/SUB],
it is evident that the passage of the quantity of heat Q, from the temperature t[SUB]1[/SUB] to the temperature t[SUB]2[/SUB], has the same equivalence-value as a double transformation of the first kind, that is to say, the transformation of the quantity Q from heat, at the temperature t[SUB]1[/SUB] into work, and from work into heat at the temperature t[SUB]2[/SUB]. A discussion of the question how far this external agreement is based upon the nature of the process itself would be out of place here; but at all events, in the mathematical determination of the equivalence-value, every transmission of heat, no matter how effected, can be considered as such a combination of two opposite transformations of the first kind.

By means of this rule, it will be easy to find a mathematical expression for the total value of all the transformations of both kinds, which are included in any circular process, however complicated. For instead of examining what part of a given quantity of heat received by a reservoir of heat, during the circular process, has arisen from work, and whence the other part has come, every such quantity received may be brought into calculation as if it had been generated by work, and every quantity lost by a reservoir of heat, as if it had been converted into work. Let us assume that the several bodies K[SUB]1[/SUB], K[SUB]2[/SUB], K[SUB]3[/SUB], &c., serving as reservoirs of heat at the temperatures t[SUB]1[/SUB], t[SUB]2[/SUB], t[SUB]3[/SUB], &c., have received during the process the quantities of heat Q[SUB]1[/SUB], Q[SUB]2[/SUB], Q[SUB]3[/SUB], &c., whereby the loss of a quantity of heat will be counted as the gain of a negative quantity of heat; then the total value N of all the transformations will be
N=Q[SUB]1[/SUB]/T[SUB]1[/SUB] + Q[SUB]2[/SUB]/T[SUB]2[/SUB] + Q[SUB]3[/SUB]/T[SUB]3[/SUB] + &c. . = Σ Q/T
It is here assumed that the temperatures of the bodies K[SUB]1[/SUB], K[SUB]2[/SUB], K[SUB]3[/SUB], &c. are constant, or at least so nearly constant, that their variations may be neglected. When one of the bodies, however, either by the reception of the quantity of heat Q itself, or through some other cause, changes its temperature during the process so considerably, that the variation demands consideration, then for each element of heat dQ we must employ that temperature which the body possessed at the time it received it, whereby an integration will be necessary. For the sake of generality, let us assume that this is the case with all the bodies; then the forgoing equation will assume the form
N = ∫ dQ/T,
wherein the integral extends over all the quantities of heat received by the several bodies.

If the process is reversible, then, however complicated it may be, we can prove, as in the simple process before considered, that the transformations which occur must exactly cancel each other, so that their algebraical sum is zero.

For were this not the case, then we might conceive all the transformations divided into two parts, of which the first gives the algebraical sum zero, and the second consists entirely of transformations having the same sign. By means of a finite or infinite number of simple circular processes, the transformations of the first part must admit of being made in an opposite manner, so that the transformations of the second part would alone remain without any other change. Were these transformations negative, i.e. from heat into work, and the transmission of heat from a lower to a higher temperature, then of the two the first could be replaced by transformations of the latter kind, and ultimately transmissions of heat froma lower to a higher temperature would alone remain, which would be compensated by nothing, and therefore contrary to the above principle. Further, were those transformations positive, it would only be necessary to execute the operations in an inverse manner to render them negative, and thus obtain the foregoing impossible case again. Hence we conclude that the second part of the transformations can have no existence.

Consequently the equation
∫ dQ/T = 0
is the analytical expression of the second fundamental theorem in the mechanical theory of heat.[/SPOILER]

~Max

[1] Clausius, M. R. (1856, August). On a modified Form of the second Fundamental Theorem in the Mechanical Theory of Heat. London, Edinburgh and Dublin Philosophical Magazine and Journal of Science, Series 4, 12(77), 92-98. Retrieved from London, Edinburgh and Dublin Philosophical Magazine and Journal of Science : Free Download, Borrow, and Streaming : Internet Archive

Emphasis mine. The heat transfer is between sub-systems which were never isolated to begin with. The classical entropy of the overall system is unaffected.

And I thought we said microstatic particles still had to obey classical laws of physics? The movement of particles would not be fundamentally random, it would be absolutely determined by the microstate at any previous instant. You can theoretically create such a lopsided box of cold and hot gasses, but only by the demon expending energy outside the system to make it so orderly and expending energy again to remove the barrier.

~Max

You are missing a premise, which I believe to be equiprobability of microstates at every instant, regardless of previous microstates.

[ul][li]There are more high-entropy* states than low entropy states.[/li][li]?[/li][li]Therefore, any given change in microstate is more likely to increase entropy.[/ul][/li]
~Max

Thermodynamics does not give me a probability distribution for anything, that would be statistical mechanics operating on top of thermodynamic observations. The problem is that you are assigning 1/6 to each dice roll when the underlying classical physics are purely deterministic. That is what makes statistical mechanics a set of heuristics rather than physical laws, so far as strict causality is concerned.

And it could very well be that the fundamental reality is random, but it was my impression from the other thread and your posts here that you do not subscribe to that idea, or at least do not present that argument now.

~Max

Well, that’s a settled question: theory predicts both increase and decrease, and both is experimentally well-confirmed.

This is simply wrong. The ‘classical’ entropy (which, again, is the very same thing as the entropy of statistical mechanics, so there’s a distinction that doesn’t make a difference) of a system with an unequal distribution of heat is lower than that of a system in equilibrium.

Which is not in conflict with anything I said. Giving a statistical description of a system does not entail that its behavior must be fundamentally probabilistic. A die roll is completely deterministic: at the instant it leaves your hand, it’s already determined what number it will show. That doesn’t mean saying ‘it will show 6 with probability 1/6’ is meaningless.

There’s nothing missing. Take the die; say that four of the numbers it can show are high-entropy states. Then, each die roll is more likely to end up in a high-entropy state, simply by virtue of there being more of them. That’s in no sense in conflict with the fact that the trajectory of the die is completely deterministic once it’s left your hand.

What I meant is, that thermodynamics gives you an effective, ultimately heuristic picture. Statistical mechanics connects this heuristic picture with the exact fundamental dynamics, and tells us how it comes about. At the microphysical level, there are no quantities like temperature or pressure; each is just an effective quantity that derives from more fundamental quantities of the underlying system, such as the kinetic energy of the individual particles. Entropy, likewise, emerges as an effective accounting of the number of states of the more fundamental system that realize the same state of the approximate, coarse-grained thermodynamic picture.

Your argument is still an enthymeme, and quite easily quaternio terminorum too:

[ul][li]This die has more marked sides than unmarked sides.[/li][li]?[/li][li]Therefore, a dice roll is more likely to show a marked side.[/ul][/li]
~Max

The theory of statistical mechanics, maybe. But not in classical thermodynamics, to my knowledge.

I had just come to the conclusion that our definitions of classical entropy are different, with cites from the man who coined the word “entropy” (although not in that paper). The definition I cited is a state function on state variables. The definition you give is a function of microscopic variables. According to my definition, entropy will never change (or at least never decrease) in an isolated system. In your definition entropy fluctuates. How can you still say your definition is the same, with all of these differences?

~Max

We should get together for a game of craps sometime.

Because thermodynamics is just an approximate theory, and thus, doesn’t always apply. That’s what’s meant by ‘violation of the second law’.

Simple: Clausius was wrong about entropy, and you still are. When he came up with the whole thing, he thought it had certain properties, like being always non-decreasing; later on, a more complete theory was found, and it turns out that entropy doesn’t actually have those properties. That’s science: theories are formulated, overturned, and refined.

That something can be expressed in terms of state variables doesn’t preclude that same thing being expressed on the basis of microscopic variables—just as how temperature can be expressed as the average molecular energy, as well as in terms of internal energy and entropy. Those aren’t two different things, but merely two different ways of talking about the same thing. Like the morning star and the evening star both refer to Venus, even though, quite clearly, morning and evening are different things. You can’t logically pry them apart; and you can’t logically pry thermodynamic entropy and the entropy of statistical mechanics apart.

I would like to wave that paper and all variants of the fluctuation theorem away as straw man arguments. Fluctuation theorem only contradicts a statistical formulation of the second law of thermodynamics, the entropy formulation with a statistical definition of entropy. From the paper:

Now that paper was a bit dense for a layman like myself, but my reading is that they suspended tiny latex particles in a solution with a somewhat even distribution. Then they pointed a laser beam at one specific particle, the particle moves towards the laser beam due to micro-forces, and they measure the refraction of the laser beam coming out the other end. From this they calculate the trajectory of the particle and say, look at this! For a second there, if we move a particle with a laser the entropy of the solution as a system decreases. Therefore, the second law of thermodynamics is violated on this tiny scale for a short period of time!

I mean, of course you can physically move particles to states of lesser (statistical) entropy. It takes work to do so but I’m not sure if they accounted for the powering of the laser and adjusted their heat sinks accordingly. They didn’t give the temperature of the solution or heat sinks but it would be near impossible to keep internal equilibrium after imposing an artificial temperature gradient. Not that internal equilibrium is necessary for the second law of thermodynamics…

Unless you are using the statistical definition of entropy and the entropy formulation of the second law of thermodynamics. In that case the law only holds absolutely true in an isolated system when the internal motions of atomic particles are exactly periodical, or when the temperature of the system is zero Kelvin. In all other cases the statistical entropy of a system will fluctuate, and this is the extent of any conclusions I can support from that paper.

As I keep saying this is all fine and good within the theory of statistical mechanics, but it would seem that statistical mechanics assumes a priori that every microstate has equal probability at every moment, regardless of previous microstates. If it were not so, there would be no probability distributions to base fluctuation theory upon. That means every law of dynamics is reduced from an absolute law to “very probable”. The same argument about the second law of thermodynamics in that paper can be made to show individual particles are merely unlikely to jump across my desk faster than the speed of light in a vacuum. Or rather, the laws of physics are violated all the time but the chances of it having any noticeable effect are so low that it may as well be impossible. But I don’t believe this is the position you take.

~Max

  1. What are you saying is the problem with the experiment in that paper, or the theorem they proved?

  2. It’s hard for me to parse what you are saying here but on one hand, yes, as I mentioned there are certain “assumptions” about systems (like your gas in a box) tending towards thermal equilibrium under certain conditions, and on the other hand at the same time there is research on setting up systems involving eg ultracold atoms that resist thermalization and exhibit quantum scarring and similar phenomena. It goes to show you that there are still interesting things to do in condensed matter physics.

If by “approximate… thus, doesn’t always apply” you mean the classical second law of thermodynamics is probabilistic, I strongly disagree with you. I am of the opinion that the second law of thermodynamics, as I learned in secondary school, have never been and can never be violated, theoretically or experimentally. In your example of three particles hopping between three boxes within a system, random hopping is impossible as it contradicts not only the laws of thermodynamics but also the speed of light in a vacuum and strict causality. In the example with billiard balls I pointed out that the second law is not violated because no heat is transfered external to the table as a system; billiard balls moving across the middle of the table do not contradict the second law because neither half of the table is an isolated system, that is, the ball moving across the table changes the aggregate velocity of all billiard balls which satisfies the second law of thermodynamics.

I think he was wrong about irreversible processes, but the formulations I cited are still proper physical laws, to my knowledge. Would you care to explain why you disagree? What is the more complete theory?

You are saying they are the same thing, but I do not understand how you came to this conclusion. I’m still working through Boltzmann’s lectures (in German), between passages of Consciousness Explained. Since Boltzmann defined “statistical” entropy, I hope he will clear things up for me. My guess is that he assumes the equiprobability of microstates at every instant, regardless of microstates at previous instants. That is the axiom you rejected.

~Max

I’m not a physicist and have no background in physics or science so there is a good chance that I have no idea what I’m talking about. But it seemed that they pointed a laser at a particle in the water, thus moving the particle. The laser constitutes work and the movement of the particle constitutes heat. When calculating entropy using the classical definition, you are supposed to consider external work as external heat[1]. It doesn’t seem like this was accounted for in the experiment, which makes sense because they were using a different definition of entropy.

[1] “… every such quantity received may be brought into calculation as if it had been generated by work, and every quantity lost by a reservoir of heat, as if it had been converted into work.”
Claudius, R. [POST=21623293]Supra at #101, in spoiler.[/POST].

I think I agree with you. Maybe? It seems to me that the authors of the paper in question also assumed the equiprobability of microstates at every instant. I’m not sure what quantum scarring or condensed matter physics are though.

~Max

No; the fluctuation theorem is a consequence of entropy being, fundamentally, a statistical quantity (not ‘statistical entropy’, but entropy; there’s only one such quantity, and it’s the same one as in classical thermodynamics, which we just these days understand better than they did back then).

I haven’t had a chance to look at the paper in detail, but I don’t think that’s right. The laser doesn’t move the bead; it acts as an optical trap, keeping it in place. From there, you can calculate how much the bead should move around (oscillate around the mean position), and if it moves around more than that, it’s received energy (heat) from the environment, as a statistical effect, thus violating the second law.

It doesn’t, though. It’s assumed that each initial condition consistent with a certain macrostate is as likely as any other; from there, since the vast majority of all possible microstates leads to evolutions that visit each possible state of the system (ergodicity), we can derive that whenever we ‘look’ at the system, it’s as likely to be in one microstate as it is to be in another. But the microscopic succession of microstates is perfectly deterministic.

It’s nice that you have an opinion, it’s just that your opinion is contradicted by both physical theory and experiment, at which point, in particular because you have no justification for your opinion beyond ‘but I really think it’s like that’, it is reasonable to revise your opinion.

This is confused. Heat can never be transferred to an isolated system, because if you can transfer heat to a system, it’s not isolated anymore. For heat transfer, two systems must be in contact; and the two halves of the billiard table are two systems in contact, just as, e. g., two halves of a volume of gas after you’ve removed some divider between them (a common example in thermodynamics) are.

The more complete theory is statistical mechanics—that’s simply a matter of logic, since you can use it to derive all the predictions (in the proper limit) of thermodynamics, but also, additional effects, which are in fact experimentally observed. Statistical mechanics subsumes and extends thermodynamics. There is no question about that.

Because they agree across the domain to which both apply. Consider relativistic speed. Wherever a Newtonian approximation applies, deviations between it and the classical notion due to Newton are so slight as to be imperceptible. Thus, in the proper limit, relativity reproduces all the predictions of Newtonian mechanics.

However, relativity makes additional predictions in the case where the Newtonian theory no longer applies. In particular, relativistic speed has a maximum value, which it can’t exceed. The Newtonian theory knows no such limit; but it would be spurious to say that therefore, there are really two notions of speed, one Newtonian, and one relativistic. You can’t say, but my spaceship flies with Newtonian speed, and so, can go as fast as it likes!

But that’s exactly what you’re saying regarding entropy. The relativistic notion of speed has revised and extended the Newtonian one, which is now understood to be only (approximately) valid in the regime of the very small (absolute value); the statistical notion of entropy has revised and extended the classical thermodynamical one, which is now understood to be only (approximately) valid in the regime of the very large (number of particles).

He doesn’t. Nobody ever did.

Maybe you should replace both with Seager’s Theories of Consciousness: early on, he discusses the notion of ‘intrusions from below’, where the more fundamental notions make themselves apparent at a coarse-grained level. One of his examples is, naturally, thermodynamics and its relation to statistical mechanics:

Seager’s example is Loschmidt’s paradox: Loschmidt pointed out that, for any entropy-increasing evolution of a physical system consisting of an aggregate of many particles, one can obtain an entropy-decreasing one by simply multiplying all the particle momenta with -1. This is dynamically as valid as the original configuration.

If one assumes that the past is (for whatever reason) in a low-entropy state, there is no dilemma here: we’re still overwhelmingly likely to observe a steady increase of entropy. But it does point out that there are perfectly physically allowable evolutions of a system that decrease entropy.

Furthermore, it also shows that by considering the microscopic dynamics, you can infer how, e. g., heat gets transferred to a hotter body: just take the molecules involved in, e. g., some convective process, and invert their momenta. And lo and behold, suddenly heat flows in the ‘wrong’ direction. But of course, we knew already that this must, on occasion, be possible.

IIRC that experiment was designed to probe non-equilibrium steady-state conditions, and that’s exactly what they did.

There is no “different definition” of entropy, just the usual nuances in defining state variables for a nonequilibrium system. There are also other experiments studying entropy in complicated nonequilibrium conditions if you are interested.

I probably misunderstand something. I thought the optical trap worked because the bead was attracted to the laser beam. If they aren’t using the laser to move the bead and reduce entropy, it seems they are just capturing a moving particle in a tractor beam and seeing how much it wiggles before momentum dies down and it stops moving. Sort of like tossing a marble into a bowl. I’m not sure how this counteracts the second law because one does not simply shoot a laser into an system and declare an isolated system has spontaneously reduced entropy. I could just as easily drop a triangle in the middle of our billiard table and point out how unlikely it is that three billiard balls in are always observed in the middle of the table.

The difference between relativity and statistical mechanics to me is that I know and accept the results and conclusions of experiments confirming relativity’s superiority over other theories of motion. These are the Michelson-Morley and Ives-Stilwell experiments as well as Mercury’s orbit and observations of solar eclipses.

~Max

Then this is the root of our disagreement. I have not yet reached the conclusion that the two definitions of entropy are the same, neither have I concluded that statistical mechanics extends thermodynamics. To me they appear as two different ways of describing the same phenomena: thermodynamics is more (theoretically perfectly) accurate but statistical mechanics is easier to calculate.

~Max

If I measure (statistical) entropy at time t[SUB]0[/SUB] then measure entropy again at time t[SUB]1[/SUB], can I compare the two measurements? I think if only the initial microstate is random, it is inappropriate to compare two measurements of statistical entropy. Each measurement of entropy assumes that all microstates are equally probable by virtue of having the number of microstates Ω in the definition. I am assuming that when calculating entropy for an isolated thermodynamic system, Ω remains constant over time. Otherwise statistical mechanics loses a lot of its usability as we would have to calculate the microscopic dynamics of every state to determine Ω for any time t[SUB]n>0[/SUB].

~Max

I don’t yet understand how entropy in statistical mechanics supersedes classical entropy, to me they seem to describe entirely different things. If they are different it makes no sense to use the statistical definition in the entropic formulation of the second law of thermodynamics, which was derived using the classical definition of entropy.

The point is, I don’t see how that experiment contradicts the second law of thermodynamics as cited in the original post here. The “law” the paper purports to contradict is not the second law of thermodynamics, in my opinion.

~Max