Is the second law of thermodynamics routinely violated?

In science, a “law” has no exceptions. Occasionally, one hears a claim that some “law” or other is being violated but, invariably, the true explanation comes to light that negates that claim.

The possible internal states of a system should be restricted by the properties of the system. This is a correlation made at every instant. The actual cause for the molecule’s behavior should be determined by previous interactions.

~Max

I’m sorry, but that flew right over my head. How is he defining “spins” and “entropy” and what does that have to do with the second law of thermodynamics? And I don’t understand the CD or dye example at all. :frowning:

~Max

Note that TWO very unrelated sorts of Second Law violations (or apparent violations) are being discussed here.

Further adding confusion is the use of two different definitions of entropy. One formula is dS = δq/T which relates entropy to heat and temperature. Another formula, used by Penrose and other theoretical physicists, is S = k log V where V is the volume of a macrostate in phase space. The first formula is a special case of the second.

I think that in Mr. Wit’s example the molecules are all inside a closed system, but one molecule moves from one region to another within that system.

Penrose gives examples of systems which appear to be in large boring “macrostates” but which in fact have useful information. In the CD example, apparently random imperfections can be transformed into a Hollywood movie, so disorder would seem to decrease when the disk is played. But this is just because macrostates are too fuzzy a notion to define easily.

Disclaimer: I am unqualified. If there are no major errors in the preceding please give me six brownie points!

Septimus has it right: I was talking about a single system, where, in order to realize this sort of thing, somehow the movements of individual molecules would have to be restricted by the overall macrostate. Picture a gas, homogeneously distributed in a box; in order to ensure that it stays that way, and not, very rarely, bunches up in one corner, something would have to direct the movements of individual molecules so as to prohibit such a thing from occurring. That’s basically magic: the molecule itself might not interact with anything, but freely drift through space, where all of a sudden, it is being turned away, lest the corner get too full.

And again, I would like to point out that basically everybody agrees on this. Boltzmann himself, who pretty much came up with this picture, considered the possibility of random fluctuations to lower entropy states. Why are you so invested in that not occurring?

Picture again the gas, trillions and trillions of molecules zipping around. What should prevent their velocities, after who knows how many collision, from aligning such that they point in more or less the same direction? All that happens is little billard balls caromming off one another. Unlikely? Of course. But calling it impossible would entail something entirely mysterious occurring, some invisible guiding hand turning away excess molecules.

The two sentences are actually in direct contradiction. If only previous interactions determine the molecule’s behavior at any given time, then there’s nothing stopping it from moving such that (in concert with others) the entropy is lowered.

If, however, you insist on macroscopic properties restricting microscopic motion, then there needs to be some determinating factor of the motion of a particle that goes beyond just its past interactions.

Picture a billard table with a single ball (and we’re doing physics here, so the billiard ball is a point mass with no friction). Set it in motion. If you haven’t taken care to set up some simple periodic path, it’ll eventually visit every point on the table, with none being particularly distinguished.

Now, add a second ball. The two balls can interact by careening off one another.

Divide the table into two halves. Suppose you can only distinguish which half of the table a ball is in. There are more ways for both balls to be in separate halves, than there are for both to be in the same half. Just mentally chop the table into ball-sized areas: if there are four such areas (two in each half), so there are two ways for both balls to be in one half, and eight for them to be in separate halves. Increasing the number of areas increases the disparity

What you’re saying, now, amounts to saying that, if a ball already occupies one half, the other can’t enter that half anymore. That is, whenever it threatens to cross the boundary, it must somehow get turned away, without interacting with the other ball or the walls.

Now, just increase the number of balls. Just set a mighty lot of them adrift on the table. They’ll bump into each other, end simply by virtue of there being many more ways to be roughly evenly distributed, they’ll spend most of their time that way.

But there’s nothing about the way they’re bouncing off of each other that forbids them from, occasionally, bunching up in one half of the table—and occasionally, that’s what they will do. That’s all there’s to it. Anything else would require intervention by some mysterious agency we have no reason at all to believe should exist.

Maxwell’s demon? I saw one on the subway last week, I swear! Or was that a Tasmanian devil? Norwegian something?

This example is much easier for me to understand.

I assume the walls of the table are perfect and reflect 100% energy and there is no friction from air or the tabletop. The billiard table itself, by virtue of being a perfect box, is an isolated system. Therefore the table expresses but one macrostate, as you defined it. There are always N balls in the billiard table, and the sum kinetic energy of all the billiard balls remains a constant no matter how many times they collide. No energy flows into our out of the system - the table remains in equilibrium with its surroundings the entire time. That is consistent with the first and second laws of thermodynamics.

Then you divide the table into two halves and call each half its own thermodynamic system. You observe the table and find the majority of the billiard balls to be on one half, and then the other, and then the first half again, etcetera. You could say each half of the table represents a thermodynamic system, although the two halves are clearly not in equilibrium.

Let us freeze time to examine the billiard table. At this moment there are exactly N/3 billiard balls on the near half of the table and 2N/3 billiard balls on the far half. Now let us unfreeze time for just a moment, and we see that there are 0 billiard balls on the near half and N billiard balls on the far half. Has the second law of thermodynamics been violated?

Under Clausius’s form of the law, heat cannot pass from a colder body to a warmer body without some other change occurring. If we redefine heat as a function of the billiard balls in a system, heat has been transferred from the near half to the far half. But at the same time, there has been a change - namely the gross translational momentum of billiard balls in the far half (accounting for perfect bouncing against the three closed walls of the table). Because we have done away with friction and other such things, every collision of billiard balls is a perfectly elastic collision - kinetic energy is always conserved. It follows that, if at least one billiard ball crosses from one half to the other, the two systems will never reach equilibrium and heat will forever fluctuate from one half to the other.

Lord Kelvin’s formulation is equivalent to Clausius’s.

Carathéodory’s formulation is never invoked because the billiard ball moving from the near side to the far side is not an adiabatic process, as far as the near or far side thermodynamic system is concerned.

Also it is possible to imagine a billiard table with N billiard balls constantly moving while the number of billiard balls on each half of the table remains constant. For example, each billiard ball could bounce perpendicular to the long edges of the table, therefore never entering the other half. Another situation would be one where each billiard ball moves parallel to the long edges of the table and collides with an opposite ball at the exact midpoint of the table. Another situation would be where each ball moves parallel to the long edges of the table, but none ever collide with each other and each ball is “paired” with another which, on a different parallel, is always equidistant from the midline of the table. There are myriad other ways to make it work without invoking magic, besides the perfect initial state and suspension of friction, etc.

~Max

I’m not sure what you’re saying, but if you start with a bunch of billiard balls or ideal gas molecules on a table/in a box, they fly across the middle of the table/box all the time and don’t start bouncing perpendicular to the sides. And if the system is in equilibrium then the temperature is everywhere the same and not “forever fluctuating” from one (arbitrary, imaginary?) half of the box to another. Moreover, even if you start with the gas confined to one half of the box and let it freely expand, equilibrium will be achieved pretty quickly (not “never”), the temperature won’t even change, but the entropy will increase, completely in accordance with the second law of thermodynamics.

Also, to be clear, in these “macroscopic” considerations we are dealing with lots of very small particles, on large timescales.

Blue. Lovely plumage.

The point was, such a system can exist with billiard balls bouncing perpendicular to the walls. Ideal gas molecules assume random behavior and distributions by definition. Billiard balls do not, but the simulation you linked to starts off with pseudorandom positions and momentum anyways.

...
296. Molecule m = new Molecule();
...
298. m.x = getrand(winSize.width*10)*.1;
299. m.y = getrand(areaHeight*10)*.1+upperBound;
300. m.dy = java.lang.Math.sqrt(1-m.dx*m.dx);
301. if (getrand(10) > 4)
302. 	m.dy = -m.dy;
...

Also I thought a system could be in equilibrium (with its surroundings) without all possible internal subsystems being in equilibrium.

~Max

Rereading this and your previous post, I was using equilibrium in the sense of mutual thermodynamic equilibrium between at least two systems. Not internal equilibrium.

Although that last part about billiard balls bouncing perpendicular to the walls would count as the billiard table being in a state of internal and external thermodynamic equilibrium.

~Max

It seems to me that no matter how many billiard balls you put on the table, if all collisions are perfectly elastic and at least one ball crosses the geometric midline of the table just one time, that thermodynamic system will never fully reach internal thermodynamic equilibrium between the two halves of the table. This is assuming perfect bouncy walls and no other forces such as friction, gravity, electromagnetic forces, etcetera.

~Max

There is a reason they call it “statistical” mechanics. You can talk about a thermodynamic limit in which the number of particles grows effectively infinite. But in your example of billiard balls and the second law of thermodynamics this isn’t even relevant. Consider even a single billiard ball (or 2, or 3, or whatever you prefer) bouncing around on half of, or an entire, billiard table. The (Gibbs) entropy of this system does not fluctuate. And when you remove the barrier across the middle of the table, it gets bigger. As for your observation that there might be some special or periodic trajectories with all the billiards bouncing in concert some way, these will have measure zero and can be neglected, nor will such fine structure survive in realistic physical systems.

Anyway, in short, no, the second law of thermodynamics is not routinely violated, especially not by (ideal or real) gases expanding in containers.

A law in physics is a description of behavior - under these circumstances, the system will behave in this manner. Someone can state a proposed law of behavior, with that law being incorrect, or incomplete. That doesn’t make it not a law, it makes it not a valid law.

That is the position that Max S. is trying to defend.

No. The number of macrostates the table can be in is the number of distinguishable configurations (balls in different areas). Assume you have two, very crude, detectors—say, scales weighing each half of the table. They can tell you how many balls are on each side, but not their precise configuration. Then, a tuple (n balls left, k balls right) is a macrostate.

Suppose now that there are in fact four places each ball can be in. A microstate then is a quadruple (n,k,l,m) denoting the number of balls in the top left, bottom left, top right, and bottom right areas of the table.

Assume there are two balls on the table, balls A and B. The macrostate (2,0) (both balls in the left half) could come about in two ways, corresponding to the microstates (A,B,0,0) or (B,A,0,0). The macrostate (1,1), on the other hand, has eight realizations: (A,0,B,0), (A,0,0,B), (0,A,B,0), (0,A,0,B), and four more with A and B switched. This is a higher-entropy state.

But, there’s nothing—absolutely nothing—that keeps the state (A,0,0,B) (say) from evolving into (A,B,0,0). A just lays there, and B transitions from bottom right to bottom left. That’s all.

Again, no: I’m considering the table to be a single system.

Which is why I added this:

(In fact, this is just the thing about the equiprobability of microstates again.)

And I’d really like your opinion on these two quotations from Ludwig Boltzmann:

(Both are from Boltzmann’s paper ‘On Zermelo’s Paper “On the Mechanical Explanation of Irreversible Processes”’.)

Do you think he’s wrong here?

I admit, in my profound ignorance, I had never heard about Boltzmann’s physics before yesterday. I’ve heard of the Boltzmann brain, though. I spent part of last night trying to read his Vorlesungen Ueber Die Principe Der Mechanik.[1]. It’s been slow-going since I don’t read German.

From what I’ve read of your quotes, Boltzmann is also assuming a random distribution of possible internal states and redefining both thermodynamic equilibrium and the second law of thermodynamics to work upon such a basis.

[1]. Boltzmann, L. (1904). Vorlesungen Ueber Die Principe Der Mechanik (II Thiell). Leipzig: Johannes Ambrosius Barth. VORLESUNGEN UBER DIE PRINZIPE DER MECHANIK. : LUDWIG BOLTZMANN : Free Download, Borrow, and Streaming : Internet Archive

Specifically, I don’t think he is right to say a system posessing absolute internal thermodynamic equilibrium can fluctuate. He would need to redefine thermodynamic equilibrium to mean “almost thermodynamic equilibrium” or “to the best of our measurements, thermodynamic equilibrium, but not actually so”.

Neither am I convinced that the entropy of an isolated system will decrease over time. I cannot derive this without assuming, as a premise, that microscopic reality is determined by stochastic rather than corporeal processes.

~Max

I am of the opinion that, by measuring subdivisions within a system you have divided that system into two sub-systems. The laws of thermodynamics hold for whatever level of detail you want, so long as it is consistent. Observe heat flow within a system and no heat flow outside. It is affirming the consequent to say the second law is violated. The second law of thermodynamics only concerns net heat flow to and from a system, as if a Carnot engine existed along the border. It does not apply to the internal state of a system until you define the internal state as two or more sub-systems.

~Max