Is the second law of thermodynamics routinely violated?

Either you have redefined macro- and micro-states, or I misunderstand you. Going by the definition in post #6 (top quote), I thought the macrostate of the billiard table would be, for example, the number of billiard balls on the table. So we could have a state of 5 billiard balls, or a state of 3 billiard balls, etcetera. Microstate would be the arrangement of individual particles (billiard balls), for example, giving each ball x and y coordinates and momentum. Entropy is then the number of microstates.

If we are to measure the number of billiard balls on the left side of the table, that number is neither a property of the table nor a property of an individual billiard ball. Thus, I conclude you have designated the left and right sides of the table as their own systems, and the two of them combined comprise of the billiard table as a whole. Here is the heirarchy of systems and states and their properties in brackets:

[ul][li]Top level. Billiard table { 3 balls, equilibrium with surroundings }[/li][li]Mid level. Left side of the table { 2 balls, inequilibrium with surroundings }, Right side of the table {1 ball, inequilibrium with surroundings }[/li][li]Fundamental level. Billiard ball A { x position < 0.5, y position ?, momentum ? }, Billiard ball B { x position < 0.5, y position ?, momentum ? }, Billiard ball C { x position > 0.5, y position ?, momentum ? }[/ul][/li]
~Max

@Max could you please rephrase what is your question about the billiard balls?

Entropy is not “the number of microstates” if that’s the issue. (It will be proportional to the integral over the phase space of -p(a) log p(a), where p(a) is the probability of state a).

Half Man Half Wit’s [POST=21615595]post #17[/POST] and the threaded conversation preceding it implies that the second law of thermodynamics is contradicted by a theoretical billiard table. My responses are to the effect of, “no it doesn’t”.

~Max

Then somebody is misunderstanding something, because the fluctuations aka random caroming of billiard balls on a table he or she is talking about do not contradict the second law of thermodynamics.

But doesn’t Maxwell’s Demon turn information into energy, coming to the relief of entropy?

Basically, yes. The energy stays the same or even increases, but the information (entropy) goes down. You can think of a genuine, bona fide Maxwell’s Demon as a combination air conditioner and perpetuum mobile.

OK, so you actually think Boltzmann misunderstood his own theory in a naive way you’re able to spot immediately, but no scientist in the century hence managed to do. That’s confidence, I like it!

So, what do you think about Feynman’s example from his Lectures on Physics, typically considered to be one of the best resources for teaching physics available?

Or, what do you make of the following quote of Maxwell (from his Theory of Heat)?

And what do you make of Eddington’s words in his entertainingly titled “The End of the World: from the Standpoint of Mathematical Physics”?

I don’t intend to bludgeon you over the head with these quotes from the big shots (well, maybe a little, but gently). But I do want to know if this doesn’t at least hint to you that maybe, you’ve got it wrong somehow—that it’s not everybody else (and I could dig up innumerably more similar quotes) who misunderstands, but rather, you? Or, failing that, do you at least recognize that I’m not saying something wildly out there, but merely, what’s been the consensus ever since the statistical mechanics foundation of thermodynamics was proposed?

Well, you haven’t. All you’ve done is introduce what’s often called a coarse-grained set of observables, analogous to thermodynamic quantities such as temperature, volume, and pressure—quantities that take account of the state of the system in aggregated form, without being sensible to microscopic details.

That’s simply wrong. As the first sentence of the relevant wiki entry puts it:

Nothing about net heat flow, applies to the internal state without dividing anything up. (Of course, you can state the second law in terms of heat flow, but that’s not the only formulation.)

Sorry if I wasn’t clear regarding my definitions. A macrostate takes account of the system in terms of coarse-grained observables, like temperature, pressure, and volume, without being sensitive to the microscopic details.

The coarse-grained observables I have introduced in the billard ball example are ‘number of billard balls in the left half’ and ‘number of billard balls in the right half’. The microstate is given by the arrangement of billard balls in the subdivided areas.

It’s a property of the system of billard balls on the table. Temperature, likewise, isn’t a property of the room a gas is in, nor of any individual molecule.

No. I’m just describing the table at different levels of abstraction. The total number of billard balls gives me relatively little information about the state of the system; the number of billard balls per half a little more; and the configuration of billard balls in the subdivided areas yet more. You can view the entropy of a given macrostate as the information we lack about the microstate, if that helps.

That’s not what I’m saying. Rather, if we view the table in terms of the coarse-grained observables I’ve introduced (‘balls per left half’, ‘balls per right half’), this macrostate has a certain entropy depending on the number of microstates that can realize it (the phase space volume it occupies).

By using a model of the billard table that restricts the available microstates to exactly one ball per quarter of the table, we see that the state (‘one ball per half’) has a higher entropy than the state (‘both balls in one half’). The transition from the former to the latter—which is perfectly well possible—then decreases the entropy of the system.

I’m surprised none of the experts gave an answer related to the one I gave in a recent thread. Is the following in error? @ OP — would this have addressed your question?

Unlike the Laws of Newton, Maxwell and Einstein, the Second Law isn’t a dynamical law; it’s just a statistical fact, closely akin to the Law of Large Numbers.

A ten-gallon tank of gas at standard temperature and pressure contains about a septillion molecules — easy to remember :slight_smile: and large enough to qualify for the Large Number Law — but deviations from an “entropy maximization” can still occur with non-zero probability just as a matter of simple statistics. (For example, consider 52 — less than a septillion but still largish compared with some integers. If you shuffle and deal four poker hands from a deck of 52, after a septillion trials you can expect a dozen or so occasions where there was a a four-way tie for best hand, all royal flushes!)

Are you querying me or Max S. ? Of course there are thermal fluctuations, therefore the Boltzmann entropy of the system is also fluctuating, and from one particular moment to the next may decrease.

My only claim is that this does not contradict the second law of thermodynamics, unless of course one formulates it in a way that does not apply to mesoscopic or microscopic systems and then tries to apply it to such a system.

The ball system is a good concrete example of a model where one can calculate everything, which is why I asked Max S. if there were any outstanding questions about that model.

What formulation of the second law are you referring to? Because I think it’s entirely reasonable, as e. g. Maxwell also does in the quote above, to speak of a violation of the second law when entropy fluctuates downward.

The argumentum ad populum certainly makes me wonder whether I have it wrong. That’s why I made this thread, not to convince you but to convince me. I can’t seem to understand why I am wrong. It seems their arguments all rely on a fundamental premise which I am not yet convinced to take. As you stated in [POST=21613848]post #8[/POST], this premise is the equiprobability of micro-states.

So far I can accept that fundamental premise as a heuristic, but not as an absolute physical law. The things you implied if the premise were false don’t seem so certain to me, see [POST=21615087]post #13[/POST] and [POST=21615132]#14[/POST] and the related discussions, which are ongoing.

I will add all of those cites to my reading list, but I can’t guarantee when I will get around to reading them. And as always, I will admit my ignorance of modern or even older sciences in an instant.

~Max

I would like to press this issue. When dealing with a thermodynamic system, I thought properties of the system must represent the system as a whole, dependent on the internal state but not necessarily on a specific internal state.

Specifically, you should be able to measure a system property without ascertaining the internal state. Would you agree?

My thought is that the number of balls on the left half of the table belongs to a mid-layer of abstraction. A property of the system need not reveal much specific information about the system’s internal state, in fact it shouldn’t reveal too much. Otherwise I might just as well define the position of ball 1 as a property of the system, the momentum of ball 1, the position of ball 2, etcetera. The more specific we allow system properties to be, the less generic our system becomes and the less useful our conclusions are. In computer science this generic-ness of systems (classes) is called polymorphism.

~Max

This definition is not acceptable to me because, when looking at a system, I cannot draw a hard line between what is a macroscopic property and a microscopic property. Is the weight of paper in a printing machine a macroscopic property of the machine, or a microscopic property? The number of numpad keys on a keyboard? The number of files in the top drawer of my filing cabinet? The aggregate color of paint in the top half of a canister filled with two half-mixed paints?

I propose redefining macroscopic to mean “of the whole” and microscopic to mean “of the constituent parts”. That way the answers are clear-cut.

~Max

I did not include that form of the second law of thermodynamics in the original post. I did this because the definition of entropy I was familiar with, dS = δQ/T, depends on the other form of the second law (Clausius or Kelvin) being true. So I don’t really consider the entropy definition to be the second law of thermodynamics, especially since temperature and therefore entropy is undefined for isolated systems not in equilibrium.

You can use the statistical definition of entropy and claim the entropy definition of the second law of thermodynamics is a law. But this is setting yourself up for failure - the statistical definition of entropy depends on that a fundamental premise of equiprobable microstates. That same premise is strictly incompatible with the the second law of thermodynamics in any form, including the entropy form.

~Max

It’s not a physical law; it’s the absence of a law stating anything else. If you have five colored marbles in a jar, and you draw one, the odds are 1/5th you’ll get either color. If you find that this equiprobability is violated, you’ll start looking for a cause: if, say, you draw the red marble 50% of the time, something must make it so that this marble is drawn more often.

We can consider this as a physical system with five states available to it. These states are equiprobable, not because that’s a physical law, but because there’s no law, nor any other cause, that would make them not be.

That’s what I’m trying to get at with the billiard balls. Barring special initial conditions, such as carefully-arranged periodic paths, which would in themselves be something in need of explanation, whenever you look at the table, for a generic starting configuration (that is, balls set in motion with no matter what initial velocities), you’re equally likely to find it in any of the possible microstates available to it.

If you want to set it up differently, you’d have to add some further element—either carefully set up initial conditions, or further forces acting on the balls to prohibit certain states/change their likelihood, and so on. But with just billiard balls bouncing off of each other, you can be certain you’ll observe a decrease in entropy eventually.

I don’t know what you mean by that. What’s the internal state?

Thermodynamic quantities are aggregate observables, which tell you something about the state of the system, but not all you could know. It’s a statistical description if you have limited knowledge about the system.

‘Mid-layer’ simply isn’t a meaningful term here. You can consider any system at different levels of exactness; the quantities you use to describe it, including the entropy, will depend on the level you’re considering it at.

A microscopic property gives you the full information about the system. A macroscopic one doesn’t, but instead, refers to the system in aggregate. That is, the microstate of an ideal gas would be the positions and momenta of all the particles; a macrostate is anything that leaves out some of that information.

[/QUOTE]
I propose redefining macroscopic to mean “of the whole” and microscopic to mean “of the constituent parts”. That way the answers are clear-cut.

~Max
[/QUOTE]

Neither ‘whole’ nor ‘parts’ have any objective meaning, though. Is the whole the whole table? If so, are its parts the legs and the top, or the atoms that constitute them? If it’s the atoms, is the whole they form the leg they’re part of, or the table the leg is a part of?

The microscopic definition is the more fundamental one, though. The one usually considered in thermodynamics only really holds in the thermodynamic limit of infinite systems.

Well, if you hold that the second law should, for whatever reason, be always and universally valid, then yes, I suppose that must be the conclusion to come to. But that’s of course question-begging: there’s no reason beyond you wishing it to be so to suppose that the second law should be universally valid. In fact, what Boltzmann did with his formulation of statistical mechanics, was to show that the second law emerges as an approximate notion from more fundamental dynamics we were previously ignorant of; given that, it’s no wonder the second law shouldn’t be universally valid.

In a logical sense there is no default interpretation, every logical truth must be axiomatic or derived from axioms. The equiprobability of microstates is an axiom. Either it is absolute (a law) or it is not (heuristic).

It is fine in philosophy to say reality is fundamentally based on heuristics rather than absolute physics. You can build consistent physics on top of heuristics, but the trade-off is predictive value at the lowest levels. Causality and scientific laws (including laws of conservation) are reduced to heuristics, because at the fundamental level changes in state are random. On the other hand, you can no longer support strong physicalism - the view that all phenomena can be explained by physics. To do so you need to contort the word “explained” to include pointing to this axiom that says it’s fundamentally random. You can do that, but it’s counter-intuitive and leads to… interesting philosophical conclusions.

Is that consistent with your position? With the positions of all those scientists? I thought you advocated a form of strong physicalism in the dualism thread.

~Max

@ Max — I’ve been trying to follow the conversation, but with growing confusion.

Assuming you thought I knew what I was talking about, would that sentence change your viewpoint?

You’re confusing the world and its description there. Axioms, laws and the like don’t make anything happen, they describe what happens.

But anyway. What’s the law that makes it such that the probability of five colored marbles in an urn comes out 1/5 for each color? Nothing beyond their mere number. Likewise with the microstates. You essentially want to say that there needs to be a separate law dictating these probabilities, and that they could just as well come out another way.

But anyway. I’ve actually shown you how the equiprobability can be derived. For an ensemble of billard balls, for generic initial conditions (i. e. initial conditions not set up to lead to special, periodic motions, which form a measure zero subset of all possible initial conditions), it will be the case that the time spent in any possible microstate will be the same. Hence, there’s an even chance of finding the system in any given microstate. And that means that entropy will tend to decrease if you start in some initial state, but may also decrease, with very low probability. That probability can be calculated via the fluctuation theorem:

The predictions of the fluctuation theorem have actually been tested, both using computer simulations, and in the laboratory:

This isn’t what this leads to, but it’s also wrong. Current physics is by mainstream opinion indeterministic—random at the base. That doesn’t entail anything regarding the possibility of physicalism.

By the way, is your username just a happy accident, or are you ‘Max S.’ exactly because of your belief that entropy always only tends to the max?

If Maxwell says that the Second Law is continually being violated for any sufficiently small group of molecules, then we could take him at his word that what he has in mind strictly applies in the thermodynamic limit where the number of molecules is (effectively) infinite, yet not be surprised that 3 gas particles could all end up on one side of a room. 3 or 1000 is “sufficiently small”, yet we don’t hold our breath (heh) for any detectable decrease of entropy of a roomful of air.

At any rate, I think it is quantitatively clear what is going on, especially for the classical systems statistically studied by Maxwell, Boltzmann, Gibbs, et al.

If I were to assume as premises:
[ul][li]all microstates of a system are equiprobable[/li][li]the entropy of a system is the count of microstates that express a given macrostate[/li][li]the second law of thermodynamics states: “the entropy of an isolated system can never decrease over time”[/ul][/li]Then the second law of thermodynamics is, in theory, routinely violated in tiny systems. At which point the law is reduced to a statistical fact, “the entropy of an isolated system is unlikely to decrease over time”.

I can’t wrap my head around the equiprobability of microstates, plus I was using other definitions of the second law of thermodynamics. So I haven’t reached that conclusion yet.

~Max