It is not obvious. If you observe the system for twenty seconds, no matter where the particle starts out it will only be at the far left position once while it will be observed at the 1 meter mark twice.
And the “gas” is not at internal equilibrium because we have moving particles within it. That’s why I don’t like thinking of thermal equilibrium as internal thermal equilibrium. I could draw a line at 3 meters and considered the entire system to comprise of sub-system A to the left of that line [0,3] with sub-system B to the right of that line (3,10]. Therefore at every possible instant the temperature between the two subsystems is unequal, the two subsystems are not in thermal equilibrium, and the combined system as a whole is not in a state of internal equilibrium. Not being in equilibrium, entropy can fluctuate (both ways if we aren’t assuming the second law).
That is also what Mr. Eastman told me - I could not substitute dQ[SUB]AuB[/SUB]/T[SUB]AuB[/SUB] for dS[SUB]AuB[/SUB] because that formula only applies if the combined system is in a state of (internal) equilibrium. I hate bothering him but I still don’t understand why the entropy formula requires internal equilibrium. Couldn’t we just use an average value for T, because T is intensive and proportional to V:
To which his response was the same, and I still don’t understand why the system must be in a state of internal equilibrium to use the entropy formula. It would certainly solve Loschmidt’s Paradox if the entropy of an isolated system never changed, and I can’t think of any violation of some other thermodynamic law, possibly excepting the assertion that two otherwise-isolated systems connected by a diathermal wall will reach equilibrium and stay there - but this was already disproven by Carathéodory with Poincaré recurrence. I couldn’t say if it violated some law of statistical mechanics.
OK, let’s be precise: the probability of the particle being observed in any interval is proportional to the length of that interval.
But all gases consist of moving particles. “Internal thermal equilibrium” simply means no heat flows between any two subsystems. If the temperature of two subsystems is unequal, then the entire system is still out of thermal equilibrium.
And, of course, a system at thermal equilibrium will experience microscopic fluctuations as usual. It is incorrect to say it is not at equilibrium; “equilibrium” does not mean “no motion”.
What is your reasoning that leads you to believe it does not? How exactly do you propose to define temperature of a system not at thermal equilibrium? Don’t say, “use a thermometer,” because one side might be hot and the other cold. And the microscopic definition assumes equilibrium as well.
Look, I honestly think you should carefully work through all the exercises on that web page (or any similar text). You can’t help but encounter all these definitions and concepts.
But the probabilities are not equal for every microstate unless the interval t = 0.
I could just as arbitrarily divide the number line into ten sub-systems: A = [1], B = [2], … J = [10]. In a real gas it would be tedious but I could divide the gas into innumerable tiny boxes each the size of one particle. There would be heat flowing between sub-systems at every time-step. Does this mean “internal thermal equilibrium” depends on how I arbitrarily divide the system into parts?
I used the average temperature of each subsystem proportional to volume.
T = (V[SUB]A[/SUB]T[SUB]A[/SUB] + V[SUB]B[/SUB]T[SUB]B[/SUB] + … + V[SUB]Z[/SUB]T[SUB]Z[/SUB]) / V
I’ve done chapters 1-6 of that course and still don’t know why I am wrong. The classical entropy formula (Clausius inequality) is not actually part of the course.
You wrote that you were considering a 1-D system of length 10 in which a particle moves at a constant speed of 1. In that case the probability of the particle being in a given interval of length l is l/10. I know you came up with something different, but your calculation was wrong
No. But considering a box with 1, or 0, molecules in it is not going to be a very good model of thermal equilibrium. Here’s an exercise: what is the temperature of a subsystem containing 1 particle? How about 0 particles?
For which there is no theoretical justification, and leads to nonsense.