Yes, it is a flow from cold to hot. Major error on my part.
But that still can’t be right… I don’t think it is correct to simply add the change in entropy of subsystems. Those fractions are ratios and I don’t think you can derive
And here I made the same mistake twice. Let me try again:
But that still can’t be right… I don’t think it is correct to simply add the change in entropy of subsystems. Those fractions are ratios and I don’t think you can derive
The entropy of subsystems is explicitly additive. That’s the reason one formulates it as the logarithm of the number of available microstates: upon combining the microstates of two systems into one, you’ll get a total number of microstates that’s the product of the microstates of the two subsystems (if system one can be in microstates A, B, and C, and system two can be in microstates D and E, the total system can be in the microstates AD, AE, BD, BE, CD, and CE). Taking the logarithm of a product is equal to the sum of the logarithms of the factors.
I have not conceded that (classical thermodynamic) entropy is the logarithm of the number of available microstates. Why else would entropy be additive?
One formulates the entropy in statistical mechanics in that way because the entropy, in classical thermodynamics, is an extensive property, which implies its additivity. If you have problems with classical thermodynamics, too, then that’s a completely separate issue.
In gases, molecules are moving quite rapidly, and conduction and convection will transport heat, as well as mix up the gas, bringing it to equilibrium pretty fast.
Even if gas molecules were mathematical billiard balls, which they are not, any hypothetical recurrence or other macroscopic fluctuations due to the number of molecules not being truly infinite, the system not being observed for an actual infinite amount of time, etc., result in cosmological timescales that are completely beyond the thermodynamic equilibrium picture. Actual gases are zipping around pretty fast, and the relaxation time to equilibrium under everyday conditions is not long.
None of this is right, I’m afraid. You can, for example, also increase entropy without any heat transfer—from the first law of thermodynamics, we get (if we leave the amount of stuff invariant)
dU = TdS - pdV
So if we let the energy stay the same, this yields
dS = pdV/T.
In consequence, if we let a system expand, its entropy will increase.
Yes. Think about it this way: if you increase the system’s size, the extensive quantity changes in proportion, while the intensive one stays constant; consequently, their ratio also increases in proportion to system size.
You cannot derive the fundamental thermodynamic relation unless temperature is defined, and because temperature is intensive it is only defined when temperature is uniform. (Unless you are willing to accept my notion of “average temperature”, but apparently I made that up the other day)
Specifically you cannot substitute TdS for δQ without using the second law of thermodynamics, and you cannot use the second law of thermodynamics unless T[SUB]A[/SUB] = T[SUB]B[/SUB].
I must remind you that the walls of an isolated system are immovable and the average temperature of an isolated system is constant.
I don’t disagree with you, but in theory a formulation of the second law of thermodynamics that says “the entropy* of an isolated system never decreases over time” is invalid.
You are missing some qualifiers (e.g., “thermodynamic limit”, “ensemble”) and precise definitions before you can argue (are you?) that Boltzmann’s law or his kinetic theory is invalid. What are the properties of your system- still elastic spheres? What are the space and time scales used to compute W? What about the fact that on some scale like your Poincare time no system is isolated?
I gather you are ultimately bothered that systems are constantly fluctuating and nothing in the equations of motion seems to preclude apparently thermodynamically impossible behavior like all the air spontaneously rushing to one side of the room, but we have already seen in this thread that as soon as you are dealing with realistic numbers it becomes impossible to observe any macroscopic fluctuations (say of entropy, however you measure it, in the wrong direction).
What about the other end of things? Would you suggest that a single elastic sphere, bouncing around in a container, is constantly violating the second law of thermodynamics because during any short interval of time it isn’t occupying the entire volume or because if you look at it with time running backwards things seem the same?
OK, then it seems I underestimated the task before me. I thought I merely had to explain how statistical mechanics and classical thermodynamics connect; but if you’ve got misgivings about classical thermodynamics, as well (such as the incredibly basic notion that entropy is additive), then I don’t think I can provide the requisite material you require to the depth you insist it be presented, not on a message board. If you’re interested, I found a web course here you might want to have a look at; it gives exactly my example in Eq. (3).
I don’t expect that to convince you, though. On the one hand, your resistance to accepting the judgment of experts on a subject shows an admirable confidence; but on the other, you just can’t study every aspect of the world to the depth that you don’t have to, at some point, accept what experts say as more likely to be correct than the whims of your imagination. I had hoped that I could at least make the expert consensus plausible to you; I’m sorry I couldn’t.
Imagine a theoretical system in one-dimensional space, a single point particle on the number line marked 0 to 10 meters. This particle is always moving left or right at a constant speed of 1 meter per second, and whenever it reaches either end of the line it starts moving in the opposite direction. The position of this particle and therefore the microstate of the whole system can therefore be realized like so:
10/π * cos^-1(cos(π(t+h)/10))
where h is the initial position of the particle at time t = 0.
Let t be a non-negative integer representing the number of time-steps, such that t ∈ N*.
Let c be the number of seconds allocated to each time-step, where c* > 0*.
Let h represent the initial microstate of the system, such that* 0 ≤ h ≤ 10*.
Let the function f(t,h)* = 10/π * cos[SUP]-1[/SUP][cos(π(t+h)/10)].
Let the set X be a microscopic representation of the system indexed by t, such that X[SUB]t[/SUB] = f(t * c,h).
I’ll define the number of distinct microstates Ω as 10**c*^-1 + 1 where c is the number of seconds that pass during one time-step t (10 is the amplitude of the above triangular wave function f, + 1 for the initial case X[SUB]0[/SUB]). For example if c=0.1, Ω = 10 * 0.1^-1 + 1 = 10 * 10 + 1= 101. For simplicity’s sake let c = 1, therefore Ω = 11. Let Ω be the total number of distinct microstates, such that Ω = 10 * c[SUP]-1[/SUP] + 1*.*
Now statistical entropy = k ln Ω = 1.38064852x10^-23 * ln(11) = about 3.31065 x 10^-23 J/K… wait, I’m confused again. How can statistical entropy change at all in an isolated system like this? Oh yeah, the formula uses W, not Ω. W only equals Ω when all the microstates correspond to a single macrostate.
I thought and still think that the macrostate of an isolated system does not change no matter how many time-steps pass - the pressure, volume, number of particles, entropy, heat - all of that remains constant because otherwise we would have to violate the first principle of thermodynamics, the conservation of energy. After much discussion with Half Man Half Wit it is apparent to me that we can divide this system into arbitrary [DEL]sub-systems[/DEL] parts, and the macroscopic properties of each part constitutes a different macrostate for the purposes of system-wide entropy. This still doesn’t make sense to me but I also don’t know statistical mechanics. Very well, we will divide the system into two [DEL]sub-systems[/DEL] parts: the left and right sides of the particle.
Now I must define two additional state variables: v[SUB]L[/SUB] and v[SUB]R[/SUB]. The v stands for volume, but in this case it represents the length of the line segment going from the particle to the left (L) or right (R) end of the system. Therefore v[SUB]L[/SUB] = x, v[SUB]R[/SUB] = 10 - X[SUB]t[/SUB], and v[SUB]LuR[/SUB] = 10. This has the effect of making every distinct microstate a distinct macrostate, thus the W for all microstates represented by 1 is different than the W for all microstates represented by 10.
In order to actually calculate W I need to assign a probability to each microstate. Because this system is theoretical and I only wish to theoretically disprove the “second law of thermodynamics”, I am allowed to calculate W with an observation period equal to the period of the system, 20 seconds. Here is a list of microstates over twenty seconds, X[SUB]0[/SUB] to X[SUB]20[/SUB], where c = 1 and h = 0.
{ 1,2,3,4,5,6,7,8,9,10,9,8,7,6,5,4,3,2,1,0 }
Changing the value of h results in the same set, although individual elements are shifted to the right by the value of h. Now the probability P(i) is the count of i in the above set divided by 20. Let b equal the period of f(t,h), 20*.
Let i be a number such that* 0 ≤ i ≤ b.
Let the function C(i) be the number of elements of the set { X[SUB]0[/SUB], … X[SUB]b[/SUB] } that are equal to i. Let the probability function P(i) = C(i)/b
I will now replace Boltzmann’s entropy with Gibbs entropy and plug in P(t) for p[SUB]i[/SUB]: Let the entropy S of a system be the Gibbs entropy, such that S = -k[SUB]B[/SUB] ΣP(t) ln P(t).
Therefore if c = 1 and h = 0, this is the entropy for the system realized by every possible microstate:
Also the signage of all entropy values in this table are wrong, it should all be positive. This doesn’t change the conclusion that entropy decreases on 2 out of every 20 time-steps.
I’m sorry to see you go, but in case you see this message I did read the Stanford page and I’m still not convinced. I don’t think they are wrong, it is probably something I misunderstand. But this is my thought process:
Q is extensive INDENTdQ = dQ[SUB]A[/SUB] + dQ[SUB]B[/SUB]
the system is isolated
(2) dQ[SUB]B[/SUB] = -dQ[SUB]A[/SUB]
Divide by dQ[SUB]A[/SUB]
(14) 1/T[SUB]B[/SUB] = 1/T[SUB]A[/SUB]
Multiply both sides by *T[SUB]B[/SUB]**T[SUB]A[/SUB]
(15) T[SUB]A[/SUB] = T[SUB]B[/SUB][/INDENT]
I am left concluding that A) it is impossible for an isolated system to have an inconsistent temperature, B) heat is not extensive, or C) entropy is not extensive. I think the latter conclusion is correct.
I suppose I could email Mr. Eastman for his opinion. I might do that.
I think that page is pretty straightforward. Also, again, a system not at equilibrium does not have its temperature defined, which should also be clear from the page.
Ok, so this is the example of an ideal gas, except in one dimension and there is only one (!) particle. In which case it is obvious that the particle is equally likely to be at any given point, and spends an equal amount of time in intervals of equal length.
As for the entropy, no, it doesn’t change with time. Nor does the entropy of any gas at equilibrium.