Think about it this way: that a low entropy state evolves to a high entropy state is more likely than that a high entropy state evolves to a low entropy state.
Say you’ve got a box with all the white marbles to the left, and all the black marbles to the right. Then, you get a ve it a good shake. What do you think will happen, and why?
Thank you for your patience, by the way.
You have yet to show the truth of this statement. In the evolution matrix example it turned out the probability of entropy decreasing is 1/27 and the probability of entropy increasing is 1/27 - exactly the same probability. The chance of a high entropy state evolving into a low entropy state was zero but the increases and decreases still occurred with low<->intermediate entropy states.
Remember this is your definition of entropy, too.
~Max
To answer one question of a time instead of getting ahead of myself, the marbles will probably mix a little because as I shake the box, the force of the walls collide with the marbles thereby injecting the system with kinetic energy.
~Max
You’re mixing micro- and macrolevel (although I’m probably partially to blame for this myself). Entropy is a property of the macrostate; the probabilities you’re talking about are probabilities of microstates evolving into one another.
At the macro level, however, you can only distinguish between high-, medium-, and low-entropy states. So if you find the system in a medium entropy state, the chance is 1/18 that it will evolve to a low entropy state; if you find the system in a low entropy state, the chance is 1/3 that it will evolve into a medium entropy state.
So one out of 18 times you find the system to spontaneously lower its entropy, while one out of three times, it spontaneously increases.
What do you expect is gonna happen if you shake it some more? And, more importantly, why?
No, you cannot compare those two probabilities without multiplying 1/18 by 18/21 and 1/3 by 3/21. To do otherwise constitutes a statistical error. And besides, you cannot apply those probabilities to the marbles example without including the 1/6 possibility that the system starts out in a high entropy state.
~Max
The more you shake the box, the more energy you impart upon the system. Assuming there are no other forces (gravity, friction, air resistance), and that the walls of the box are still perfectly bouncy, the velocity of marbles in the box will continue to increase. It is important that the walls of the box are perfectly bouncy, and this is unrealistic - if the walls perfectly reflect kinetic energy from marbles, you do not have to shake faster because energy from the marbles bouncing against the box is never sent back to counteract the movement of your arm.
So the more you shake the box, the more the system’s energy increases. So too does entropy* increase. In fact it is impossible for the entropy of the system to decrease while you are shaking the box.
*my definition of entropy, dQ/T.
~Max
If you think about a system in thermal equilibrium with a heat bath, that corresponds to Gibbs’s canonical ensemble. The temperature is fixed, but the energy is not. If you calculate the equilibrium state you will find that- big surprise here- the Gibbs entropy is maximized.
No. We’re talking about conditional probabilities here. 1/27 is the probability of the system in any state whatsoever evolving to a (microstate realizing a) higher (or, indeed, lower) entropy macrostate. But, you have additional information, on which you must condition your probability assignment.
By the formula for conditional probability, the probability that the system moves to a state of lower entropy, given that it is in a state of medium entropy, is (1/27) / (18/27): the probability of being in a medium entropy state AND moving to a lower entropy, which occurs for one of the 27 possible cases, divided by the probability of being in a medium entropy state. Naturally, that’s 1/18.
You only need to impart more energy to counter losses due to friction; if those weren’t there, the balls would keep on bouncing, and mixing, with any initial impulse (internal energy).
So, again: you keep shaking (countering friction; alternatively, you give the box a single whack, setting up an initial state, if there is no friction)—what do you expect happens? Don’t think about entropy, for now, or heat, or any such confounders. Just regarding the mixture of marbles: will they tend to stay separated? Or is it more likely that they become more and more mixed?
Yes, but you are in error to directly compare 1/18 and 1/3.
~Max
They will probably mix some but I cannot say whether they become more or less mixed, that would depend on specifics on how I shake the box and how exactly the marbles are arranged. And I was assuming a scenario without friction.
~Max
Why would that be? I want to know which one is more likely: that the system moves to a lower entropy state from a higher entropy state, or that it moves to a higher entropy state from a lower entropy state. So these are exactly the numbers I have to compare.
If I were to do as you say, and weight the two probabilities with the probability of finding the system in a low- respectively high-entropy state, I would instead compare the probability of finding the system in a low entropy state and having it transition to a higher one with that of finding it in a high entropy state and having it transition to a low entropy one.
That’s a different situation: there, I have not yet looked at the system, and want to know what to expect if I look; but what we want to know is, after we have looked at the system, what (most likely) happens next.
You don’t have those specifics. That’s the point: you don’t have microscopic control over the system, but have to predict what’s more likely to happen, given the state it’s in now.
In other words, if I offer you a hundred bucks if you shake the system and it gets less mixed, and you have to give me a hundred bucks if it gets more mixed, do you take that bet?
I will admit that it is more likely for our system to transfer from a low entropy state to an intermediate entropy state than it is to transfer from an intermediate entropy state to a low entropy state. Those respective probabilities are 1/3 and 1/18.
I was myself wrong to dispute your statement in [POST=21635893]post #201[/POST].
But that is not what you set out to prove, is it?
~Max
It is not clear whether I keep shaking the box forever, or when I measure how mixed the marbles are.
I hate to fight the hypothetical but I have no place predicting the behavior of the marbles with so little information. Especially not for a $100 bet.
~Max
It’s not? How so? What I set out to demonstrate was how somebody observing a system could come away with a law of the form ‘the entropy increases or stays constant’, even though it’s possible for it to also decrease. My claim was that for this, it’s enough to realize the difference in number regarding high(er)- vs low(er)-entropy states.
So let’s think about how one might come away with that law. The experimenter may set the system up in either a low, intermediate, or high entropy state. If they set it up in a hight entropy state, they’ll always find it to stay constant. If they set it up in a low entropy state, they’ll always find it to stay constant, or, with some sizeable probability that increases sharply once we get to larger system sizes, to increase.
If, now, they set it up in an intermediate entropy state, they will find that, with overwhelming likelihood, the entropy will stay the same. Indeed, for some sufficiently large system, they likely never will observe it to decrease.
Consequently, they’ll formulate a law, generalizing from their observations, that entropy always increases, or stays constant.
We have more information, however. We know that this quantity, entropy, is realized by a system deterministically evolving among microstates. We can thus conclude that once in a (very large) while, entropy must also decrease. The law that was formulated thus is not an exact, but merely a statistical one.
For any sufficiently large system, that we’ll ever observe such a violation is vanishingly unlikely (think a tornado rushing through a junkyard spontaneously assembling a jumbo jet unlikely). But for small enough systems, these violations may be observable.
Moreover, we can predict just how likely such violations are going to be. That’s the fluctuation theorem. And, indeed, if we do the experiment, this prediction is confirmed.
You can’t keep shaking it forever. Unless you’re a very different entity from what I take you to be, you (I’m sorry to say) won’t be around forever. So let’s just say you shake it good until you get bored, or hungry, or need a bathroom break.
This is, by the way, completely irrelevant to the hypothetical, because of course, if you shake it, it will get more mixed. I mean, you can’t really reasonably think otherwise; you’re just trying special pleading to get out of accepting an unwelcome conclusion.
(One awkwardness of the current setup, I should point out before we bump into it, is that the absolute number of high entropy states is smaller than the number of intermediate entropy states. Thus, you could have all high entropy states transition to intermediate entropy states; this is no longer the case for larger systems, where the high entropy states will vastly dominate.)