There is no question of probability at this point any more. The second throw is perfectly determined by the initial state. You just choose that state, then the evolution proceeds deterministically, like clockwork.
No, that’s not possible. If a column were all zeroes, then a certain microstate would evolve to nothing. If a row were all zeroes, then another row must have more than one 1, since we still need to specify where all the 27 possible states evolve to, and we would violate reversibility.
Oh, I see. I misread your last post, and now concede that the chance on the second throw is 50%.
Why do you need to specify where all 27 states evolve to? It should be enough that S[SUB]i[/SUB] eventually loops back to S[SUB]i[/SUB], whether that takes two steps or 26 steps. I can conceive of a system that never reaches every possible microstate, two moving points on a line segment which cannot occupy the same spot and bounce off both each other and the ends of the segment. No matter how you arrange the two points on the line, there exists at least two microstates that will never be reached: the points cannot switch places at the ends of the line segment. If I made an evolution matrix of that system, the columns (or is it rows?) for those two microstates are undefined.
You would, however, need to add a rule that the evolution matrix has a value for the initial state vector, and that each path through the matrix comes full circle.
Because otherwise, you’ll end up with some states for which you don’t have any evolution law. I mean, what’ll happen if you set up the system in that state? If the matrix doesn’t say, what does?
Cyclic motions are not a problem for this setup: state S[sub]1[/sub] could evolve to S[sub]2[/sub], and S[sub]2[/sub] to S[sub]1[/sub] (for example).
If you can’t set up the system in such a way that the points start out at every place, then the states you can’t set up are just not possible states of the system—you’ve got additional constraints, such that certain configurations are just impossible. If you can set the system up in this state, then the evolution law will have to specify how it evolves.
That’s right, some evolution matrices will be incompatible with any given initial state. This is a consequence of the initial state being but one instant in a periodic cycle.
The other option is to have multiple independent paths in an evolution matrix, such that perhaps M[27] points to itself (does not evolve) and all of the other 26 states form a loop: 1->2->3->…->26->1->2->…
Actually that makes more sense.
The points can start out in any configuration, so long as the two points do not occupy the same position, which I think you will allow me to rule out. But they can never switch sides - if point A is initially on the left and point B is initially on the right, the evolution of that system will never allow point A to be on the right and point B on the left. If point A is initially on the right and point B is initially on the left, again the system will never evolve so as to allow the points to switch sides.
Let’s say this system has exactly six distinct microstates:
[ul][li]A on the left, B in the middle[/li][li]A on the left, B on the right[/li][li]A in the middle, B on the left[/li][li]A in the middle, B on the right[/li][li]A on the right, B on the left[/li][li]A on the right, B in the middle[/ul][/li]
The evolution matrix would therefore be 6x6 and the state vector index six-dimensional, and is represented as such (I am representing S[SUB]i[/SUB] as a decimal):
Do I correctly understand your idea of evolution matrices? If so, it does not make sense to count all of the states in any particular sub-matrix because it is not guaranteed that any evolutionary path visits all of those states.
Then it’s just not an evolution law. As any physical law, it must apply to every state of the system to yield a valid dynamics.
This is perfectly well possible, of course. But it doesn’t change the conclusion.
I can’t really make sense of your notation, sorry. I will try to explain the evolution matrices more thoroughly, but I might not get to it for a couple of days. Basically, the matrix changes one vector into another via matrix multiplication; perhaps read up on that in the meantime.
The point you’re making isn’t valid, however: while of course, you can still t up cyclical evolutions, this doesn’t change the fact that since there’s more ways for a state to evolve to a higher entropy, typically, you will observe entropy increase.
Indeed, it’s not hard to see that eventually, any evolution will get back to the initial state. This doesn’t change the conclusion: if you observe the system in a low entropy state, there are more cases that the next state is a higher entropy one than a lower entropy one.
I mean, if it’s a minimum entropy state, thus should be clear: every step either yields constant entropy, or an increase. If every lowest-entropy state leads to another lowest-entropy state, then, in particular, none of the intermediate or high entropy states can evolve into a lowest-entropy state.
Then, for every intermediate-entropy state, each next step will either increase entropy, or will leave it constant. And so on.
The thing is, you need to consider every possible state. If you eliminate the possibility of entropy increase for one, then another must (except for the limiting case where entropy always is constant, as e. g. if every state stays the same). Since there are more ways to increase entropy, some of them must go uncompensated.
Sorry about the notation. And right, that fact is exactly what I don’t understand.
I still don’t see how there are necessarily more ways to increase entropy. Those entropy-changing steps could all be part of a different evolutionary cycle within the matrix. For example S[SUB]1-6[/SUB] could form a cycle, S[SUB]7-24[/SUB] could form another cycle, and S[SUB]25-27[/SUB] could form a third cycle. In this matrix entropy never changes in direct contradiction to your assertion.
I don’t think this example of some states (27?) evolving according to an arbitrary permutation matrix is a good illustration of mixing or ergodicity or entropy production. For instance, suppose your matrix M above is the identity matrix.
I explicitly allowed for entropy staying constant (see my last post). After all, that’s the second law: there’s a greater-or-equal sign there. Also, I have stipulated from the beginning that I’m not considering such limiting cases, as in for instance where all of the gas molecules just oscillate in lockstep between one side of the room and the other.
What I said was that you can’t have an uncompensated entropy decrease, and if you have the chance of a decrease, there will be a greater chance of increase. Take the case where two of the minimum entropy states just oscillate among each other, one evolves to an intermediate state, one intermediate state evolves to that low entropy state, and the high entropy states just cycle among themselves.
Then, if the system is in a high-entropy state, it will stay high. In an intermediate state, one out of 18 times, we will observe a reductuon, otherwise, entropy stays constant.
In a low entropy state, however, one out of three times, you get an entropy increase.
Hence, the chance of observing an increase (or no change) is much higher than of observing a decrease.
Well, if you turn into off the dynamics, then you’ll indeed have constant entropy, but that’s also the case with a gas. Plus, you can write every evolution law in this way (if you discretize the system), so it’s sufficiently general.
If I understand you correctly you are arranging the evolution matrix such that S[SUB]1-2[/SUB] make one cycle, S[SUB]3-24[/SUB] make a second cycle, and S[SUB]25-27[/SUB] make a third cycle. In S[SUB]3-24[/SUB] there is exactly one low-entropy state that evolves into an intermediate entropy state, and exactly one intermediate entropy state that evolves into a low-entropy state. Then you say the entropy does not change during the evolution cycles S[SUB]1-2[/SUB] or S[SUB]25-27[/SUB]. All fine so far.
You say the probability of observing a reduction in entropy during a random step in cycle S[SUB]3-24[/SUB] is 1/18 and the probability of observing no change is 17/18. That doesn’t follow at all - there are 21 steps in that cycle. The probability of any random step increasing entropy is 1/21, the probability of entropy remaining constant is 19/21, and the probability of entropy decreasing is 1/21.
No, that’s not quite it. 1-6 evolve among themselves (whether they make one cycle, or more than one), and one state from 7-24 evolves to one from 25-27, while one from 25-27 evolves to one from 7-24.
So if the system is in one of the states 1-6, entropy will remain constant. For one state among 7-24, entropy will decrease. For one state from 25-27, entropy will increase.
Hence, the probability of observing a reduction given that the system is in an intermediate entropy state is 1/18. The probability of observing an increase in entropy given that the system is in a low-entropy state is 1/3.
The ‘given that’ is what’s usually called the ‘past hypothesis’. We find the universe in a low entropy state; the second law concerns the probability of what happens given that we do so.
In other words, for more of the low entropy states that we could find the system in, we observe an increase (or constance) than a decrease.
OK, but aren’t we getting away from dynamic considerations of entropy (and note, by the way, that considered as a discrete dynamical system the Kolmogorov-Sinai entropy of your process is always zero) and saying things that verge on the tautological, along the lines of, if we partition our space into a big subset and a little subset, then permute all the states, then a majority of points land in the big subset? Or that the system is most probably in the most probable state?
Classically (too classically?), it seems that the entropy of a body describes its average properties when at equilibrium, or at least observed over some non-infinitesimal period of time so that some probability distribution arises.
Sure, but that’s not really a relevant notion here. We’re considering the amount of information we may gain by discovering the precise microstate, basically.
Pretty much, yes. That’s ultimately all the second law comes down to (as I think I said earlier).
No. Basically, if there’s a way in, there’s a way out. So if one of the intermediate entropy states evolves to a low entropy one, then (at least) one of the low entropy states can’t evolve to another low entropy state (since every state has a unique precursor, and the precursor of one of the low entropy states is an intermediate entropy state, and thus, can’t be a low entropy state), and hence, must evolve to a higher entropy state.
There are 27 possible states. The initial state is equiprobable: 1/27.
The probability of the initial state being a high-entropy state is 6/27.
The probability of the initial state being an intermediate-entropy state is 18/27.
The probability of the initial state being a low-entropy state is 3/27.
6+18+3=27, so that checks out.
The probability of any random observation of a high entropy state changing entropy is 0.
The probability of any random observation of a high entropy state keeping consistent entropy is 1.
The probability of any random observation of an intermediate entropy state changing to a low entropy state is 1/18.
The probability of any random observation of an intermediate entropy state keeping consistent entropy is 17/18.
The probability of any random observation of an intermediate entropy state changing to a high entropy state is 0.
The probability of any random observation of a low entropy state changing keeping consistent entropy is 2/3.
The probability of any random observation of a low entropy state changing to an intermediate entropy state is 1/3.
The probability of any random observation of a low entropy state changing to a high entropy state is 0.
Therefore, the probability of a random observation showing an increase in entropy is 1/3 * 3/27 = 1/27.
The probability of a random observation showing no change in entropy is 6/27 + 17/18 * 18/27 + 2/3 * 3/27 = 25/27.
The probability of a random observation showing a decrease in entropy is 1/18 * 18/27 = 1/27.
1/27 + 25/27 + 1/27 = 27/27, so that checks out.
But wasn’t your assertion that the probability of observing an increase is greater than the probability of observing a decrease?