FAQ 
Calendar 


#151




Quote:
You say it has been violated, but in each case I can't see the violation without assuming some stochastic fundamental reality which also leaves the door open for violations of every other law of physics. For example the random hopping in your first hypothetical means microscopic particles don't obey physics at all. I never saw a violation to begin with for the billiard table or the laser experiment. ~Max Last edited by Max S.; 05082019 at 04:28 PM. Reason: examples 
#152




Quote:
Now, to the next step.
I expect greater resistance with this example. But again, try not to think ahead to the rest of this discussion; just consider the above system, as I have presented it. Do you agree that the conclusion is reasonable, here? That there is once again a law that appears valid thanks to the limited observations made at the macroscale, which we can see must be violated once we know about the microscopic level? 
#153




Quote:
Quote:
Last edited by DPRK; 05082019 at 04:59 PM. 
#154




Quote:
Quote:
~Max 


#155




Re: MarblesQuote:
Quote:
Quote:
~Max 
#156




Quote:
~Max 
#157




And we still call Pluto a "planet," but science books have a big fat asterisk next to that.

#158




Quote:
Quote:
Quote:
So assume some 'less grey' state G_{<} can be realized by means of white and black marbles in ten different ways (corresponding to microstates G_{<}^{1} through G_{<}^{10}). Assume there are more ways to realize the 'more grey' state G_{>} than there are ways of realizing the 'even less grey' state G_{<<}. Then, more of these ten states will evolve towards the more grey state than will evolve towards the less grey one. Say there are eight states G_{>}^{1} through G_{>}^{8}, and the states G_{<<}^{1} and G_{<<}^{2}. These are the states available from either of the states realizing G_{<} on the next timestep. Then, eight out of ten times, if the system is in the state G_{<}, we will see it evolve into G_{>}. (Perhaps it helps to recall, here, that a reversible evolution always takes different states to different states, as if it didn't, i. e. by taking two different states to the same following state, you can't tell which was the original state from looking at the later one, and thus, can't reverse the evolution.) Thus, at each timestep, we're more likely to observe an increase in greyness, simply because there are more ways to get more grey. Last edited by Half Man Half Wit; 05082019 at 11:35 PM. 
#159




It's the same with the second law of thermodynamicstextbooks generally point out that it's only valid statistically.



#160




Quote:
(OK, I know with "randomness" there's a degree of interpretation involved and it's possible there could be nonlocal hidden variables. Half Man Half Wit will know better than me. But I'm not aware of any effort to interpret away chaotic behaviour. Or anyone that claims it breaks physical laws.) Last edited by Mijin; 05092019 at 12:38 AM. 
#161




I continue to feel that my claim ...
Unlike the Laws of Newton, Maxwell and Einstein, the Second Law isn't a dynamical law; it's just a statistical fact, closely akin to the Law of Large Numbers. ... is correct, and should have quiesced this bickering! Everything. Quote:
Quote:

#162




Quote:
So far the only violations of the second law have been violations of Clausius's corollary, using different definitions of entropy. And it seems to me that said definitions of entropy also imply the possibility that the laws of Newton, Einstein, and Maxwell can also be violated, although the probability is vanishingly small. ~Max 
#163




Re:Re:Re: Flipping CoinsQuote:
I agreed that the law could be formulated but I did not agree that the steps between 5 and 18 of that example were valid, logical steps. I never agreed with your logic past step 5, because you missed a very important premise. See my post on dice logic, #150. ~Max 
#164




Quote:
~Max 


#165




GreynessQuote:
We can see now that a point on the leftmost position, 10, has a 19/21 probability of becoming "more grey" or "less entropic" at the next step, and a 2/21 probability of remaining at the same distance from grey or of having no change in entropy. So does a point on the rightmost position. Here are the other values (you may need to switch forum themes at the bottom left of the page): Code:
Position to Entropy Pos  Less  Same  More 10  19/21  2/21  0/21  9  17/21  2/21  2/21  8  15/21  2/21  4/21  7  13/21  2/21  6/21  6  11/21  2/21  8/21  5  9/21  2/21  10/21  4  7/21  2/21  12/21  3  5/21  2/21  14/21  2  3/21  2/21  16/21  1  1/21  2/21  18/21 0  0/21  1/21  20/21 1  1/21  2/21  18/21 2  3/21  2/21  16/21 3  5/21  2/21  14/21 4  7/21  2/21  12/21 5  9/21  2/21  10/21 6  11/21  2/21  8/21 7  13/21  2/21  6/21 8  15/21  2/21  4/21 9  17/21  2/21  2/21 10  19/21  2/21  0/21 And this is after assuming the equiprobability of microstates at each measure of entropy, which I do not wish to assume. ~Max 
#166




Quote:
There are three (or four) forms of the second law of thermodynamics which are not heuristic, and these are the ones I was taught in school and cited in the first post. So in order to disprove those laws, you would need to show a contradiction. That's what this thread is about. ~Max Last edited by Max S.; 05092019 at 10:01 AM. 
#167




Quote:
Or, in other words, consider the following. You let a ball drop a hundred (a thousand, a million...) times. It always falls down. From this, you formulate a law: stuff falls down. Now suppose that whatever deity has created the universe has made it so that the actual law is: stuff falls down, except once every sextillion times, when it just hovers in place. You've got no data to support that the law is actually the latter, and never will observe any. Thus, you still formulate the law as 'stuff falls down'. You're completely justified in doing so; however, you happen to be wrong. That's the situation we're in here: whenever we try, we will find greyness increasing with overwhelming probability. That is, in any concrete, reasonably large series of trials, we won't observe a violation. Thus, we formulate a law to the effect that greyness always increases. This law stands on equal footing with every other physical law: it's a generalization from finitely many observations. Quote:
Take a coin. Suppose you can only throw it in two different waysway A and way B. Way A always lands heads up; way B always lands tails up. You don't have control over whether you've thrown it according to way A or way B (random initial conditions, you remember). Then, the probability that it comes up heads is 50% on each throw. No further assumptions necessary. Suppose now you can throw the coin in 20 different ways, 10 of which come up heads, 10 of which come up tails. This yields the same conclusion. As does supposing that there are 100, or 1000, and so on different ways. What matters is that from the set of possible initial conditions, half of them yield heads, and half of them yield tails. Quote:
Quote:
You assume that entropy gain and loss are equally likely. You get out that entropy gain and loss are equally likely. This isn't surprising. 
#168




Quote:
~Max 
#169




Quote:
~Max 


#170




Quote:
If by 'the system tends to greyness' you mean the disparity in color between the left and right boxes will always decrease over time, again this is disproven with 100% probability so long as the observer has enough time to watch the system. If by 'the system tends to greyness' you mean 'after removing the wall, each box will usually be some shade of grey as opposed to absolute black or white', I have no qualms. But that is not a law. Quote:
~Max 
#171




Quote:
Quote:
Quote:
You're right to point out that this is, ultimately, wrong; but the observer has no way to know that, without having the microscopic theory of greyness. Quote:
We're talking about generalizations made from actually feasible observations. Given this, while there is an astronomically small probability of actually observing a violation of the law of increasing greyness, the far more probable course is going to be that, during the tens or hundreds or thousands of times that the experiment is repeated, no violation is observed, and thus, the law has the status of any other physical law ever formulated. Quote:
Quote:
Take my introductory example. Here's again the distinguishable macrostates, together with the number of their microscopic realizations:
No matter what the microscopic dynamics are, each of the three microstates realizing, say, (A2B1C0) has 6 states corresponding to macrostates of higher entropy it can evolve to, 17 states of equal entropy, but only 3 states of lower entropy. Does that help clear things up? 
#172




If you are studying these marbles, once you have an ergodic Markov chain, then it will satisfy the asymptotic equipartition theorem; this is a mathematical result in information theory. You can compare the entropy rate of the forward and timereversed process and it will satisfy the fluctuation theorem. At this point, you are talking about mathematical theorems, like the Law of Large Numbers, so there really isn't anything open to interpretation unless one wants to philosophize about it like Rosencrantz and Guildenstern.

#173




Quote:
Quote:
Quote:
Quote:
Quote:
~Max 
#174




Quote:
~Max 


#175




Quote:
Quote:
Quote:
Quote:
That's what our experimenter sees: exactly what you would see. Quote:
Let's take the smallest possible change of something like the coin system. Take one pixel, and flip its color. Suppose you start with an allblack state. Then, flipping one pixel's color will make the system more grey with certainty: every other possible state is one with higher 'greyness'. Then, take the resulting state, and again, flip one pixel: if you flip any pixel other than the one you've flipped before, you will again move towards a state of higher greyness. And so on: there will be more states of higher greyness to flip to, until you've reached a 50/50 distribution of black vs. white pixels. If you flip a pixel there, you will get to a state that's ever so slightly (but undetectably) less grey. The next flip will return you to the equilibrium with a probability of 50% (actually, very slightly more than that, since there is now either one more black or white pixel). Now, the important thing to realize is that this doesn't depend on the microdynamics; in particular, it doesn't assume that this dynamics is random. Rather, this is a conclusion that applies to generic deterministic and reversible dynamics. Think again back to the original, allblack system: any possible microscopic evolution will lead to a more grey state. After that initial chance, still almost every possible evolution will lead to a more grey state. And so on, up until you reach equilibrium. There, half of all evolution laws will lead to a less grey state; but, even for that half that lead away from equilibrium, most will return quickly to it, with some holding out longer, and only one making it all the way to the allwhite state, before going back. In conclusion, as long as you don't take care to set up a very special evolution of the system (a point I had stressed from the beginning, you will recall), any generic microdynamics will, at any point in the evolution of the system, tend to increase the overall greyness with overwhelming probability. 
#176




Quote:
Quote:
Quote:
Quote:
Quote:
~Max Last edited by Max S.; 05092019 at 04:49 PM. 
#177




I was under the impression you were discussing processes that randomly mix up marbles and wanted to understand why they inevitably get scrambled rather than unscrambled.

#178




Quote:
~Max 
#179




Quote:
Quote:
Quote:
Quote:
Take my original system. It has 27 microstates. We can represent the microstate by means of a 27dimensional vector, that has a '1' in the row corresponding to the microstate, and a '0' everywhere else, like this: Code:
(0) (0) (.) (.) (.) S_{i} = (1) (.) (.) (.) (0)
Any given evolution law of this system will be a 27 x 27  matrix M, which takes one 27dim vector to a different 27dim vector by ordinary matrix multiplication: S_{j} = M*S_{i} (That is, the system starts out in S_{i}, is evolved for one timestep, and ends up in S_{j}.) The matrix evolves S_{i} into S_{j}, if and only if the ith entry in its jth row is '1'. This matrix must obey some constraints. First, any given state can only evolve into one other state (this is determinism). Consequently, the matrix can only have one entry different from 0 in each column. Likewise, only one state can evolve into any given state (this is reversibility: each state must have a unique precursor). Consequently, there can be only one nonzero entry per row, as well. Thus, all the valid laws of evolution for the system take the form of a matrix M such that each row and column have exactly one entry equal to 1, and the rest 0 (there are thus 27 nonzero entries in the matrix). Now, we can count how often each of these evolutions will increase vs. decrease entropy. Take first the upper left 6 x 6  submatrix: any entry here will only 'mix' the maximumentropy states among themselves, thus yielding to constant entropy. Then, take the 18 x 18  submatrix 'in the middle'. This mixes the intermediateentropy states, thus likewise not leading to entropy increase. Same for the 3 x 3  submatrix in the lower right corner, which mixes the minentropy states. Now take the lower left 3 x 24  submatrix (that is, everything left over if you take the lower right 3 x 3 away). These are more interesting: they are entries that take any of the 24 states of nonminimal entropy, and transform them into stats of minimal entropy. These are, thus, entropy decreasing. Because of the properties of the matrix, there can be three such entries. Then, take the left 18 x 6  strip: They are the entries that take any of the 6 maximum entropy states to one of the 18 intermediate ones. These are, likewise, entropy decreasing; again, there can be maximally 6 entries here (but, if there are any entries here, we have to be careful not to double count: if there is an entry in the first three columns, there can't be any entries in the lower left 3 x 3  submatrix, since this would entail a maximum entropy state both transitioning to a lower entropy and a minimum entropy one, in violation of determinism). These are all the entries that may lower entropy: transitioning from a maximum entropy state to either an intermediate or minimum entropy state, or transitioning from an intermediate entropy state to a minimum entropy state. Consequentlyand here's the kickerany evolution law we can write down for the system can have at most nine entries that lead to a lowering of entropy, and must, consequently, have 18 entries that keep entropy constant or increase it. Hence, no matter what the microdynamics areno matter which particular M we chooseit follows, just from the count of microstates, that it takes any given state to a lowerentropy one at most one out of three (9/27) times. Consequently, entropy increases or stays constant more often than it decreases. And that's all I've been saying. 


#180




Quote:
Quote:
Quote:
Could not the row and column both be all zeroes? For situations where some microstates are never visited. My intuition is that most systems do not visit every microstate. ~Max 
#181




Quote:
Quote:

#182




Quote:
Quote:
You would, however, need to add a rule that the evolution matrix has a value for the initial state vector, and that each path through the matrix comes full circle. ~Max 
#183




Because otherwise, you'll end up with some states for which you don't have any evolution law. I mean, what'll happen if you set up the system in that state? If the matrix doesn't say, what does?
Quote:
Quote:
Last edited by Half Man Half Wit; 05102019 at 10:53 AM. 
#184




Quote:
The other option is to have multiple independent paths in an evolution matrix, such that perhaps M[27] points to itself (does not evolve) and all of the other 26 states form a loop: 1>2>3>...>26>1>2>... Actually that makes more sense. Quote:
Let's say this system has exactly six distinct microstates:
The evolution matrix would therefore be 6x6 and the state vector index sixdimensional, and is represented as such (I am representing S_{i} as a decimal): Code:
S_{j}= 123456 S_{i}=1000100 S_{i}=2010000 S_{i}=3000001 S_{i}=4100000 S_{i}=5000010 S_{i}=6001000 ~Max Last edited by Max S.; 05102019 at 11:37 AM. 


#185




Quote:
Quote:
Quote:
The point you're making isn't valid, however: while of course, you can still t up cyclical evolutions, this doesn't change the fact that since there's more ways for a state to evolve to a higher entropy, typically, you will observe entropy increase. Indeed, it's not hard to see that eventually, any evolution will get back to the initial state. This doesn't change the conclusion: if you observe the system in a low entropy state, there are more cases that the next state is a higher entropy one than a lower entropy one. I mean, if it's a minimum entropy state, thus should be clear: every step either yields constant entropy, or an increase. If every lowestentropy state leads to another lowestentropy state, then, in particular, none of the intermediate or high entropy states can evolve into a lowestentropy state. Then, for every intermediateentropy state, each next step will either increase entropy, or will leave it constant. And so on. The thing is, you need to consider every possible state. If you eliminate the possibility of entropy increase for one, then another must (except for the limiting case where entropy always is constant, as e. g. if every state stays the same). Since there are more ways to increase entropy, some of them must go uncompensated. 
#186




Quote:
I still don't see how there are necessarily more ways to increase entropy. Those entropychanging steps could all be part of a different evolutionary cycle within the matrix. For example S_{16} could form a cycle, S_{724} could form another cycle, and S_{2527} could form a third cycle. In this matrix entropy never changes in direct contradiction to your assertion. ~Max 
#187




I don't think this example of some states (27?) evolving according to an arbitrary permutation matrix is a good illustration of mixing or ergodicity or entropy production. For instance, suppose your matrix M above is the identity matrix.

#188




Quote:
What I said was that you can't have an uncompensated entropy decrease, and if you have the chance of a decrease, there will be a greater chance of increase. Take the case where two of the minimum entropy states just oscillate among each other, one evolves to an intermediate state, one intermediate state evolves to that low entropy state, and the high entropy states just cycle among themselves. Then, if the system is in a highentropy state, it will stay high. In an intermediate state, one out of 18 times, we will observe a reductuon, otherwise, entropy stays constant. In a low entropy state, however, one out of three times, you get an entropy increase. Hence, the chance of observing an increase (or no change) is much higher than of observing a decrease. 
#189




Well, if you turn into off the dynamics, then you'll indeed have constant entropy, but that's also the case with a gas. Plus, you can write every evolution law in this way (if you discretize the system), so it's sufficiently general.



#190




Quote:
You say the probability of observing a reduction in entropy during a random step in cycle S_{324} is 1/18 and the probability of observing no change is 17/18. That doesn't follow at all  there are 21 steps in that cycle. The probability of any random step increasing entropy is 1/21, the probability of entropy remaining constant is 19/21, and the probability of entropy decreasing is 1/21. ~Max 
#191




Quote:
Quote:
~Max 
#192




Quote:
So if the system is in one of the states 16, entropy will remain constant. For one state among 724, entropy will decrease. For one state from 2527, entropy will increase. Hence, the probability of observing a reduction given that the system is in an intermediate entropy state is 1/18. The probability of observing an increase in entropy given that the system is in a lowentropy state is 1/3. The 'given that' is what's usually called the 'past hypothesis'. We find the universe in a low entropy state; the second law concerns the probability of what happens given that we do so. In other words, for more of the low entropy states that we could find the system in, we observe an increase (or constance) than a decrease. 
#193




OK, but aren't we getting away from dynamic considerations of entropy (and note, by the way, that considered as a discrete dynamical system the KolmogorovSinai entropy of your process is always zero) and saying things that verge on the tautological, along the lines of, if we partition our space into a big subset and a little subset, then permute all the states, then a majority of points land in the big subset? Or that the system is most probably in the most probable state?
Classically (too classically?), it seems that the entropy of a body describes its average properties when at equilibrium, or at least observed over some noninfinitesimal period of time so that some probability distribution arises. Last edited by DPRK; 05102019 at 02:07 PM. 
#194




Quote:
~Max Last edited by Max S.; 05102019 at 02:06 PM. 


#195




Quote:
Quote:

#196




No. Basically, if there's a way in, there's a way out. So if one of the intermediate entropy states evolves to a low entropy one, then (at least) one of the low entropy states can't evolve to another low entropy state (since every state has a unique precursor, and the precursor of one of the low entropy states is an intermediate entropy state, and thus, can't be a low entropy state), and hence, must evolve to a higher entropy state.

#197




Quote:
Quote:
~Max 
#198




16 are the highentropy states, corresponding to the (single) macrostate (A1B1C1), see above.

#199




Quote:
So we have the following rules: There are 27 possible states. The initial state is equiprobable: 1/27. The probability of the initial state being a highentropy state is 6/27. The probability of the initial state being an intermediateentropy state is 18/27. The probability of the initial state being a lowentropy state is 3/27. 6+18+3=27, so that checks out. The probability of any random observation of a high entropy state changing entropy is 0. The probability of any random observation of a high entropy state keeping consistent entropy is 1. The probability of any random observation of an intermediate entropy state changing to a low entropy state is 1/18. The probability of any random observation of an intermediate entropy state keeping consistent entropy is 17/18. The probability of any random observation of an intermediate entropy state changing to a high entropy state is 0. The probability of any random observation of a low entropy state changing keeping consistent entropy is 2/3. The probability of any random observation of a low entropy state changing to an intermediate entropy state is 1/3. The probability of any random observation of a low entropy state changing to a high entropy state is 0. Therefore, the probability of a random observation showing an increase in entropy is 1/3 * 3/27 = 1/27. The probability of a random observation showing no change in entropy is 6/27 + 17/18 * 18/27 + 2/3 * 3/27 = 25/27. The probability of a random observation showing a decrease in entropy is 1/18 * 18/27 = 1/27. 1/27 + 25/27 + 1/27 = 27/27, so that checks out. But wasn't your assertion that the probability of observing an increase is greater than the probability of observing a decrease? ~Max 


#200




Quote:
~Max 
Reply 
Thread Tools  
Display Modes  

