FAQ 
Calendar 


#201




Think about it this way: that a low entropy state evolves to a high entropy state is more likely than that a high entropy state evolves to a low entropy state.

#202




Say you've got a box with all the white marbles to the left, and all the black marbles to the right. Then, you get a ve it a good shake. What do you think will happen, and why?

#203




Quote:
You have yet to show the truth of this statement. In the evolution matrix example it turned out the probability of entropy decreasing is 1/27 and the probability of entropy increasing is 1/27  exactly the same probability. The chance of a high entropy state evolving into a low entropy state was zero but the increases and decreases still occurred with low<>intermediate entropy states. Remember this is your definition of entropy, too. ~Max 
#204




Quote:
~Max Last edited by Max S.; 05102019 at 04:24 PM. 


#205




Quote:
At the macro level, however, you can only distinguish between high, medium, and lowentropy states. So if you find the system in a medium entropy state, the chance is 1/18 that it will evolve to a low entropy state; if you find the system in a low entropy state, the chance is 1/3 that it will evolve into a medium entropy state. So one out of 18 times you find the system to spontaneously lower its entropy, while one out of three times, it spontaneously increases. 
#206




What do you expect is gonna happen if you shake it some more? And, more importantly, why?

#207




Quote:
~Max 
#208




Quote:
So the more you shake the box, the more the system's energy increases. So too does entropy* increase. In fact it is impossible for the entropy of the system to decrease while you are shaking the box. *my definition of entropy, dQ/T. ~Max Last edited by Max S.; 05102019 at 04:47 PM. 
#209




If you think about a system in thermal equilibrium with a heat bath, that corresponds to Gibbs's canonical ensemble. The temperature is fixed, but the energy is not. If you calculate the equilibrium state you will find that big surprise here the Gibbs entropy is maximized.



#210




Quote:
By the formula for conditional probability, the probability that the system moves to a state of lower entropy, given that it is in a state of medium entropy, is (1/27) / (18/27): the probability of being in a medium entropy state AND moving to a lower entropy, which occurs for one of the 27 possible cases, divided by the probability of being in a medium entropy state. Naturally, that's 1/18. 
#211




Quote:
So, again: you keep shaking (countering friction; alternatively, you give the box a single whack, setting up an initial state, if there is no friction)what do you expect happens? Don't think about entropy, for now, or heat, or any such confounders. Just regarding the mixture of marbles: will they tend to stay separated? Or is it more likely that they become more and more mixed? 
#212




Quote:
~Max 
#213




Quote:
~Max 
#214




Why would that be? I want to know which one is more likely: that the system moves to a lower entropy state from a higher entropy state, or that it moves to a higher entropy state from a lower entropy state. So these are exactly the numbers I have to compare.
If I were to do as you say, and weight the two probabilities with the probability of finding the system in a low respectively highentropy state, I would instead compare the probability of finding the system in a low entropy state and having it transition to a higher one with that of finding it in a high entropy state and having it transition to a low entropy one. That's a different situation: there, I have not yet looked at the system, and want to know what to expect if I look; but what we want to know is, after we have looked at the system, what (most likely) happens next. 


#215




Quote:
In other words, if I offer you a hundred bucks if you shake the system and it gets less mixed, and you have to give me a hundred bucks if it gets more mixed, do you take that bet? 
#216




Quote:
I was myself wrong to dispute your statement in post #201. But that is not what you set out to prove, is it? ~Max 
#217




Quote:
I hate to fight the hypothetical but I have no place predicting the behavior of the marbles with so little information. Especially not for a $100 bet. ~Max 
#218




Quote:
So let's think about how one might come away with that law. The experimenter may set the system up in either a low, intermediate, or high entropy state. If they set it up in a hight entropy state, they'll always find it to stay constant. If they set it up in a low entropy state, they'll always find it to stay constant, or, with some sizeable probability that increases sharply once we get to larger system sizes, to increase. If, now, they set it up in an intermediate entropy state, they will find that, with overwhelming likelihood, the entropy will stay the same. Indeed, for some sufficiently large system, they likely never will observe it to decrease. Consequently, they'll formulate a law, generalizing from their observations, that entropy always increases, or stays constant. We have more information, however. We know that this quantity, entropy, is realized by a system deterministically evolving among microstates. We can thus conclude that once in a (very large) while, entropy must also decrease. The law that was formulated thus is not an exact, but merely a statistical one. For any sufficiently large system, that we'll ever observe such a violation is vanishingly unlikely (think a tornado rushing through a junkyard spontaneously assembling a jumbo jet unlikely). But for small enough systems, these violations may be observable. Moreover, we can predict just how likely such violations are going to be. That's the fluctuation theorem. And, indeed, if we do the experiment, this prediction is confirmed. 
#219




Quote:
This is, by the way, completely irrelevant to the hypothetical, because of course, if you shake it, it will get more mixed. I mean, you can't really reasonably think otherwise; you're just trying special pleading to get out of accepting an unwelcome conclusion. 


#220




(One awkwardness of the current setup, I should point out before we bump into it, is that the absolute number of high entropy states is smaller than the number of intermediate entropy states. Thus, you could have all high entropy states transition to intermediate entropy states; this is no longer the case for larger systems, where the high entropy states will vastly dominate.)

#221




Quote:
You cannot simply assume that the system starts out in an intermediary state, or you are rigging your experiment. By doing so you have reduced 1/27 25/27 1/27 to 1/18 17/18 and 0/18. Your contention, if I recall, is that the probability of observing an increase is astronomically higher than observing a decrease. ~Max 
#222




Quote:
As a relevant mathematical exercise, flip a fair coin a bunch of times (N times) and compute the average value, where heads count as 0 and tails as 1. Now compute the probability that the average is greater than x (eg x could be 0.6). This average value is your "macrostate". You will find that the probability decreases exponentially as a function of minus the "entropy". You are more likely to observe a higherentropy state. Last edited by DPRK; 05112019 at 12:51 PM. 
#223




Quote:
~Max 
#224




Quote:
~Max 


#225




Quote:
Within a specific evolutionary cycle there needs not be an equal number of relatively high and low entropy states. But the cycle is periodic and by definition any increases in entropy must be matched with decreases of equal total magnitude. With only two accessible levels of entropy the probability of randomly observing an increase in entropy must be equal to the probability of observing a decrease. With more than two accessible levels of entropy it is possible for a random observation to favor an increase over a decrease in entropy  for example if entropy was plotted as a sawtooth wave. But without defining the evolution cycle this is not a given, entropy could just as well be plotted as a sine wave. ~Max 
#226




Quote:

#227




Quote:
~Max 
#228




For our purposes it does not matter if the flips are "absolutely random"; you may assume so. Or instead of coins think of them as noninteracting atoms that may be in one of two spin states. That is all beside the point, which is not that there is a "probability distribution", but to calculate what it is and verify the exponential falloff of the probability of any deviation.

#229




Quote:
For a fully deterministic system where the initial state and number of steps and all other variables are given, there is only one accessible state at the end of the trial. That gives me S = k_{B} * ln(1) = 0. So let's say everything is defined except the initial state. In the coin flippingover example that gives us two accessible states at the end of the trial, depending on which of two initial states was chosen. For a system with two equiprobable states, that gives me system entropy of S = k_{B} * ln(2) or about 9.56993x10^{24} joules per kelvin. But this definition of entropy is macroscopic  it says nothing about which microstate is "high entropy" and which state is "low entropy". As soon as we apply the actual microstate the macroscopic entropy drops to zero. Could you explain what you mean by "ending up in a smaller region becomes exponentially unlikely"?. I'm not making any connection between the probability of any end state in a coin flipping exercise and that conclusion. ~Max 


#230




Quote:
The states as I have given them initially are what we have macroscopic control over; they're how we set up the system in the experiment. We don't have control over which of the possible microstates realizes that particular macrostate; that's where the probabilistic part comes in. Your description would correspond to an 'experiment' where we just set up the system randomly, and see what happens. In that case, we'd most likely (again, in the sense of 'with virtual certainty') just always see systems at nearly maximum entropy not doing much at all. But real experiments will typically involve setting up a lowentropy state, and then looking to see what happens. Think about the historic experiments with steam engines and the like. In that case, the probabilities as I have given them are the only appropriate ones. Quote:
Quote:
The maximum entropy states is 'half the coins showing heads, half showing tails'. This state can be realized in (100 choose 50) ~ 10^{29} ways. there are thus 10^{29} microstates corresponding to the maximum entropy state, and 1 corresponding to the minimum entropy state. Simple continuity shows that each state with an inbetween amount of entropy must have an inbetween number of microscopic realizations, as well. Hence, vastly more microstates correspond to highentropy states than to lowentropy states. Or, being even more explicit, look at the fraction of microstates of the form 'x coins out of a 100 showing heads' compared to the maximumentropy '50 showing heads'. You can easily calculate that 96.5% of all microstates lie in the range between 40 and 60 coins flipped. If we double the number, with 200 coins, 99.6% of all microstates lie in the interval between 80 and 120 coins showing heads. For any remotely macroscopic system, thus, with on the order of 10^{23} atoms (instead of 200), each of which can be in a huge number of states (rather than a coin's two), if you just randomly look at the system, you're basically guaranteed to find it in a state extremely close to the maximum entropy state. So this idea, that the realworld experiment is modeled by just observing the system in a random state, simply doesn't workbecause then, with virtual certainty, you just won't ever observe it doing anything. Rather, a realworld experiment involves setting up a lowentropy state. Upon doing so, you will, with virtual certainty, as I've shown, observe it evolving into a higherentropy state, simply by virtue of the relative number of accessible microstates. From these observations, the second law may beand historically, wasabstracted. With increased understanding of the microscopic basis of matter, it became clear that the second law can only apply statistically. And so it does. Last edited by Half Man Half Wit; 05122019 at 10:08 AM. 
#231




Quote:
Quote:
In our example the macrostate is the observed "mean energy" x. What I was getting at is, some of these states are more probable than others, the most probable state being x = 0.5. How improbable are the other states? Well, the probability of observing a mean greater than or equal to x is bounded by [let's assume 1.0 > x > 0.5 here to get the correct signs] exp(N(log 2 + x log x + (1x) log(1x))); by symmetry there is the same probability of a state with mean less than 1x. You can see that for a fixed deviation size, the probability of being off by that much or more decays exponentially fast as N grows; an overwhelming number of states are very close to the most probable state; this is the big, i.e., maximal entropy, region. 
#232




Quote:
Quote:
Quote:
~Max 
#233




Quote:
Quote:
~Max 
#234




Quote:
Quote:
Quote:
Quote:
Quote:
Perhaps it helps if you think about all of the states of the system. Say, there's k states. Now, thus, at any given time t, the system is going to be in one of those states. At time t + 1, it likewise will be in one of those states; any evolution thus is specified by specifying what state each of the system's states evolves to. (This is the idea behind the evolution matrices I introduced.) Start with the trivial evolution, which takes every state to itself. You can represent it like this: Code:
+ > + + > + + > + + > + + > + * > * * > * * > * x > x Now, introduce any variation to that diagram. Say, have one of the highentropy states evolve to a lowentropy state, like so: Code:
+ > + + > + + > + + > + + > * * > + * > * * > * x > x I would encourage you to play around with this a little. See if you can find an evolution such that entropy increase won't, overall, occur more often than entropy decrease. If that doesn't make things clearer, let's just go to the most stupendously simply case, a system with two highentropy states, and one lowentropy state. (Or more accurately, a system with one macrostate realized by two microstates, and another one realized by a single microstate.) This yields either: Code:
+ > + + > + x > x Code:
+ > + + > x x > + Quote:
Sure. For a single evolution, starting in some microstate, it may well be that certain states are never visited. But that means those states, then, aren't accessible anymore for evolutions starting in another microstate, which consequently will have to visit others; and carrying that through, we arrive at the general consequence that observing entropy increase is always more likely than observing decrease. Quote:



#235




Quote:
Quote:
If N = 0, macrostate_{H} = { H }If N = 1, macrostate_{H} = { H, T }If N = 2, macrostate_{H} = { H, T, H }If N = 3, macrostate_{H} = { H, T, H, T }If N = 4, macrostate_{H} = { H, T, H, T, H }If N = 5, macrostate_{H} = { H, T, H, T, H, T }Let us next define a function to calculate the average of heads given a particular macrostate. Let f(x) be the function calculating the ratio of heads to tails in set x. Therefore: f(macrostate_{H}) = (floor(N/2) + 1) / (N + 1) f(macrostate_{T}) = floor(N/2) / (N + 1) Now you ask, what is the probability that the ratio of heads to tails is greater than 0.6? This is simply asking whether f(macrostate) > 0.6. If we pass in macrostate_{H}, we get the inequality (floor(N/2) + 1) / (N + 1) > 0.6, with two positive integer solutions of N=0 and N=2. If we pass in macrostate_{T}, this gives me floor(N/2) / (N + 1) > 0.6 which has no solutions. I take as a premise that the initial macrostate of the system is random, that is, the probability of the initial state being macrostate_{H} vs macrostate_{T} is 50%/50%. So to answer that question, if N∈{ 0, 2 } then the probability that the ratio of heads to tails is greater than 0.6 is 50%. For any other N, the probability is 0%. What does this have to do with entropy? ~Max 
#236




Three State SystemQuote:
SPOILER:
You claim that, given the above system with an initial macrostate of macrostate_{L}, in the next time step entropy will either remain constant or increase with certainty. That is correct and I agree. Then you claim that, given the above system with an initial macrostate of macrostate_{H}, in the next time step entropy will either stay the same or decrease. That is also correct and I agree, although I would not use the verbage "in one out of two cases" because that could be misinterpreted as a 50%/50% probability which does not follow. In evolution_3 and evolution_6, for example, the probability of consistent entropy/decreasing entropy is 100%/0%. Then you claim that a system with an initial macrostate of macrostate_{L} is "virtually certain" to evolve into a higherentropy state, simply by virtue of the relative number of accessible microstates. You have not assigned any sort of probability to the different possible evolutions of a system so it doesn't make sense to assert that an observer is likely to observe any particular macrostate after one time step. Try as you may, unless you flesh out the evolution matrix (which means you know the microscopic dynamics) or assume the microscopic evolution is as random as the initial microstate, you cannot make that conclusion. Once you assume a random evolution, it is easy to show that a macrostate_{L} has a 10/24 probability of keeping the same macrostate after one timestep compared to a 14/24 probability of changing to macrostate_{H}; that a macrostate_{H} has a 7/24 probability of changing to a macrostate_{L} while the probability of staying macrostate_{H} is 17/24. ~Max 
#237




Quote:
Quote:
Quote:
Quote:
Quote:
Quote:

#238




Quote:
Code:
(1 0 0) (1 0 0) (0 1 0) M1 = (0 1 0) M2 = (0 0 1) M3 = (1 0 0) (0 0 1) (0 1 0) (0 0 1) (0 1 0) (0 0 1) (0 0 1) M4 = (0 0 1) M5 = (1 0 0) M6 = (0 1 0) (1 0 0) (0 1 0) (1 0 0) The relevance of the preceding exercise is just to establish that no matter which of those describes the correct microscopic evolution law, the conclusion holds that entropy will more often increase (or stay constant) than decrease. There's no sense to assigning probabilities to these evolutions; the laws of physics don't get chosen anew upon each experiment (if they did, this whole science business would be right out of the window). We don't know which one is true, and can't tell based on our macroscopic observations (remember, we only know the macrostate, not that it's, for example, realized by two versus two hundred or two million microstates). But no matter which one it is, we'll come away describing the macroscopic world that's accessible to our investigations by means of the second law (with overwhelming likelihood). 
#239




Quote:
Quote:
Let N = 3 and let X = { 0, 1, 1, 1, 2, 2, 2, 3 } Code:
i │ state │ total heads (X_{i}) ──┼───────┼───────────────── 1 │ 0 0 0 │ 0 2 │ 1 0 0 │ 1 3 │ 0 1 0 │ 1 4 │ 0 0 1 │ 1 5 │ 0 1 1 │ 2 6 │ 1 0 1 │ 2 7 │ 1 1 0 │ 2 8 │ 1 1 1 │ 3 SPOILER:
Therefore the standard deviation σ = sqrt(3)/2 SPOILER:
Therefore variance σ² = 3/4 So far everything checks out. Now what is the probability that the number of heads for a given state X_{i} >= 2? This is just a binomial distribution so it would be the binomial coefficients divided by the number of states, 2^{N}. The probability that a random state has 2 heads is 3/8. SPOILER:
The probability that a random state has 3 heads is 1/8. SPOILER:
Therefore the probability that a random state has at least two heads is 1/2. SPOILER:
Alright, so the math checks out. But I'm not sure how you derived "exp(0.02 N)", which seems to be an arbitrary number. I have yet to fit in entropy or the second law of thermodynamics. ~Max 


#240




Quote:
~Max 
#241




Quote:
Code:
+ > + > + > + > + > * > * > x > * > x > If we were sure the system is currently in an intermediateentropy macrostate then we have a 2/3 chance of observing a decrease in entropy at the next instant. There is only a 1/3 chance of observing an increase in entropy at the next instant. Higher entropy means more homogeony, and so it would seem this particular evolution directly contradicts step 8 above. Not that I'm making the connection between step 8 and step 9. ~Max 
#242




Quote:
Quote:
But that doesn't impact the overall conclusion, since the second law still holds if nothing changes. I could have formulated that more stringently, but I am still hoping that these examples are accepted in the spirit they're given. After all, while you can of course come up with all sorts of contrivances such that a box full of white and black balls doesn't mix when it's shakenperhaps the balls are glued into place, or are magnetic, or maybe weigh a ton eachthe intent, and the conclusion it leads to, isn't really threatened by such nitpicking: it's perfectly clear, if I hand you a box, with white balls on the one side, and black balls on the other, and you shake it, the balls will mix. Quote:
It's irrelevant, but the same holds true for prolonged observation, as well: after all, we can take each further step as the first one of a new experiment. Quote:
Quote:
Quote:
And yes, even for macroscopic systems, the difference between the number of maximumentropy states (microstates corresponding to, I should always say) and almostmaximum entropy states will not be so great. So indeed, there will be fluctuations away from equilibrium, which will typically be so small as to be undetectable. But I'm not aiming at mathematical proof here. That's readily available in a multitude of textbooks. I want to make it intuitively clear to you that the reason all the gas in a room doesn't bunch up in the left corner isn't because there's a law that states that 'gas atoms must always tend to the greatest possible dispersal'which would introduce a weird form of teleology and downwardcausationbut rather, because there's so many more ways to be evenly distributed rather than bunched up, and hence, any change to a bunchedup system is much more likely to lead to a less bunchedup system than the other way around. A teapot, once shattered, will not reassemble itself, no matter how much you shake the partseven though that is theoretically possible, and one could contrive laws of physics that privilege 'teapotness' as a state of certain assemblies in order to have it spontaneously reassemble. But note that even in your example, for the majority of (macro)states'on average' in a suitable sensethe second law will hold; and for the majority of evolutionsagain, 'on average'it will hold for all states. Also note that the violation will be shortlived: after a firststep reduction, we will with certainty observe an increase, again. So, for the typical universe (with typical laws of physics), and for typical states, we will, typically, observe an entropy increase for large enough systems. Claiming that it's not so needs at least an argument as to why our universe shouldn't be typical. A more sophisticated treatment of this sort of thing would go into the notion of ergodicity, and when and how it's justified. I can't provide that here, not to the level of detail necessary to satisfy your curiosity. So, if you're not willing to accept a little waffling on the notion of 'typical', and agree that, provided we're in a typical world, we'll typically see entropy increase, I'm afraid there's nothing short of a full course in statistical mechanics that will suffice. There's only so much you can simplify without becoming flat wrong, and I've at least skirted that edge so far; so I'm afraid you'll have to take one horn of the dilemmaeither accept a certain degree of appeal to intuition, like that a box full of black and white marbles will mix if stirred, thus obtaining an intuitive any easily apprehended picture of how the second law works and how it can be violated, or insist on a fullon formal treatment, thus needing the corresponding fullon formalism of statistical mechanics to go along with it. Try as I might, I don't think there's a road through the middle hereI can't both be perfectly accurate and keep this manageably simple. If that's what you require, I'm sorry; but I think if you're willing to work with me just a little, and accept certain obvious, but, I agree, not rigorously proven statementslike that black and white balls tend to mix upon being shakenI think you can achieve a much better understanding of the second law. Quote:
Last edited by Half Man Half Wit; 05152019 at 01:11 PM. 
#243




Quote:
Quote:
This is all just going through the definition of entropy in terms of complexity (maybe it be more instructive to analyse a nonideal gas or a solid or 2D Ising model or something). These microscopic calculations all have to do with explaining how behaviour like heat spontaneously flowing from a hotter body to a colder one, but not the other way around, arises. Some hypothetical longterm recurrence, or lack thereof, doesn't really have anything to do with it. 
#244




Quote:
Quote:
For comparison, if the system starts out in highentropy macrostate_{H}, the probability of observing a decrease in entropy at the next step is 50%. The probability of observing a net decrease in entropy after two steps is 0%. The probability of observing a net decrease in entropy over n steps is 50% if n is odd and 0% if n is even. The takeaway is that with this system, the probability of observing a net decrease in entropy over an even number of steps is exactly the same as the probability of observing a net decrease in entropy  0%. If the number of steps observed is random, there is a 50% chance that net entropy will not change at all. Quote:
Quote:
I do appreciate your participation in this thread and I am willing to drop my objections to the marbles example. The conclusion from that example was never controversial, and is in fact identical to the conclusion from the coinflipping example. I only took issue in the steps taken to reach said conclusion. ~Max 


#245




Quote:
~Max Last edited by Max S.; 05152019 at 03:19 PM. Reason: emphasis 
#246




Quote:
Quote:
So (as, really, always in this thread), we'll have to look at what's more likely. And here, the possibility that one observes 100 coin throws yielding heads in a row doesn't impact on the fact that anybody who merely observes aggregated coin throw results will not see such an outcome. Great! I think we have come a really long way, here. Note how you've been opposed to the possibility that a law, formulated at the macroscopic level and assumed to be perfectly valid there, may receive a deeper explanation at the microscopic level, where it becomes clear that the law isn't, in fact, inviolable. This law concerned a macroscopic quantitygreynessthat doesn't directly map to the microscopic properties of the system (the marbles are either black or white), but is explained by them. Now, what remains to do is to convince you that this sort of thing is what actually happened with respect to the second law. That is:
Do you agree that that's roughly what needs to be established? 
#247




Quote:
We can start at the top, point #1. I think I agree, but it depends on what you mean by "a macroscopic quantity of a system". If you mean that my definition of entropy makes it a property of the whole system, we are in agreement. ~Max 
#248




Quote:
That means, in this definition, it's only meaningful in the context where these quantities are meaningful. Temperature, for example, is just an average property, only applicable to the macroscopic world. Entropy, thus, is likewise; and, since we can define temperature in terms of microscopic quantities, we can do so for entropy, as well. 
#249




Quote:
Quote:
~Max Last edited by Max S.; 05162019 at 09:03 AM. Reason: strikethrough 


#250




Quote:
Now suppose that the gas volumes have different temperatures. Let's say, volume A is hotter, volume B is colder. Then, once you've removed the partition, heat will (tend to) flow from A to B, until both systems are in equilibrium with one another. This is the second law from one perspective. Alternatively, we can think of the gas after the partition has been removed as one system in a state that's far from equilibriumwith one side of it hotter, the other colder. The move towards equilibrium is one towards a more homogeneous, more 'disordered' statea move of the sort we've by now examined copiously: towards higher entropy in the sense of transitioning to a more likely state. The fact that the entropy of the total system can't decrease yields the fact that no heat can be transferred from the colder system to the hotter one. If an amount of heat dQ flows into B, its entropy will be increased by, as you know, dS_{B} = dQ/T_{B}. Likewise, via an amount of heat dQ flowing out of A, the entropy is decreased, by dS_{A} = dQ/T_{A}. The total change in entropy of the combined system is then: dS = dS_{B} + dS_{A} = dQ/T_{B}  dQ/T_{A} Now, for T_{A} = T_{B}, that's zero; but in that case, also no heat can get transferred. If T_{A} > T_{B}, we have a positive dS, and likewise, heat flowing from the hotter to the colder system. If, however, T_{A} < T_{B}, dS would be negative, and heat would flow from the hotter to the colder system. Consequently, if heat flows from the colder to the hotter system, the total entropy of the combined system decreases, and vice versaif heat flows from the hotter to the colder system, total entropy increases. Thus, we can state the second law in two equivalent ways:
The second formulation of the second law is the one we will connect with the notion of microstates. The thing to realize here is simply that heat is like greyness: there are more microstates corresponding to states of equally distributed heat than there are corresponding to states of inhomogeneous heat distribution. For the moment, I'll leave you merely with the intuitive basis of that claim: there are, simply put, more ways of having 'fast' and 'slow' gas atoms (black and white marbles) evenly distributed in the box (leading to an overall intermediate average speedgreyness), than there are ways of having all the 'fast' atoms in part A and all the 'slow' ones in part B. Thus, starting out with such an 'uneven' distribution, we would expect, given the prior discourse, to observe, typically, an evening out of the distribution of heatwhich entails, thus, heat flowing from the hotter to the colder part, purely on the same basis as in the marbleexample, where 'darkness' flows from the blacker part to the lighter one, to yield an overall grey result. I'm going to stop here for the moment, though, because experience teaches me you're not going to accept this simply at face value (on the other hand, if you feel inclined to do so, please don't feel obliged to further your inquiry just for my sake!). So let's see what we bump into now. 
Reply 
Thread Tools  
Display Modes  

