Reply
 
Thread Tools Display Modes
  #201  
Old 05-10-2019, 04:07 PM
Half Man Half Wit's Avatar
Half Man Half Wit is online now
Guest
 
Join Date: Jun 2007
Posts: 6,672
Quote:
Originally Posted by Max S. View Post
But wasn't your assertion that the probability of observing an increase is greater than the probability of observing a decrease?



~Max
Think about it this way: that a low entropy state evolves to a high entropy state is more likely than that a high entropy state evolves to a low entropy state.
  #202  
Old 05-10-2019, 04:09 PM
Half Man Half Wit's Avatar
Half Man Half Wit is online now
Guest
 
Join Date: Jun 2007
Posts: 6,672
Quote:
Originally Posted by Max S. View Post
I'm still not making the connection between 7. and 8. in the marbles example.



~Max
Say you've got a box with all the white marbles to the left, and all the black marbles to the right. Then, you get a ve it a good shake. What do you think will happen, and why?
  #203  
Old 05-10-2019, 04:18 PM
Max S. is offline
Guest
 
Join Date: Aug 2017
Location: Florida, USA
Posts: 528
Quote:
Originally Posted by Half Man Half Wit View Post
Think about it this way: that a low entropy state evolves to a high entropy state is more likely than that a high entropy state evolves to a low entropy state.
Thank you for your patience, by the way.

You have yet to show the truth of this statement. In the evolution matrix example it turned out the probability of entropy decreasing is 1/27 and the probability of entropy increasing is 1/27 - exactly the same probability. The chance of a high entropy state evolving into a low entropy state was zero but the increases and decreases still occurred with low<->intermediate entropy states.

Remember this is your definition of entropy, too.

~Max
  #204  
Old 05-10-2019, 04:20 PM
Max S. is offline
Guest
 
Join Date: Aug 2017
Location: Florida, USA
Posts: 528
Quote:
Originally Posted by Half Man Half Wit View Post
Say you've got a box with all the white marbles to the left, and all the black marbles to the right. Then, you get a ve it a good shake. What do you think will happen, and why?
To answer one question of a time instead of getting ahead of myself, the marbles will probably mix a little because as I shake the box, the force of the walls collide with the marbles thereby injecting the system with kinetic energy.

~Max

Last edited by Max S.; 05-10-2019 at 04:24 PM.
  #205  
Old 05-10-2019, 04:30 PM
Half Man Half Wit's Avatar
Half Man Half Wit is online now
Guest
 
Join Date: Jun 2007
Posts: 6,672
Quote:
Originally Posted by Max S. View Post
Thank you for your patience, by the way.



You have yet to show the truth of this statement. In the evolution matrix example it turned out the probability of entropy decreasing is 1/27 and the probability of entropy increasing is 1/27 - exactly the same probability. The chance of a high entropy state evolving into a low entropy state was zero but the increases and decreases still occurred with low<->intermediate entropy states.



Remember this is your definition of entropy, too.



~Max
You're mixing micro- and macrolevel (although I'm probably partially to blame for this myself). Entropy is a property of the macrostate; the probabilities you're talking about are probabilities of microstates evolving into one another.

At the macro level, however, you can only distinguish between high-, medium-, and low-entropy states. So if you find the system in a medium entropy state, the chance is 1/18 that it will evolve to a low entropy state; if you find the system in a low entropy state, the chance is 1/3 that it will evolve into a medium entropy state.

So one out of 18 times you find the system to spontaneously lower its entropy, while one out of three times, it spontaneously increases.
  #206  
Old 05-10-2019, 04:32 PM
Half Man Half Wit's Avatar
Half Man Half Wit is online now
Guest
 
Join Date: Jun 2007
Posts: 6,672
Quote:
Originally Posted by Max S. View Post
To answer one question of a time instead of getting ahead of myself, the marbles will probably mix a little because as I shake the box, the force of the walls collide with the marbles thereby injecting the system with kinetic energy.



~Max
What do you expect is gonna happen if you shake it some more? And, more importantly, why?
  #207  
Old 05-10-2019, 04:37 PM
Max S. is offline
Guest
 
Join Date: Aug 2017
Location: Florida, USA
Posts: 528
Quote:
Originally Posted by Half Man Half Wit View Post
You're mixing micro- and macrolevel (although I'm probably partially to blame for this myself). Entropy is a property of the macrostate; the probabilities you're talking about are probabilities of microstates evolving into one another.

At the macro level, however, you can only distinguish between high-, medium-, and low-entropy states. So if you find the system in a medium entropy state, the chance is 1/18 that it will evolve to a low entropy state; if you find the system in a low entropy state, the chance is 1/3 that it will evolve into a medium entropy state.

So one out of 18 times you find the system to spontaneously lower its entropy, while one out of three times, it spontaneously increases.
No, you cannot compare those two probabilities without multiplying 1/18 by 18/21 and 1/3 by 3/21. To do otherwise constitutes a statistical error. And besides, you cannot apply those probabilities to the marbles example without including the 1/6 possibility that the system starts out in a high entropy state.

~Max
  #208  
Old 05-10-2019, 04:46 PM
Max S. is offline
Guest
 
Join Date: Aug 2017
Location: Florida, USA
Posts: 528
Quote:
Originally Posted by Half Man Half Wit View Post
What do you expect is gonna happen if you shake it some more? And, more importantly, why?
The more you shake the box, the more energy you impart upon the system. Assuming there are no other forces (gravity, friction, air resistance), and that the walls of the box are still perfectly bouncy, the velocity of marbles in the box will continue to increase. It is important that the walls of the box are perfectly bouncy, and this is unrealistic - if the walls perfectly reflect kinetic energy from marbles, you do not have to shake faster because energy from the marbles bouncing against the box is never sent back to counteract the movement of your arm.

So the more you shake the box, the more the system's energy increases. So too does entropy* increase. In fact it is impossible for the entropy of the system to decrease while you are shaking the box.

*my definition of entropy, dQ/T.

~Max

Last edited by Max S.; 05-10-2019 at 04:47 PM.
  #209  
Old 05-10-2019, 05:11 PM
DPRK is offline
Guest
 
Join Date: May 2016
Posts: 3,033
If you think about a system in thermal equilibrium with a heat bath, that corresponds to Gibbs's canonical ensemble. The temperature is fixed, but the energy is not. If you calculate the equilibrium state you will find that- big surprise here- the Gibbs entropy is maximized.
  #210  
Old 05-10-2019, 05:14 PM
Half Man Half Wit's Avatar
Half Man Half Wit is online now
Guest
 
Join Date: Jun 2007
Posts: 6,672
Quote:
Originally Posted by Max S. View Post
No, you cannot compare those two probabilities without multiplying 1/18 by 18/21 and 1/3 by 3/21. To do otherwise constitutes a statistical error. And besides, you cannot apply those probabilities to the marbles example without including the 1/6 possibility that the system starts out in a high entropy state.



~Max
No. We're talking about conditional probabilities here. 1/27 is the probability of the system in any state whatsoever evolving to a (microstate realizing a) higher (or, indeed, lower) entropy macrostate. But, you have additional information, on which you must condition your probability assignment.

By the formula for conditional probability, the probability that the system moves to a state of lower entropy, given that it is in a state of medium entropy, is (1/27) / (18/27): the probability of being in a medium entropy state AND moving to a lower entropy, which occurs for one of the 27 possible cases, divided by the probability of being in a medium entropy state. Naturally, that's 1/18.
  #211  
Old 05-10-2019, 05:20 PM
Half Man Half Wit's Avatar
Half Man Half Wit is online now
Guest
 
Join Date: Jun 2007
Posts: 6,672
Quote:
Originally Posted by Max S. View Post
The more you shake the box, the more energy you impart upon the system.
You only need to impart more energy to counter losses due to friction; if those weren't there, the balls would keep on bouncing, and mixing, with any initial impulse (internal energy).

So, again: you keep shaking (countering friction; alternatively, you give the box a single whack, setting up an initial state, if there is no friction)---what do you expect happens? Don't think about entropy, for now, or heat, or any such confounders. Just regarding the mixture of marbles: will they tend to stay separated? Or is it more likely that they become more and more mixed?
  #212  
Old 05-10-2019, 06:18 PM
Max S. is offline
Guest
 
Join Date: Aug 2017
Location: Florida, USA
Posts: 528
Quote:
Originally Posted by Half Man Half Wit View Post
No. We're talking about conditional probabilities here. 1/27 is the probability of the system in any state whatsoever evolving to a (microstate realizing a) higher (or, indeed, lower) entropy macrostate. But, you have additional information, on which you must condition your probability assignment.

By the formula for conditional probability, the probability that the system moves to a state of lower entropy, given that it is in a state of medium entropy, is (1/27) / (18/27): the probability of being in a medium entropy state AND moving to a lower entropy, which occurs for one of the 27 possible cases, divided by the probability of being in a medium entropy state. Naturally, that's 1/18.
Yes, but you are in error to directly compare 1/18 and 1/3.

~Max
  #213  
Old 05-10-2019, 06:22 PM
Max S. is offline
Guest
 
Join Date: Aug 2017
Location: Florida, USA
Posts: 528
Quote:
Originally Posted by Half Man Half Wit View Post
You only need to impart more energy to counter losses due to friction; if those weren't there, the balls would keep on bouncing, and mixing, with any initial impulse (internal energy).

So, again: you keep shaking (countering friction; alternatively, you give the box a single whack, setting up an initial state, if there is no friction)---what do you expect happens? Don't think about entropy, for now, or heat, or any such confounders. Just regarding the mixture of marbles: will they tend to stay separated? Or is it more likely that they become more and more mixed?
They will probably mix some but I cannot say whether they become more or less mixed, that would depend on specifics on how I shake the box and how exactly the marbles are arranged. And I was assuming a scenario without friction.

~Max
  #214  
Old 05-10-2019, 11:56 PM
Half Man Half Wit's Avatar
Half Man Half Wit is online now
Guest
 
Join Date: Jun 2007
Posts: 6,672
Quote:
Originally Posted by Max S. View Post
Yes, but you are in error to directly compare 1/18 and 1/3.



~Max
Why would that be? I want to know which one is more likely: that the system moves to a lower entropy state from a higher entropy state, or that it moves to a higher entropy state from a lower entropy state. So these are exactly the numbers I have to compare.

If I were to do as you say, and weight the two probabilities with the probability of finding the system in a low- respectively high-entropy state, I would instead compare the probability of finding the system in a low entropy state and having it transition to a higher one with that of finding it in a high entropy state and having it transition to a low entropy one.

That's a different situation: there, I have not yet looked at the system, and want to know what to expect if I look; but what we want to know is, after we have looked at the system, what (most likely) happens next.
  #215  
Old 05-10-2019, 11:59 PM
Half Man Half Wit's Avatar
Half Man Half Wit is online now
Guest
 
Join Date: Jun 2007
Posts: 6,672
Quote:
Originally Posted by Max S. View Post
They will probably mix some but I cannot say whether they become more or less mixed, that would depend on specifics on how I shake the box and how exactly the marbles are arranged. And I was assuming a scenario without friction.



~Max
You don't have those specifics. That's the point: you don't have microscopic control over the system, but have to predict what's more likely to happen, given the state it's in now.

In other words, if I offer you a hundred bucks if you shake the system and it gets less mixed, and you have to give me a hundred bucks if it gets more mixed, do you take that bet?
  #216  
Old 05-11-2019, 02:38 AM
Max S. is offline
Guest
 
Join Date: Aug 2017
Location: Florida, USA
Posts: 528
Quote:
Originally Posted by Half Man Half Wit View Post
Why would that be? I want to know which one is more likely: that the system moves to a lower entropy state from a higher entropy state, or that it moves to a higher entropy state from a lower entropy state. So these are exactly the numbers I have to compare.
I will admit that it is more likely for our system to transfer from a low entropy state to an intermediate entropy state than it is to transfer from an intermediate entropy state to a low entropy state. Those respective probabilities are 1/3 and 1/18.

I was myself wrong to dispute your statement in post #201.

But that is not what you set out to prove, is it?

~Max
  #217  
Old 05-11-2019, 02:42 AM
Max S. is offline
Guest
 
Join Date: Aug 2017
Location: Florida, USA
Posts: 528
Quote:
Originally Posted by Half Man Half Wit View Post
You don't have those specifics. That's the point: you don't have microscopic control over the system, but have to predict what's more likely to happen, given the state it's in now.

In other words, if I offer you a hundred bucks if you shake the system and it gets less mixed, and you have to give me a hundred bucks if it gets more mixed, do you take that bet?
It is not clear whether I keep shaking the box forever, or when I measure how mixed the marbles are.

I hate to fight the hypothetical but I have no place predicting the behavior of the marbles with so little information. Especially not for a $100 bet.

~Max
  #218  
Old 05-11-2019, 02:54 AM
Half Man Half Wit's Avatar
Half Man Half Wit is online now
Guest
 
Join Date: Jun 2007
Posts: 6,672
Quote:
Originally Posted by Max S. View Post
I will admit that it is more likely for our system to transfer from a low entropy state to an intermediate entropy state than it is to transfer from an intermediate entropy state to a low entropy state. Those respective probabilities are 1/3 and 1/18.



I was myself wrong to dispute your statement in post #201.



But that is not what you set out to prove, is it?



~Max
It's not? How so? What I set out to demonstrate was how somebody observing a system could come away with a law of the form 'the entropy increases or stays constant', even though it's possible for it to also decrease. My claim was that for this, it's enough to realize the difference in number regarding high(er)- vs low(er)-entropy states.

So let's think about how one might come away with that law. The experimenter may set the system up in either a low, intermediate, or high entropy state. If they set it up in a hight entropy state, they'll always find it to stay constant. If they set it up in a low entropy state, they'll always find it to stay constant, or, with some sizeable probability that increases sharply once we get to larger system sizes, to increase.

If, now, they set it up in an intermediate entropy state, they will find that, with overwhelming likelihood, the entropy will stay the same. Indeed, for some sufficiently large system, they likely never will observe it to decrease.

Consequently, they'll formulate a law, generalizing from their observations, that entropy always increases, or stays constant.

We have more information, however. We know that this quantity, entropy, is realized by a system deterministically evolving among microstates. We can thus conclude that once in a (very large) while, entropy must also decrease. The law that was formulated thus is not an exact, but merely a statistical one.

For any sufficiently large system, that we'll ever observe such a violation is vanishingly unlikely (think a tornado rushing through a junkyard spontaneously assembling a jumbo jet unlikely). But for small enough systems, these violations may be observable.

Moreover, we can predict just how likely such violations are going to be. That's the fluctuation theorem. And, indeed, if we do the experiment, this prediction is confirmed.
  #219  
Old 05-11-2019, 02:57 AM
Half Man Half Wit's Avatar
Half Man Half Wit is online now
Guest
 
Join Date: Jun 2007
Posts: 6,672
Quote:
Originally Posted by Max S. View Post
It is not clear whether I keep shaking the box forever, or when I measure how mixed the marbles are.



I hate to fight the hypothetical but I have no place predicting the behavior of the marbles with so little information. Especially not for a $100 bet.



~Max
You can't keep shaking it forever. Unless you're a very different entity from what I take you to be, you (I'm sorry to say) won't be around forever. So let's just say you shake it good until you get bored, or hungry, or need a bathroom break.

This is, by the way, completely irrelevant to the hypothetical, because of course, if you shake it, it will get more mixed. I mean, you can't really reasonably think otherwise; you're just trying special pleading to get out of accepting an unwelcome conclusion.
  #220  
Old 05-11-2019, 03:36 AM
Half Man Half Wit's Avatar
Half Man Half Wit is online now
Guest
 
Join Date: Jun 2007
Posts: 6,672
(One awkwardness of the current setup, I should point out before we bump into it, is that the absolute number of high entropy states is smaller than the number of intermediate entropy states. Thus, you could have all high entropy states transition to intermediate entropy states; this is no longer the case for larger systems, where the high entropy states will vastly dominate.)
  #221  
Old 05-11-2019, 12:17 PM
Max S. is offline
Guest
 
Join Date: Aug 2017
Location: Florida, USA
Posts: 528
Quote:
Originally Posted by Half Man Half Wit View Post
It's not? How so? What I set out to demonstrate was how somebody observing a system could come away with a law of the form 'the entropy increases or stays constant', even though it's possible for it to also decrease. My claim was that for this, it's enough to realize the difference in number regarding high(er)- vs low(er)-entropy states.

So let's think about how one might come away with that law. The experimenter may set the system up in either a low, intermediate, or high entropy state. If they set it up in a hight entropy state, they'll always find it to stay constant. If they set it up in a low entropy state, they'll always find it to stay constant, or, with some sizeable probability that increases sharply once we get to larger system sizes, to increase.

If, now, they set it up in an intermediate entropy state, they will find that, with overwhelming likelihood, the entropy will stay the same. Indeed, for some sufficiently large system, they likely never will observe it to decrease.

Consequently, they'll formulate a law, generalizing from their observations, that entropy always increases, or stays constant.

We have more information, however. We know that this quantity, entropy, is realized by a system deterministically evolving among microstates. We can thus conclude that once in a (very large) while, entropy must also decrease. The law that was formulated thus is not an exact, but merely a statistical one.

For any sufficiently large system, that we'll ever observe such a violation is vanishingly unlikely (think a tornado rushing through a junkyard spontaneously assembling a jumbo jet unlikely). But for small enough systems, these violations may be observable.

Moreover, we can predict just how likely such violations are going to be. That's the fluctuation theorem. And, indeed, if we do the experiment, this prediction is confirmed.
I already admitted that such an observer subject to the "experiment" could formulate such a law. But if there is an equiprobability of initial microstates, it is not given that the system starts out in an intermediary state and you must use the probabilities for the system as a whole - 1/27, 25/27, and 1/27.

You cannot simply assume that the system starts out in an intermediary state, or you are rigging your experiment. By doing so you have reduced 1/27 25/27 1/27 to 1/18 17/18 and 0/18.

Your contention, if I recall, is that the probability of observing an increase is astronomically higher than observing a decrease.

~Max
  #222  
Old 05-11-2019, 12:46 PM
DPRK is offline
Guest
 
Join Date: May 2016
Posts: 3,033
Quote:
Originally Posted by Max S. View Post
Your contention, if I recall, is that the probability of observing an increase is astronomically higher than observing a decrease.
The entropy of your gas (or whatever) is proportional to the logarithm of the number of accessible states. Therefore ending up in a smaller region becomes exponentially unlikely.

As a relevant mathematical exercise, flip a fair coin a bunch of times (N times) and compute the average value, where heads count as 0 and tails as 1. Now compute the probability that the average is greater than x (eg x could be 0.6). This average value is your "macrostate". You will find that the probability decreases exponentially as a function of minus the "entropy". You are more likely to observe a higher-entropy state.

Last edited by DPRK; 05-11-2019 at 12:51 PM.
  #223  
Old 05-11-2019, 03:56 PM
Max S. is offline
Guest
 
Join Date: Aug 2017
Location: Florida, USA
Posts: 528
Quote:
Originally Posted by DPRK View Post
The entropy of your gas (or whatever) is proportional to the logarithm of the number of accessible states. Therefore ending up in a smaller region becomes exponentially unlikely.

As a relevant mathematical exercise, flip a fair coin a bunch of times (N times) and compute the average value, where heads count as 0 and tails as 1. Now compute the probability that the average is greater than x (eg x could be 0.6). This average value is your "macrostate". You will find that the probability decreases exponentially as a function of minus the "entropy". You are more likely to observe a higher-entropy state.
I don't understand. Flipping a coin over n times represents a system with two states, either state 0 for heads and state 1 for tails. The initial state is already random - 50% probability of being heads or tails. The number of trials is also random - 50% probability of being even or odd. I should not have to point out that the probability of a coin showing heads after n flips is exactly 50%. To ask whether random variable 0 < x < 1 is greater than 50% is equally trivial - the probability approaches 50% depending on how precisely you define x.

~Max
  #224  
Old 05-11-2019, 04:05 PM
Max S. is offline
Guest
 
Join Date: Aug 2017
Location: Florida, USA
Posts: 528
Quote:
Originally Posted by Half Man Half Wit View Post
You can't keep shaking it forever. Unless you're a very different entity from what I take you to be, you (I'm sorry to say) won't be around forever. So let's just say you shake it good until you get bored, or hungry, or need a bathroom break.

This is, by the way, completely irrelevant to the hypothetical, because of course, if you shake it, it will get more mixed. I mean, you can't really reasonably think otherwise; you're just trying special pleading to get out of accepting an unwelcome conclusion.
If you are asking whether the marbles would be more mixed after shaking the box than when I started, I would take that bet unless the marbles filled the box to the brim (no room to mix). I thought you were asking about the degree of mixed-ness after the fact, which I would not bet on.

~Max
  #225  
Old 05-11-2019, 04:19 PM
Max S. is offline
Guest
 
Join Date: Aug 2017
Location: Florida, USA
Posts: 528
Quote:
Originally Posted by Half Man Half Wit View Post
(One awkwardness of the current setup, I should point out before we bump into it, is that the absolute number of high entropy states is smaller than the number of intermediate entropy states. Thus, you could have all high entropy states transition to intermediate entropy states; this is no longer the case for larger systems, where the high entropy states will vastly dominate.)
I think larger systems also have the same number of high and low entropy states if all you are counting is the ensemble of possible atomic configurations. However, many of those configurations will be inaccessible from each other.

Within a specific evolutionary cycle there needs not be an equal number of relatively high and low entropy states. But the cycle is periodic and by definition any increases in entropy must be matched with decreases of equal total magnitude. With only two accessible levels of entropy the probability of randomly observing an increase in entropy must be equal to the probability of observing a decrease. With more than two accessible levels of entropy it is possible for a random observation to favor an increase over a decrease in entropy - for example if entropy was plotted as a sawtooth wave. But without defining the evolution cycle this is not a given, entropy could just as well be plotted as a sine wave.

~Max
  #226  
Old 05-11-2019, 06:17 PM
DPRK is offline
Guest
 
Join Date: May 2016
Posts: 3,033
Quote:
Originally Posted by Max S. View Post
I don't understand. Flipping a coin over n times represents a system with two states, either state 0 for heads and state 1 for tails. The initial state is already random - 50% probability of being heads or tails. The number of trials is also random - 50% probability of being even or odd. I should not have to point out that the probability of a coin showing heads after n flips is exactly 50%. To ask whether random variable 0 < x < 1 is greater than 50% is equally trivial - the probability approaches 50% depending on how precisely you define x.
The trials are independent, and the state consists of the results of N coin flips. The number of states equals 2N and the probability of each one is 2-N. Now x is a fixed parameter and you ask what is the probability of observing a mean value M greater than x after N trials. For instance, if x = 0.5, then the limit of this probability as N→∞ is 50%, but if x = 0.55 then it is zero. But let's say x = 0.55 and N starts to grow. What is the probability as a function of N?
  #227  
Old 05-11-2019, 07:45 PM
Max S. is offline
Guest
 
Join Date: Aug 2017
Location: Florida, USA
Posts: 528
Quote:
Originally Posted by DPRK View Post
The trials are independent, and the state consists of the results of N coin flips. The number of states equals 2N and the probability of each one is 2-N.
Do you mean to say that flipping a coin is absolutely random? I thought you meant the deterministic process of flipping a coin over. If a random process determines whether a coin lands on heads or tails, I would agree by all means that for any given n trials there is a probability distribution over 2n states.

~Max
  #228  
Old 05-11-2019, 08:14 PM
DPRK is offline
Guest
 
Join Date: May 2016
Posts: 3,033
For our purposes it does not matter if the flips are "absolutely random"; you may assume so. Or instead of coins think of them as non-interacting atoms that may be in one of two spin states. That is all beside the point, which is not that there is a "probability distribution", but to calculate what it is and verify the exponential falloff of the probability of any deviation.
  #229  
Old 05-11-2019, 10:24 PM
Max S. is offline
Guest
 
Join Date: Aug 2017
Location: Florida, USA
Posts: 528
Quote:
Originally Posted by DPRK View Post
For our purposes it does not matter if the flips are "absolutely random"; you may assume so. Or instead of coins think of them as non-interacting atoms that may be in one of two spin states. That is all beside the point, which is not that there is a "probability distribution", but to calculate what it is and verify the exponential falloff of the probability of any deviation.
I'm becoming confused, and I likely misunderstand your argument. You were saying the entropy of a system is proportional to the logarithm of the number of accessible states. I take it this is because you define entropy as S = kBlnΩ, where kB is Boltzmann's constant and Ω is the number of accessible microstates. That's fine, there's clearly a logarithm in that formula so everything checks out.

For a fully deterministic system where the initial state and number of steps and all other variables are given, there is only one accessible state at the end of the trial. That gives me S = kB * ln(1) = 0.

So let's say everything is defined except the initial state. In the coin flipping-over example that gives us two accessible states at the end of the trial, depending on which of two initial states was chosen. For a system with two equiprobable states, that gives me system entropy of S = kB * ln(2) or about 9.56993x10-24 joules per kelvin. But this definition of entropy is macroscopic - it says nothing about which microstate is "high entropy" and which state is "low entropy". As soon as we apply the actual microstate the macroscopic entropy drops to zero.

Could you explain what you mean by "ending up in a smaller region becomes exponentially unlikely"?. I'm not making any connection between the probability of any end state in a coin flipping exercise and that conclusion.

~Max
  #230  
Old 05-12-2019, 10:07 AM
Half Man Half Wit's Avatar
Half Man Half Wit is online now
Guest
 
Join Date: Jun 2007
Posts: 6,672
Quote:
Originally Posted by Max S. View Post
I already admitted that such an observer subject to the "experiment" could formulate such a law. But if there is an equiprobability of initial microstates, it is not given that the system starts out in an intermediary state and you must use the probabilities for the system as a whole - 1/27, 25/27, and 1/27.
Equiprobability of microstates means microstates consistent with a given macrostate. That is, if the system is in the macrostate (A1B1C1), then it is with equal probability in either of the states (A:1,B:2,C:3), (A:3,B:1,C:2), (A:2,B:3,C:1), (A:1,B:3,C:2), (A:2,B:1,C:3) or (A:3,B:2,C:1). (Where the numbers denote which of the three balls is in what box.)

The states as I have given them initially are what we have macroscopic control over; they're how we set up the system in the experiment. We don't have control over which of the possible microstates realizes that particular macrostate; that's where the probabilistic part comes in.

Your description would correspond to an 'experiment' where we just set up the system randomly, and see what happens. In that case, we'd most likely (again, in the sense of 'with virtual certainty') just always see systems at nearly maximum entropy not doing much at all.

But real experiments will typically involve setting up a low-entropy state, and then looking to see what happens. Think about the historic experiments with steam engines and the like. In that case, the probabilities as I have given them are the only appropriate ones.

Quote:
Originally Posted by Max S. View Post
If you are asking whether the marbles would be more mixed after shaking the box than when I started, I would take that bet unless the marbles filled the box to the brim (no room to mix). I thought you were asking about the degree of mixed-ness after the fact, which I would not bet on.

~Max
I don't understand the difference between the two settings you propose. You would bet that the box is more mixed after the shaking, but you wouldn't bet on whether it's more or less mixed...?

Quote:
Originally Posted by Max S. View Post
I think larger systems also have the same number of high and low entropy states if all you are counting is the ensemble of possible atomic configurations.
I never know where you take these things from. That's patently false: think about a system of 100 coins. The minimum entropy states are 'all coins showing heads' and 'all coins showing tails'. Each of these can be realized in exactly one way.

The maximum entropy states is 'half the coins showing heads, half showing tails'. This state can be realized in (100 choose 50) ~ 1029 ways. there are thus 1029 microstates corresponding to the maximum entropy state, and 1 corresponding to the minimum entropy state. Simple continuity shows that each state with an in-between amount of entropy must have an in-between number of microscopic realizations, as well. Hence, vastly more microstates correspond to high-entropy states than to low-entropy states.

Or, being even more explicit, look at the fraction of microstates of the form 'x coins out of a 100 showing heads' compared to the maximum-entropy '50 showing heads'. You can easily calculate that 96.5% of all microstates lie in the range between 40 and 60 coins flipped. If we double the number, with 200 coins, 99.6% of all microstates lie in the interval between 80 and 120 coins showing heads.

For any remotely macroscopic system, thus, with on the order of 1023 atoms (instead of 200), each of which can be in a huge number of states (rather than a coin's two), if you just randomly look at the system, you're basically guaranteed to find it in a state extremely close to the maximum entropy state.

So this idea, that the real-world experiment is modeled by just observing the system in a random state, simply doesn't work---because then, with virtual certainty, you just won't ever observe it doing anything.

Rather, a real-world experiment involves setting up a low-entropy state. Upon doing so, you will, with virtual certainty, as I've shown, observe it evolving into a higher-entropy state, simply by virtue of the relative number of accessible microstates.

From these observations, the second law may be---and historically, was---abstracted. With increased understanding of the microscopic basis of matter, it became clear that the second law can only apply statistically. And so it does.

Last edited by Half Man Half Wit; 05-12-2019 at 10:08 AM.
  #231  
Old 05-12-2019, 04:33 PM
DPRK is offline
Guest
 
Join Date: May 2016
Posts: 3,033
Quote:
Originally Posted by Max S. View Post
I'm becoming confused, and I likely misunderstand your argument. You were saying the entropy of a system is proportional to the logarithm of the number of accessible states. I take it this is because you define entropy as S = kBlnΩ, where kB is Boltzmann's constant and Ω is the number of accessible microstates. That's fine, there's clearly a logarithm in that formula so everything checks out.

For a fully deterministic system where the initial state and number of steps and all other variables are given, there is only one accessible state at the end of the trial. That gives me S = kB * ln(1) = 0.
There is definitely a misunderstanding here. Deterministic or not, the final microstate does not have an "entropy"; the entropy is associated to the MACRO-state in question or, more generally, to a probability distribution.
Quote:
So let's say everything is defined except the initial state. In the coin flipping-over example that gives us two accessible states at the end of the trial, depending on which of two initial states was chosen. For a system with two equiprobable states, that gives me system entropy of S = kB * ln(2) or about 9.56993x10-24 joules per kelvin. But this definition of entropy is macroscopic - it says nothing about which microstate is "high entropy" and which state is "low entropy". As soon as we apply the actual microstate the macroscopic entropy drops to zero.

Could you explain what you mean by "ending up in a smaller region becomes exponentially unlikely"?. I'm not making any connection between the probability of any end state in a coin flipping exercise and that conclusion.

~Max
In our toy example, the "system entropy" is just proportional to N log(2) like you say. Individual microstates don't have entropy associated to them; they are just points in a phase space.

In our example the macrostate is the observed "mean energy" x. What I was getting at is, some of these states are more probable than others, the most probable state being x = 0.5. How improbable are the other states? Well, the probability of observing a mean greater than or equal to x is bounded by [let's assume 1.0 > x > 0.5 here to get the correct signs] exp(-N(log 2 + x log x + (1-x) log(1-x))); by symmetry there is the same probability of a state with mean less than 1-x. You can see that for a fixed deviation size, the probability of being off by that much or more decays exponentially fast as N grows; an overwhelming number of states are very close to the most probable state; this is the big, i.e., maximal entropy, region.
  #232  
Old 05-13-2019, 11:39 AM
Max S. is offline
Guest
 
Join Date: Aug 2017
Location: Florida, USA
Posts: 528
Quote:
Originally Posted by Half Man Half Wit View Post
Equiprobability of microstates means microstates consistent with a given macrostate. That is, if the system is in the macrostate (A1B1C1), then it is with equal probability in either of the states (A:1,B:2,C:3), (A:3,B:1,C:2), (A:2,B:3,C:1), (A:1,B:3,C:2), (A:2,B:1,C:3) or (A:3,B:2,C:1). (Where the numbers denote which of the three balls is in what box.)

The states as I have given them initially are what we have macroscopic control over; they're how we set up the system in the experiment. We don't have control over which of the possible microstates realizes that particular macrostate; that's where the probabilistic part comes in.

Your description would correspond to an 'experiment' where we just set up the system randomly, and see what happens. In that case, we'd most likely (again, in the sense of 'with virtual certainty') just always see systems at nearly maximum entropy not doing much at all.

But real experiments will typically involve setting up a low-entropy state, and then looking to see what happens. Think about the historic experiments with steam engines and the like. In that case, the probabilities as I have given them are the only appropriate ones.
Very well, I think (hope) I understand you. I was still confusing macrostate with thermodynamic state, but it is clear to me now that these are not the same thing. If an isolated thermodynamic system starts out in an intermediate-entropy macrostate, and all microstates consistent with that macrostate are equiprobable, then a single observation over one time step might show the system evolving into a higher-entropy or lower-entropy state depending on the internal dynamics. Without knowing the rules of evolution, the probability of observing such a change is not given, but depending on how the rules of evolution are specified it is possible for the probabilities of observing an increase versus a decrease in entropy to be unequal, or even one-sided.


Quote:
Originally Posted by Half Man Half Wit View Post
I don't understand the difference between the two settings you propose. You would bet that the box is more mixed after the shaking, but you wouldn't bet on whether it's more or less mixed...?
Right, without many more details I would not bet, for example, on whether the ratio of white to black on each side is within 10%.

Quote:
Originally Posted by Half Man Half Wit View Post
I never know where you take these things from. That's patently false: think about a system of 100 coins. The minimum entropy states are 'all coins showing heads' and 'all coins showing tails'. Each of these can be realized in exactly one way.

The maximum entropy states is 'half the coins showing heads, half showing tails'. This state can be realized in (100 choose 50) ~ 1029 ways. there are thus 1029 microstates corresponding to the maximum entropy state, and 1 corresponding to the minimum entropy state. Simple continuity shows that each state with an in-between amount of entropy must have an in-between number of microscopic realizations, as well. Hence, vastly more microstates correspond to high-entropy states than to low-entropy states.

Or, being even more explicit, look at the fraction of microstates of the form 'x coins out of a 100 showing heads' compared to the maximum-entropy '50 showing heads'. You can easily calculate that 96.5% of all microstates lie in the range between 40 and 60 coins flipped. If we double the number, with 200 coins, 99.6% of all microstates lie in the interval between 80 and 120 coins showing heads.

For any remotely macroscopic system, thus, with on the order of 1023 atoms (instead of 200), each of which can be in a huge number of states (rather than a coin's two), if you just randomly look at the system, you're basically guaranteed to find it in a state extremely close to the maximum entropy state.

So this idea, that the real-world experiment is modeled by just observing the system in a random state, simply doesn't work---because then, with virtual certainty, you just won't ever observe it doing anything.

Rather, a real-world experiment involves setting up a low-entropy state. Upon doing so, you will, with virtual certainty, as I've shown, observe it evolving into a higher-entropy state, simply by virtue of the relative number of accessible microstates.

From these observations, the second law may be---and historically, was---abstracted. With increased understanding of the microscopic basis of matter, it became clear that the second law can only apply statistically. And so it does.
I'm not sure what I was thinking there.

~Max
  #233  
Old 05-13-2019, 12:24 PM
Max S. is offline
Guest
 
Join Date: Aug 2017
Location: Florida, USA
Posts: 528
Quote:
Originally Posted by Half Man Half Wit View Post
Rather, a real-world experiment involves setting up a low-entropy state. Upon doing so, you will, with virtual certainty, as I've shown, observe it evolving into a higher-entropy state, simply by virtue of the relative number of accessible microstates.
I don't yet understand this conclusion. If a thermodynamic system starts out in a low-entropy macrostate, nothing I am aware of dictates that the system must evolve into a higher-entropy macrostate. Even if there are more higher-entropy macrostates, it is not given that those macrostates are accessible. Without giving microscopic details (including microscopic evolution), it is impossible to even assign a probability as to whether the multitudes of higher-entropy macrostates are accessible from any given lower-entropy macrostate.

Quote:
Originally Posted by Half Man Half Wit View Post
From these observations, the second law may be---and historically, was---abstracted. With increased understanding of the microscopic basis of matter, it became clear that the second law can only apply statistically. And so it does.
I'm not sure if I should respond to this paragraph while the above issue is still outstanding. Nevertheless I still do not equate your definition of entropy with my definition of entropy, therefore your definition of the second law is materially different from my definition and a violation of your law does not imply a violation of mine.

~Max
  #234  
Old 05-13-2019, 01:18 PM
Half Man Half Wit's Avatar
Half Man Half Wit is online now
Guest
 
Join Date: Jun 2007
Posts: 6,672
Quote:
Originally Posted by Max S. View Post
Very well, I think (hope) I understand you. I was still confusing macrostate with thermodynamic state, but it is clear to me now that these are not the same thing.
A thermodynamic state---the way I would use the term---is a kind of macrostate, described in terms of thermodynamic variables (temperature, pressure) which are aggregates of microscopic variables (kinetic energy, momentum).

Quote:
If an isolated thermodynamic system starts out in an intermediate-entropy macrostate, and all microstates consistent with that macrostate are equiprobable, then a single observation over one time step might show the system evolving into a higher-entropy or lower-entropy state depending on the internal dynamics. Without knowing the rules of evolution, the probability of observing such a change is not given, but depending on how the rules of evolution are specified it is possible for the probabilities of observing an increase versus a decrease in entropy to be unequal, or even one-sided.
But still, as I've demonstrated, no matter the evolution, we can say that observing an entropy increase is more likely than observing a decrease. Just take the extreme case: there's 100 intermediate-entropy states (by which I mean, ten microstates corresponding to intermediate-entropy macrostates), 1 low-entropy state (same), and 10000 high-entropy states. Only one of the intermediate-entropy states can, under any evolution whatever, evolve to a low-entropy one; so for any given dynamics, the chances of observing an increase (or entropy staying constant), given that the system starts out in an intermediate-entropy state, is 99%, given only that information.

Quote:
Right, without many more details I would not bet, for example, on whether the ratio of white to black on each side is within 10%.
OK, but you do now agree to the general principle of the thing, right? In other words, that 8 indeed follows from 7 in the list I gave?

Quote:
Originally Posted by Half Man Half Wit View Post
  1. Suppose you have two boxes, A and B.
  2. Box A is filled with white marbles, and box B is filled with black ones.
  3. Both boxes are placed on a vibrating plate, such that the marbles in them bounce around.
  4. The walls of the boxes are removable.
  5. Suppose you put both boxes next to one another, and remove the now adjacent walls, creating one big box.
  6. Marbles from the white box A will bounce into the black box, and marbles from the black box B will bounce into the white box.
  7. There are more ways of realizing a state that's pretty uniformly grey, than there are to realize a state that's (say) all white in box A, and all black in box B.
  8. Consequently, there are more ways to go from a state that's slightly inhomogeneous to one that's more homogeneous, than there are ways to go to a state that's even more inhomogeneous.
  9. To any observer who, as before, is only capable of seeing gross colors, the formerly black-and-white separated box will gradually tend to a shade of even gray.
  10. That observer might formulate a law, stating that black and white, if brought into contact, eventually even out to a uniform gray.
  11. Knowing the microscopic description, we know that this is just, again, a law of averages: there is nothing that prohibits, say, all or a sizable fraction of the white marbles from bouncing back into box A.
  12. Given a long enough timescale, the even grey will, eventually, separate into black and white patches.
Quote:
Originally Posted by Max S. View Post
I don't yet understand this conclusion. If a thermodynamic system starts out in a low-entropy macrostate, nothing I am aware of dictates that the system must evolve into a higher-entropy macrostate.
Well, I never said so---virtual certainty, i. e. with a probability so ridiculously high that it's not really worth distinguishing from absolute certainty, but still, in principle, merely statistically.

Perhaps it helps if you think about all of the states of the system. Say, there's k states. Now, thus, at any given time t, the system is going to be in one of those states. At time t + 1, it likewise will be in one of those states; any evolution thus is specified by specifying what state each of the system's states evolves to. (This is the idea behind the evolution matrices I introduced.)

Start with the trivial evolution, which takes every state to itself. You can represent it like this:
Code:
+ ----> +
+ ----> +
+ ----> +
+ ----> +
+ ----> +
* ----> *
* ----> *
* ----> *
x ----> x
Here, the + are high-entropy states, the * are intermediate-entropy states, and the x is a low-entropy state.

Now, introduce any variation to that diagram. Say, have one of the high-entropy states evolve to a low-entropy state, like so:

Code:
+ ----> +
+ ----> +
+ ----> +
+ ----> +
+ ----> *
* ----> +
* ----> *
* ----> *
x ----> x
What has to happen, as a consequence, is that one of the high-entropy states must evolve to an intermediate-entropy state. But then, we're done: a higher fraction of the intermediate-entropy states evolves to a high-entropy state, than high-entropy states evolve to low-entropy states. This must be the case; nothing else is possible.

I would encourage you to play around with this a little. See if you can find an evolution such that entropy increase won't, overall, occur more often than entropy decrease.

If that doesn't make things clearer, let's just go to the most stupendously simply case, a system with two high-entropy states, and one low-entropy state. (Or more accurately, a system with one macrostate realized by two microstates, and another one realized by a single microstate.)

This yields either:
Code:
+ ----> +
+ ----> +
x ----> x
or:

Code:
+ ----> +
+ ----> x
x ----> +
That is, a system where either nothing changes, or, if you're in the low-entropy state, entropy increases with certainty, while in the high-entropy state, in one out of two cases, entropy decreases.

Quote:
Even if there are more higher-entropy macrostates, it is not given that those macrostates are accessible. Without giving microscopic details (including microscopic evolution), it is impossible to even assign a probability as to whether the multitudes of higher-entropy macrostates are accessible from any given lower-entropy macrostate.
I have no idea what you mean by accessibility here. If a state isn't accessible, then it's not really a valid state of the system.

Sure. For a single evolution, starting in some microstate, it may well be that certain states are never visited. But that means those states, then, aren't accessible anymore for evolutions starting in another microstate, which consequently will have to visit others; and carrying that through, we arrive at the general consequence that observing entropy increase is always more likely than observing decrease.

Quote:
I'm not sure if I should respond to this paragraph while the above issue is still outstanding. Nevertheless I still do not equate your definition of entropy with my definition of entropy, therefore your definition of the second law is materially different from my definition and a violation of your law does not imply a violation of mine.

~Max
Don't worry, I haven't forgotten. We'll get to that in due course. The argument is, of course, very simple---basically amounting to showing that there are more ways in a system for heat to be evenly distributed, and that thus, whenever 'my' entropy increases, so does 'yours', and hence, whenever 'my' second law is violated, so is 'yours'---but no, I don't expect you to follow that for now.
  #235  
Old 05-13-2019, 01:27 PM
Max S. is offline
Guest
 
Join Date: Aug 2017
Location: Florida, USA
Posts: 528
Quote:
Originally Posted by DPRK View Post
There is definitely a misunderstanding here. Deterministic or not, the final microstate does not have an "entropy"; the entropy is associated to the MACRO-state in question or, more generally, to a probability distribution.
Thank you for clearing that up, my posts must read like an ignorant fool. I do appreciate both you and Half Man Half Wit (and others) for putting up with me.

Quote:
Originally Posted by DPRK View Post
In our toy example, the "system entropy" is just proportional to N log(2) like you say. Individual microstates don't have entropy associated to them; they are just points in a phase space.

In our example the macrostate is the observed "mean energy" x. What I was getting at is, some of these states are more probable than others, the most probable state being x = 0.5. How improbable are the other states? Well, the probability of observing a mean greater than or equal to x is bounded by [let's assume 1.0 > x > 0.5 here to get the correct signs] exp(-N(log 2 + x log x + (1-x) log(1-x))); by symmetry there is the same probability of a state with mean less than 1-x. You can see that for a fixed deviation size, the probability of being off by that much or more decays exponentially fast as N grows; an overwhelming number of states are very close to the most probable state; this is the big, i.e., maximal entropy, region.
This part flew over my head. I don't see how N fits into the entropy formula from statistical mechanics. In a system of one coin being flipped over N times, N determines the number of steps, not independent trials. There are two possible macrostates and two accessible microstates. One macrostate is given if the coin starts on heads (macrostateH), the other if the coin starts on tails (macrostateT). If N = 0, the microstates and macrostates are the same: StateH { H }, StateT { T }.

If N = 0,
macrostateH = { H }
macrostateT = { T }
If N = 1,
macrostateH = { H, T }
macrostateT = { T, H }
If N = 2,
macrostateH = { H, T, H }
macrostateT = { T, H, T }
If N = 3,
macrostateH = { H, T, H, T }
macrostateT = { T, H, T, H }
If N = 4,
macrostateH = { H, T, H, T, H }
macrostateT = { T, H, T, H, T }
If N = 5,
macrostateH = { H, T, H, T, H, T }
macrostateT = { T, H, T, H, T, H }
Let us next define a function to calculate the average of heads given a particular macrostate.

Let f(x) be the function calculating the ratio of heads to tails in set x.
Therefore:
f(macrostateH) = (floor(N/2) + 1) / (N + 1)
f(macrostateT) = floor(N/2) / (N + 1)

Now you ask, what is the probability that the ratio of heads to tails is greater than 0.6? This is simply asking whether f(macrostate) > 0.6. If we pass in macrostateH, we get the inequality (floor(N/2) + 1) / (N + 1) > 0.6, with two positive integer solutions of N=0 and N=2.

If we pass in macrostateT, this gives me floor(N/2) / (N + 1) > 0.6 which has no solutions.

I take as a premise that the initial macrostate of the system is random, that is, the probability of the initial state being macrostateH vs macrostateT is 50%/50%.

So to answer that question, if N∈{ 0, 2 } then the probability that the ratio of heads to tails is greater than 0.6 is 50%. For any other N, the probability is 0%. What does this have to do with entropy?

~Max
  #236  
Old 05-13-2019, 06:01 PM
Max S. is offline
Guest
 
Join Date: Aug 2017
Location: Florida, USA
Posts: 528

Three State System


Quote:
Originally Posted by Half Man Half Wit View Post
If that doesn't make things clearer, let's just go to the most stupendously simply case, a system with two high-entropy states, and one low-entropy state. (Or more accurately, a system with one macrostate realized by two microstates, and another one realized by a single microstate.)

This yields either:
Code:
+ ----> +
+ ----> +
x ----> x
or:

Code:
+ ----> +
+ ----> x
x ----> +
That is, a system where either nothing changes, or, if you're in the low-entropy state, entropy increases with certainty, while in the high-entropy state, in one out of two cases, entropy decreases.
Half Man Half Wit's Three State System
SPOILER:
The tiny system has two possible macrostates: high-entropy macrostateH and low-entropy macrostateL

There are two microstates consistent with macrostateH: microstate A and microstate B.

There is one microstate consistent with macrostateL: microstate C.

Consider now all of the possible evolutions of this system. There are twenty-four possible evolutions, many of which are redundant so I have reduced that to six. Each evolution has between one and three evolutionary "cycles". Here is a list of all six different evolutions, with their probability in parenthesis (assuming a random distribution, which has not yet been established):
  • evolution_1 (3/24)
    • cycle_1: A->B->C->
  • evolution_2 (3/24)
    • cycle_1: A->C->B->
  • evolution_3 (4/24)
    • cycle_1: A->B->
    • cycle_2: C->
  • evolution_4 (4/24)
    • cycle_1: A->
    • cycle_2: B->C->
  • evolution_5 (4/24)
    • cycle_1: A->C->
    • cycle_2: B->
  • evolution_6 (6/24)
    • cycle_1: A->
    • cycle_2: B->
    • cycle_3: C->

Or consider the pictoral cycles in the spoiler.
SPOILER:
Code:
         evolution_1 (3/24)
           cycle_1
           A──>──B
           │     │
           ^     v
           │     │
           └──<──C

         evolution_2 (3/24)
           cycle_1
           A──>──C
           │     │
           ^     v
           │     │
           └──<──B

         evolution_3 (4/24)
     cycle_1    cycle_2
     A──>──┐    C──>──┐
     │     │    │     │
     ^     v    ^     v
     │     │    │     │
     └──<──B    └──<──┘

         evolution_4 (4/24)
     cycle_1    cycle_2
     A──>──┐    B──>──┐
     │     │    │     │
     ^     v    ^     v
     │     │    │     │
     └──<──┘    └──<──C

         evolution_5 (4/24)
     cycle_1    cycle_2
     A──>──┐    B──>──┐
     │     │    │     │
     ^     v    ^     v
     │     │    │     │
     └──<──C    └──<──┘

         evolution_6 (6/24)
cycle_1    cycle_2    cycle_3
A──>──┐    B──>──┐    C──>──┐
│     │    │     │    │     │
^     v    ^     v    ^     v
│     │    │     │    │     │
└──<──┘    └──<──┘    └──<──┘


For the raw table I made of all 24 evolutions, see this spoiler.
SPOILER:
Code:
# mapping cycle_1 cycle_2 cycle_3
 1 1 abc
 2 2 acb
 3 2 bac
 4 1 bca
 5 1 cab
 6 2 cba

 7 3 ab c
 8 5 ac b
 9 3 ba c
10 4 bc a
11 5 ca b
12 4 cb a
13 4 a bc
14 4 a cb
15 5 b ac
16 5 b ca
17 3 c ab
18 3 c ba

19 6 a b c
20 6 a c b
21 6 b a c
22 6 b c a
23 6 c a b
24 6 c b a


You claim that, given the above system with an initial macrostate of macrostateL, in the next time step entropy will either remain constant or increase with certainty. That is correct and I agree.

Then you claim that, given the above system with an initial macrostate of macrostateH, in the next time step entropy will either stay the same or decrease. That is also correct and I agree, although I would not use the verbage "in one out of two cases" because that could be misinterpreted as a 50%/50% probability which does not follow. In evolution_3 and evolution_6, for example, the probability of consistent entropy/decreasing entropy is 100%/0%.

Then you claim that a system with an initial macrostate of macrostateL is "virtually certain" to evolve into a higher-entropy state, simply by virtue of the relative number of accessible microstates.

You have not assigned any sort of probability to the different possible evolutions of a system so it doesn't make sense to assert that an observer is likely to observe any particular macrostate after one time step. Try as you may, unless you flesh out the evolution matrix (which means you know the microscopic dynamics) or assume the microscopic evolution is as random as the initial microstate, you cannot make that conclusion.

Once you assume a random evolution, it is easy to show that a macrostateL has a 10/24 probability of keeping the same macrostate after one timestep compared to a 14/24 probability of changing to macrostateH; that a macrostateH has a 7/24 probability of changing to a macrostateL while the probability of staying macrostateH is 17/24.

~Max
  #237  
Old 05-13-2019, 06:11 PM
DPRK is offline
Guest
 
Join Date: May 2016
Posts: 3,033
Quote:
Originally Posted by Max S. View Post
I don't see how N fits into the entropy formula from statistical mechanics.
It should not come as too much of a surprise that the entropy will be proportional to N, the number of particles. For instance, consider the entropy of the entire system. There are 2N equally probable states. The logarithm of this number is N log 2.
Quote:
In a system of one coin being flipped over N times, N determines the number of steps, not independent trials.
The system consists of N atoms, not one atom observed at N discrete times. You may regard the state of each atom as determined by an independent coin flip because I stipulated that there was no interaction among different atoms. In our example they are just randomly flipping around; the details are irrelevant.
Quote:
There are two possible macrostates and two accessible microstates.
The possible macrostates are the possible number of heads: 0, 1, 2, ..., N. The state with k heads consists of N! / k! (N-k)! micro-states; the logarithm of this number is the entropy of being in that particular state (obtained by counting all of them, because in this case they are equally probably, then taking the logarithm), up to Boltzmann's constant.
Quote:
Let us next define a function to calculate the average of heads given a particular macrostate.
If N is not too small, you may approximate the number of heads as normally distributed with mean N/2 and variance N/4.
Quote:
Now you ask, what is the probability that the ratio of heads to tails is greater than 0.6?
0.6 was supposed to be more simply the proportion of heads. I did actually do some calculations, which you should check; my conclusion was that this is less than exp(-0.02 N). For example, if N = 3, the probability of at least 2 heads is 0.5 < exp(-0.06) = 0.9418

Quote:
What does this have to do with entropy?
It goes to demonstrate that, even in a wildly fluctuating system like this one, as soon as you have more than a few atoms the physical properties will be dominated by the most probable (= maximal entropy) state, and also that the probability of the system being in a state with entropy S will be eS/total # of microstates (here I have again suppressed Boltzmann's constant), which will be indistinguishable from zero if S is not the maximum possible value (and N is at least some reasonable number). Run a computer simulation if you don't believe it...
  #238  
Old 05-14-2019, 12:07 AM
Half Man Half Wit's Avatar
Half Man Half Wit is online now
Guest
 
Join Date: Jun 2007
Posts: 6,672
Quote:
Originally Posted by Max S. View Post
You have not assigned any sort of probability to the different possible evolutions of a system so it doesn't make sense to assert that an observer is likely to observe any particular macrostate after one time step. Try as you may, unless you flesh out the evolution matrix (which means you know the microscopic dynamics) or assume the microscopic evolution is as random as the initial microstate, you cannot make that conclusion.
The exact opposite is the case. The evolution is just the laws of physics for the given system; I trust you will grant me that they don't change between experiments. We may not know those laws, we may not even have any idea that there are such further laws (after all, we only see the macrostate), but they're there, and they are given by one of the possible evolutions; one of the six matrices:

Code:
     (1 0 0)          (1 0 0)          (0 1 0)
M1 = (0 1 0)     M2 = (0 0 1)     M3 = (1 0 0)
     (0 0 1)          (0 1 0)          (0 0 1)

     (0 1 0)          (0 0 1)          (0 0 1)
M4 = (0 0 1)     M5 = (1 0 0)     M6 = (0 1 0)
     (1 0 0)          (0 1 0)          (1 0 0)
Whatever way we set up the system, its underlying dynamics will always be given by one---and just one---of those matrices, just like the underlying dynamics of the molecules of a classical gas are always given by the same Newtonian physics.

The relevance of the preceding exercise is just to establish that no matter which of those describes the correct microscopic evolution law, the conclusion holds that entropy will more often increase (or stay constant) than decrease.

There's no sense to assigning probabilities to these evolutions; the laws of physics don't get chosen anew upon each experiment (if they did, this whole science business would be right out of the window). We don't know which one is true, and can't tell based on our macroscopic observations (remember, we only know the macrostate, not that it's, for example, realized by two versus two hundred or two million microstates). But no matter which one it is, we'll come away describing the macroscopic world that's accessible to our investigations by means of the second law (with overwhelming likelihood).
  #239  
Old 05-15-2019, 01:03 AM
Max S. is offline
Guest
 
Join Date: Aug 2017
Location: Florida, USA
Posts: 528
Quote:
Originally Posted by DPRK View Post
It should not come as too much of a surprise that the entropy will be proportional to N, the number of particles. For instance, consider the entropy of the entire system. There are 2N equally probable states. The logarithm of this number is N log 2.

The system consists of N atoms, not one atom observed at N discrete times. You may regard the state of each atom as determined by an independent coin flip because I stipulated that there was no interaction among different atoms. In our example they are just randomly flipping around; the details are irrelevant.

The possible macrostates are the possible number of heads: 0, 1, 2, ..., N. The state with k heads consists of N! / k! (N-k)! micro-states; the logarithm of this number is the entropy of being in that particular state (obtained by counting all of them, because in this case they are equally probably, then taking the logarithm), up to Boltzmann's constant.
OK, so it is a system of N coins and each coin can be heads or tails. I agree that makes 2N equally probable states; 2N = Ω. I can derive log2(Ω) = N. How are you getting N log 2 and what does that signify? It can't be the entropy: S = kB loge(Ω) = kB loge(N2)

Quote:
If N is not too small, you may approximate the number of heads as normally distributed with mean N/2 and variance N/4.

0.6 was supposed to be more simply the proportion of heads. I did actually do some calculations, which you should check; my conclusion was that this is less than exp(-0.02 N). For example, if N = 3, the probability of at least 2 heads is 0.5 < exp(-0.06) = 0.9418
Bear with me as I relearn basic statistics...

Let N = 3 and let X = { 0, 1, 1, 1, 2, 2, 2, 3 }
Code:
i │ state │ total heads (Xi)
──┼───────┼─────────────────
1 │ 0 0 0 │ 0
2 │ 1 0 0 │ 1
3 │ 0 1 0 │ 1
4 │ 0 0 1 │ 1
5 │ 0 1 1 │ 2
6 │ 1 0 1 │ 2
7 │ 1 1 0 │ 2
8 │ 1 1 1 │ 3
Therefore the mean = 3/2
SPOILER:
(0+1+1+1+2+2+2+3)/8 = 12/8 = 3/2

Therefore the standard deviation σ = sqrt(3)/2
SPOILER:
Unicode
Code:
       ________________ 
      / 1    Ω          
σ =  /  ─   Σ (Xi - )
    √   Ω   i=1         
       _________________
      / 1    23⎛     3⎞
σ =  /  ─   Σ ⎜Xi - ─⎟ 
    √   23  i=1⎝     2⎠ 
       ____________________________________________________________________________________________
      / 1   ⎛⎛    3⎞   ⎛    3⎞   ⎛    3⎞   ⎛    3⎞   ⎛    3⎞   ⎛    3⎞   ⎛    3⎞   ⎛    3⎞⎞
σ =  /  ─  ⎜⎜0 - ─⎟  + ⎜1 - ─⎟  + ⎜1 - ─⎟  + ⎜1 - ─⎟  + ⎜2 - ─⎟  + ⎜2 - ─⎟  + ⎜2 - ─⎟  + ⎜3 - ─⎟ ⎟
    √   8   ⎝⎝    2⎠    ⎝    2⎠    ⎝    2⎠    ⎝    2⎠    ⎝    2⎠    ⎝    2⎠    ⎝    2⎠    ⎝    2⎠ ⎠
       ____________________________________________________________________
      / 1   ⎛⎛  3⎞   ⎛  1⎞   ⎛  1⎞   ⎛  1⎞   ⎛1⎞   ⎛1⎞   ⎛1⎞   ⎛3⎞⎞
σ =  /  ─  ⎜⎜- ─⎟  + ⎜- ─⎟  + ⎜- ─⎟  + ⎜- ─⎟  + ⎜─⎟  + ⎜─⎟  + ⎜─⎟  + ⎜─⎟ ⎟
    √   8   ⎝⎝  2⎠    ⎝  2⎠    ⎝  2⎠    ⎝  2⎠    ⎝2⎠    ⎝2⎠    ⎝2⎠    ⎝2⎠ ⎠
       ____________________________________
      / 1   ⎛9   1   1   1   1   1   1   9⎞
σ =  /  ─  ⎜─ + ─ + ─ + ─ + ─ + ─ + ─ + ─⎟
    √   8   ⎝4   4   4   4   4   4   4   4⎠
       _______
      / 1   24
σ =  /  ─  ──
    √   8    4
       _
      /3
σ =  / ─
    √  4
     _
    √3
σ = ──
     2
ASCII
SPOILER:
Code:
       ________________ 
      / 1    Ω          
σ =  /  ─   Σ (Xi - )
    √   Ω   i=1         
       _________________
      / 1    23      3  
σ =  /  ─   Σ (Xi - ─)
    √   23  i=1      2  
       _____________________________________________________________________________________________
      / 1         3          3          3          3          3          3          3           3   
σ =  /  ─  ((0 - ─) + (1 - ─) + (1 - ─) + (1 - ─) + (2 - ─) + (2 - ─) + (2 - ─)  + (3 - ─))
    √   8         2          2          2          2          2          2          2           2   
       ____________________________________________________________________
      / 1       3        1        1        1      1      1      1      3   
σ =  /  ─  ((- ─) + (- ─) + (- ─) + (- ─) + (─) + (─) + (─) + (─))
    √   8       2        2        2        2      2      2      2      2   
       ____________________________________
      / 1    9   1   1   1   1   1   1   9 
σ =  /  ─  (─ + ─ + ─ + ─ + ─ + ─ + ─ + ─)
    √   8    4   4   4   4   4   4   4   4 
       _______
      / 1   24
σ =  /  ─  ──
    √   8    4
       _
      /3
σ =  / ─
    √  4
     _
    √3
σ = ──
     2

Therefore variance σ = 3/4

So far everything checks out. Now what is the probability that the number of heads for a given state Xi >= 2? This is just a binomial distribution so it would be the binomial coefficients divided by the number of states, 2N.

The probability that a random state has 2 heads is 3/8.
SPOILER:
Unicode
Code:
⎛N⎞            N!     
⎜ ⎟  Ω = ────────────
⎝k⎠       Ω  k!(N-k)!

⎛3⎞            N!               3!             1  2  3       6   3
⎜ ⎟  8 = ──────────── = ─────────────── = ──────────────── = ── = ─
⎝2⎠       8  k!(N-k)!   8  2!  (3-2)!   8   1  2  (1)   16   8
ASCII
SPOILER:
Code:
 N             N!     
( )  Ω = ────────────
 k        Ω  k!(N-k)!

 3             N!               3!             1  2  3       6   3
( )  8 = ──────────── = ─────────────── = ──────────────── = ── = ─
 2        8  k!(N-k)!   8  2!  (3-2)!   8   1  2  (1)   16   8

The probability that a random state has 3 heads is 1/8.
SPOILER:
Unicode
Code:
⎛N⎞            N!     
⎜ ⎟  Ω = ────────────
⎝k⎠       Ω  k!(N-k)!

⎛3⎞            N!               3!               1  2  3            6     1
⎜ ⎟  8 = ──────────── = ─────────────── = ───────────────────── = ────── = ─
⎝3⎠       8  k!(N-k)!   8  3!  (3-3)!   8  1  2  3  (0)!    48  1   8
ASCII
SPOILER:
Code:
 N             N!     
( )  Ω = ────────────
 k        Ω  k!(N-k)!

 3             N!               3!               1  2  3            6     1
( )  8 = ──────────── = ─────────────── = ───────────────────── = ────── = ─
 3        8  k!(N-k)!   8  3!  (3-3)!   8  1  2  3  (0)!    48  1   8

Therefore the probability that a random state has at least two heads is 1/2.
SPOILER:
Code:
Unicode
 N  ⎛N⎞         ⎛3⎞         ⎛3⎞        3   1   4   1
 Σ (⎜ ⎟  Ω) = (⎜ ⎟  Ω) + (⎜ ⎟  Ω) = ─ + ─ = ─ = ─
k=2 ⎝k⎠         ⎝2⎠         ⎝3⎠        8   8   8   2
ASCII
SPOILER:
Code:
 N   N           3           3         3   1   4   1
 Σ (( )  Ω) = (( )  Ω) + (( )  Ω) = ─ + ─ = ─ = ─
k=2  k           2           3         8   8   8   2


Alright, so the math checks out. But I'm not sure how you derived "exp(-0.02 N)", which seems to be an arbitrary number. I have yet to fit in entropy or the second law of thermodynamics.

~Max
  #240  
Old 05-15-2019, 02:52 AM
Max S. is offline
Guest
 
Join Date: Aug 2017
Location: Florida, USA
Posts: 528
Quote:
Originally Posted by Half Man Half Wit View Post
The relevance of the preceding exercise is just to establish that no matter which of those describes the correct microscopic evolution law, the conclusion holds that entropy will more often increase (or stay constant) than decrease.
I will concede this general rule for the first time step in systems where there are more microstates corresponding to relatively high-entropy macrostates than there are corresponding to relatively low-entropy macrostates.

~Max
  #241  
Old 05-15-2019, 03:23 AM
Max S. is offline
Guest
 
Join Date: Aug 2017
Location: Florida, USA
Posts: 528
Quote:
Originally Posted by Half Man Half Wit View Post
Quote:
Originally Posted by Half Man Half Wit View Post
7. There are more ways of realizing a state that's pretty uniformly grey, than there are to realize a state that's (say) all white in box A, and all black in box B.
8. Consequently, there are more ways to go from a state that's slightly inhomogeneous to one that's more homogeneous, than there are ways to go to a state that's even more inhomogeneous.
I would encourage you to play around with this a little. See if you can find an evolution such that entropy increase won't, overall, occur more often than entropy decrease.
First, the equilibrium evolution you presented satisfies your request and contradicts step 8. I must also point out that the two statements I quoted are not equivalent - I can create an example that contradicts step 8 while entropy increases do not overall occur more often than entropy decreases. And again I must object to the verbage "occur more often" because we are only talking about a single time step, not prolonged observation of a system. Continual observation of a system will necessarily show that entropy is periodic no matter the particular microscopic evolution.

Code:
+ -> + -> + -> + ->

+ -> * ->

* -> x -> * -> x ->
I've added an extra microstate consistent with the low-entropy macrostate. I have also split the evolution into three cycles (three lines in the code box). At the end of the cycle it starts anew at the first microstate on the line. You can see that there are 5 microstates consistent with a high-entropy macrostate, 3 microstates consistent with a intermediate-entropy macrostate, and 2 microstates consistent with a low-entropy macrostate. Thus the system complies with step 7.

If we were sure the system is currently in an intermediate-entropy macrostate then we have a 2/3 chance of observing a decrease in entropy at the next instant. There is only a 1/3 chance of observing an increase in entropy at the next instant. Higher entropy means more homogeony, and so it would seem this particular evolution directly contradicts step 8 above.

Not that I'm making the connection between step 8 and step 9.

~Max
  #242  
Old 05-15-2019, 01:08 PM
Half Man Half Wit's Avatar
Half Man Half Wit is online now
Guest
 
Join Date: Jun 2007
Posts: 6,672
Quote:
Originally Posted by Max S. View Post
I will concede this general rule for the first time step in systems where there are more microstates corresponding to relatively high-entropy macrostates than there are corresponding to relatively low-entropy macrostates.

~Max
And since the further steps are just the same as the first step, with the same evolution law, it holds for those, as well.

Quote:
Originally Posted by Max S. View Post
First, the equilibrium evolution you presented satisfies your request and contradicts step 8.
Agreed. If there is no way to either go to a higher- or lower-entropy state---if nothing ever changes, in other words---then there are indeed not more ways to go to a higher-entropy state.

But that doesn't impact the overall conclusion, since the second law still holds if nothing changes. I could have formulated that more stringently, but I am still hoping that these examples are accepted in the spirit they're given.

After all, while you can of course come up with all sorts of contrivances such that a box full of white and black balls doesn't mix when it's shaken---perhaps the balls are glued into place, or are magnetic, or maybe weigh a ton each---the intent, and the conclusion it leads to, isn't really threatened by such nitpicking: it's perfectly clear, if I hand you a box, with white balls on the one side, and black balls on the other, and you shake it, the balls will mix.

Quote:
And again I must object to the verbage "occur more often" because we are only talking about a single time step, not prolonged observation of a system.
'Occur more often' here means in repetitions of the same experiment: i. e. we set the system up in the same macrostate, and observe what happens.

It's irrelevant, but the same holds true for prolonged observation, as well: after all, we can take each further step as the first one of a new experiment.

Quote:
Continual observation of a system will necessarily show that entropy is periodic no matter the particular microscopic evolution.
Sure, but only if we could observe the system for arbitrary lengths of time. That, however, we can't, being finite beings and all. And for any even slightly macroscopic system, the length of time we'd need to observe it in order to have even a tiny chance at observing an entropy reversal far exceeds the age of the universe.

Quote:
I've added an extra microstate consistent with the low-entropy macrostate. I have also split the evolution into three cycles (three lines in the code box). At the end of the cycle it starts anew at the first microstate on the line. You can see that there are 5 microstates consistent with a high-entropy macrostate, 3 microstates consistent with a intermediate-entropy macrostate, and 2 microstates consistent with a low-entropy macrostate. Thus the system complies with step 7.
Sure. For these tiny systems, you can set up certain pathological cases. As I pointed out earlier myself:
Quote:
Originally Posted by Half Man Half Wit View Post
(One awkwardness of the current setup, I should point out before we bump into it, is that the absolute number of high entropy states is smaller than the number of intermediate entropy states. Thus, you could have all high entropy states transition to intermediate entropy states; this is no longer the case for larger systems, where the high entropy states will vastly dominate.)
But again, this doesn't impact the basic point. For a larger system, there will be more than twice as many intermediate-entropy states than there are low-entropy states; for the coins, for example, there are (100 choose 1) = 100 next-to-minimal entropy states, and (100 choose 2) = 4950 next higher entropy states, and so on.

And yes, even for macroscopic systems, the difference between the number of maximum-entropy states (microstates corresponding to, I should always say) and almost-maximum entropy states will not be so great. So indeed, there will be fluctuations away from equilibrium, which will typically be so small as to be undetectable.

But I'm not aiming at mathematical proof here. That's readily available in a multitude of textbooks. I want to make it intuitively clear to you that the reason all the gas in a room doesn't bunch up in the left corner isn't because there's a law that states that 'gas atoms must always tend to the greatest possible dispersal'---which would introduce a weird form of teleology and downward-causation---but rather, because there's so many more ways to be evenly distributed rather than bunched up, and hence, any change to a bunched-up system is much more likely to lead to a less bunched-up system than the other way around.

A teapot, once shattered, will not reassemble itself, no matter how much you shake the parts---even though that is theoretically possible, and one could contrive laws of physics that privilege 'teapotness' as a state of certain assemblies in order to have it spontaneously reassemble.

But note that even in your example, for the majority of (macro)states---'on average' in a suitable sense---the second law will hold; and for the majority of evolutions---again, 'on average'---it will hold for all states.

Also note that the violation will be short-lived: after a first-step reduction, we will with certainty observe an increase, again.

So, for the typical universe (with typical laws of physics), and for typical states, we will, typically, observe an entropy increase for large enough systems. Claiming that it's not so needs at least an argument as to why our universe shouldn't be typical.

A more sophisticated treatment of this sort of thing would go into the notion of ergodicity, and when and how it's justified. I can't provide that here, not to the level of detail necessary to satisfy your curiosity.

So, if you're not willing to accept a little waffling on the notion of 'typical', and agree that, provided we're in a typical world, we'll typically see entropy increase, I'm afraid there's nothing short of a full course in statistical mechanics that will suffice. There's only so much you can simplify without becoming flat wrong, and I've at least skirted that edge so far; so I'm afraid you'll have to take one horn of the dilemma---either accept a certain degree of appeal to intuition, like that a box full of black and white marbles will mix if stirred, thus obtaining an intuitive any easily apprehended picture of how the second law works and how it can be violated, or insist on a full-on formal treatment, thus needing the corresponding full-on formalism of statistical mechanics to go along with it.

Try as I might, I don't think there's a road through the middle here---I can't both be perfectly accurate and keep this manageably simple. If that's what you require, I'm sorry; but I think if you're willing to work with me just a little, and accept certain obvious, but, I agree, not rigorously proven statements---like that black and white balls tend to mix upon being shaken---I think you can achieve a much better understanding of the second law.

Quote:
Originally Posted by Max S. View Post
Not that I'm making the connection between step 8 and step 9.

~Max
So does that mean you're retracting your earlier admission of that point?

Quote:
Originally Posted by Max S. View Post
I will concede that one might formulate a law to the effect that 'greyness only increases', and further, that law can be violated. What is your point?

~Max

Last edited by Half Man Half Wit; 05-15-2019 at 01:11 PM.
  #243  
Old 05-15-2019, 02:27 PM
DPRK is offline
Guest
 
Join Date: May 2016
Posts: 3,033
Quote:
Originally Posted by Max S. View Post
OK, so it is a system of N coins and each coin can be heads or tails. I agree that makes 2N equally probable states; 2N = Ω. I can derive log2(Ω) = N. How are you getting N log 2 and what does that signify? It can't be the entropy: S = kB loge(Ω) = kB loge(N2)
Thermodynamic entropy is proportional to the logarithm of the total number of states. log(2N) = N log(2) , in any base (this is all up to a constant anyway). The thing to notice is that you have a system composed of independent subsystems, and the entropy of the total system is the sum of the entropies of the subsystems.
Quote:

Alright, so the math checks out. But I'm not sure how you derived "exp(-0.02 N)", which seems to be an arbitrary number. I have yet to fit in entropy or the second law of thermodynamics.
I did say you should check the calculation yourself. Use Chernoff's inequality or something similar. The resulting bounds are more interesting if you take N = 100 or 1000 instead of 2 or 3.

This is all just going through the definition of entropy in terms of complexity (maybe it be more instructive to analyse a non-ideal gas or a solid or 2-D Ising model or something). These microscopic calculations all have to do with explaining how behaviour like heat spontaneously flowing from a hotter body to a colder one, but not the other way around, arises. Some hypothetical long-term recurrence, or lack thereof, doesn't really have anything to do with it.
  #244  
Old 05-15-2019, 03:17 PM
Max S. is offline
Guest
 
Join Date: Aug 2017
Location: Florida, USA
Posts: 528
Quote:
Originally Posted by Half Man Half Wit View Post
Quote:
Originally Posted by Max S. View Post
Quote:
Originally Posted by Half Man Half Wit View Post
The relevance of the preceding exercise is just to establish that no matter which of those describes the correct microscopic evolution law, the conclusion holds that entropy will more often increase (or stay constant) than decrease.
I will concede this general rule for the first time step in systems where there are more microstates corresponding to relatively high-entropy macrostates than there are corresponding to relatively low-entropy macrostates.

~Max
And since the further steps are just the same as the first step, with the same evolution law, it holds for those, as well.
...
Quote:
Originally Posted by Max S. View Post
And again I must object to the verbage "occur more often" because we are only talking about a single time step, not prolonged observation of a system.
'Occur more often' here means in repetitions of the same experiment: i. e. we set the system up in the same macrostate, and observe what happens.

It's irrelevant, but the same holds true for prolonged observation, as well: after all, we can take each further step as the first one of a new experiment.
I think you are wrong about this. Let's take the tiny three state system with evolution_4:

Quote:
Originally Posted by Max S. View Post
The tiny system has two possible macrostates: high-entropy macrostateH and low-entropy macrostateL

There are two microstates consistent with macrostateH: microstate A and microstate B.

There is one microstate consistent with macrostateL: microstate C.

Consider now all of the possible evolutions of this system. There are twenty-four possible evolutions, many of which are redundant so I have reduced that to six. Each evolution has between one and three evolutionary "cycles". Here is a list of all six different evolutions, with their probability in parenthesis (assuming a random distribution, which has not yet been established):
...
  • entropy_4 (4/24)
    • cycle_1: A->
    • cycle_2: B->C->
...
Code:
...
    evolution_4 (4/24)
cycle_1    cycle_2
A──>──┐    B──>──┐
│     │    │     │
^     v    ^     v
│     │    │     │
└──<──┘    └──<──C
...
If the system starts out in low-entropy macrostateL, the probability of observing an increase in entropy at the next step is 100%. The probability of observing a net increase in entropy after two steps is 0%. Given a random number of steps n, the probability of observing a net increase in entropy over n steps is 100% if n is odd and 0% if n is even.

For comparison, if the system starts out in high-entropy macrostateH, the probability of observing a decrease in entropy at the next step is 50%. The probability of observing a net decrease in entropy after two steps is 0%. The probability of observing a net decrease in entropy over n steps is 50% if n is odd and 0% if n is even.

The takeaway is that with this system, the probability of observing a net decrease in entropy over an even number of steps is exactly the same as the probability of observing a net decrease in entropy - 0%. If the number of steps observed is random, there is a 50% chance that net entropy will not change at all.

Quote:
Originally Posted by Half Man Half Wit View Post
Try as I might, I don't think there's a road through the middle here---I can't both be perfectly accurate and keep this manageably simple. If that's what you require, I'm sorry; but I think if you're willing to work with me just a little, and accept certain obvious, but, I agree, not rigorously proven statements---like that black and white balls tend to mix upon being shaken---I think you can achieve a much better understanding of the second law.
I'm willing to move past steps 8 and 9 for the sake of argument. After all, you haven't revealed (or rather, I haven't considered) your full argument as to how the second law of thermodynamics is regularly violated. I have yet to see how this business about macrostates has any relevance to the second law of thermodynamics.

Quote:
Originally Posted by Half Man Half Wit View Post
So does that mean you're retracting your earlier admission of that point?
I think I am being consistent in saying that it is possible for the observer to observe only increases in entropy, and therefore formulate a law to the effect that 'entropy only increases' or 'greyness only increases'. I can take this position without admitting that the observer necessarily or even likely observes only increases in entropy.
I do appreciate your participation in this thread and I am willing to drop my objections to the marbles example. The conclusion from that example was never controversial, and is in fact identical to the conclusion from the coin-flipping example. I only took issue in the steps taken to reach said conclusion.

~Max
  #245  
Old 05-15-2019, 03:18 PM
Max S. is offline
Guest
 
Join Date: Aug 2017
Location: Florida, USA
Posts: 528
Quote:
Originally Posted by Half Man Half Wit View Post
Don't worry about that for now. This is going to be a bit of a journey, and you keep running into confusions by getting ahead of yourself. But for now, the first step has been taken: we agree that it's possible to have a law, valid to all appearances in the macroscopic realm, concerning a certain quantity ('greyness') such that once that quantity has been understood on a microscopic level, we understand that the original law can only hold in an approximate sense. We can build from here.

Now, to the next step.
  1. Suppose you have two boxes, A and B.
  2. Box A is filled with white marbles, and box B is filled with black ones.
  3. Both boxes are placed on a vibrating plate, such that the marbles in them bounce around.
  4. The walls of the boxes are removable.
  5. Suppose you put both boxes next to one another, and remove the now adjacent walls, creating one big box.
  6. Marbles from the white box A will bounce into the black box, and marbles from the black box B will bounce into the white box.
  7. There are more ways of realizing a state that's pretty uniformly grey, than there are to realize a state that's (say) all white in box A, and all black in box B.
  8. Consequently, there are more ways to go from a state that's slightly inhomogeneous to one that's more homogeneous, than there are ways to go to a state that's even more inhomogeneous.
  9. To any observer who, as before, is only capable of seeing gross colors, the formerly black-and-white separated box will gradually tend to a shade of even gray.
  10. That observer might formulate a law, stating that black and white, if brought into contact, eventually even out to a uniform gray.
  11. Knowing the microscopic description, we know that this is just, again, a law of averages: there is nothing that prohibits, say, all or a sizable fraction of the white marbles from bouncing back into box A.
  12. Given a long enough timescale, the even grey will, eventually, separate into black and white patches.

I expect greater resistance with this example. But again, try not to think ahead to the rest of this discussion; just consider the above system, as I have presented it. Do you agree that the conclusion is reasonable, here? That there is once again a law that appears valid thanks to the limited observations made at the macroscale, which we can see must be violated once we know about the microscopic level?
Conceded (emphasis mine).

~Max

Last edited by Max S.; 05-15-2019 at 03:19 PM. Reason: emphasis
  #246  
Old 05-16-2019, 12:10 AM
Half Man Half Wit's Avatar
Half Man Half Wit is online now
Guest
 
Join Date: Jun 2007
Posts: 6,672
Quote:
Originally Posted by Max S. View Post
I think you are wrong about this. Let's take the tiny three state system with evolution_4:
My point was merely that the laws of physics don't change for the next 'hop' in the evolution, so to speak. So that for sufficiently large systems, with many billions of microstates, you will typically observe a continuous increase of entropy.

Quote:
I think I am being consistent in saying that it is possible for the observer to observe only increases in entropy, and therefore formulate a law to the effect that 'entropy only increases' or 'greyness only increases'. I can take this position without admitting that the observer necessarily or even likely observes only increases in entropy.
For there to be a law, I don't think 'possible' would work. After all, laws aren't just formulated on the basis of single-shot observations; rather, experiments are repeated to exclude the possibility of chance events.

So (as, really, always in this thread), we'll have to look at what's more likely. And here, the possibility that one observes 100 coin throws yielding heads in a row doesn't impact on the fact that anybody who merely observes aggregated coin throw results will not see such an outcome.

Quote:
Originally Posted by Max S. View Post
Conceded (emphasis mine).

~Max
Great! I think we have come a really long way, here. Note how you've been opposed to the possibility that a law, formulated at the macroscopic level and assumed to be perfectly valid there, may receive a deeper explanation at the microscopic level, where it becomes clear that the law isn't, in fact, inviolable. This law concerned a macroscopic quantity---greyness---that doesn't directly map to the microscopic properties of the system (the marbles are either black or white), but is explained by them.

Now, what remains to do is to convince you that this sort of thing is what actually happened with respect to the second law. That is:
  1. Entropy was first assumed to be a macroscopic quantity of a system.
  2. Entropy can be explained in terms of the microscopic, by connecting it with the notion of how many microstates there are per macrostate.
  3. A formulation of the second law is 'heat does not spontaneously pass from a colder to a hotter body'.
  4. The distribution of heat can be explained by microscopic transport phenomena.
  5. A transport of heat from the colder to the hotter body is a transition from a state with more microstates per macrostate to one with fewer.
  6. Hence, the second law may be violated if the system evolves against likelihood, from a higher-entropy to a lower-entropy state.

Do you agree that that's roughly what needs to be established?
  #247  
Old 05-16-2019, 01:18 AM
Max S. is offline
Guest
 
Join Date: Aug 2017
Location: Florida, USA
Posts: 528
Quote:
Originally Posted by Half Man Half Wit View Post
Now, what remains to do is to convince you that this sort of thing is what actually happened with respect to the second law. That is:
  1. Entropy was first assumed to be a macroscopic quantity of a system.
  2. Entropy can be explained in terms of the microscopic, by connecting it with the notion of how many microstates there are per macrostate.
  3. A formulation of the second law is 'heat does not spontaneously pass from a colder to a hotter body'.
  4. The distribution of heat can be explained by microscopic transport phenomena.
  5. A transport of heat from the colder to the hotter body is a transition from a state with more microstates per macrostate to one with fewer.
  6. Hence, the second law may be violated if the system evolves against likelihood, from a higher-entropy to a lower-entropy state.

Do you agree that that's roughly what needs to be established?
If the next goal is to admit that the second law of thermodynamics can be violated, conceding each of those points would satisfy the goal.

We can start at the top, point #1. I think I agree, but it depends on what you mean by "a macroscopic quantity of a system". If you mean that my definition of entropy makes it a property of the whole system, we are in agreement.

~Max
  #248  
Old 05-16-2019, 03:13 AM
Half Man Half Wit's Avatar
Half Man Half Wit is online now
Guest
 
Join Date: Jun 2007
Posts: 6,672
Quote:
Originally Posted by Max S. View Post
If the next goal is to admit that the second law of thermodynamics can be violated, conceding each of those points would satisfy the goal.



We can start at the top, point #1. I think I agree, but it depends on what you mean by "a macroscopic quantity of a system". If you mean that my definition of entropy makes it a property of the whole system, we are in agreement.



~Max
I mean a quantity that's a function of the macrostate of the system; that is, a function of state variables like temperature, volume, and pressure, as opposed to the microscopic variables like (generalized) positions and momenta of individual particles.

That means, in this definition, it's only meaningful in the context where these quantities are meaningful. Temperature, for example, is just an average property, only applicable to the macroscopic world. Entropy, thus, is likewise; and, since we can define temperature in terms of microscopic quantities, we can do so for entropy, as well.
  #249  
Old 05-16-2019, 08:59 AM
Max S. is offline
Guest
 
Join Date: Aug 2017
Location: Florida, USA
Posts: 528
Quote:
Originally Posted by Half Man Half Wit View Post
I mean a quantity that's a function of the macrostate of the system; that is, a function of state variables like temperature, volume, and pressure, as opposed to the microscopic variables like (generalized) positions and momenta of individual particles.

That means, in this definition, it's only meaningful in the context where these quantities are meaningful. Temperature, for example, is just an average property, only applicable to the macroscopic world. Entropy, thus, is likewise; and, since we can define temperature in terms of microscopic quantities, we can do so for entropy, as well.
Very well. But how do you explain #2?

Quote:
2. Entropy can be explained in terms of the microscopic, by connecting it with the notion of how many microstates there are per macrostate.
What does the count of an isolated system's microstates have to do with entropy? If the system is isolated there can not be any change in heat flow external to the system δQ. I will admit that heat and temperature can be a function over the microstate of a system, for example the count of particles and sum of particle velocities respectively. But that doesn't involve the count of microstates and due to the first law of thermodynamics it won't change over time. Besides, if δQ is zero, it doesn't matter what value T has because the change in entropy will still be zero.

~Max

Last edited by Max S.; 05-16-2019 at 09:03 AM. Reason: strikethrough
  #250  
Old 05-16-2019, 12:42 PM
Half Man Half Wit's Avatar
Half Man Half Wit is online now
Guest
 
Join Date: Jun 2007
Posts: 6,672
Quote:
Originally Posted by Max S. View Post
Very well. But how do you explain #2?



What does the count of an isolated system's microstates have to do with entropy? If the system is isolated there can not be any change in heat flow external to the system δQ. I will admit that heat and temperature can be a function over the microstate of a system, for example the count of particles and sum of particle velocities respectively. But that doesn't involve the count of microstates and due to the first law of thermodynamics it won't change over time. Besides, if δQ is zero, it doesn't matter what value T has because the change in entropy will still be zero.

~Max
I'm not completely sure, but I think maybe you're getting hung up on the definition of a 'system'. But ultimately, a 'system' just is what you consider to be a system. I can think of two volumes of gas as two systems, or as a single one. Just consider the classic thermodynamic case of a box with a partition that can be inserted and removed frictionlessly, i. e. without performing any work. As long as the partition is present, we will have two isolated systems; once it is removed, we can think of it either as one system, or as two, no longer isolated from each other.

Now suppose that the gas volumes have different temperatures. Let's say, volume A is hotter, volume B is colder. Then, once you've removed the partition, heat will (tend to) flow from A to B, until both systems are in equilibrium with one another. This is the second law from one perspective.

Alternatively, we can think of the gas after the partition has been removed as one system in a state that's far from equilibrium---with one side of it hotter, the other colder. The move towards equilibrium is one towards a more homogeneous, more 'disordered' state---a move of the sort we've by now examined copiously: towards higher entropy in the sense of transitioning to a more likely state.

The fact that the entropy of the total system can't decrease yields the fact that no heat can be transferred from the colder system to the hotter one. If an amount of heat dQ flows into B, its entropy will be increased by, as you know, dSB = dQ/TB. Likewise, via an amount of heat dQ flowing out of A, the entropy is decreased, by dSA = -dQ/TA.

The total change in entropy of the combined system is then:

dS = dSB + dSA = dQ/TB - dQ/TA

Now, for TA = TB, that's zero; but in that case, also no heat can get transferred. If TA > TB, we have a positive dS, and likewise, heat flowing from the hotter to the colder system.

If, however, TA < TB, dS would be negative, and heat would flow from the hotter to the colder system. Consequently, if heat flows from the colder to the hotter system, the total entropy of the combined system decreases, and vice versa---if heat flows from the hotter to the colder system, total entropy increases.

Thus, we can state the second law in two equivalent ways:
  1. Heat only flows from a hotter (A) to a colder system (B)
  2. The total entropy of a system (A + B) never decreases

The second formulation of the second law is the one we will connect with the notion of microstates. The thing to realize here is simply that heat is like greyness: there are more microstates corresponding to states of equally distributed heat than there are corresponding to states of inhomogeneous heat distribution.

For the moment, I'll leave you merely with the intuitive basis of that claim: there are, simply put, more ways of having 'fast' and 'slow' gas atoms (black and white marbles) evenly distributed in the box (leading to an overall intermediate average speed---greyness), than there are ways of having all the 'fast' atoms in part A and all the 'slow' ones in part B. Thus, starting out with such an 'uneven' distribution, we would expect, given the prior discourse, to observe, typically, an evening out of the distribution of heat---which entails, thus, heat flowing from the hotter to the colder part, purely on the same basis as in the marble-example, where 'darkness' flows from the blacker part to the lighter one, to yield an overall grey result.

I'm going to stop here for the moment, though, because experience teaches me you're not going to accept this simply at face value (on the other hand, if you feel inclined to do so, please don't feel obliged to further your inquiry just for my sake!). So let's see what we bump into now.
Reply

Bookmarks

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off

Forum Jump


All times are GMT -5. The time now is 04:43 AM.

Powered by vBulletin® Version 3.8.7
Copyright ©2000 - 2019, vBulletin Solutions, Inc.

Send questions for Cecil Adams to: cecil@straightdope.com

Send comments about this website to: webmaster@straightdope.com

Terms of Use / Privacy Policy

Advertise on the Straight Dope!
(Your direct line to thousands of the smartest, hippest people on the planet, plus a few total dipsticks.)

Copyright 2018 STM Reader, LLC.

 
Copyright © 2017