Reply
 
Thread Tools Display Modes
  #51  
Old 05-01-2019, 10:08 AM
Max S. is offline
Guest
 
Join Date: Aug 2017
Location: Florida, USA
Posts: 528
Quote:
Originally Posted by Half Man Half Wit View Post
OK, so you actually think Boltzmann misunderstood his own theory in a naive way you're able to spot immediately, but no scientist in the century hence managed to do. That's confidence, I like it!

So, what do you think about Feynman's example from his Lectures on Physics, typically considered to be one of the best resources for teaching physics available?


Or, what do you make of the following quote of Maxwell (from his Theory of Heat)?


And what do you make of Eddington's words in his entertainingly titled "The End of the World: from the Standpoint of Mathematical Physics"?


I don't intend to bludgeon you over the head with these quotes from the big shots (well, maybe a little, but gently). But I do want to know if this doesn't at least hint to you that maybe, you've got it wrong somehow---that it's not everybody else (and I could dig up innumerably more similar quotes) who misunderstands, but rather, you? Or, failing that, do you at least recognize that I'm not saying something wildly out there, but merely, what's been the consensus ever since the statistical mechanics foundation of thermodynamics was proposed?
The argumentum ad populum certainly makes me wonder whether I have it wrong. That's why I made this thread, not to convince you but to convince me. I can't seem to understand why I am wrong. It seems their arguments all rely on a fundamental premise which I am not yet convinced to take. As you stated in post #8, this premise is the equiprobability of micro-states.

So far I can accept that fundamental premise as a heuristic, but not as an absolute physical law. The things you implied if the premise were false don't seem so certain to me, see post #13 and #14 and the related discussions, which are ongoing.

I will add all of those cites to my reading list, but I can't guarantee when I will get around to reading them. And as always, I will admit my ignorance of modern or even older sciences in an instant.

~Max
  #52  
Old 05-01-2019, 10:27 AM
Max S. is offline
Guest
 
Join Date: Aug 2017
Location: Florida, USA
Posts: 528
Quote:
Originally Posted by Half Man Half Wit View Post
Quote:
Originally Posted by Max S. View Post
I am of the opinion that, by measuring subdivisions within a system you have divided that system into two sub-systems.
Well, you haven't. All you've done is introduce what's often called a coarse-grained set of observables, analogous to thermodynamic quantities such as temperature, volume, and pressure---quantities that take account of the state of the system in aggregated form, without being sensible to microscopic details.

...

Quote:
Originally Posted by Max S. View Post
If we are to measure the number of billiard balls on the left side of the table, that number is neither a property of the table nor a property of an individual billiard ball.
It's a property of the system of billard balls on the table. Temperature, likewise, isn't a property of the room a gas is in, nor of any individual molecule.
I would like to press this issue. When dealing with a thermodynamic system, I thought properties of the system must represent the system as a whole, dependent on the internal state but not necessarily on a specific internal state.

Specifically, you should be able to measure a system property without ascertaining the internal state. Would you agree?

Quote:
Originally Posted by Half Man Half Wit View Post
Quote:
Originally Posted by Max S. View Post
Thus, I conclude you have designated the left and right sides of the table as their own systems, and the two of them combined comprise of the billiard table as a whole.
No. I'm just describing the table at different levels of abstraction. The total number of billard balls gives me relatively little information about the state of the system; the number of billard balls per half a little more; and the configuration of billard balls in the subdivided areas yet more. You can view the entropy of a given macrostate as the information we lack about the microstate, if that helps.
My thought is that the number of balls on the left half of the table belongs to a mid-layer of abstraction. A property of the system need not reveal much specific information about the system's internal state, in fact it shouldn't reveal too much. Otherwise I might just as well define the position of ball 1 as a property of the system, the momentum of ball 1, the position of ball 2, etcetera. The more specific we allow system properties to be, the less generic our system becomes and the less useful our conclusions are. In computer science this generic-ness of systems (classes) is called polymorphism.

~Max

Last edited by Max S.; 05-01-2019 at 10:29 AM. Reason: fixed links
  #53  
Old 05-01-2019, 10:42 AM
Max S. is offline
Guest
 
Join Date: Aug 2017
Location: Florida, USA
Posts: 528
Quote:
Originally Posted by Half Man Half Wit View Post
Sorry if I wasn't clear regarding my definitions. A macrostate takes account of the system in terms of coarse-grained observables, like temperature, pressure, and volume, without being sensitive to the microscopic details.

The coarse-grained observables I have introduced in the billard ball example are 'number of billard balls in the left half' and 'number of billard balls in the right half'. The microstate is given by the arrangement of billard balls in the subdivided areas.
This definition is not acceptable to me because, when looking at a system, I cannot draw a hard line between what is a macroscopic property and a microscopic property. Is the weight of paper in a printing machine a macroscopic property of the machine, or a microscopic property? The number of numpad keys on a keyboard? The number of files in the top drawer of my filing cabinet? The aggregate color of paint in the top half of a canister filled with two half-mixed paints?

I propose redefining macroscopic to mean "of the whole" and microscopic to mean "of the constituent parts". That way the answers are clear-cut.

~Max
  #54  
Old 05-01-2019, 11:05 AM
Max S. is offline
Guest
 
Join Date: Aug 2017
Location: Florida, USA
Posts: 528
Quote:
Originally Posted by Half Man Half Wit View Post
Quote:
Originally Posted by Max S.
The second law of thermodynamics only concerns net heat flow to and from a system, as if a Carnot engine existed along the border. It does not apply to the internal state of a system until you define the internal state as two or more sub-systems.

~Max
That's simply wrong. As the first sentence of the relevant wiki entry puts it:
Quote:
Originally Posted by wikipedia
The second law of thermodynamics states that the total entropy of an isolated system can never decrease over time.
Nothing about net heat flow, applies to the internal state without dividing anything up. (Of course, you can state the second law in terms of heat flow, but that's not the only formulation.)
I did not include that form of the second law of thermodynamics in the original post. I did this because the definition of entropy I was familiar with, dS = δQ/T, depends on the other form of the second law (Clausius or Kelvin) being true. So I don't really consider the entropy definition to be the second law of thermodynamics, especially since temperature and therefore entropy is undefined for isolated systems not in equilibrium.

You can use the statistical definition of entropy and claim the entropy definition of the second law of thermodynamics is a law. But this is setting yourself up for failure - the statistical definition of entropy depends on that a fundamental premise of equiprobable microstates. That same premise is strictly incompatible with the the second law of thermodynamics in any form, including the entropy form.

~Max
  #55  
Old 05-01-2019, 11:39 AM
Half Man Half Wit's Avatar
Half Man Half Wit is offline
Guest
 
Join Date: Jun 2007
Posts: 6,672
Quote:
Originally Posted by Max S. View Post
The argumentum ad populum certainly makes me wonder whether I have it wrong. That's why I made this thread, not to convince you but to convince me. I can't seem to understand why I am wrong. It seems their arguments all rely on a fundamental premise which I am not yet convinced to take. As you stated in post #8, this premise is the equiprobability of micro-states.

So far I can accept that fundamental premise as a heuristic, but not as an absolute physical law.
It's not a physical law; it's the absence of a law stating anything else. If you have five colored marbles in a jar, and you draw one, the odds are 1/5th you'll get either color. If you find that this equiprobability is violated, you'll start looking for a cause: if, say, you draw the red marble 50% of the time, something must make it so that this marble is drawn more often.

We can consider this as a physical system with five states available to it. These states are equiprobable, not because that's a physical law, but because there's no law, nor any other cause, that would make them not be.

That's what I'm trying to get at with the billiard balls. Barring special initial conditions, such as carefully-arranged periodic paths, which would in themselves be something in need of explanation, whenever you look at the table, for a generic starting configuration (that is, balls set in motion with no matter what initial velocities), you're equally likely to find it in any of the possible microstates available to it.

If you want to set it up differently, you'd have to add some further element---either carefully set up initial conditions, or further forces acting on the balls to prohibit certain states/change their likelihood, and so on. But with just billiard balls bouncing off of each other, you can be certain you'll observe a decrease in entropy eventually.

Quote:
Originally Posted by Max S. View Post
I would like to press this issue. When dealing with a thermodynamic system, I thought properties of the system must represent the system as a whole, dependent on the internal state but not necessarily on a specific internal state.

Specifically, you should be able to measure a system property without ascertaining the internal state. Would you agree?
I don't know what you mean by that. What's the internal state?

Thermodynamic quantities are aggregate observables, which tell you something about the state of the system, but not all you could know. It's a statistical description if you have limited knowledge about the system.

Quote:
My thought is that the number of balls on the left half of the table belongs to a mid-layer of abstraction.
'Mid-layer' simply isn't a meaningful term here. You can consider any system at different levels of exactness; the quantities you use to describe it, including the entropy, will depend on the level you're considering it at.

Quote:
Originally Posted by Max S. View Post
This definition is not acceptable to me because, when looking at a system, I cannot draw a hard line between what is a macroscopic property and a microscopic property.
A microscopic property gives you the full information about the system. A macroscopic one doesn't, but instead, refers to the system in aggregate. That is, the microstate of an ideal gas would be the positions and momenta of all the particles; a macrostate is anything that leaves out some of that information.

[/QUOTE]I propose redefining macroscopic to mean "of the whole" and microscopic to mean "of the constituent parts". That way the answers are clear-cut.

~Max[/QUOTE]
Neither 'whole' nor 'parts' have any objective meaning, though. Is the whole the whole table? If so, are its parts the legs and the top, or the atoms that constitute them? If it's the atoms, is the whole they form the leg they're part of, or the table the leg is a part of?

Quote:
Originally Posted by Max S. View Post
I did not include that form of the second law of thermodynamics in the original post. I did this because the definition of entropy I was familiar with, dS = δQ/T, depends on the other form of the second law (Clausius or Kelvin) being true. So I don't really consider the entropy definition to be the second law of thermodynamics, especially since temperature and therefore entropy is undefined for isolated systems not in equilibrium.
The microscopic definition is the more fundamental one, though. The one usually considered in thermodynamics only really holds in the thermodynamic limit of infinite systems.

Quote:
You can use the statistical definition of entropy and claim the entropy definition of the second law of thermodynamics is a law. But this is setting yourself up for failure - the statistical definition of entropy depends on that a fundamental premise of equiprobable microstates. That same premise is strictly incompatible with the the second law of thermodynamics in any form, including the entropy form.

~Max
Well, if you hold that the second law should, for whatever reason, be always and universally valid, then yes, I suppose that must be the conclusion to come to. But that's of course question-begging: there's no reason beyond you wishing it to be so to suppose that the second law should be universally valid. In fact, what Boltzmann did with his formulation of statistical mechanics, was to show that the second law emerges as an approximate notion from more fundamental dynamics we were previously ignorant of; given that, it's no wonder the second law shouldn't be universally valid.
  #56  
Old 05-01-2019, 01:12 PM
Max S. is offline
Guest
 
Join Date: Aug 2017
Location: Florida, USA
Posts: 528
Quote:
Originally Posted by Half Man Half Wit View Post
It's not a physical law; it's the absence of a law stating anything else. If you have five colored marbles in a jar, and you draw one, the odds are 1/5th you'll get either color. If you find that this equiprobability is violated, you'll start looking for a cause: if, say, you draw the red marble 50% of the time, something must make it so that this marble is drawn more often.

We can consider this as a physical system with five states available to it. These states are equiprobable, not because that's a physical law, but because there's no law, nor any other cause, that would make them not be.
In a logical sense there is no default interpretation, every logical truth must be axiomatic or derived from axioms. The equiprobability of microstates is an axiom. Either it is absolute (a law) or it is not (heuristic).

It is fine in philosophy to say reality is fundamentally based on heuristics rather than absolute physics. You can build consistent physics on top of heuristics, but the trade-off is predictive value at the lowest levels. Causality and scientific laws (including laws of conservation) are reduced to heuristics, because at the fundamental level changes in state are random. On the other hand, you can no longer support strong physicalism - the view that all phenomena can be explained by physics. To do so you need to contort the word "explained" to include pointing to this axiom that says it's fundamentally random. You can do that, but it's counter-intuitive and leads to... interesting philosophical conclusions.

Is that consistent with your position? With the positions of all those scientists? I thought you advocated a form of strong physicalism in the dualism thread.

~Max
  #57  
Old 05-01-2019, 01:46 PM
septimus's Avatar
septimus is offline
Guest
 
Join Date: Dec 2009
Location: The Land of Smiles
Posts: 19,119
@ Max — I've been trying to follow the conversation, but with growing confusion.

Quote:
Originally Posted by septimus View Post
Unlike the Laws of Newton, Maxwell and Einstein, the Second Law isn't a dynamical law; it's just a statistical fact, closely akin to the Law of Large Numbers.
Assuming you thought I knew what I was talking about, would that sentence change your viewpoint?
  #58  
Old 05-01-2019, 04:09 PM
Half Man Half Wit's Avatar
Half Man Half Wit is offline
Guest
 
Join Date: Jun 2007
Posts: 6,672
Quote:
Originally Posted by Max S. View Post
In a logical sense there is no default interpretation, every logical truth must be axiomatic or derived from axioms. The equiprobability of microstates is an axiom. Either it is absolute (a law) or it is not (heuristic).
You're confusing the world and its description there. Axioms, laws and the like don't make anything happen, they describe what happens.

But anyway. What's the law that makes it such that the probability of five colored marbles in an urn comes out 1/5 for each color? Nothing beyond their mere number. Likewise with the microstates. You essentially want to say that there needs to be a separate law dictating these probabilities, and that they could just as well come out another way.

But anyway. I've actually shown you how the equiprobability can be derived. For an ensemble of billard balls, for generic initial conditions (i. e. initial conditions not set up to lead to special, periodic motions, which form a measure zero subset of all possible initial conditions), it will be the case that the time spent in any possible microstate will be the same. Hence, there's an even chance of finding the system in any given microstate. And that means that entropy will tend to decrease if you start in some initial state, but may also decrease, with very low probability. That probability can be calculated via the fluctuation theorem:
Quote:
Originally Posted by wikipedia
While the second law of thermodynamics predicts that the entropy of an isolated system should tend to increase until it reaches equilibrium, it became apparent after the discovery of statistical mechanics that the second law is only a statistical one, suggesting that there should always be some nonzero probability that the entropy of an isolated system might spontaneously decrease; the fluctuation theorem precisely quantifies this probability.
The predictions of the fluctuation theorem have actually been tested, both using computer simulations, and in the laboratory:

Quote:
Originally Posted by wikipedia
The first laboratory experiment that verified the validity of the FT was carried out in 2002. In this experiment, a plastic bead was pulled through a solution by a laser. Fluctuations in the velocity were recorded that were opposite to what the second law of thermodynamics would dictate for macroscopic systems.
Quote:
Originally Posted by Max S. View Post
Causality and scientific laws (including laws of conservation) are reduced to heuristics, because at the fundamental level changes in state are random. On the other hand, you can no longer support strong physicalism - the view that all phenomena can be explained by physics.
This isn't what this leads to, but it's also wrong. Current physics is by mainstream opinion indeterministic---random at the base. That doesn't entail anything regarding the possibility of physicalism.

By the way, is your username just a happy accident, or are you 'Max S.' exactly because of your belief that entropy always only tends to the max?

Last edited by Half Man Half Wit; 05-01-2019 at 04:11 PM.
  #59  
Old 05-01-2019, 04:14 PM
DPRK is offline
Guest
 
Join Date: May 2016
Posts: 3,033
Quote:
Originally Posted by Half Man Half Wit View Post
What formulation of the second law are you referring to? Because I think it's entirely reasonable, as e. g. Maxwell also does in the quote above, to speak of a violation of the second law when entropy fluctuates downward.
If Maxwell says that the Second Law is continually being violated for any sufficiently small group of molecules, then we could take him at his word that what he has in mind strictly applies in the thermodynamic limit where the number of molecules is (effectively) infinite, yet not be surprised that 3 gas particles could all end up on one side of a room. 3 or 1000 is "sufficiently small", yet we don't hold our breath (heh) for any detectable decrease of entropy of a roomful of air.

At any rate, I think it is quantitatively clear what is going on, especially for the classical systems statistically studied by Maxwell, Boltzmann, Gibbs, et al.
  #60  
Old 05-01-2019, 05:02 PM
Max S. is offline
Guest
 
Join Date: Aug 2017
Location: Florida, USA
Posts: 528
Quote:
Originally Posted by septimus View Post
@ Max — I've been trying to follow the conversation, but with growing confusion.
Quote:
Originally Posted by septimus View Post
Unlike the Laws of Newton, Maxwell and Einstein, the Second Law isn't a dynamical law; it's just a statistical fact, closely akin to the Law of Large Numbers.
Assuming you thought I knew what I was talking about, would that sentence change your viewpoint?
If I were to assume as premises:
  • all microstates of a system are equiprobable
  • the entropy of a system is the count of microstates that express a given macrostate
  • the second law of thermodynamics states: "the entropy of an isolated system can never decrease over time"
Then the second law of thermodynamics is, in theory, routinely violated in tiny systems. At which point the law is reduced to a statistical fact, "the entropy of an isolated system is unlikely to decrease over time".

I can't wrap my head around the equiprobability of microstates, plus I was using other definitions of the second law of thermodynamics. So I haven't reached that conclusion yet.

~Max
  #61  
Old 05-01-2019, 05:12 PM
Lumpy's Avatar
Lumpy is offline
Charter Member
 
Join Date: Aug 1999
Location: Minneapolis, Minnesota US
Posts: 16,352
Even if 2nd law reversals happen, it's still not possible to use these to do work, because even if you had some mechanism that could supposedly exploit such reversals, it's equally possible that random fluctuations could undo any temporary gains. It's sort of like blackjack: you can get a small temporary reversal against the dealer but in the long run you always lose.
  #62  
Old 05-01-2019, 05:36 PM
Max S. is offline
Guest
 
Join Date: Aug 2017
Location: Florida, USA
Posts: 528
Quote:
Originally Posted by Half Man Half Wit View Post
But anyway. What's the law that makes it such that the probability of five colored marbles in an urn comes out 1/5 for each color? Nothing beyond their mere number.
The law would go something like this: "There is an equal probability that you will draw any particular marble from an urn." I think the generalized law is the law of total probability, combined with a discrete random distribution. This in turn depends on the marbles being picked randomly.

~Max
  #63  
Old 05-01-2019, 05:40 PM
Max S. is offline
Guest
 
Join Date: Aug 2017
Location: Florida, USA
Posts: 528
Quote:
Originally Posted by Lumpy View Post
Even if 2nd law reversals happen, it's still not possible to use these to do work, because even if you had some mechanism that could supposedly exploit such reversals, it's equally possible that random fluctuations could undo any temporary gains. It's sort of like blackjack: you can get a small temporary reversal against the dealer but in the long run you always lose.
Unless the system starts in a high entropy state, then randomly jumps to a low entropy state, and by purely random chance stays in the low entropy state forever.

The laws of probability are only useful in predicting to an extent - if the only constraint is probability, it is theoretically possible for every coin toss to be tails forever.

~Max
  #64  
Old 05-01-2019, 06:03 PM
Max S. is offline
Guest
 
Join Date: Aug 2017
Location: Florida, USA
Posts: 528
Quote:
Originally Posted by Half Man Half Wit View Post
But anyway. I've actually shown you how the equiprobability can be derived. For an ensemble of billard balls, for generic initial conditions (i. e. initial conditions not set up to lead to special, periodic motions, which form a measure zero subset of all possible initial conditions), it will be the case that the time spent in any possible microstate will be the same. Hence, there's an even chance of finding the system in any given microstate.
Here's what I don't understand. Is every microstate equiprobable at every moment? Is the probability of a microstate at time t constrained by the microstate at time t-1?

If so, let's say the billiard table starts out with an initial microstate that would not classically lead to periodic motion. The distribution of billiard balls is somewhat homogenous. Step forward in time one moment. Surely there is a chance that the arrangement of billiard balls is now a special microstate which would classically lead to periodic motion. Even if this isn't the state now, at some point you would expect the arrangement of billiard balls to reach such a special microstate. The reverse holds true - one moment the system can have a special microstate and the next moment it might not. This is where I find the disconnect between statistical mechanics and causality. The effect is that all the "dynamic" laws of physics mentioned by septimus are reduced to heuristics. There is a chance, however miniscule, that the eight ball will randomly jump to the opposite side of the table faster than the classical speed of light in a vacuum.

If not, then only the initial microstate of the table is random, and then only for the purposes of the hypothetical. From there on out everything follows classical laws and the second law of thermodynamics as classically defined is never violated. We would still use statistical mechanics to describe systems where the internal state cannot be determined, but that is just a heuristic. The underlying reality would be classical.

~Max
  #65  
Old 05-01-2019, 06:48 PM
DPRK is offline
Guest
 
Join Date: May 2016
Posts: 3,033
Yes, the classical billiards studied by Boltzmann exhibits Poincaré recurrence, which may seem surprising but goes to show one needs to be careful how to formulate and interpret the second law of thermodynamics for a finite number of particles, as discussed above, and how it technically arises as in Boltzmann's eta-theorem.
  #66  
Old 05-02-2019, 12:08 AM
Half Man Half Wit's Avatar
Half Man Half Wit is offline
Guest
 
Join Date: Jun 2007
Posts: 6,672
Quote:
Originally Posted by Max S. View Post
The law would go something like this: "There is an equal probability that you will draw any particular marble from an urn." I think the generalized law is the law of total probability, combined with a discrete random distribution. This in turn depends on the marbles being picked randomly.

~Max
Right. So if I present you with two urns, each having five balls of different color inside, and ask you to pick from both; and you find that, picking from one, you get colors in proportion to their rate of occurrence, each 1/5 of the time, and from the other, you don't, but, say, obtain red 50% of the time, that wouldn't strike you as odd?

Quote:
Originally Posted by Max S. View Post
Here's what I don't understand. Is every microstate equiprobable at every moment? Is the probability of a microstate at time t constrained by the microstate at time t-1?
Fully, of course. These are deterministic systems: the microstate at t-1 completely fixes the microstate at t (and, in fact, at all future and past times, if the system just evolves in isolation).

The probabilities in statistical mechanics stem from ignorance. We don't have full knowledge of the microstate, and thus, we don't have full control over it. Setting a large enough system up, always means drawing a certain microstate from a distribution consistent with the macroscopic variables we can control. And, for virtually all (all but an impossibly tiny subset) of these initial states, if we wait long enough, we're gonna see entropy dipping down. As has in fact happened in the laboratory.

If you throw a die, it'll end up showing either number randomly with probability 1/6. If you trust that's what's going to happen, you should not have any trouble accepting the equiprobability of microstates---because it's exactly that. If the die fails to behave that way, at least in the limit of large numbers of throws, you'll call it unfair---you'll assume it has been tampered with in some way. That you're justified in concluding this is exactly because anything else than equiprobability would require some special intervention.

Quote:
If so, let's say the billiard table starts out with an initial microstate that would not classically lead to periodic motion. The distribution of billiard balls is somewhat homogenous. Step forward in time one moment. Surely there is a chance that the arrangement of billiard balls is now a special microstate which would classically lead to periodic motion.
No. If it's in a periodic microstate at t, it was in a periodic microstate at t-1, because the state at t-1 completely dictates the state at t. (Of course, as DPRK rightly points out, for any finite system, there is always at least Poincaré periodicity, but the sorts of periodic movement relevant here are cycles on much shorter timescales that don't visit every possible microstate.)

Quote:
If not, then only the initial microstate of the table is random, and then only for the purposes of the hypothetical. From there on out everything follows classical laws and the second law of thermodynamics as classically defined is never violated.
No. For almost any microstate of the system, classical, deterministic evolution of its constituents will lead to lower-entropy stats after having visited higher-entropy states. In the limit of continuous variables, this happens with probability 1 (i. e. the microstates for which this isn't true form a measure zero subset of all microstates you could set up initially).
  #67  
Old 05-02-2019, 01:58 AM
Mijin's Avatar
Mijin is offline
Guest
 
Join Date: Feb 2006
Location: Shanghai
Posts: 8,939
Quote:
Originally Posted by Lumpy View Post
Even if 2nd law reversals happen, it's still not possible to use these to do work, because even if you had some mechanism that could supposedly exploit such reversals, it's equally possible that random fluctuations could undo any temporary gains. It's sort of like blackjack: you can get a small temporary reversal against the dealer but in the long run you always lose.
This doesn't follow.

The reason we can't use entropy reversals to do useful work is because the chance of a thimbleful of water spontaneously developing a 1c temperature gradient is probably such that you'd need to wait many times the universe's current age to get even odds (I haven't done the math though)

But were such a fluctuation to happen, there's no reason it needs to behave any differently to any other heterogenous material at that point.
No reason why the system should "know" it needs to leap back to some former state, and so no likely reason you couldn't do work.

Sent from my Redmi 5A using Tapatalk

Last edited by Mijin; 05-02-2019 at 02:01 AM.
  #68  
Old 05-02-2019, 02:08 AM
Half Man Half Wit's Avatar
Half Man Half Wit is offline
Guest
 
Join Date: Jun 2007
Posts: 6,672
And this sort of thing, of course, routinely happens in the right circumstances. As per the above-cited wiki article on the fluctuation theorem:

Quote:
There are many important implications from the Fluctuation Theorem. One is that small machines (such as nanomachines or even mitochondria in a cell) will spend part of their time actually running in "reverse". What we mean with "reverse" is that it is possible to observe that these small molecular machines are able to generate work by taking heat from the environment.
  #69  
Old 05-02-2019, 03:15 AM
septimus's Avatar
septimus is offline
Guest
 
Join Date: Dec 2009
Location: The Land of Smiles
Posts: 19,119
Quote:
Originally Posted by Half Man Half Wit View Post
And this sort of thing, of course, routinely happens in the right circumstances. As per the above-cited wiki article on the fluctuation theorem:
The claim about mitochondria seems amazing, though I don't really understand it. The only citation in that Wikipedia section is to this 2006 paper. I hope someone above my paygrade will determine if that paper even makes the claim the above-quoted Wiki summary shows.
  #70  
Old 05-02-2019, 09:06 AM
Half Man Half Wit's Avatar
Half Man Half Wit is offline
Guest
 
Join Date: Jun 2007
Posts: 6,672
Quote:
Originally Posted by septimus View Post
The claim about mitochondria seems amazing, though I don't really understand it. The only citation in that Wikipedia section is to this 2006 paper. I hope someone above my paygrade will determine if that paper even makes the claim the above-quoted Wiki summary shows.
I searched the text, and the word 'mitochondria' doesn't appear. That's all the effort I'm really willing to expend on that...

On another note: Max S., what do you make of papers like this one?

Experimental Demonstration of Violations of the Second Law of Thermodynamics for Small Systems and Short Time Scales

Do you think the experimental methodology was suspect? Are they misinterpreting their results?
  #71  
Old 05-02-2019, 09:15 AM
Nava is offline
Member
 
Join Date: Nov 2004
Location: Hey! I'm located! WOOOOW!
Posts: 41,190
I haven't looked at the paper, but my 20€ are on "improperly isolated systems".
  #72  
Old 05-02-2019, 09:26 AM
Half Man Half Wit's Avatar
Half Man Half Wit is offline
Guest
 
Join Date: Jun 2007
Posts: 6,672
Quote:
Originally Posted by Nava View Post
I haven't looked at the paper, but my 20€ are on "improperly isolated systems".
So how do I collect?
  #73  
Old 05-02-2019, 10:00 AM
Nava is offline
Member
 
Join Date: Nov 2004
Location: Hey! I'm located! WOOOOW!
Posts: 41,190
First you begin by proving that the systems are properly defined and truly separated from any contact with the rest of the universe, then we talk ISBNs.
__________________
Evidence gathered through the use of science is easily dismissed through the use of idiocy. - Czarcasm.
  #74  
Old 05-02-2019, 10:30 AM
DPRK is offline
Guest
 
Join Date: May 2016
Posts: 3,033
That experiment purports to demonstrate in an experiment the applicability of the Fluctuation Theorem, which is — like Boltzmann's theorem — a mathematical theorem relying on certain assumptions. It is relevant to the entropy production rate of certain classical dynamical systems, which is actually interesting to put to experimental test because, we should not forget, all this mathematical discussion of idealized billiard balls and similar is supposed to have measurable and testable implications for real physical systems, like the Second Law of thermodynamics, the subject of this thread.

Back to the Fluctuation Theorem, it implies that in a system (satisfying certain assumptions...) that is not at equilibrium, so its entropy could either increase or decrease(?!), the ratio between the probability of positive and negative entropy production over any interval of time. In particular, it could happen that (only at all likely in a small volume and over a short interval) that the observed entropy production is negative. However, note that the average entropy production, no matter how small the volume or how short the observation time, is always positive. In the authors' own words, "the [Fluctuation Theorem]... is completely consistent with the Second Law of Thermodynamics, and it can be considered a generalization of this Law to small systems observed for short periods of time."

So, while one should of course critically examine the experimental setup (which is described in the paper, involving 100 latex particles in a 1.0 ml glass cell full of water), the reported results are completely consistent with the mathematical model, precisely the same type of mathematical model considered and used by Boltzmann and others to derive the Second Law of thermodynamics (which, again, is in fact observed in real life; if it be "routinely violated" then I have overlooked something in this discussion).
  #75  
Old 05-02-2019, 10:39 AM
DPRK is offline
Guest
 
Join Date: May 2016
Posts: 3,033
Quote:
Originally Posted by Nava View Post
First you begin by proving that the systems are properly defined and truly separated from any contact with the rest of the universe, then we talk ISBNs.
Before coughing up the dough, the system consists of latex powder in water in a glass cell being zapped with a laser and observed through a microscope. In their computer simulation, particles along the walls act as a momentum and heat source/sink. So it depends what you mean by "isolated".... (cf different possible "ensembles" in statistical mechanics)
  #76  
Old 05-02-2019, 12:23 PM
Half Man Half Wit's Avatar
Half Man Half Wit is offline
Guest
 
Join Date: Jun 2007
Posts: 6,672
Quote:
Originally Posted by Nava View Post
First you begin by proving that the systems are properly defined and truly separated from any contact with the rest of the universe, then we talk ISBNs.
Well, no. It's a published, peer reviewed scientific study in one of the highest regarded journals in physics; if you want to claim it's wrong, the burden of proof lies squarely on your end. I don't really think you could want it to be acceptable to claim, confronted with such a study, that it's just wrong until proven right.

Besides, it's 'just' an experimental confirmation of a well-studied theoretical result, which came out just as theory predicted. To have either the theory wrong, and the experiment as well, just in the right way to 'confirm' the theoretical prediction, or to have the experiment mistakenly show the predicted theoretical effect, would both be considerably less parsimonious explanations than just accepting that the experiment simply observed what it was set up to observe (and again, to assign validity to such rebuttals against scientific work would likewise requite assigning validity to all manner of creationist, climate skeptic, and other quackery).

DPRK, I think we have an issue of terminology here, so let's try to sort that out. The story I was taught is as follows. In studying the theoretical capacity of heat engines, people like Carnot and Clausius came up with certain laws bounding their efficiency, like the second law of thermodynamics. This can be stated in the form 'for an isolated system, entropy can never decrease'. This was mainly based on empirical evidence.

Later on, Boltzmann, Gibbs, Maxwell and so on formulated kinetic theory, which was able to derive the earlier laws as statistical predictions, valid on average, but not, as was thought previously, inviolable. Entropy can, in fact, spontaneously decrease. This, to me, is all that is meant by a violation of the second law---it was thought to be an exact, universally applicable law, but turned out to be merely statistical. (Which, perhaps counterintuitively, is exactly what makes it so---virtually---inviolable: I can imagine universes where things move faster than light, where there's no quantum theory, no weak or strong force, or what have you; but I can't imagine one in which there's no second law, because fundamentally, it just boils down to the fact that more likely things happen more often.)

I think you agree with me that entropy, even in a closed system, may spontaneously decrease, solely by virtue of the effectively random microdynamics. However, you don't want to call that a violation of the second law. Hence, I repeat my earlier question: do you appeal to some different formulation of the second law in that case, one that refers perhaps to the average entropy production?
  #77  
Old 05-02-2019, 12:35 PM
Half Man Half Wit's Avatar
Half Man Half Wit is offline
Guest
 
Join Date: Jun 2007
Posts: 6,672
Quote:
Originally Posted by Half Man Half Wit View Post
DPRK, I think we have an issue of terminology here, so let's try to sort that out. The story I was taught is as follows. In studying the theoretical capacity of heat engines, people like Carnot and Clausius came up with certain laws bounding their efficiency, like the second law of thermodynamics. This can be stated in the form 'for an isolated system, entropy can never decrease'. This was mainly based on empirical evidence.

Later on, Boltzmann, Gibbs, Maxwell and so on formulated kinetic theory, which was able to derive the earlier laws as statistical predictions, valid on average, but not, as was thought previously, inviolable. Entropy can, in fact, spontaneously decrease. This, to me, is all that is meant by a violation of the second law---it was thought to be an exact, universally applicable law, but turned out to be merely statistical.
I just went back and checked---the textbook from which I was taught statistical mechanics in fact does refer to downward fluctuations of entropy as violations of the second law, so if I'm somehow not up on current parlance there, I at least have an excuse.
  #78  
Old 05-02-2019, 03:49 PM
DPRK is offline
Guest
 
Join Date: May 2016
Posts: 3,033
Quote:
Originally Posted by Half Man Half Wit View Post
I just went back and checked---the textbook from which I was taught statistical mechanics in fact does refer to downward fluctuations of entropy as violations of the second law, so if I'm somehow not up on current parlance there, I at least have an excuse.
Is there some variation among textbooks, then? For instance, in Landau & Lifshitz (hardly current parlance!) we read:
Quote:
If a closed system is at some instant in a non-equilibrium macroscopic state, the most probable consequence at later instants is a steady increase in the entropy of the system. This is the law of increase of entropy or second law of thermodynamics, discovered by R. Clausius (1865); its statistical explanation was given by L. Boltzmann in the 1870s.

In speaking of the "most probable" consequence, we must remember that in reality the probability of transition to states of higher entropy is so enormous in comparison with that of any appreciable decrease in entropy that in practice the latter can never be observed in Nature. Ignoring decreases of entropy due to negligible fluctuations, we can therefore formulate the law of increase of entropy as follows: if at some instant the entropy of a closed system does not have its maximal value, then at subsequent instants the entropy will not decrease; it will increase or at least remain constant.
Note that we are considering macroscopic bodies/systems, not individual particles. Now, even if very improbable downward fluctuations of entropy are to be regarded as a violation of the second law, an important point is that "in practice... in Nature" nobody is ever going to observe such a violation. It wouldn't be much of a Law if they did.

Last edited by DPRK; 05-02-2019 at 03:52 PM.
  #79  
Old 05-02-2019, 09:12 PM
Max S. is offline
Guest
 
Join Date: Aug 2017
Location: Florida, USA
Posts: 528
Quote:
Originally Posted by DPRK View Post
Is there some variation among textbooks, then? For instance, in Landau & Lifshitz (hardly current parlance!) we read:

Note that we are considering macroscopic bodies/systems, not individual particles. Now, even if very improbable downward fluctuations of entropy are to be regarded as a violation of the second law, an important point is that "in practice... in Nature" nobody is ever going to observe such a violation. It wouldn't be much of a Law if they did.
Goodness, my "physics" textbook was some generic mid-2000s Prentice Hall textbook for physical science with pictures of balloons on the front. I don't think it mentioned thermodynamics at all, except how to convert centigrade to Fahrenheit.

~Max
  #80  
Old 05-02-2019, 09:23 PM
Max S. is offline
Guest
 
Join Date: Aug 2017
Location: Florida, USA
Posts: 528
Quote:
Originally Posted by Half Man Half Wit View Post
Right. So if I present you with two urns, each having five balls of different color inside, and ask you to pick from both; and you find that, picking from one, you get colors in proportion to their rate of occurrence, each 1/5 of the time, and from the other, you don't, but, say, obtain red 50% of the time, that wouldn't strike you as odd?
Odd? Very. Impossible? No. I've always been hesitant assigning absolute or near-absolute certainty to probability theory. It's like an educated guess.

You can't build absolute laws on probability and at least three formulations of the second law of thermodynamics are absolute laws: the three in the original post and the entropy corollary, using the classical definition of entropy. I have yet to be convinced that any of these laws are contradicted by theory or observation.

The only law that might be violated is the entropy version, using the statistical definition of entropy. But it would seem the original proof of that formulation relied on both Clausius's version of the second law and a classical definition of entropy. So I'm not sure the statistical-entropy version of the second law was ever a "law" to begin with.

~Max
  #81  
Old 05-02-2019, 09:26 PM
Max S. is offline
Guest
 
Join Date: Aug 2017
Location: Florida, USA
Posts: 528
Quote:
Originally Posted by Half Man Half Wit View Post
...the microstate at t-1 completely fixes the microstate at t (and, in fact, at all future and past times, if the system just evolves in isolation).
Ah, I think I am beginning to understand. It must have been "random hopping" that threw me off, because all this time I had thought you mean equiprobability of microstates at every instant, independent of microstate at the former instant. Forgive me for the misunderstanding.

I am perfectly fine allowing for equiprobability of the initial microstate of a toy thermodynamic system. Now we can return to the main question: is the second law of thermodynamics routinely violated? Also, which version is violated and why?

~Max
  #82  
Old 05-02-2019, 10:39 PM
DPRK is offline
Guest
 
Join Date: May 2016
Posts: 3,033
Quote:
Originally Posted by Max S. View Post
Goodness, my "physics" textbook was some generic mid-2000s Prentice Hall textbook for physical science with pictures of balloons on the front. I don't think it mentioned thermodynamics at all, except how to convert centigrade to Fahrenheit.
So we shouldn't judge a book by its cover? Because balloons sound pretty thermodynamic. Automatically mistrust any text that doesn't come in plain buckram?
  #83  
Old 05-02-2019, 10:59 PM
DPRK is offline
Guest
 
Join Date: May 2016
Posts: 3,033
Quote:
Originally Posted by Max S. View Post
Ah, I think I am beginning to understand. It must have been "random hopping" that threw me off, because all this time I had thought you mean equiprobability of microstates at every instant, independent of microstate at the former instant. Forgive me for the misunderstanding.
Boltzmann introduced a "molecular chaos" hypothesis to deal with entropy production in time reversible systems. You need some ergodic condition like that to prove his H-theorem, or a Fluctuation theorem.
  #84  
Old 05-02-2019, 11:38 PM
Max S. is offline
Guest
 
Join Date: Aug 2017
Location: Florida, USA
Posts: 528
Quote:
Originally Posted by DPRK View Post
So we shouldn't judge a book by its cover? Because balloons sound pretty thermodynamic. Automatically mistrust any text that doesn't come in plain buckram?
I guess you're right. I remember knowing things like how to read a thermometer, temperature conversion, the three states of matter, melting and boiling points and pressure, conduction vs convection vs radiation, and somehow the parts and process of a four-stroke engine. But I don't remember if that came from class or a video game called Genius Physics.

~Max
  #85  
Old 05-02-2019, 11:59 PM
Half Man Half Wit's Avatar
Half Man Half Wit is offline
Guest
 
Join Date: Jun 2007
Posts: 6,672
Quote:
Originally Posted by DPRK View Post
Is there some variation among textbooks, then? For instance, in Landau & Lifshitz (hardly current parlance!) we read:
Maybe, but L&L there basically say that they'll ignore downward fluctuations, because of them being impossible to observe, and formulate the second law on that basis; this isn't in tension with the claim that when such rare fluctuations occur, the second law is violated.

Quote:
Originally Posted by Max S. View Post
Odd? Very. Impossible? No.
Then tell me how it could be. Tell me how, for a system that's fully described as 'five differently colored balls in an urn' (that is, without adding any ad-hoc explanations, such as 'maybe the red ball is attracted to my hand'), it could be that the probability of drawing either color is different from 1/5.

Quote:
You can't build absolute laws on probability and at least three formulations of the second law of thermodynamics are absolute laws: the three in the original post and the entropy corollary, using the classical definition of entropy. I have yet to be convinced that any of these laws are contradicted by theory or observation.
So, are you just choosing Nava's route of ignoring experimental results you don't want to accept?

Quote:
The only law that might be violated is the entropy version, using the statistical definition of entropy. But it would seem the original proof of that formulation relied on both Clausius's version of the second law and a classical definition of entropy. So I'm not sure the statistical-entropy version of the second law was ever a "law" to begin with.

~Max
The different formulations are equivalent. If the entropy of a closed system can fluctuate downwards, then a colder body can heat up a hotter one, for example (again, see the experiment I quoted above).

Quote:
Originally Posted by Max S. View Post
Ah, I think I am beginning to understand. It must have been "random hopping" that threw me off, because all this time I had thought you mean equiprobability of microstates at every instant, independent of microstate at the former instant. Forgive me for the misunderstanding.

I am perfectly fine allowing for equiprobability of the initial microstate of a toy thermodynamic system. Now we can return to the main question: is the second law of thermodynamics routinely violated? Also, which version is violated and why?

~Max
Yes, it's violated (albeit 'routinely' for macroscopic systems being, as has been stressed, about once in several lifetimes of the universe), all of its versions are, and the reason is that the second law only applies on a statistical basis.

Last edited by Half Man Half Wit; 05-02-2019 at 11:59 PM.
  #86  
Old 05-03-2019, 08:38 AM
doubleminus is offline
Guest
 
Join Date: Jan 2013
Posts: 323
So, going back a little, is Bolzmann's own brain an instance of a Boltzmann brain ? And together with everyone else's brains ? Seems a bit too high a probability. So maybe very improbable states are much too common.

Now, of course, we have evolution, powered by an inflow of energy and acting as an invisible hand directing matter towards preferred states. However this seemed always to me as a weaselly explanation - the inflow of energy is necessary, but by far not sufficient. So, my question is a re-framing of the Fermi paradox: is the start of an evolutionary chain a thermodynamic low-entropy random fluctuation? And taking into account the self-sustaining and self-amplificatory characteristic of this low-entropy fluctuation, and supposing we could look (really) far away into the future should this have any effect on the homogeneity of the (very) future universe ?

Maybe we can start simpler. Let's say that a quirky random fluctuation somewhere near an energy source (star ?) creates something much simpler, like a Maxwell Daemon - or just an air conditioner. What happens then with probability distributions over time in that portion of space ?
  #87  
Old 05-03-2019, 10:29 AM
Max S. is offline
Guest
 
Join Date: Aug 2017
Location: Florida, USA
Posts: 528
Quote:
Originally Posted by doubleminus View Post
... a thermodynamic low-entropy random fluctuation ...
Does such a thing exist without assuming equiprobability of microstates at every instant, without regard to previous microstates?

~Max
  #88  
Old 05-03-2019, 10:49 AM
Half Man Half Wit's Avatar
Half Man Half Wit is offline
Guest
 
Join Date: Jun 2007
Posts: 6,672
Quote:
Originally Posted by doubleminus View Post
So, my question is a re-framing of the Fermi paradox: is the start of an evolutionary chain a thermodynamic low-entropy random fluctuation?
There's many interesting ideas around thermodynamics and the origin of life; but, for the most part, they actually center around life being a very efficient way to increase (overall) entropy, and thus, being thermodynamically favored, see e. g. here.

Quote:
Originally Posted by Max S. View Post
Does such a thing exist without assuming equiprobability of microstates at every instant, without regard to previous microstates?

~Max
I sense the sticking point here is the notion of 'random'. These downward flucturations are random with respect to the macrostate, in the sense that knowing the macrostate does not allow you to predict that a downward fluctuation will occur; but they're completely deterministic on the level of the microstate. Picture something like a gas evenly spread out in a box, with all the gas molecule's velocities aimed towards the center: from the macroscopic level, taking the gas as being described by its volume, temperature, and pressure, you see nothing but an equilibrium, with no indication that anything's going to happen but it remaining in equilibrium; but in fact, the gas molecules will bunch up at the center, thus decreasing entropy.
  #89  
Old 05-03-2019, 11:24 AM
Max S. is offline
Guest
 
Join Date: Aug 2017
Location: Florida, USA
Posts: 528

Macroscopic vs microscopic


Quote:
Originally Posted by Half Man Half Wit View Post
Quote:
Originally Posted by Max S. View Post
I propose redefining macroscopic to mean "of the whole" and microscopic to mean "of the constituent parts". That way the answers are clear-cut.
Neither 'whole' nor 'parts' have any objective meaning, though. Is the whole the whole table? If so, are its parts the legs and the top, or the atoms that constitute them? If it's the atoms, is the whole they form the leg they're part of, or the table the leg is a part of?
The whole is the whole table, because the thermodynamic system under discussion is the whole table. Both the legs of the table and their individual constituent atoms are "parts" of the whole table. Individual atoms could also be "parts" of the legs of the table, where the leg of the table is a sub-system of the whole table. I'm willing to drop this and go on using your definitions now that you have clarified them.

~Max
  #90  
Old 05-03-2019, 11:34 AM
Max S. is offline
Guest
 
Join Date: Aug 2017
Location: Florida, USA
Posts: 528

Is the count of particles on the left half of the table a macroscopic property of the whole table?


Quote:
Originally Posted by Half Man Half Wit View Post
Quote:
Originally Posted by Max S. View Post
I would like to press this issue. When dealing with a thermodynamic system, I thought properties of the system must represent the system as a whole, dependent on the internal state but not necessarily on a specific internal state.

Specifically, you should be able to measure a system property without ascertaining the internal state. Would you agree?
I don't know what you mean by that. What's the internal state?

Thermodynamic quantities are aggregate observables, which tell you something about the state of the system, but not all you could know. It's a statistical description if you have limited knowledge about the system.
That is confusing - I know what I meant but let me rephrase:

When dealing with a thermodynamic system, I thought properties of the system must represent the system as a whole, dependent on the microscopic state but not necessarily on a specific microscopic state.

Quote:
Originally Posted by Half Man Half Wit View Post
'Mid-layer' simply isn't a meaningful term here. You can consider any system at different levels of exactness; the quantities you use to describe it, including the entropy, will depend on the level you're considering it at.
You cannot measure half of a thermodynamic system without defining a boundary between the one half and the other. The boundary need not have impermeable walls and it need not be isolated. It could be an imaginary line through the middle of the table. As soon as you define the boundary, you have defined a fully functional "mid-level" thermodynamic sub-system with its own set of thermodynamic state variables, including temperature and entropy and yes, the number of particles.

I say mid-level because our frame of reference is the table as a whole system. The fundamental level would be atomic*, or at least however specific we wish to go. Surely you can understand my description of systems between the frame of reference and the fundamental level as mid-level abstractions, or sub-systems?

* atomic as in indivisible, not necessarily atomic theory

Quote:
Originally Posted by Half Man Half Wit View Post
A microscopic property gives you the full information about the system. A macroscopic one doesn't, but instead, refers to the system in aggregate. That is, the microstate of an ideal gas would be the positions and momenta of all the particles; a macrostate is anything that leaves out some of that information.
By this definition a every macroscopic property belongs to a thermodynamic system. "Macroscopic property" is the same as saying "state variable" as I learned thermodynamics.

Therefore, the number of particles on the left side of the table is not a macroscopic property of the table in aggregate, but a macroscopic property of the left side of the table in aggregate. The reason for this is that determining which particles are on the left side of the table depends on microscopic the position of those particles. If you can only measure the system as a whole, you can not observe which particles are on the left side and therefore that count is not a course-grained observable.

If you fail to admit the left side of the table as a thermodynamic system, then by your own definitions there is no such property as "the number of particles on the left side of the table".

~Max
  #91  
Old 05-03-2019, 12:02 PM
Half Man Half Wit's Avatar
Half Man Half Wit is offline
Guest
 
Join Date: Jun 2007
Posts: 6,672
Quote:
Originally Posted by Max S. View Post
Therefore, the number of particles on the left side of the table is not a macroscopic property of the table in aggregate, but a macroscopic property of the left side of the table in aggregate.
The point is that a macrostate is a set of quantities---however defined---that give you information about the aggregate properties of a system. (number of particles in the left half, number of particles in the right half) is a perfectly good description of the system.
  #92  
Old 05-03-2019, 12:24 PM
Max S. is offline
Guest
 
Join Date: Aug 2017
Location: Florida, USA
Posts: 528

Improbable vs Impossible


Quote:
Originally Posted by Half Man Half Wit View Post
Quote:
Originally Posted by Max S. View Post
Quote:
Originally Posted by Half Man Half Wit View Post
Right. So if I present you with two urns, each having five balls of different color inside, and ask you to pick from both; and you find that, picking from one, you get colors in proportion to their rate of occurrence, each 1/5 of the time, and from the other, you don't, but, say, obtain red 50% of the time, that wouldn't strike you as odd?
Odd? Very. Impossible? No.
Then tell me how it could be. Tell me how, for a system that's fully described as 'five differently colored balls in an urn' (that is, without adding any ad-hoc explanations, such as 'maybe the red ball is attracted to my hand'), it could be that the probability of drawing either color is different from 1/5.
If two urns contain five balls each, and each urn has exactly one red ball, the probability of randomly picking red from the second urn is 20%.

The probability of randomly picking red from the second urn every time for n trials is 20%n. The probability of randomly picking red from the second urn half of the time for n trials is 20%n/2. With 10 trials that's 1.024x10-5%. The probability of picking red from the second urn in at least 500 out of 1,000 trials is about 3x10-348%. Improbable. Unlikely. But by definition, not impossible.

Improbable things happen all the time, just not on purpose. I'm not sure if that answers your question, or where you are going with this.

~Max
  #93  
Old 05-03-2019, 12:47 PM
Max S. is offline
Guest
 
Join Date: Aug 2017
Location: Florida, USA
Posts: 528
Quote:
Originally Posted by Half Man Half Wit View Post
The point is that a macrostate is a set of quantities---however defined---that give you information about the aggregate properties of a system. (number of particles in the left half, number of particles in the right half) is a perfectly good description of the system.
But you must understand that changes in the number of particles on the left half of the table does not constitute a change in the aggregate temperature, heat, volume, pressure, number of (aggregate) particles, or classical entropy of the table.

Therefore the only form of the second law of thermodynamics that could possibly be violated when a particle moves within an isolated system is the entropy formulation using a statistical definition of entropy. I'm not sure how anyone justified that formulation to begin with.

~Max
  #94  
Old 05-03-2019, 12:52 PM
Max S. is offline
Guest
 
Join Date: Aug 2017
Location: Florida, USA
Posts: 528
Quote:
Originally Posted by Half Man Half Wit View Post
I sense the sticking point here is the notion of 'random'. These downward flucturations are random with respect to the macrostate, in the sense that knowing the macrostate does not allow you to predict that a downward fluctuation will occur; but they're completely deterministic on the level of the microstate. Picture something like a gas evenly spread out in a box, with all the gas molecule's velocities aimed towards the center: from the macroscopic level, taking the gas as being described by its volume, temperature, and pressure, you see nothing but an equilibrium, with no indication that anything's going to happen but it remaining in equilibrium; but in fact, the gas molecules will bunch up at the center, thus decreasing entropy.
That makes sense, but only when using the statistical definition of entropy. None of the versions of the second law of thermodynamics in the original post are contradicted by such a gas. Those laws are absolute and apply at every level of detail.

It seems that statistical mechanics is a useful but imprecise abstraction of the underlying classical reality.

~Max

Last edited by Max S.; 05-03-2019 at 12:54 PM.
  #95  
Old 05-03-2019, 01:07 PM
Half Man Half Wit's Avatar
Half Man Half Wit is offline
Guest
 
Join Date: Jun 2007
Posts: 6,672
Quote:
Originally Posted by Max S. View Post
But you must understand that changes in the number of particles on the left half of the table does not constitute a change in the aggregate temperature, heat, volume, pressure, number of (aggregate) particles, or classical entropy of the table.

Therefore the only form of the second law of thermodynamics that could possibly be violated when a particle moves within an isolated system is the entropy formulation using a statistical definition of entropy. I'm not sure how anyone justified that formulation to begin with.

~Max
The formulation is justified because what's postulated based on empirical observations in thermodynamics can be derived as theorems valid in the statistical limit from statistical mechanics. Additionally, statistical mechanics predicts entropy fluctuating downwards, and heat flowing from a colder to a hotter body, with a certain probability. These predictions are empirically confirmed. Hence, the earlier formulations are merely approximations of the more fundamental statistical notions.

Quote:
Originally Posted by Max S. View Post
That makes sense, but only when using the statistical definition of entropy. None of the versions of the second law of thermodynamics in the original post are contradicted by such a gas. Those laws are absolute and apply at every level of detail.

It seems that statistical mechanics is a useful but imprecise abstraction of the underlying classical reality.

~Max

Statistical mechanics is the more fundamental theory, and explains the empirical findings of thermodynamics, completing the theory at the microscopic level.

Last edited by Half Man Half Wit; 05-03-2019 at 01:08 PM.
  #96  
Old 05-03-2019, 01:15 PM
Max S. is offline
Guest
 
Join Date: Aug 2017
Location: Florida, USA
Posts: 528
Quote:
Originally Posted by Half Man Half Wit View Post
By the way, is your username just a happy accident, or are you 'Max S.' exactly because of your belief that entropy always only tends to the max?
Neither, I don't believe entropy always only tends to the max. The classical definition of entropy is that it never changes in an isolated system.

I still don't see how you can get to "in an isolated system, entropy tends to increase over time" from the statistical definition of entropy.

~Max
  #97  
Old 05-03-2019, 01:17 PM
Half Man Half Wit's Avatar
Half Man Half Wit is offline
Guest
 
Join Date: Jun 2007
Posts: 6,672
The equivalence of the classical thermodynamics form of the second law and the statistical mechanics form really isn't hard to see. Consider a gas that's a mixture of hot and cold, fast and slow, molecules (it will, of course, in reality contain particles of a continuous Maxwell-Boltzmann distribution of velocities, but let's idealize here). Now, there are more microstates realizing the configuration where both sorts of particles are evenly distributed than there are microstates realizing the configuration where all the 'hot' particles are on the left side, and all the 'cold' particles are on the right side. All the cold particles moving right, and the hot particles moving left, is therefore a reduction in entropy---both statistical (since we're going from a macrostate with many microscopic realizations to one with fewer) and in terms of heat transfer (since heat flows from a colder system to a hotter one).

As you may recall, this is the setup of Maxwell's demon: he sits at the boundary between both sides, sorting hot particles to one, and cold particles to the other side. The trick is now simply that we don't need the demon at all: the whole thing can happen purely by chance, should all the molecule's velocities align in the right way; which they will, for generic initial conditions, after you've waited long enough.
  #98  
Old 05-03-2019, 01:20 PM
Half Man Half Wit's Avatar
Half Man Half Wit is offline
Guest
 
Join Date: Jun 2007
Posts: 6,672
Quote:
Originally Posted by Max S. View Post
Neither, I don't believe entropy always only tends to the max. The classical definition of entropy is that it never changes in an isolated system.
That's flat wrong. Take a room of gas, and pile up all the molecules in the left corner; this is an isolated system, and what's gonna happen is that all the gas will soon fill the room, until the whole is filled, which is the state of maximum entropy for the system.

Quote:
I still don't see how you can get to "in an isolated system, entropy tends to increase over time" from the statistical definition of entropy.

~Max
Simple: there's more ways to increase than decrease the entropy, thus, any given change is more likely to increase it, and hence, on average, the entropy will increase.
  #99  
Old 05-03-2019, 01:26 PM
Max S. is offline
Guest
 
Join Date: Aug 2017
Location: Florida, USA
Posts: 528
Quote:
Originally Posted by Half Man Half Wit View Post
The formulation is justified because what's postulated based on empirical observations in thermodynamics can be derived as theorems valid in the statistical limit from statistical mechanics. Additionally, statistical mechanics predicts entropy fluctuating downwards, and heat flowing from a colder to a hotter body, with a certain probability.
How is it derived and from what is the difference between deriving and justification? It seems to me you must either uphold the fundamental axiom of equiprobability of microstate at every instant regardless of previous state, thereby severing causality and reducing all physical laws to "very likely", or you must admit that statistical mechanics is but an approximation of an underlying reality.

And if statistical mechanics predicts a violation of the classical second law of thermodynamics, I would like to hear an explanation.

~Max
  #100  
Old 05-03-2019, 01:34 PM
Half Man Half Wit's Avatar
Half Man Half Wit is offline
Guest
 
Join Date: Jun 2007
Posts: 6,672
Quote:
Originally Posted by Max S. View Post
How is it derived and from what is the difference between deriving and justification? It seems to me you must either uphold the fundamental axiom of equiprobability of microstate at every instant regardless of previous state, thereby severing causality and reducing all physical laws to "very likely", or you must admit that statistical mechanics is but an approximation of an underlying reality.
Neither. The statistical part comes in because we have incomplete information about the underlying microstate, and thus, can't make certain predictions. It is, in the end, not in any way different or more complicated than assigning a probability of 1/6 to a thrown die showing any given number.

Perhaps as an analogy: thermodynamics tells you, after having observed lots of dice throws, that the die shows each number 1/6 of the time; statistical mechanics models the way the dice gets thrown, its possible trajectories, and derives that it's gonna come up with every number once in six throws.

Quote:
And if statistical mechanics predicts a violation of the classical second law of thermodynamics, I would like to hear an explanation.

~Max
See my first post in this thread. Also, the post I made before this one. And in fact pretty much every post in between.
Reply

Bookmarks

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off

Forum Jump


All times are GMT -5. The time now is 04:10 AM.

Powered by vBulletin® Version 3.8.7
Copyright ©2000 - 2019, vBulletin Solutions, Inc.

Send questions for Cecil Adams to: cecil@straightdope.com

Send comments about this website to: webmaster@straightdope.com

Terms of Use / Privacy Policy

Advertise on the Straight Dope!
(Your direct line to thousands of the smartest, hippest people on the planet, plus a few total dipsticks.)

Copyright © 2018 STM Reader, LLC.

 
Copyright © 2017