Reply
 
Thread Tools Display Modes
  #1  
Old 12-03-2019, 01:09 PM
pigtwo is offline
Guest
 
Join Date: Apr 2019
Posts: 31

Entropy - Why can't it decrease


I've been reading a pop-sci book and it's gotten me thinking about entropy and the second law of thermodynamics. One thing I've never fully understood why it is said that entropy can never decrease for a closed system. My understanding of entropy is that entropy is the number of microstates that produce a certain macrostate. IE A system where lots of microstates produce the same macrostate is high in entropy where as a system with only a few microstates that can produce that macrostate is low entropy.

The second law of thermodynamics states that systems will tend towards higher entropy states via random processies. The processies individually do not prefer any states and essentially just act as random perturbation to the system. So if we take a low entropy state and apply a random perturbation to it, it is more likely to end up in a higher entropy state merely because there are just more configurations that result in a higher entropy state.

The example that comes to mind is keeping neatly wrapped up headphones in your pocket. When you first put headphones in your pocket nicely coiled up they are in a low entropy state because for all the configurations of the cable only a small number are nicely coiled up. Much more states are tangled and a mess. So when it starts in the low entropy coiled state and you walking or jostling it applies random changes to the cord and the cord will essentially select a new random state. Since there more knotted states than non-knotted you would expect the headphones to have a higher likelihood of coming out knotted.

So to get to the part I don't understand. It appears to me that everything here is probabilistic. There is nothing preventing a knotted headphone cable from randomly forming into a nicely coiled state. It's just very unlikely. And it's more unlikely that it would continue to stay in that state. But it's not impossible. It could just stay in that state forever. (However unlikely).

So what gives? It seems like entropy can decrease, it's just unlikely to do so. I don't expect to collect my Nobel Prize for shattering all of physics any time soon. So there must be something I'm missing here.
  #2  
Old 12-03-2019, 01:26 PM
mixdenny is offline
Guest
 
Join Date: Apr 2016
Location: Cleveland suburbs
Posts: 2,077
I know I should let Chronos handle this but here goes. I'm not sure if a coiled cord is a good example of entropy or not. But it took energy to make it coiled and that energy came from you. And if somehow it started out messy and ended up coiled by walking around, you are still adding energy by walking.

And your energy came from eating beef and grain producing heat and energy and that beef and grain needed tractors to plow and harvest, machinery to process and refrigerators to cool it, trucks to deliver it, on and on. And all that accumulated energy is now GONE. Dumped into a cord that you just had to have neatly coiled. All that resulted in a huge increase in entropy vs the small decrease from coiling it.

But I bet the low entropy state of the cord is laying in a straight line rather than coiled. After all, it is probably trying to straighten itself, you are forcing it to be coiled. OK, now someone who really knows the answer can come along.

Dennis
  #3  
Old 12-03-2019, 01:30 PM
Shodan is offline
Charter Member
 
Join Date: Jul 2000
Location: Milky Way Galaxy
Posts: 40,199
No, you aren't missing anything. It is possible, just very, very, very unlikely, that enough random perturbations in a system will lead to a lower entropy state.

Another thing to keep in mind is the number of interactions involved. It is possible to flip a coin ten times and get ten heads in a row. It is also possible, but much less likely, to get a thousand heads in a row. It is also possible to flip a coin once a second, every second, since the Big Bang, and have it come up heads every time. But not very likely.

Regards,
Shodan
  #4  
Old 12-03-2019, 01:56 PM
Machine Elf is offline
Guest
 
Join Date: Feb 2007
Location: Challenger Deep
Posts: 12,401
Quote:
Originally Posted by mixdenny View Post
I know I should let Chronos handle this but here goes. I'm not sure if a coiled cord is a good example of entropy or not. But it took energy to make it coiled and that energy came from you. And if somehow it started out messy and ended up coiled by walking around, you are still adding energy by walking.

And your energy came from eating beef and grain producing heat and energy and that beef and grain needed tractors to plow and harvest, machinery to process and refrigerators to cool it, trucks to deliver it, on and on. And all that accumulated energy is now GONE. Dumped into a cord that you just had to have neatly coiled. All that resulted in a huge increase in entropy vs the small decrease from coiling it.

But I bet the low entropy state of the cord is laying in a straight line rather than coiled. After all, it is probably trying to straighten itself, you are forcing it to be coiled. OK, now someone who really knows the answer can come along.

Dennis
You can have a closed system (i.e. no heat or work moving across system boundary) with a fixed amount of energy, and it could exist in any of a number of different entropy levels. Example: a hot brick next to a cold brick, no heat exchange with the rest of the universe. With hot brick hot and cold brick cold, this is a low-entropy condition; you could extract mechanical work from this configuration using a heat engine. But if you wait a while the hot brick will warm the cold brick until they are at the same temperature. This is a high-entropy condition, and you cannot extract useful mechanical work from it with a heat engine.

So now use this isothermal pair of bricks to pose the OP's question: what prevents a particular random occurrence of atomic collisions from resulting in one block spontaneously becoming hotter than the other, resulting in a reduction of entropy in this closed system and once again allowing a heat engine to produce work?
  #5  
Old 12-03-2019, 02:08 PM
Shodan is offline
Charter Member
 
Join Date: Jul 2000
Location: Milky Way Galaxy
Posts: 40,199
Nothing prevents it; it could happen. It might take a few billion years, but it could happen. It is also possible that both bricks will randomly heat up to the same temperature, so that no work could be produced. It is also possible that it never happens until the heat death of the universe.

I have even heard theories that, after the heat death of the universe, everything sits around until things randomly fluctuate back into a low-entropy state and the Big Bang happens again.

Regards,
Shodan
  #6  
Old 12-03-2019, 02:18 PM
octopus's Avatar
octopus is offline
Guest
 
Join Date: Apr 2015
Posts: 9,356
Quote:
Originally Posted by mixdenny View Post
I know I should let Chronos handle this but here goes. I'm not sure if a coiled cord is a good example of entropy or not. But it took energy to make it coiled and that energy came from you. And if somehow it started out messy and ended up coiled by walking around, you are still adding energy by walking.

And your energy came from eating beef and grain producing heat and energy and that beef and grain needed tractors to plow and harvest, machinery to process and refrigerators to cool it, trucks to deliver it, on and on. And all that accumulated energy is now GONE. Dumped into a cord that you just had to have neatly coiled. All that resulted in a huge increase in entropy vs the small decrease from coiling it.

But I bet the low entropy state of the cord is laying in a straight line rather than coiled. After all, it is probably trying to straighten itself, you are forcing it to be coiled. OK, now someone who really knows the answer can come along.

Dennis
The energy is not going anywhere. Energy is conserved.

Entropy tends to increase in closed systems because there are more configurations of states that have higher homogeneity (reduced gradients).
  #7  
Old 12-03-2019, 02:39 PM
septimus's Avatar
septimus is online now
Guest
 
Join Date: Dec 2009
Location: The Land of Smiles
Posts: 20,160
I've given my answer the last 2 times the question arose. This time, I'll just mention Isaac Asimov's answer, which can be found on the 'Net by Googling "The Last Question by Isaac Asimov."
  #8  
Old 12-03-2019, 03:31 PM
pigtwo is offline
Guest
 
Join Date: Apr 2019
Posts: 31
I feel like maybe I'm misinterpreting something in the 2nd law of thermodynamics(or maybe there is some semantic thing I'm missing). The first line of the wiki on the 2nd law of thermodynamics is "The second law of thermodynamics states that the total entropy of an isolated system can never decrease over time...". That seems like it's incorrect. It would appear that entropy can decrease. It's insanely unlikely but there is an enormous difference between insanely unlikely and not possible. I feel like when you read about decreasing entropy it is seen as something like exceeding the speed of light in that it's completely impossible. IE Total entropy monotonically increases.

Maybe the interpretation is that if you're in the system you cannot force it into a lower entropy state. But that lower entropy state could come about randomly.
  #9  
Old 12-03-2019, 03:50 PM
Triskadecamus is offline
Charter Member
 
Join Date: Oct 1999
Location: I'm coming back, now.
Posts: 7,601
For a coiled headphone cable three verys might be a marginally forgivable understatement of how unlikely that completely recoiling might occur. For one glass of ice resulting from room temperature water, you need a whole lot more verys. Even Carl Sagan missquotes over all of recorded uses of the word billions in his lifetime aren't enough verys for how unlikely it would be. The glass of water is a very very very very very very very very very very very small fraction of the observed universe at any given picosecond. Figure out how many verys you need for the whole universe. Don't try the math at home.
  #10  
Old 12-03-2019, 05:40 PM
Nava is offline
Member
 
Join Date: Nov 2004
Location: Hey! I'm located! WOOOOW!
Posts: 43,053
Quote:
Originally Posted by pigtwo View Post
"The second law of thermodynamics states that the total entropy of an isolated system can never decrease over time...".
Underline mine. The entropy can decrease in a short period, but give the system a bit longer and entropy will increase again. Your confusion seems to me to stem from missing the underlined bit: the 2nd law does not state that entropy can never decrease at all, but that given enough time it will not decrease (it doesn't even say that it will always increase: a given system will have an entropy maximum).
__________________
Some people knew how to kill a conversation. Cura, on the other hand, could make it wish it had never been born.

Last edited by Nava; 12-03-2019 at 05:41 PM.
  #11  
Old 12-03-2019, 05:41 PM
Dewey Finn is offline
Charter Member
 
Join Date: Apr 2003
Posts: 29,249
Quote:
Originally Posted by septimus View Post
I've given my answer the last 2 times the question arose. This time, I'll just mention Isaac Asimov's answer, which can be found on the 'Net by Googling "The Last Question by Isaac Asimov."
Insufficient data for meaningful answer?
  #12  
Old 12-03-2019, 05:56 PM
Senegoid is offline
Guest
 
Join Date: Sep 2011
Location: Sunny California
Posts: 14,964
Quote:
Originally Posted by pigtwo View Post
I feel like maybe I'm misinterpreting something in the 2nd law of thermodynamics(or maybe there is some semantic thing I'm missing). The first line of the wiki on the 2nd law of thermodynamics is "The second law of thermodynamics states that the total entropy of an isolated system can never decrease over time...". That seems like it's incorrect. It would appear that entropy can decrease. It's insanely unlikely but there is an enormous difference between insanely unlikely and not possible. I feel like when you read about decreasing entropy it is seen as something like exceeding the speed of light in that it's completely impossible. IE Total entropy monotonically increases.
(My emphasis added.)

Douglas Adams may be relevant here:
Quote:
Originally Posted by Douglas Adams
The impossible often has a kind of integrity which the merely improbable lacks.
__________________
=========================================
  #13  
Old 12-03-2019, 07:16 PM
pigtwo is offline
Guest
 
Join Date: Apr 2019
Posts: 31
Quote:
Originally Posted by Nava View Post
Underline mine. The entropy can decrease in a short period, but give the system a bit longer and entropy will increase again. Your confusion seems to me to stem from missing the underlined bit: the 2nd law does not state that entropy can never decrease at all, but that given enough time it will not decrease (it doesn't even say that it will always increase: a given system will have an entropy maximum).
Why is it restricted to short time scales? What forces entropy to eventually increase on long time scales but not short?. From my understanding entropy increasing is probabilistic. So if we start in some low entropy state there is a very high likelihood that it will transition into a higher entropy state. But there is a very small chance that it just stays in that low entropy state. Or it even could transition into a lower entropy state. I don't see any reason it couldn't do this indefinitely. As stated before the probability of that happening is indescribably unlikely but it's not impossible.

I kind of feel like the second law is really 'Entropy will probably increase'. Maybe I'm just nitpicking but the law has so much weight in physics it feels a little odd for it to have such a weak point to make(IE improbable vs impossible).
  #14  
Old 12-03-2019, 08:34 PM
Exapno Mapcase is online now
Charter Member
 
Join Date: Mar 2002
Location: NY but not NYC
Posts: 31,839
Stackexchange got the same question: Is there any proof for the 2nd law of thermodynamics?

The short answer is: it's complicated and depends on some technical assumptions. You should click on the link there about Boltzmann's H Theorem. But the discussion is mostly in English.
  #15  
Old 12-03-2019, 09:08 PM
Hari Seldon is offline
Member
 
Join Date: Mar 2002
Location: Trantor
Posts: 13,227
First I want to comment on the heat death of the universe. It is asymptotic. The amount of free energy (energy available for work) decreases but never hits 0. One consequence is that anything that can happen now could happen at any future time; it just takes much longer as the free energy declines.

My second point is that infinity is a loooong time and that anything that can happen in a finite time will almost certainly happen eventually. Including rolling a googol of heads in a row. In particular a big bang producing a system of minimum entropy.
  #16  
Old 12-03-2019, 10:31 PM
DPRK is offline
Guest
 
Join Date: May 2016
Posts: 4,042
Quote:
Originally Posted by pigtwo View Post
Why is it restricted to short time scales? What forces entropy to eventually increase on long time scales but not short?. From my understanding entropy increasing is probabilistic. So if we start in some low entropy state there is a very high likelihood that it will transition into a higher entropy state. But there is a very small chance that it just stays in that low entropy state. Or it even could transition into a lower entropy state. I don't see any reason it couldn't do this indefinitely. As stated before the probability of that happening is indescribably unlikely but it's not impossible.

I kind of feel like the second law is really 'Entropy will probably increase'. Maybe I'm just nitpicking but the law has so much weight in physics it feels a little odd for it to have such a weak point to make(IE improbable vs impossible).
I think you do not quite grasp the magnitude of the probabilities involved. Like, is there anything keeping you from tossing heads on a fair coin 101010 times in a row? No? Then try it, and see if mysterious physical forces keep you from making progress "indefinitely".
  #17  
Old 12-04-2019, 01:18 AM
Jragon's Avatar
Jragon is offline
Guest
 
Join Date: Mar 2007
Location: Miskatonic University
Posts: 10,708
The difference between "short" and "long" here is largely the difference between "finite" and "infinite". Like, you could argue that the existence of life and a climate on this planet is essentially a big low-entropy pocket, one that's lasted billions of years, but ultimately a pocket on an infinite time scale.

Now, for every second you add onto your calculation, the chance of it being lower entropy than the starting point is much smaller, so "short" vs "long" is also some sense true there given the magnitude of the probabilities, but here the short/long distinction is largely with regard to infinities, or at best, cosmic time scales.

Last edited by Jragon; 12-04-2019 at 01:19 AM.
  #18  
Old 12-04-2019, 01:35 AM
ASL v2.0's Avatar
ASL v2.0 is offline
Guest
 
Join Date: Jul 2019
Location: Various
Posts: 633
Quote:
Originally Posted by pigtwo View Post
I feel like maybe I'm misinterpreting something in the 2nd law of thermodynamics(or maybe there is some semantic thing I'm missing). The first line of the wiki on the 2nd law of thermodynamics is "The second law of thermodynamics states that the total entropy of an isolated system can never decrease over time...". That seems like it's incorrect. It would appear that entropy can decrease. It's insanely unlikely but there is an enormous difference between insanely unlikely and not possible. I feel like when you read about decreasing entropy it is seen as something like exceeding the speed of light in that it's completely impossible. IE Total entropy monotonically increases.

Maybe the interpretation is that if you're in the system you cannot force it into a lower entropy state. But that lower entropy state could come about randomly.
First, I want to note that you’ve slightly modified your description of the second law here, as opposed to what you put in your first post. Most notably, here you said "isolated system" whereas initially you said "closed system." It should indeed be "isolated" rather than "closed." A closed system, while not allowing for the transfer of matter in or out, does allow for the transfer of energy. You can decrease the entropy of a closed system through energy transfer with another system.

Second, maybe don’t think of it so much as "can't" as, "on all but the most trivial of time scales, it has been observed that entropy of an isolated system increases, to the point that we can take it as a fundamental law of physics, but of course all physical laws are subject to change, just as soon as new evidence, counteracting thousands of years of observation and experimentation, arises to contradict it, at which point we may have to revise the model."

Last edited by ASL v2.0; 12-04-2019 at 01:37 AM.
  #19  
Old 12-04-2019, 05:52 AM
Napier is offline
Charter Member
 
Join Date: Jan 2001
Location: Mid Atlantic, USA
Posts: 9,621
We have two different hypotheticals floating around here. In an isolated system two bricks could get further apart in temperature, it's just fantastically unlikely. But an isolated brick or glass of water can't change temperature by itself, because that would create or destroy temperature, so that's actually impossible.
  #20  
Old 12-04-2019, 06:54 AM
DPRK is offline
Guest
 
Join Date: May 2016
Posts: 4,042
We could not even define the temperature of a brick or glass of water if things did not work the way they do with objects reasonably quickly coming to thermal equilibrium.

ps there may be a typo where my stack of 10's posted above has one extra layer; a glass of water has approximately, let's say 1025 molecules. But the conclusion stands: once it is at equilibrium, don't hold your breath watching for time to run backwards and entropy to spontaneously reverse itself.
  #21  
Old 12-04-2019, 11:20 AM
pigtwo is offline
Guest
 
Join Date: Apr 2019
Posts: 31
Quote:
Originally Posted by DPRK View Post
I think you do not quite grasp the magnitude of the probabilities involved. Like, is there anything keeping you from tossing heads on a fair coin 101010 times in a row? No? Then try it, and see if mysterious physical forces keep you from making progress "indefinitely".
I understand how unlikely it is. My main point is that the implications of something being impossible is much different than something being extremely* unlikely. As an example there are phenomena like the De Broglie wavelength which(to my shallow understanding) dictates the probability of particle being in a certain location. The probability that that particle is 1 light year away is extremely low but not impossible. If it were truly impossible that would have big implications of how the physics works. That's kind of where I'm coming at this from. Maybe a better example, what if the you could go faster than the speed of light without infinite energy, but you just had to get really luckily. That would completely change the nature of the speed of light even though functionally nothing would change because you would never expect to see it happen.

The first line of the wiki says total entropy never decreases. That's what got me interested in this. I was wondering what mechanism prevents entropy from decreasing. But it appears that the answer is nothing and that it can decrease. Empirically it never does this but there is nothing expressly forbidding it.

*There's no good way to describe how unlikely it is so I just default to calling it some large amount of unlikely. I guess you could say something like less than 1/TREE(3) chance or something but it doesn't really add anything.

Quote:
Originally Posted by ASL v2.0
First, I want to note that youíve slightly modified your description of the second law here, as opposed to what you put in your first post. Most notably, here you said "isolated system" whereas initially you said "closed system." It should indeed be "isolated" rather than "closed." A closed system, while not allowing for the transfer of matter in or out, does allow for the transfer of energy. You can decrease the entropy of a closed system through energy transfer with another system.

Second, maybe donít think of it so much as "can't" as, "on all but the most trivial of time scales, it has been observed that entropy of an isolated system increases, to the point that we can take it as a fundamental law of physics, but of course all physical laws are subject to change, just as soon as new evidence, counteracting thousands of years of observation and experimentation, arises to contradict it, at which point we may have to revise the model."
Thank you for clarifying the difference between isolated and closed. To your second paragraph, that is my current understanding. Basically what started all this for me was that when you read layman stuff about the second law of thermodynamics they make it sound like entropy cannot decrease(including over very long timescales). Which doesn't seem to be precisely accurate. I'm pretty sure it's just done for convenience and explaining the real meaning of the second law to layman readers wouldn't be very effective.

So I've probably just devolved in to nitpicking and the general sentiment is understood.
  #22  
Old 12-04-2019, 11:36 AM
ASL v2.0's Avatar
ASL v2.0 is offline
Guest
 
Join Date: Jul 2019
Location: Various
Posts: 633
Something else to consider. Some, rather than referring to closed or isolated systems, would refer instead to the "whole system." That is, if you take two closed systems and put them together, you may get one to experience an entropy drop, but the other will see entropy rise, and overall the rise will be greater than the drop.

And when you really, really pick a nit, there is no such thing as a truly isolated system. What is significant isnít the action of molecules in individual bricks or beakers and the probability that they may do this thing versus that, but rather that the universe as a whole, or at least the observable portions of it, do appear to obey the second law (and the others, too). That is, the entropy of the universe appears to be increasing, irrevocably so. Whether or not empirical evidence has ever suggested the entropy of the universe might have ever increased overall, well... not since a very small amount of time after the Big Bang. And before that time... who knows.
  #23  
Old 12-04-2019, 11:39 AM
DPRK is offline
Guest
 
Join Date: May 2016
Posts: 4,042
You have probably read the recent long thread about this, in which there were a number of illustrative examples to the effect that, since things (like gases) are constantly moving around dynamically, if you want entropy to absolutely never decrease you can probably come up with a technical or qualified definition that has this property, but for any actual macroscopic system this is a distinction without a difference. Still, at some point it may be useful to leave aside the pop-sci stuff and consider what Boltzmann et al. actually argued, like his "molecular chaos" assumption.
  #24  
Old 12-04-2019, 02:45 PM
TemporalFix is offline
Guest
 
Join Date: Apr 2018
Location: The Netherlands
Posts: 11
There are two distinct branches of physics involved here: thermodynamics and statistical physics. According to thermodynamics, decreasing entropy is impossible. According to statistical physics, decreasing entropy is improbable. There is a difference between thermodynamics and statistical physics here, the difference between impossible and improbable. A source of confusion is: many people use statistical physics to reason about entropy, but mis-categorize it by calling that reasoning thermodynamics, as if the difference between thermodynamics and statistical physics doesn't exist.
  #25  
Old 12-04-2019, 03:32 PM
Buck Godot's Avatar
Buck Godot is offline
Guest
 
Join Date: Mar 2010
Location: MD outside DC
Posts: 6,140
Quote:
Originally Posted by pigtwo View Post
I understand how unlikely it is. My main point is that the implications of something being impossible is much different than something being extremely* unlikely. As an example there are phenomena like the De Broglie wavelength which(to my shallow understanding) dictates the probability of particle being in a certain location. The probability that that particle is 1 light year away is extremely low but not impossible. If it were truly impossible that would have big implications of how the physics works. That's kind of where I'm coming at this from. Maybe a better example, what if the you could go faster than the speed of light without infinite energy, but you just had to get really luckily. That would completely change the nature of the speed of light even though functionally nothing would change because you would never expect to see it happen.
Not really. If the wave function of an electron collapses such that the location of that electron is actually 1 light year away, then the electron didn't actually move faster than the speed of light, it stayed exactly where it was which, as it happens, is actually one light year away. Now since you couldn't force this event to occur (without an improbability drive) you can't actually pass information this way, so it doesn't break causality. Quantum mechanically pretty much all laws of physics are more suggestions than laws, but if they work every single time you repeat the experiment whose to say they aren't laws.

Last edited by Buck Godot; 12-04-2019 at 03:34 PM.
  #26  
Old 12-04-2019, 04:57 PM
pigtwo is offline
Guest
 
Join Date: Apr 2019
Posts: 31
Quote:
Originally Posted by TemporalFix View Post
There are two distinct branches of physics involved here: thermodynamics and statistical physics. According to thermodynamics, decreasing entropy is impossible. According to statistical physics, decreasing entropy is improbable. There is a difference between thermodynamics and statistical physics here, the difference between impossible and improbable. A source of confusion is: many people use statistical physics to reason about entropy, but mis-categorize it by calling that reasoning thermodynamics, as if the difference between thermodynamics and statistical physics doesn't exist.
Ah, well that would make a lot of sense. I did not know these were different. Looking at the wiki for entropy in statistical physics appears to be how I've been thinking about it. The equation for entropy is definitely the one I'm familiar with. I notice that the wording for the second law is different as well: "the total entropy of any isolated thermodynamic system tends to increase over time, approaching a maximum value." That's for the clarification, that basically was what I was looking for.

Quote:
Originally Posted by Buck Godot
Not really. If the wave function of an electron collapses such that the location of that electron is actually 1 light year away, then the electron didn't actually move faster than the speed of light, it stayed exactly where it was which, as it happens, is actually one light year away. Now since you couldn't force this event to occur (without an improbability drive) you can't actually pass information this way, so it doesn't break causality. Quantum mechanically pretty much all laws of physics are more suggestions than laws, but if they work every single time you repeat the experiment whose to say they aren't laws.
I'm not sure what you're saying not really to. I wasn't making any statement about particles moving faster than light. I was just trying to give an example of if you were to change something in known physics from very improbable to impossible. And how that is a very meaningful difference and would have lots of impacts.

Maybe a better example could be something like The Pigtwo Coin Flipping Law which states that it is highly unlikely that a coin be flipped and land on heads consecutively more than a Graham's number amount of times. If I told you this you'd yawn. But if instead we had The Pigtwo Coin Flipping Law which states that a coin cannot be flipped and land on heads consecutively more than a Graham's number amount of times. That would be really weird. You'd be able to figure out more stuff about physics by determining what is preventing that last coin flip from landing heads. The fact that it's impossible vs improbable would give hints to the underlying physics.

This is fairly aside from the point thought and I understand now(thanks to TemporalFix) that in thermodynamic physics it is truly impossible for entropy to decrease while in statistical physics it's merely very unlikely.
  #27  
Old 12-04-2019, 06:52 PM
Buck Godot's Avatar
Buck Godot is offline
Guest
 
Join Date: Mar 2010
Location: MD outside DC
Posts: 6,140
The point that I was trying to make is that post-quantum mechanics, pretty much all of physics fits into that improbably category. "If I drop a ball gravity it will fall" is really only, "if I drop a ball it will probably fall, unless I flip a number of heads equal to Graham's number."

Last edited by Buck Godot; 12-04-2019 at 06:55 PM.
  #28  
Old 12-04-2019, 07:37 PM
DPRK is offline
Guest
 
Join Date: May 2016
Posts: 4,042
Quote:
Originally Posted by pigtwo View Post
This is fairly aside from the point thought and I understand now(thanks to TemporalFix) that in thermodynamic physics it is truly impossible for entropy to decrease while in statistical physics it's merely very unlikely.
One way to look at it is that in the thermodynamic limit the number of particles tends to infinity. Thus fluctuations of the temperature, entropy vanish.
Reply

Bookmarks

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off

Forum Jump


All times are GMT -5. The time now is 11:05 PM.

Powered by vBulletin® Version 3.8.7
Copyright ©2000 - 2019, vBulletin Solutions, Inc.

Send questions for Cecil Adams to: cecil@straightdope.com

Send comments about this website to: webmaster@straightdope.com

Terms of Use / Privacy Policy

Advertise on the Straight Dope!
(Your direct line to thousands of the smartest, hippest people on the planet, plus a few total dipsticks.)

Copyright © 2019 STM Reader, LLC.

 
Copyright © 2017