I understand how unlikely it is. My main point is that the implications of something being impossible is much different than something being extremely* unlikely. As an example there are phenomena like the De Broglie wavelength which(to my shallow understanding) dictates the probability of particle being in a certain location. The probability that that particle is 1 light year away is extremely low but not impossible. If it were truly impossible that would have big implications of how the physics works. That’s kind of where I’m coming at this from. Maybe a better example, what if the you could go faster than the speed of light without infinite energy, but you just had to get really luckily. That would completely change the nature of the speed of light even though functionally nothing would change because you would never expect to see it happen.
The first line of the wiki says total entropy never decreases. That’s what got me interested in this. I was wondering what mechanism prevents entropy from decreasing. But it appears that the answer is nothing and that it can decrease. Empirically it never does this but there is nothing expressly forbidding it.
*There’s no good way to describe how unlikely it is so I just default to calling it some large amount of unlikely. I guess you could say something like less than 1/TREE(3) chance or something but it doesn’t really add anything.
Thank you for clarifying the difference between isolated and closed. To your second paragraph, that is my current understanding. Basically what started all this for me was that when you read layman stuff about the second law of thermodynamics they make it sound like entropy cannot decrease(including over very long timescales). Which doesn’t seem to be precisely accurate. I’m pretty sure it’s just done for convenience and explaining the real meaning of the second law to layman readers wouldn’t be very effective.
So I’ve probably just devolved in to nitpicking and the general sentiment is understood.
Something else to consider. Some, rather than referring to closed or isolated systems, would refer instead to the “whole system.” That is, if you take two closed systems and put them together, you may get one to experience an entropy drop, but the other will see entropy rise, and overall the rise will be greater than the drop.
And when you really, really pick a nit, there is no such thing as a truly isolated system. What is significant isn’t the action of molecules in individual bricks or beakers and the probability that they may do this thing versus that, but rather that the universe as a whole, or at least the observable portions of it, do appear to obey the second law (and the others, too). That is, the entropy of the universe appears to be increasing, irrevocably so. Whether or not empirical evidence has ever suggested the entropy of the universe might have ever increased overall, well… not since a very small amount of time after the Big Bang. And before that time… who knows.
You have probably read the recent long thread about this, in which there were a number of illustrative examples to the effect that, since things (like gases) are constantly moving around dynamically, if you want entropy to absolutely never decrease you can probably come up with a technical or qualified definition that has this property, but for any actual macroscopic system this is a distinction without a difference. Still, at some point it may be useful to leave aside the pop-sci stuff and consider what Boltzmann et al. actually argued, like his “molecular chaos” assumption.
There are two distinct branches of physics involved here: thermodynamics and statistical physics. According to thermodynamics, decreasing entropy is impossible. According to statistical physics, decreasing entropy is improbable. There is a difference between thermodynamics and statistical physics here, the difference between impossible and improbable. A source of confusion is: many people use statistical physics to reason about entropy, but mis-categorize it by calling that reasoning thermodynamics, as if the difference between thermodynamics and statistical physics doesn’t exist.
Not really. If the wave function of an electron collapses such that the location of that electron is actually 1 light year away, then the electron didn’t actually move faster than the speed of light, it stayed exactly where it was which, as it happens, is actually one light year away. Now since you couldn’t force this event to occur (without an improbability drive) you can’t actually pass information this way, so it doesn’t break causality. Quantum mechanically pretty much all laws of physics are more suggestions than laws, but if they work every single time you repeat the experiment whose to say they aren’t laws.
Ah, well that would make a lot of sense. I did not know these were different. Looking at the wiki for entropy in statistical physics appears to be how I’ve been thinking about it. The equation for entropy is definitely the one I’m familiar with. I notice that the wording for the second law is different as well: “the total entropy of any isolated thermodynamic system tends to increase over time, approaching a maximum value.” That’s for the clarification, that basically was what I was looking for.
I’m not sure what you’re saying not really to. I wasn’t making any statement about particles moving faster than light. I was just trying to give an example of if you were to change something in known physics from very improbable to impossible. And how that is a very meaningful difference and would have lots of impacts.
Maybe a better example could be something like The Pigtwo Coin Flipping Law which states that it is highly unlikely that a coin be flipped and land on heads consecutively more than a Graham’s number amount of times. If I told you this you’d yawn. But if instead we had The Pigtwo Coin Flipping Law which states that a coin cannot be flipped and land on heads consecutively more than a Graham’s number amount of times. That would be really weird. You’d be able to figure out more stuff about physics by determining what is preventing that last coin flip from landing heads. The fact that it’s impossible vs improbable would give hints to the underlying physics.
This is fairly aside from the point thought and I understand now(thanks to TemporalFix) that in thermodynamic physics it is truly impossible for entropy to decrease while in statistical physics it’s merely very unlikely.
The point that I was trying to make is that post-quantum mechanics, pretty much all of physics fits into that improbably category. “If I drop a ball gravity it will fall” is really only, “if I drop a ball it will probably fall, unless I flip a number of heads equal to Graham’s number.”
One way to look at it is that in the thermodynamic limit the number of particles tends to infinity. Thus fluctuations of the temperature, entropy vanish.
Thought of what might be a better analogy for the issue of whether both bricks get hot: you have a horizontal rectangular enclosure, say 1 meter square and 0.1 meter tall. You put 1000 pennies on its floor, heads up, and close the lid. Now you give it a gentle bump on the bottom, so you hear a few of the pennies getting tossed up, and they may land tails up. You open the lid and find, say 950 are still heads up and 50 are tails up. So you get interested in how bumping it changes the score, and find that generally you can make the pennies less and less preferentially heads up, and that statistically the score tends towards 500 and 500.
You never find both bricks hot, and in this analogy, you never get more than 1000 heads up. It’s impossible.
If all the pennies are flipped completely randomly, then the number of heads follows a binomial distribution. This can model something like 1000 molecules of gas in a box, which move randomly between two halves of the box, and the highest-entropy (most likely) states are ones where they are evenly distributed.
The “gently nudging” could be a situation where the particles start out in a low-entropy state, and gradually move towards a higher-entropy state as they mix up.
On the other hand, it might make you think of a system of non-interacting particles in a magnetic field which causes them to have a different energy level depending on whether the spin faces up or down. In this case, the binomial distribution tells you the number of spin states with a given energy. You could imagine the system as being at thermal equilibrium at some fixed temperature and calculate the probability of being in a given energy state, which results in a Boltzmann distribution. The entropy in this case will be an (increasing) function of the temperature.
The entropy of a closed system can be described classically in terms of heat flow, and the second law of thermodynamics then states that in any process it can never decrease.
The key to understanding what all this coin-flipping stuff has to do with entropy and thermodynamic laws is that entropy can be calculated by analysing the statistics of the microscopic configurations of the system; roughly speaking, it measures how much the state is spread among these different configurations.
It’s not supposed to be completely obvious that these describe the same thing and that the laws of thermodynamics can be proved this way; that is what all the statistical mechanical theory can do for us: explain this behaviour. Originally, people observed the 2nd law in forms like that heat spontaneously flows from a warm body to a cold one, but did not explain why. In fact, it is a good point, and maybe the source if your question, that the law can be formulated in this way without using the word “entropy”. This word was apparently invented by Clausius in 1865, so to understand the equivalence you need to understand Clausius’s principle of heat flow.
To be precise, if your closed system is undergoing some kind of thermodynamic process then the change in entropy satisfies T dS = dU + P dV, where U is the amount of internal energy in your system. This gives a relation between “entropy”, which we claim cannot decrease, and the amount of heat going in/out of the system, the amount of work done, and pressure and volume changes. Now, if heat would spontaneously flow from a cold body to a hotter one, we can check that this would result in a decrease of entropy.
Perhaps there IS a way to reverse entropy, but we haven’t figured it out yet. This is the entire crux of Isaac Asimov’s short story The Last Question, which you can read it in full here.
(I came SOOO close to posting the last chapter here, but I couldn’t bring myself to spoil it for y’all. Asimov fans, if you haven’t read this one yet, don’t deny yourself the pleasure.)