Entropy? Siri is no help

We had the second law before we even knew that there was a universe, and not just a single galaxy of stars.

Everything leads to maximum entropy, eventually.

As to how the accelerating expansion of the universe and entropy are linked… well, that is a very good question.

I don’t want to get myself into trouble here, so I don’t want to get into too much detail of things that I don’t fully grok, but the expansion of the universe is an arrow of time, as is entropy. Back when we thought it was possible that the universe would end up contracting to a point (not all that long ago, barely over 2 decades), there were hypotheses on what would be the result of those two arrows of time differing. Someone mentioned Hawkings book, “Brief History of Time.” and if you get the original one, it was published before we knew that the universe was accelerating in its expansion, and talks about this a bit.

Without expansion, you get a bunch of Boltzmann brains and other statistical anomalies, even whole new universes. Essentially, if you put a bunch of particles in the corner of a box, they will not want to stay there, they will want to spread out. But, wait long enough (a very, very very long time), and eventually, they will all be back in the corner of the box. They will form complex structures purely by chance. It won’t happen often, but it will happen occasionally, over an infinite period of time.

With expansion, most of that goes away. If the box is always growing, the particles will almost certainly not ever all be back in one place again. They will not spontaneously form complex structures purely randomly. They might, but it is extremely unlikely, even over an infinite period of time.

In a way, I see it as if you have a big grid of quarters. You start them all heads, and that’s pretty low entropy. Anything you can do will increase the entropy. But, given enough time, you will see any pattern you look for, including all heads and all tails.

OTOH, if every time you iterate your grid, you also add a new row or column to it, you will not be guaranteed to get complex patterns over even infinite time, and it’s rather unlikely as well.

A suffusion of yellow.

There are three known arrows of time in physics, the thermodynamic one (the future is when entropy is higher), the cosmological one (the future is when the Universe is bigger), and a very subtle and slight one from particle physics that’s hard to describe what it even means. All of the other arrows of time that people describe end up being consequences of the thermodynamic one. It’s occasionally speculated that there might be some connection between those three, but every indication we have is that they’re entirely unrelated.

Have you tried putting all the hot things on one side and the cold things on the other side?

The McDLT approach to entropy reduction!

I assume you are joking, but – theoretically, that WOULD reduce entropy, and COULD happen by chance; the problem is that the more hot things you have on one side, the less likely they are to REMAIN on that side.

But hypothetically if you took a big room filled with air molecules you could end up with all the particles on one side of the room by random chance alone. The odds are just astronomically tiny, and the more particles you get on that side the harder it is for another particle to join them.

If you waited around for many times the length of the universe’s lifetime it would eventually happen, though-- right?

That’s what Boltzmann believed.

An expanding universe, or an expanding room, kinda ruins that.

I am. Also, The universe doesn’t really have ‘sides’.

That’s literally the same example my physics professor used (well, he said it could all end up in one corner, but still).