Thermodynamics and the Big Crunch

One of the Very Big Laws governing the universe is a concept in Thermodynamics:
Order decays into entropy.

One of the possible Ends of the universe is The Big Crunch: Gravity counteracts the outward expansion of matter expanding from the Big Bang, and the universe collapses into a quantum singularity.

Wouldn’t that violate the entropy principle?

All space, time, matter , and energy in one neat little point is much too orderly.

In what follows, I’m trusting Wikipedia articles with cites from Physics textbooks.

Descriptions of entropy in terms of “order” and “disorder” are misleading. Entropy is a measure of differences in temperature, pressure and density within a closed physical system.* The greater these differences in a system, the lesser the measure of entropy in that system.

Given that, we can see that the size of a system does not have anything to do with the measure of its entropy. Even a very, very dense physical system (such as would exist in the “Big Crunch”) can be as entropic as it is possible for something to be, if its density (and temperature and pressure) is uniform throughout. (Note that Black Holes have the highest possible measure of entropy given their mass and volume.)

A couple of caveats, gleaned from well-sourced wikipedia articles. (I failed to log them while I was reading, but just doing a search for “entropy” should get you to them.) Some people argue that the laws of thermodynamics do not apply to the universe as a whole. I don’t know why this is. But I can guess that at least some of these arguments involve a notion that the universe is not a closed system. Some of the latest theories (perhaps better labled “speculations”) postulate that the universe–the thing that underwent our Big Bang–is not alone, and that there are other universes (and stranger things like “branes”) external to it and able to exert influences on it. So the universe may not be a closed system. (I reiterate: I do not know if that is the basis on which some people are arguing that the laws of thermodynamics don’t apply directly to the universe as a whole, I’m just guessing that might be one of the bases for such an argument.) Another caveat: The laws of thermodynamics apply, apparently, only to relatively macroscopic phenomena. Past a certain scale (unsuprisingly, the Plank scale) the concepts involved in those laws no longer have a clear meaning.

I’m gonna get so clobbered for posting the above. Well, the experts will be along soon.

-FrL-
*(There is also a conception of entropy which uses the notion of “information”–less information equals more entropy–but the jury is not out yet as to whether this kind of entropy is related to, identical with, or completely different than, thermodynamic entropy.)

It’s not especially neat. It’d be pretty much impossible to have all the matter in the universe stay matter at that point, so much of it would have decayed to energy. Energy is much more entropic than orderly matter. Matter does not like being matter and would much rather be energy.

Also, in the principles of string theory, there really is no such thing as a “singularity,” but really a very, very, very, VERY tiny universe. So it’s not a point, which would make it less orderly. But I’m no physics major, so I couldn’t explain it in any way that would either make sense or be guranteed to be 100% correct.

The information definition is the only one actually used in current physics, and is the definition for which the Second Law applies. The older definition involving temperatures and such is still true, but that’s largely because temperature (along with all the other thermodynamic quantities) is also now redefined in information-theory terms.

I’d heard entropy was the measure of how much work can be done by a closed system. How is the ability to do work regarded in terms of information?

“Doing work” is an abstraction (in the information sense) of organizing or otherwise modifying information. For instance (speaking somwhat metaphorically for the purpose of illustration) if all of your books are in a box on the floor, you have to do work upon them to arrange them on a shelf. In doing so, however, you’ve created further disorder by turning the complex chemicals that store energy in your body (adenosine triphosphate, or ATP) into heat and mechanical energy, which raises the overall temperature of the room, assuming that the room is adiabatic (perfectly insulated). So work creates order in one area of a system by compromising it in another area. In sum total, you never end up with at much useful energy as you started out with, and you always have more entropy (Rule #2). And you can’t even stop if you want to; the harder you try not to move, the worse you’re going to make it (Rule #3); it’s like herding cats.

Frylock, the laws of physics don’t only apply to macroscopic phenomena; indeed, thermodynamic principles are key to understanding why quantum mechanical phenomena aren’t exhibited on the everyday scale; see the thermodynamic limit and quantum decoherence. (Neither of these is the best written treatment of the respective topics, but I’m not going to dig through better texts and sum up their statements today.) One has to restate the laws of thermodynamics to use them in quantum systems because the classical quasiequilibrium thermodynamics (what Chronos is referring to as the “older definition”) and statistical mechanics approaches don’t apply to individual quantum particles, but the laws are equally valid on the very tiny scale of fundamental particles, too.

Note that the classical thermodynamics approach is still approximately valid for most practical uses like designing a steam cycle, making chemicals, and building meteorological models, and for the most part the only people really concerned about the statistical approach are physicists and engineers working in fields where they’re dealing with phenomena that can’t be approximated via continuum mechanics, like energetic plasmas or spectral emissivity. Quantum mechanical thermodynamics is of interest to quantum physicists (obviously) and information theorists (a field of discrete mathematics) working with quantized information.

Enola Straight, all matter and energy crammed into one single point (or homogeneously in a very, very small universe) isn’t necessarily orderly at all, particularly if it “has no free choice” (so to speak) about being there, i.e. it can’t escape. Gravitational singularities and the event horizons that envelop them–black holes–are volumes of maximal local entropy; the only way they can “gain” more entropy is to become larger, hence the entropy of a nonrotating black hole is directly proportional to its circumference. All of the information that goes into them–that is, all the matter and energy that falls past the event horizon–is gone forever from the universe, even in image or effect, leaving the black hole with only three fundamental properties–mass, spin, and charge, a state which is otherwise described by John Wheeler’s famous observation that “Black holes have no hair.”

Orderly (in a thermodynamic sense) is all about keeping the hot side hot and the cool side cool, just like that McBLT sandwich that McDonald’s was trying to promote waywayback. Unfortunately, the geniuses at Hamburger University neglected to cover basic thermodynamic principles and stuck both parts of the sandwich in the same box, and it always came out in a limp, lukewarm equilibrium state.

Stranger

Oh, and current muttering by cosmologists is that the Universe won’t end in a Big Crunch, but rather a Big Rip or a Big Freeze, where everything either becomes to widely seperated or too low a temperature differential for information to be conveyed from one locality to another. Think of living in North Dakota, surrounded by other copies of North Dakota, which are in turn surrounded by further copies of North Dakota, ad infinium. There’s just no point in going anywhere if it’s all just the same as where you’re at.

Stranger

Stranger, I said nothing of the sort. :mad: :rolleyes:

-FrL-

For the record, I got the idea that the 2nd law doesnt apply past a certain scale here, which article in turn claims to have gotten that notion from the following work: Landau, L.D.; Lifshitz, E.M. (1996). Statistical Physics Part 1. Butterworth Heinemann

Can anyone educate me as to how the author of that wikipedia article has probably misunderstood the Landau text, or alternatively, how I have misunderstood the wikipedia article?

-FrL-

It might be good, then, if you ever get a chance, to edit the wikipedia article on Entropy, which says:

Do you know why someone might have said the above? Is it just wrong, or is it misleading, or have I simply misunderstood it somehow?

-FrL-

Entropy is not an absolute law, it’s a statistical phenomena. The amount of high entropy states is larger than the amount of low entropy states so they are statistically far more likely.

My understanding is once the universe stops expanding then starts collapsing time will run backwards, so the law rgarding entropy will still hold since you will have a negative value for time.

Well, I can say that I’ve never heard any debate on the topic. The closest I’ve heard to such a debate is the question of whether and how black hole entropy relates to the more familiar sort, and even that debate is much more about “how” than “whether”. All I can figure is that the author of that Wikipedia article must have been working from outdated information: There must have been a time when the information-theory model was still new and therefore debated, and if that’s when the author was educated on the subject, and e isn’t up on current work, e might pass on that information.

On the other hand, I don’t work much directly with thermodynamics (except for some with black hole thermodynamics; see above), so it’s possible that there is still such a debate, just in other spheres than the ones I move in.

Back to the OP: One needn’t worry about the entropy of the Universe at the Big Crunch singularity, nor about much of anything else there, since at a singularity, things break down such that the questions can’t even be asked. When asked a question about a singularity, the best answers are generally “I dunno”, “Huh?”, or “So, how about them Yankees?”. One can ask such questions about times which approach the singularity, but for such times, there is still space for things to move about (albeit not comfortably), and the answer is that there’s plenty of entropy.

kanicbird, the notion of entropy reversing itself at the moment of maximum expansion is a fun one to play around with, but there’s no evidence for it whatsoever, and it would lead to some mighty strange conclusions. If that were the way the Universe worked, it would be exactly equivalent to the Universe simply ceasing to exist abruptly at the moment of maximum expansion, with no warning whatsoever.

I’m sure I read somewhere that the laws of physics work the same with time running forwards or backwards. Since the direction of time is defined by the direction of entropy if time was running backwards entropy would be too and we wouldn’t notice a dam difference.

Though that was a long time ago, possibly in another universe.

Right, which is why such a reversal would be indistinguishable from the Universe coming to an abrupt end. The folks on either side of the middle time would notice exactly the same things, remember exactly the same things, and do exactly the same things, so it’d be just as if they’re the same people, simultaneously approaching the middle time from both directions. But nobody would ever notice any time after the middle time.