Explain Entropy

Entropy (at least in the Shannon sense, and thus I assume similarly for the Boltzmann sense as well) isn’t logarithmically related to probability as a mere matter of convenience for handling probabilities spanning many orders of magnitude; entropy is specifically logarithmically related to probability because independent events multiply probability under conjunction, and we’d like to speak of this as adding entropy.

So… I’m still curious: what is a macroscopic quantity? You’ve given examples, but not a definition.

For every possible way that atoms could move forward in time from a lowly mixed state to a highly mixed state, there’s a corresponding way (just reverse all the motions; particle kinetics are described by time-symmetric laws, right?) that atoms could move forward in time from a highly mixed state to a lowly mixed state. So it can’t be just automatic that the mixedness of the atoms will tend to rise over time. This can’t be explained as a probabilistic tautology; there must be some hidden assumption (I’m told it has to do with boundary conditions at the Big Bang, but what do I know? All I know is that the handwavy statistical argument people usually give is too glib).

In your case you should care because you are going to melt.

There simply are many more high entropy states than there are low entropy states, so any ‘random step’ in the evolution is far more likely to go towards higher entropy. Ultimately, the second law doesn’t mean anything but ‘more likely states obtain more often’. The boundary conditions of the universe only come into play to explain why the entropy was ever low to begin with; a randomly selected state for the universe would be expected to be high entropy.

As for macroscopic states, there’s indeed some ambiguity. The crucial thing is perhaps that once you have defined a notion of macroscopic quantity you care about, then entropy is unambiguous.

I’m running out of time here, perhaps I can elaborate later on…

That reasoning would be equally true for a random step backwards in time as for a random step forwards in time. It would be just as supportive of “Entropy tends to decrease over time” as “Entropy tends to increase over time”.

"INSUFFICIENT DATA FOR MEANINGFUL ANSWER. "

– Isaac Asimov, “The Last Question”

:smiley:

The first part is right, but I don’t see how this implies that entropy would decrease over time – basically, one should expect that the entropy is higher towards the past and future, but not that it should be lower towards the future. To then get the entropy in the past low one invokes the universal boundary conditions.

They’ve lost gravitational potential energy by clustering close together, but their kinetic energy has increased. That’s a reversible process, so entropy has not particularly increased.

The second law of thermodynamics says that in a closed system, entropy can only increase or remain constant.

A jar of liquid water in intergalactic space will cool and freeze. because this involves heat transfer between the water (at 273K) and surrounding intergalactic space (approximately 4K), the jar of water cannot be considered in isolation; the “system,” for second-law purposes, is the jar of water plus the surrounding intergalactic space to which heat is being pissed away. The jar of liquid water, when it freezes, experiences a decrease in entropy - but the surrounding intergalactic space, by receiving all that heat from the liquid water, experiences an increase in entropy which more than offsets the decrease in entropy of the jar of water. The net change in entropy for this heat transfer event is greater than zero.

“Given time, everything will go to shit.”

Well, higher towards past = lower towards future…

But the counting argument doesn’t really show “Entropy tends to rise as one steps backwards/forwards in time”; it shows “Entropy tends to be high at any moment”, such that if entropy is in fact actually low at some moment, this tends to be an anomaly and entropy will tend to be high (and thus higher) at any other moment around it, before or after.

[This is all on the assumption that there are more ways to be high entropy than low entropy, which I suppose is dependent on what exactly macrostates are… for example, given a quadrillion macrostates each consisting of just one microstate (and thus low/zero entropy), and one macrostate consisting of a million microstates (and thus higher entropy), there are rather more ways to be low entropy than high entropy. But I gather this isn’t how macrostates work in practice, only I don’t have a good sense of what they amount to still. I can only make flailing points about mathematical technicalities, because I still don’t understand the physics.]

Also I think that since the wearer was a small child, the implication is that the little one is quite helpful at creating disorder in the universe.

I’m not sure I see where your trouble with the concept lies, so on the danger of missing your point: macrostates are just defined by the variables you care about at the macroscopic level; for a gas, for instance, these would be pressure, temperature, volume, energy… I.e. the things you need to know if you want to describe a cycle of a machine doing work on the gas. Entropy then essentially is defined by state counting – there are many more ways for a gas to be spread more or less evenly than there are ways for it to clump, more ways to be at uniform temperature than not, and so on.

Entropy is a fun philosophical concept. Most of the responses so far have been from an information theory slant. I imagine most of my answer will be cosmological. Since entropy seems to be a universal constant, the idea has worked its way into the observations of researchers working in many diverse fields - evolutionary biology, psychology, fractal universe structure, etc.

The short of it is - the universe started infinitely small and will end up infinitely large. The idea of entropy is one way we think of and measure this constant, never ending process. We assume the universe is itself a closed system, so while you will see spots of low entropy here and there like planets, galaxies and message boards - the universe as a whole is moving toward a high entropy level.

It’s important to remember that entropy is one of those constant universal forces that have been with us since very soon after the Big Bang. Gravity soon got itself going also, and will always be the natural enemy of entropy. The process we think of as “time” also came about in our universe at the same… well, time.

Now fast forward to a time when physicists and chemists have evolved enough in their studies of the natural world to be asking some practical questions that they just might be able to answer with their new technology. In the mid/late 1800s a common puzzler was:
“Why can’t I build a perpetual motion machine?” We noticed it may have something to do with friction and heat dissipation.

We soon found constants and equations that would work with what we were experiencing. Much of this work was with gas pressures. We could measure and predict the “spreading out” of gas in a closed environment, and do the same things with heat energy. Look into the Boltzmann constant (gas constant/Avogadro, or even energy/temperature) for other ways in which we were also trying to find equations that would work on a macroscopic level and on the newly discovered atomic level.

So that’s why the idea of entropy came about. You will observe the natural process of entropy going on around you all the time. Real physical objects will always get colder and the clump of molecules that give it its status as physical object will all spread out until the thing is eventually dispersed elementary particles, or even smaller particles we haven’t discovered. Note that this may take a lot of time.

Thought experiment: If you take a small clear glass box and put a potted plant with a nice orange flower in it, perfectly sealed, no particles get in or out, what will happen to the object (plant) you put in there? The particles will “want” to find a state of equilibrium. The humidity level of the plant will equalize with the air, the plant’s cells will break down and turn to sludge or, if the air was dry will turn to dust. To really see the effect of entropy you would need to take this box away from as much collected energy as you can find. Out in deep space the flower’s dust particles will not be influenced even by gravity and you will see them spread out evenly in the box. Now if you open the box in deep space the particles will eventually spread out evenly across the entire universe. Everything else in the universe will also be at maximum high entropy and this will have taken, from our perspective, an infinite amount of time. After this it gets complicated.

So perhaps the most interesting philosophical aspect about entropy is that this process, as an important side effect of universal expansion, is key to our sensation of time.

Well, let me try to put my hole of understanding this way:

Let Configs be the set of possible ways to choose positions and momenta for N particles.

There will be an equivalence relation on the set Configs saying when two configurations are macroscopically equivalent.

What is the definition of this equivalence relation, mathematically?

(For simplicity, let us take N = 2, if that makes sense. And if that does not make sense, why not?)

Misspelling intentional?

Entropy can be thought of as a tendency away from order toward chaos. Small children tend to accelerate this tendency. Ergo, children are “Entropy Elves”. They assist entropy.

:smack: no

Exactly.

(Although chaos has been co-opted to mean something besides disorder.)

If you’re implying that gravity somehow counteracts entropy, that’s not the case, even though under gravitational influence, things tend to get lumpier, while ordinarily, when entropy increases, we think about everything getting spread out evenly. In fact, wrt gravity, lumpy matter distributions have a higher entropy; a black hole, the most lumpiest thing imaginable, has the maximum entropy within a (suitably defined) spacetime volume. So gravity acts as every other process towards increasing entropy (as it must).

It does make sense, it’s just a bit of a small sample size, so that most of the thermodynamic statements you make about this system will carry rather huge error bars.

As for the equivalence relation on states in Configs (which physicists call ‘phase space’ for some odd reason), it’s basically something like ‘being at the same energy’. You’ll probably know that any system can be described by a quantity known as the Hamiltonian, which basically gives the system’s energy. For some definite value E, the constraint H = E is a curve in phase space, and the (micro-) states on that curve belong to the same macrostate. So in order to find the system’s entropy, you just have to count those states. Typically, if the system is in equilibrium, it will be in each of those microstates with equal probability; often, the set of probabilities {P[sub]r[/sub]} to be in microstate r is considered to be the system’s ‘macrostate’ in the stricter sense.

Ah, ok. Does that mean you can determine temperature, pressure, etc., just from the system’s energy?

No (temperature, for instance, is the average energy per particle, so you’d need to know at least the number of particles to make the conversion), but you can describe the same macrostate in terms of other variables (say if it’s in terms of energy and volume, you can go over to a description in terms of temperature and pressure, provided the ‘amount of stuff’, i.e. number of particles, is constant).

Sorry, let me rephrase the question (although you may have just answered it): can you determine the pressure, volume, etc., just from the system’s energy, supposing the number of particles (and their masses and whatnot, I suppose) are held fixed? That is, for a fixed collection of particles, all their macroscopic properties are functions of the Hamiltonian alone? To be a macroscopic property is to be a function of the Hamiltonian?

If so, great. I finally understand physical entropy.

The Hamiltonian fixes the complete behavior of a physical system, giving the equations of motion for all the particles. Say, in the ideal gas case, the Hamiltonian is just the sum of all the one-particle-in-a-box Hamiltonians (because we neglect interaction terms), so sure, you can derive all the macroscopic quantities from it.