Explain Entropy

My IQ is 100. Explain entropy to me, and why I should care about it.

Edit: I’m looking for a philosophical slant rather than a purely technical view of the concept.

A philosophical slant to entropy?
I’m posting just to bookmark this thread because I’m curious myself just what that might be.

It is a technical concept of physics, though, invented by physicists, not a philosophical concept either in origin or main use. Any philosophical uses of it (unless they are detailed, technical analyses of exactly how the concept is used by physicists, and I very much doubt if that is what you are looking for) are merely metaphors, and usually pretty loose ones at that.

It is used as a metaphor for disorder and lack of motivation. For the real meaning, ask a physicist.

You want entropy explained with a philosophical slant?

I’ll take any kind of slant.

Well there is no reason for you to be personally concerned about it, anymore than you are personally concerned about say gravity. I would be amused to see a Troy MClure style PSA though now “Come back entropy!”.

The universe started with a bang, that bang is slowly getting bigger and colder. Eventually all the energy will be gone.

Entropy is a loss of organization and usefulness. Since energy is neither created nor destroyed, it doesn’t make sense to talk about a system having less energy at one point of time than another, but as the energy becomes less useful, we can talk about entropy going up.

For example, burning a gallon of gas. You take highly concentrated and organized chemical energy in the gas and you spread it all over the place - heat in the engine and exhaust diffuses until they reach room temperature, chemicals in the gas are reduced to molecule with less potential energy (i.e. you can’t burn H2O and CO2 and get more energy), and a pure liquid is now pinging around the atmosphere as a mixed gas. The energy from the gasoline has not disappeared, but it has become useless. (Or, at least, less useful.)

Is dissipated heat always less useful than something such as chemical energy or mechanical energy? Don’t we have to heat our houses in the winter?

Dissipated heat, in the relevant sense, is completely useless; so yes.

Well, ‘useful’ is a fuzzy term here, but ‘work’ is something that has an application in physics which can also relate to practical situations I think.

Burning something to heat your home can be considered work. That heat staying in your home is not work, it’s just inertia. And you can’t take that heat and bundle it up into fuel to burn somewhere else - not without spending more energy in the bundling process than you’d get out of burning it.

The only work that heat in your house can do, at that point, is move from your house to heat somewhere that’s colder than your house is, and so on. You can’t use it to heat someplace that’s already warmer without fighting the thermodynamics, (as an air conditioner does,) and you use more energy in that process.

I hope that shed some light and wasn’t too inaccurate. :smiley:

If I could hijack the thread, could someone also explain physical entropy in a technical sense? Specifically, in what sense is physical entropy an objective, non-describer-dependent, quantity? Can entropy be measured with a device?

I am aware of the Shannon entropy of a probability distribution, and how one might apply this to a thermodynamic situation with some business about probability distributions of microscopic states conditioned on macroscopic states, but what exactly distinguishes a state/property as macroscopic, and what exactly do probabilities mean in the context of deterministic particle collisions?

(I’ve probably asked this on these boards before, but I don’t recall it ever becoming clear to me. And it’s possible that, for all I know, my same qualms about entropy as it’s been explained to me would apply just as well to, say, temperature or such things. I don’t know very much.)

Well dissipated heat has less potential energy.If you’re really ionterested see heat engine.

Entropy should really be described as the loss of potential energy.

Philosophically it is a counter point to steady state concepts. There’s always loss over time. The concept gets borrowed from thermodynamics and applied to information theory, so there’s some philosophy about how those two different meanings correlate. Otherwise it’s like magnets and siphons, nobody knows how those really work :).

As long as we’re looking for philosophy, can someone explain this part of it to me?

Why do we talk about entropy constantly going UP? Wouldn’t it be simpler and more intuitive to talk about entropy going DOWN, as in the phrase, “The universe is running down.” I don’t get it.

(Don’t misunderstand me. UI’m not saying that those who talk about entropy going up are wrong. I’m asking why it was defined that way. When the concept of entropy was formulated, they could just have easily defined it as a mirror image of how they chose, and I’m curious why they chose this instead of that, given that down is so much more intuitive. Well, intuitive to ME, at least — why not to them?)

Many concepts have an inverse. In electricity we talk of conductivity and it’s inverse, resistance. I may be using the word inverse incorrectly but I hope you get where I’m going.

To my understanding entropy is the inverse of potential energy. It’s often described as “equilibrium/average/homogenization/dissipation” and this makes sense in a thermodymanic sense but it’s a little confusing in the overall picture. One thing I’ve heard is that matter tends to clump together in galaxies and this proves that entropy as in “equilibrium/average/homogenization/dissipation” is false. However if you thing of entropy as the loss of potential energy it makes sense.

When my children were very young, every mall had t-shirt booths where you could get almost anything put on a t- or similar shirt. I got each of them a baseball shirt with “ENRTOPY ELF” ironed onto the front. Only ONE person ever got the joke. So, to the the student running the cryo demonstration for the chem department one Picnic Day at UC Davis: Thank You. Your chuckle was appreciated.

It’s sort of like the problem if we had a temperature scale that went down instead of up. I.e., if higher heat meant a smaller number on the scale. At some point the scale would hit zero. And then you’d have to go negative for even hotter temps.

We already have a mess of this problem with Celsius and Fahrenheit. That’s why Kelvin is so useful in Physics. Good old PV = nRT in Kelvin is nice. In C or F it’s going to be a lot uglier. And that’s just one equation.

You can have zero entropy (at least theoretically). So, that’s a good place to have a fixed point on a scale and move away from that. And then it also works well in keeping the equations simple.

My take on entropy. Imagine a box with a divider down the middle. Helium atoms on one side, Neon on the other. There’s some “order” to it having them split up like that. Remove the divider and the atoms mix. There is now less order. Entropy has increased.

But, it’s more about Thermo than this usually. If the two sides had differences in temps, you remove the divider and they now even out. That’s how entropy is usually thought of in Physics.

About “useless” heat. People think that having heat, any kind of heat means you can power an engine and do something. Wrong. You need a heat difference. Something cooler for the heat to flow into. That flow is what you tap to run your engine. If you found a planet that was 1000 degrees uniformly, you couldn’t put a steam engine on it and get it to run. But you can run an engine (using a gas with a low condensation point) on a freezing planet if there is a still colder sink close by the “merely” freezing place.

Increasing entropy means you have less flow going on to do anything useful with. Note that the “useful” stuff means making entropy go the other way for a bit somewhere else. Like grow some food.

As a computer person, it is of interest since it also entails loss of information, which is another discussion.

Note that in Thermo, randomness happens. Things can become more ordered. It’s just so unlikely and fleeting at the macro level you can ignore it. Not so at the atomic level.

That makes it easier to understand. Thank you.

I’ve also seen entropy defined as randomness. An often presented example of a gain in entropy is the melting of ice into water. Ice: molecules are locked in a crystal lattice. Water: molecules are moving around freely. There is an increase in entropy because there is an increase in randomness of the (location of the) molecules …?

This is very nice and helps a lot. But, they’re not equivalent, are they? What, if anything, is lost using such a much more ‘down to earth’ definition?