The Straight Dope

Go Back   Straight Dope Message Board > Main > General Questions

Reply
 
Thread Tools Display Modes
  #1  
Old 07-19-2012, 02:51 PM
IceQube IceQube is offline
BANNED
 
Join Date: Feb 2012
Location: your 6
Posts: 759
Explain Entropy

My IQ is 100. Explain entropy to me, and why I should care about it.
Reply With Quote
Advertisements  
  #2  
Old 07-19-2012, 02:56 PM
IceQube IceQube is offline
BANNED
 
Join Date: Feb 2012
Location: your 6
Posts: 759
Edit: I'm looking for a philosophical slant rather than a purely technical view of the concept.
Reply With Quote
  #3  
Old 07-19-2012, 03:09 PM
zoid zoid is online now
Charter Member
 
Join Date: Sep 2001
Location: Chicago Il
Posts: 7,024
A philosophical slant to entropy?
I'm posting just to bookmark this thread because I'm curious myself just what that might be.
Reply With Quote
  #4  
Old 07-19-2012, 03:16 PM
njtt njtt is online now
Guest
 
Join Date: Jul 2004
It is a technical concept of physics, though, invented by physicists, not a philosophical concept either in origin or main use. Any philosophical uses of it (unless they are detailed, technical analyses of exactly how the concept is used by physicists, and I very much doubt if that is what you are looking for) are merely metaphors, and usually pretty loose ones at that.

It is used as a metaphor for disorder and lack of motivation. For the real meaning, ask a physicist.
Reply With Quote
  #5  
Old 07-19-2012, 03:19 PM
PacifistPorcupine PacifistPorcupine is offline
Guest
 
Join Date: Jun 2011
You want entropy explained with a philosophical slant?
Reply With Quote
  #6  
Old 07-19-2012, 03:20 PM
IceQube IceQube is offline
BANNED
 
Join Date: Feb 2012
Location: your 6
Posts: 759
I'll take any kind of slant.
Reply With Quote
  #7  
Old 07-19-2012, 03:21 PM
grude grude is online now
Guest
 
Join Date: Dec 2011
Well there is no reason for you to be personally concerned about it, anymore than you are personally concerned about say gravity. I would be amused to see a Troy MClure style PSA though now "Come back entropy!".

The universe started with a bang, that bang is slowly getting bigger and colder. Eventually all the energy will be gone.
Reply With Quote
  #8  
Old 07-19-2012, 03:21 PM
dracoi dracoi is offline
Guest
 
Join Date: Dec 2008
Entropy is a loss of organization and usefulness. Since energy is neither created nor destroyed, it doesn't make sense to talk about a system having less energy at one point of time than another, but as the energy becomes less useful, we can talk about entropy going up.

For example, burning a gallon of gas. You take highly concentrated and organized chemical energy in the gas and you spread it all over the place - heat in the engine and exhaust diffuses until they reach room temperature, chemicals in the gas are reduced to molecule with less potential energy (i.e. you can't burn H2O and CO2 and get more energy), and a pure liquid is now pinging around the atmosphere as a mixed gas. The energy from the gasoline has not disappeared, but it has become useless. (Or, at least, less useful.)
Reply With Quote
  #9  
Old 07-19-2012, 03:26 PM
IceQube IceQube is offline
BANNED
 
Join Date: Feb 2012
Location: your 6
Posts: 759
Is dissipated heat always less useful than something such as chemical energy or mechanical energy? Don't we have to heat our houses in the winter?

Last edited by IceQube; 07-19-2012 at 03:28 PM..
Reply With Quote
  #10  
Old 07-19-2012, 03:29 PM
njtt njtt is online now
Guest
 
Join Date: Jul 2004
Dissipated heat, in the relevant sense, is completely useless; so yes.
Reply With Quote
  #11  
Old 07-19-2012, 03:35 PM
chrisk chrisk is offline
Charter Member
 
Join Date: Nov 2003
Location: Southern ontario
Posts: 5,905
Quote:
Originally Posted by IceQube View Post
Is dissipated heat always less useful than something such as chemical energy or mechanical energy? Don't we have to heat our houses in the winter?
Well, 'useful' is a fuzzy term here, but 'work' is something that has an application in physics which can also relate to practical situations I think.

Burning something to heat your home can be considered work. That heat staying in your home is not work, it's just inertia. And you can't take that heat and bundle it up into fuel to burn somewhere else - not without spending more energy in the bundling process than you'd get out of burning it.

The only work that heat in your house can do, at that point, is move from your house to heat somewhere that's colder than your house is, and so on. You can't use it to heat someplace that's already warmer without fighting the thermodynamics, (as an air conditioner does,) and you use more energy in that process.

I hope that shed some light and wasn't too inaccurate.
__________________
Stringing Words Forum
Aspiring writers and authors supporting each other.
Goals and resolutions our particular specialty - also sharing commiseration and triumphs.
Join today!

Last edited by chrisk; 07-19-2012 at 03:36 PM..
Reply With Quote
  #12  
Old 07-19-2012, 03:47 PM
Indistinguishable Indistinguishable is offline
Guest
 
Join Date: Apr 2007
If I could hijack the thread, could someone also explain physical entropy in a technical sense? Specifically, in what sense is physical entropy an objective, non-describer-dependent, quantity? Can entropy be measured with a device?

I am aware of the Shannon entropy of a probability distribution, and how one might apply this to a thermodynamic situation with some business about probability distributions of microscopic states conditioned on macroscopic states, but what exactly distinguishes a state/property as macroscopic, and what exactly do probabilities mean in the context of deterministic particle collisions?

(I've probably asked this on these boards before, but I don't recall it ever becoming clear to me. And it's possible that, for all I know, my same qualms about entropy as it's been explained to me would apply just as well to, say, temperature or such things. I don't know very much.)

Last edited by Indistinguishable; 07-19-2012 at 03:49 PM..
Reply With Quote
  #13  
Old 07-19-2012, 03:51 PM
zoid zoid is online now
Charter Member
 
Join Date: Sep 2001
Location: Chicago Il
Posts: 7,024
Well dissipated heat has less potential energy.If you're really ionterested see heat engine.

Entropy should really be described as the loss of potential energy.
Reply With Quote
  #14  
Old 07-19-2012, 03:53 PM
TriPolar TriPolar is offline
Member
 
Join Date: Oct 2007
Location: rhode island
Posts: 22,191
Philosophically it is a counter point to steady state concepts. There's always loss over time. The concept gets borrowed from thermodynamics and applied to information theory, so there's some philosophy about how those two different meanings correlate. Otherwise it's like magnets and siphons, nobody knows how those really work .
Reply With Quote
  #15  
Old 07-19-2012, 04:21 PM
Keeve Keeve is offline
Guest
 
Join Date: Aug 2000
As long as we're looking for philosophy, can someone explain this part of it to me?
Quote:
Originally Posted by dracoi View Post
... but as the energy becomes less useful, we can talk about entropy going up.
Quote:
Originally Posted by zoid View Post
Well dissipated heat has less potential energy. ... Entropy should really be described as the loss of potential energy.
Why do we talk about entropy constantly going UP? Wouldn't it be simpler and more intuitive to talk about entropy going DOWN, as in the phrase, "The universe is running down." I don't get it.

(Don't misunderstand me. UI'm not saying that those who talk about entropy going up are wrong. I'm asking why it was defined that way. When the concept of entropy was formulated, they could just have easily defined it as a mirror image of how they chose, and I'm curious why they chose this instead of that, given that down is so much more intuitive. Well, intuitive to ME, at least --- why not to them?)
Reply With Quote
  #16  
Old 07-19-2012, 04:40 PM
zoid zoid is online now
Charter Member
 
Join Date: Sep 2001
Location: Chicago Il
Posts: 7,024
Quote:
Originally Posted by Keeve View Post
As long as we're looking for philosophy, can someone explain this part of it to me?Why do we talk about entropy constantly going UP? Wouldn't it be simpler and more intuitive to talk about entropy going DOWN, as in the phrase, "The universe is running down." I don't get it.
Many concepts have an inverse. In electricity we talk of conductivity and it's inverse, resistance. I may be using the word inverse incorrectly but I hope you get where I'm going.

To my understanding entropy is the inverse of potential energy. It's often described as "equilibrium/average/homogenization/dissipation" and this makes sense in a thermodymanic sense but it's a little confusing in the overall picture. One thing I've heard is that matter tends to clump together in galaxies and this proves that entropy as in "equilibrium/average/homogenization/dissipation" is false. However if you thing of entropy as the loss of potential energy it makes sense.
Reply With Quote
  #17  
Old 07-19-2012, 04:42 PM
Yllaria Yllaria is offline
Charter Member
 
Join Date: Nov 2001
Location: Stockton
Posts: 7,659
Quote:
Originally Posted by IceQube View Post
I'll take any kind of slant.
When my children were very young, every mall had t-shirt booths where you could get almost anything put on a t- or similar shirt. I got each of them a baseball shirt with "ENRTOPY ELF" ironed onto the front. Only ONE person ever got the joke. So, to the the student running the cryo demonstration for the chem department one Picnic Day at UC Davis: Thank You. Your chuckle was appreciated.
Reply With Quote
  #18  
Old 07-19-2012, 04:51 PM
ftg ftg is offline
Guest
 
Join Date: Feb 2001
Quote:
Originally Posted by Keeve View Post
As long as we're looking for philosophy, can someone explain this part of it to me?Why do we talk about entropy constantly going UP? Wouldn't it be simpler and more intuitive to talk about entropy going DOWN, as in the phrase, "The universe is running down." I don't get it.
It's sort of like the problem if we had a temperature scale that went down instead of up. I.e., if higher heat meant a smaller number on the scale. At some point the scale would hit zero. And then you'd have to go negative for even hotter temps.

We already have a mess of this problem with Celsius and Fahrenheit. That's why Kelvin is so useful in Physics. Good old PV = nRT in Kelvin is nice. In C or F it's going to be a lot uglier. And that's just one equation.

You can have zero entropy (at least theoretically). So, that's a good place to have a fixed point on a scale and move away from that. And then it also works well in keeping the equations simple.

My take on entropy. Imagine a box with a divider down the middle. Helium atoms on one side, Neon on the other. There's some "order" to it having them split up like that. Remove the divider and the atoms mix. There is now less order. Entropy has increased.

But, it's more about Thermo than this usually. If the two sides had differences in temps, you remove the divider and they now even out. That's how entropy is usually thought of in Physics.

About "useless" heat. People think that having heat, any kind of heat means you can power an engine and do something. Wrong. You need a heat difference. Something cooler for the heat to flow into. That flow is what you tap to run your engine. If you found a planet that was 1000 degrees uniformly, you couldn't put a steam engine on it and get it to run. But you can run an engine (using a gas with a low condensation point) on a freezing planet if there is a still colder sink close by the "merely" freezing place.

Increasing entropy means you have less flow going on to do anything useful with. Note that the "useful" stuff means making entropy go the other way for a bit somewhere else. Like grow some food.

As a computer person, it is of interest since it also entails loss of information, which is another discussion.

Note that in Thermo, randomness happens. Things can become more ordered. It's just so unlikely and fleeting at the macro level you can ignore it. Not so at the atomic level.
Reply With Quote
  #19  
Old 07-19-2012, 05:01 PM
IceQube IceQube is offline
BANNED
 
Join Date: Feb 2012
Location: your 6
Posts: 759
Quote:
Originally Posted by zoid View Post

Entropy should really be described as the loss of potential energy.
That makes it easier to understand. Thank you.

I've also seen entropy defined as randomness. An often presented example of a gain in entropy is the melting of ice into water. Ice: molecules are locked in a crystal lattice. Water: molecules are moving around freely. There is an increase in entropy because there is an increase in randomness of the (location of the) molecules ..?

Last edited by IceQube; 07-19-2012 at 05:01 PM..
Reply With Quote
  #20  
Old 07-19-2012, 05:12 PM
KarlGauss KarlGauss is offline
Out of the slimy mud of words
Charter Member
 
Join Date: Mar 2000
Location: Between pole and tropic
Posts: 6,601
Quote:
Originally Posted by zoid View Post
Entropy should really be described as the loss of potential energy.
This is very nice and helps a lot. But, they're not equivalent, are they? What, if anything, is lost using such a much more 'down to earth' definition?
Reply With Quote
  #21  
Old 07-19-2012, 05:14 PM
PacifistPorcupine PacifistPorcupine is offline
Guest
 
Join Date: Jun 2011
It's an increase in the number of available states, In ice, the molecules are more constrained, thus the lower entropy.
Reply With Quote
  #22  
Old 07-19-2012, 05:22 PM
IceQube IceQube is offline
BANNED
 
Join Date: Feb 2012
Location: your 6
Posts: 759
Quote:
Originally Posted by PacifistPorcupine View Post
It's an increase in the number of available states, In ice, the molecules are more constrained, thus the lower entropy.
States?

As in states of matter?
Reply With Quote
  #23  
Old 07-19-2012, 05:31 PM
newcomer newcomer is offline
BANNED
 
Join Date: Mar 2006
Location: Toronto
Posts: 1,943
I'm familiar with entropy in information theory where it is a measure of unpredictability (inverse of information). In Computer Science we studied Shannon theoretical work which laid foundation for practical uses such as data compression (entropy of English language) or development of encryption methods.

You can find entire chapter devoted to entropy in Godel, Escher, Bach book by Hofstadter.
Reply With Quote
  #24  
Old 07-19-2012, 05:43 PM
notsoheavyd3 notsoheavyd3 is offline
Guest
 
Join Date: Sep 2009
Quote:
Originally Posted by IceQube View Post
That makes it easier to understand. Thank you.

I've also seen entropy defined as randomness. An often presented example of a gain in entropy is the melting of ice into water. Ice: molecules are locked in a crystal lattice. Water: molecules are moving around freely. There is an increase in entropy because there is an increase in randomness of the (location of the) molecules ..?
I've been confused by this for awhile actually. I mean suppose I had a closed jar of liquid water in intergalatic space and it was just sitting there. I'd expect the entropy of the jug of water to go up with time.(Since you know, 2nd law) However I'd also expect it to cool and since there's no source of energy near by I'd expect it to freeze, not melt which would make me think the jug of liquid water has less entropy than the jug of ice.

Chronos any chance you could clear that up?
Reply With Quote
  #25  
Old 07-19-2012, 05:52 PM
chrisk chrisk is offline
Charter Member
 
Join Date: Nov 2003
Location: Southern ontario
Posts: 5,905
I think that from a thermodynamic point of view, there's no way to say 'liquid water has more entropy than ice' without looking at the system of which it's a part - a hot summer day, a cold night in Antarctica, or interstellar space. The 'randomness' element is part of the information-theory definition of entropy, isn't it? Applying it to water molecules might be creating a dubious analogy.

Last edited by chrisk; 07-19-2012 at 05:53 PM.. Reason: Wanted to remove suspicion.
Reply With Quote
  #26  
Old 07-19-2012, 06:03 PM
dracoi dracoi is offline
Guest
 
Join Date: Dec 2008
Quote:
Originally Posted by Keeve View Post
As long as we're looking for philosophy, can someone explain this part of it to me?Why do we talk about entropy constantly going UP? Wouldn't it be simpler and more intuitive to talk about entropy going DOWN, as in the phrase, "The universe is running down." I don't get it.

(Don't misunderstand me. UI'm not saying that those who talk about entropy going up are wrong. I'm asking why it was defined that way. When the concept of entropy was formulated, they could just have easily defined it as a mirror image of how they chose, and I'm curious why they chose this instead of that, given that down is so much more intuitive. Well, intuitive to ME, at least --- why not to them?)
When I was studying this, I thought of it like this:
If we start with 100 units of energy in a closed system and we use 25 units of energy to do work, we now have 75 units of energy left to do additional work, and 25 units of entropy (the "useless" energy). Do another 10 units of work, and now you only have 65 left, with 35 of entropy.

But this kind of thinking is not going to get you through the mid-term. In the end, entropy is a mathematical concept used by physicists. Thermodynamics is one of those subjects where you have to free yourself from intuition and trust the math.
Reply With Quote
  #27  
Old 07-19-2012, 06:05 PM
Bosstone Bosstone is offline
Guest
 
Join Date: Mar 2001
Philosophically: Everything breaks down.
Reply With Quote
  #28  
Old 07-19-2012, 06:06 PM
enipla enipla is offline
Member
 
Join Date: Jul 2001
Location: Colorado Rockies.
Posts: 8,023
Quote:
Originally Posted by IceQube View Post
I'll take any kind of slant.
Live with a Labrador Retriever for a few years. Everything on the coffee table will end up on the floor.

Everything seeks its lowest state. If you have animals, it happens a bit quicker.
Reply With Quote
  #29  
Old 07-19-2012, 06:07 PM
Little Nemo Little Nemo is online now
Charter Member
 
Join Date: Dec 1999
Location: Western New York
Posts: 55,188
Okay, I'll take a shot at this.

All things exist in a state of change.

Some things exist is a system of order and some things do not.

Things can move between a system of order to a system out of order and vice versa.

It takes much more effort to place things into a system of order than it does to remove them from a system of order.

The result of the above factors is that the amount of order in the universe is constantly decreasing. This loss of order is entropy.
Reply With Quote
  #30  
Old 07-19-2012, 06:28 PM
zoid zoid is online now
Charter Member
 
Join Date: Sep 2001
Location: Chicago Il
Posts: 7,024
Quote:
Originally Posted by Little Nemo View Post
Okay, I'll take a shot at this.

All things exist in a state of change.

Some things exist is a system of order and some things do not.

Things can move between a system of order to a system out of order and vice versa.

It takes much more effort to place things into a system of order than it does to remove them from a system of order.

The result of the above factors is that the amount of order in the universe is constantly decreasing. This loss of order is entropy.
But here's the confusing thing about that explanation. When matter in interstellar clouds clumps together to make planets and stars and galaxies it actually becomes more organized but it increases entropy. The increase in entropy is due to the loss of potential energy.
Reply With Quote
  #31  
Old 07-19-2012, 07:09 PM
zoid zoid is online now
Charter Member
 
Join Date: Sep 2001
Location: Chicago Il
Posts: 7,024
Quote:
Originally Posted by KarlGauss View Post
This is very nice and helps a lot. But, they're not equivalent, are they? What, if anything, is lost using such a much more 'down to earth' definition?
Well long, long ago...


Scientist1 = Bob
Scientist 2 = Terry
Scientist3 = Alex

Bob: Ok Terry what was the loss of potential energy from that last one?

Terry: Looks like right at 35 kilojoules

Bob: You know, we’re always recording this “loss of potential energy.” We should really give it a name just to make it easier.

Terry: What do you want to call it?

Bob: I don’t give a shit, call it whatever you want.

Terry: OK, we’ll call it Entropy?

Bob: Entropy? What the fuck is “Entropy”?

Terry: It’s my Ex’s maiden name.

Bob: Get the fuck outta here, you Ex’s maiden name is “Entropy?!”

Terry: Yeah I think they’re Swedish or Bavarian or some shit like that.

Bob: OK, but what does that have to do with anything?

Terry: Well she sucked all of the energy and potential out of me so I figured it was fitting.

Bob: Jesus, you really need to let that go man.

Terry: She’s STILL got my fucking dog!

Bob: REALLY?! AGAIN with the God Damn dog?!
.
.
.
Alex: Hey guys, whatcha doing?

Terry: We’re measuring Entropy.

Alex: What the fuck is “Entropy?”

Terry: See I have this Ex…

Alex: Mention that mother-fucking dog and I swear I’ll beat the SHIT out of you!



And so it came to be.
Reply With Quote
  #32  
Old 07-19-2012, 07:19 PM
colonial colonial is offline
BANNED
 
Join Date: Mar 2011
Posts: 1,709
We really need a professional scientist to make the best of this OP.

Entropy is the concept underlying the Second Law of Thermodynamics.

It is the word used by science for the inevitable increasing disorder which the 2nd Law observes must take place in any system closed to outside sources of energy.

The Universe is a closed system meaning the energy it contains will gradually dissipate in to a point called "heat death". The total amount of energy must always be the same, but its activity under heat death will be too disordered to allow formation of galaxies, stars or planets, perhaps even atoms.

Last edited by colonial; 07-19-2012 at 07:20 PM..
Reply With Quote
  #33  
Old 07-19-2012, 07:54 PM
Daylate Daylate is offline
Guest
 
Join Date: Dec 1999
Perhaps this will explain it better.

http://s564.photobucket.com/albums/s...nt=entropy.gif
Reply With Quote
  #34  
Old 07-19-2012, 08:03 PM
Lazlo Lazlo is offline
Guest
 
Join Date: Oct 2000
Try some MC Hawking for a phat rhyme slant. (NSFW - Language)
Reply With Quote
  #35  
Old 07-19-2012, 08:14 PM
Senegoid Senegoid is offline
Member
 
Join Date: Sep 2011
Location: Sunny California
Posts: 6,507
Quote:
Originally Posted by chrisk View Post
I hope that shed some light and wasn't too inaccurate.
Philosophical slant: By shedding some light, you've done some useful work of fighting ignorance, but in the process, increased the overall entropy of the universe.
Reply With Quote
  #36  
Old 07-19-2012, 08:15 PM
Bear_Nenno Bear_Nenno is offline
Charter Member
 
Join Date: Jun 2000
Location: Ft Benning, GA
Posts: 6,909
Quote:
Originally Posted by Yllaria View Post
When my children were very young, every mall had t-shirt booths where you could get almost anything put on a t- or similar shirt. I got each of them a baseball shirt with "ENRTOPY ELF" ironed onto the front. Only ONE person ever got the joke. So, to the the student running the cryo demonstration for the chem department one Picnic Day at UC Davis: Thank You. Your chuckle was appreciated.
little help?
Reply With Quote
  #37  
Old 07-19-2012, 08:43 PM
Indistinguishable Indistinguishable is offline
Guest
 
Join Date: Apr 2007
Well, elves are notoriously bad spellers, you see...
Reply With Quote
  #38  
Old 07-19-2012, 09:47 PM
septimus septimus is offline
Guest
 
Join Date: Dec 2009
When engineers concerned with data compression speak of the entropy of something (file, image, or dictionary, etc.), they refer to the minimum number of bits needed to represent it. This can also be considered to be the "information content" measured quantitatively. Opposite to this is redundancy -- a file which is mostly redundant can be compressed a lot.

But confusion arises. An image or text file with maximum entropy will not have maximum information in a practical sense: instead it will be pure noise! To be informative in a practical sense, an object needs to be some compromise between information and redundancy.

I think similar comments apply to thermodynamic entropy. A system with maximum entropy has no chance for further interesting evolution. Yet systems of very low entropy (i.e. high order) are too simple to be interesting. To be "interesting," phenomena require both entropy (information) and order (redundancy).
Reply With Quote
  #39  
Old 07-19-2012, 09:48 PM
JWT Kottekoe JWT Kottekoe is offline
Guest
 
Join Date: Apr 2003
Entropy is well defined mathematically, but hard to explain. It is a measure of how likely it is for a closed system to exist in a given macroscopic state. Since these probabilities range over so many orders of magnitude, entropy is defined on a logarithmic scale (like the Richter scale for earthquakes, or pH for proton concentration). If I start out with a very unlikely configuration of a system whose particles are in motion, it will very naturally evolve to a much more likely configuration. Fundamentally, that is all there is to the notion that the entropy is always increasing. It is not impossible for the reverse to occur, it is just incredibly unlikely.

What does all this mean? As I said, it is hard to explain. First we have to define what I mean by a macroscopic state. In statistical mechanics, we can understand how microscopic motion of atoms explains the macroscopic thermodynamic properties of heat and temperature. Entropy is another macroscopic quantity. As someone mentioned up thread, consider a box with a divider and hydrogen atoms on one side, helium atoms on the other. Assume these are perfectly non-interacting ideal gases, for the sake of simplicity. I remove the divider and the system is in a very unlikely arrangement of atoms with a given temperature and pressure. As the atoms move around they naturally mix. No change of energy occurs (potential or kinetic), but the entropy increases dramatically, because you go from a very improbable arrangement to an arrangement that is much more likely. Once the gases have mixed, it is astronomically improbable that random motion will restore the separation of atoms that we started with, while it is highly probable that we will continue to have a relatively uniform distribution of atoms.

So, to repeat, the second law of thermodynamics is a mathematically well-defined way of saying that closed systems tend to move away from highly improbable configurations to more probable configurations that have the same macroscopic state variables, like temperature and pressure.

In the example I gave, there is no change in potential energy. Entropy is decidedly not the loss of potential energy and does not even have the same units as energy.

For the poster who would prefer a measure that goes in the other direction, you would like Shannon's information theory, in which information (again logarithmically measured) is described as negentropy. Information can only be lost, not gained.

Last edited by JWT Kottekoe; 07-19-2012 at 09:51 PM..
Reply With Quote
  #40  
Old 07-19-2012, 11:00 PM
Nava Nava is offline
Guest
 
Join Date: Nov 2004
Quote:
Originally Posted by Bear_Nenno View Post
little help?
Entropy is metaphorically equated to disorganization, mess, disorder. The letters are in the wrong order; the spelling is suffering from the ravages of entropy.
Reply With Quote
  #41  
Old 07-19-2012, 11:39 PM
Indistinguishable Indistinguishable is offline
Guest
 
Join Date: Apr 2007
Quote:
Originally Posted by JWT Kottekoe View Post
Entropy is well defined mathematically, but hard to explain. It is a measure of how likely it is for a closed system to exist in a given macroscopic state. Since these probabilities range over so many orders of magnitude, entropy is defined on a logarithmic scale (like the Richter scale for earthquakes, or pH for proton concentration).
Entropy (at least in the Shannon sense, and thus I assume similarly for the Boltzmann sense as well) isn't logarithmically related to probability as a mere matter of convenience for handling probabilities spanning many orders of magnitude; entropy is specifically logarithmically related to probability because independent events multiply probability under conjunction, and we'd like to speak of this as adding entropy.

Quote:
First we have to define what I mean by a macroscopic state. In statistical mechanics, we can understand how microscopic motion of atoms explains the macroscopic thermodynamic properties of heat and temperature. Entropy is another macroscopic quantity.
So... I'm still curious: what is a macroscopic quantity? You've given examples, but not a definition.

Quote:
As someone mentioned up thread, consider a box with a divider and hydrogen atoms on one side, helium atoms on the other. Assume these are perfectly non-interacting ideal gases, for the sake of simplicity. I remove the divider and the system is in a very unlikely arrangement of atoms with a given temperature and pressure. As the atoms move around they naturally mix. No change of energy occurs (potential or kinetic), but the entropy increases dramatically, because you go from a very improbable arrangement to an arrangement that is much more likely. Once the gases have mixed, it is astronomically improbable that random motion will restore the separation of atoms that we started with, while it is highly probable that we will continue to have a relatively uniform distribution of atoms.
For every possible way that atoms could move forward in time from a lowly mixed state to a highly mixed state, there's a corresponding way (just reverse all the motions; particle kinetics are described by time-symmetric laws, right?) that atoms could move forward in time from a highly mixed state to a lowly mixed state. So it can't be just automatic that the mixedness of the atoms will tend to rise over time. This can't be explained as a probabilistic tautology; there must be some hidden assumption (I'm told it has to do with boundary conditions at the Big Bang, but what do I know? All I know is that the handwavy statistical argument people usually give is too glib).

Last edited by Indistinguishable; 07-19-2012 at 11:40 PM..
Reply With Quote
  #42  
Old 07-20-2012, 12:35 AM
fumster fumster is offline
Member
 
Join Date: Feb 2007
Posts: 2,890
Quote:
Originally Posted by IceQube View Post
My IQ is 100. Explain entropy to me, and why I should care about it.
In your case you should care because you are going to melt.
Reply With Quote
  #43  
Old 07-20-2012, 12:37 AM
Half Man Half Wit Half Man Half Wit is offline
Guest
 
Join Date: Jun 2007
There simply are many more high entropy states than there are low entropy states, so any 'random step' in the evolution is far more likely to go towards higher entropy. Ultimately, the second law doesn't mean anything but 'more likely states obtain more often'. The boundary conditions of the universe only come into play to explain why the entropy was ever low to begin with; a randomly selected state for the universe would be expected to be high entropy.

As for macroscopic states, there's indeed some ambiguity. The crucial thing is perhaps that once you have defined a notion of macroscopic quantity you care about, then entropy is unambiguous.

I'm running out of time here, perhaps I can elaborate later on...
Reply With Quote
  #44  
Old 07-20-2012, 12:55 AM
Indistinguishable Indistinguishable is offline
Guest
 
Join Date: Apr 2007
Quote:
Originally Posted by Half Man Half Wit View Post
There simply are many more high entropy states than there are low entropy states, so any 'random step' in the evolution is far more likely to go towards higher entropy.
That reasoning would be equally true for a random step backwards in time as for a random step forwards in time. It would be just as supportive of "Entropy tends to decrease over time" as "Entropy tends to increase over time".

Last edited by Indistinguishable; 07-20-2012 at 12:56 AM..
Reply With Quote
  #45  
Old 07-20-2012, 04:49 AM
Polycarp Polycarp is offline
Member
 
Join Date: Aug 1999
Location: A better place to be
Posts: 26,718
"INSUFFICIENT DATA FOR MEANINGFUL ANSWER. "

-- Isaac Asimov, "The Last Question"


Last edited by Polycarp; 07-20-2012 at 04:50 AM..
Reply With Quote
  #46  
Old 07-20-2012, 08:43 AM
Half Man Half Wit Half Man Half Wit is offline
Guest
 
Join Date: Jun 2007
Quote:
Originally Posted by Indistinguishable View Post
That reasoning would be equally true for a random step backwards in time as for a random step forwards in time. It would be just as supportive of "Entropy tends to decrease over time" as "Entropy tends to increase over time".
The first part is right, but I don't see how this implies that entropy would decrease over time -- basically, one should expect that the entropy is higher towards the past and future, but not that it should be lower towards the future. To then get the entropy in the past low one invokes the universal boundary conditions.
Reply With Quote
  #47  
Old 07-20-2012, 09:01 AM
Machine Elf Machine Elf is offline
Guest
 
Join Date: Feb 2007
Quote:
Originally Posted by zoid View Post
But here's the confusing thing about that explanation. When matter in interstellar clouds clumps together to make planets and stars and galaxies it actually becomes more organized but it increases entropy. The increase in entropy is due to the loss of potential energy.
They've lost gravitational potential energy by clustering close together, but their kinetic energy has increased. That's a reversible process, so entropy has not particularly increased.

Quote:
Originally Posted by notsoheavy3d
I've been confused by this for awhile actually. I mean suppose I had a closed jar of liquid water in intergalatic space and it was just sitting there. I'd expect the entropy of the jug of water to go up with time.(Since you know, 2nd law) However I'd also expect it to cool and since there's no source of energy near by I'd expect it to freeze, not melt which would make me think the jug of liquid water has less entropy than the jug of ice.
The second law of thermodynamics says that in a closed system, entropy can only increase or remain constant.

A jar of liquid water in intergalactic space will cool and freeze. because this involves heat transfer between the water (at 273K) and surrounding intergalactic space (approximately 4K), the jar of water cannot be considered in isolation; the "system," for second-law purposes, is the jar of water plus the surrounding intergalactic space to which heat is being pissed away. The jar of liquid water, when it freezes, experiences a decrease in entropy - but the surrounding intergalactic space, by receiving all that heat from the liquid water, experiences an increase in entropy which more than offsets the decrease in entropy of the jar of water. The net change in entropy for this heat transfer event is greater than zero.
Reply With Quote
  #48  
Old 07-20-2012, 10:43 AM
Colophon Colophon is offline
Guest
 
Join Date: Sep 2002
Quote:
Originally Posted by IceQube View Post
Edit: I'm looking for a philosophical slant rather than a purely technical view of the concept.
"Given time, everything will go to shit."
Reply With Quote
  #49  
Old 07-20-2012, 11:29 AM
Indistinguishable Indistinguishable is offline
Guest
 
Join Date: Apr 2007
Quote:
Originally Posted by Half Man Half Wit View Post
The first part is right, but I don't see how this implies that entropy would decrease over time -- basically, one should expect that the entropy is higher towards the past and future, but not that it should be lower towards the future. To then get the entropy in the past low one invokes the universal boundary conditions.
Well, higher towards past = lower towards future...

But the counting argument doesn't really show "Entropy tends to rise as one steps backwards/forwards in time"; it shows "Entropy tends to be high at any moment", such that if entropy is in fact actually low at some moment, this tends to be an anomaly and entropy will tend to be high (and thus higher) at any other moment around it, before or after.

[This is all on the assumption that there are more ways to be high entropy than low entropy, which I suppose is dependent on what exactly macrostates are... for example, given a quadrillion macrostates each consisting of just one microstate (and thus low/zero entropy), and one macrostate consisting of a million microstates (and thus higher entropy), there are rather more ways to be low entropy than high entropy. But I gather this isn't how macrostates work in practice, only I don't have a good sense of what they amount to still. I can only make flailing points about mathematical technicalities, because I still don't understand the physics.]

Last edited by Indistinguishable; 07-20-2012 at 11:31 AM..
Reply With Quote
  #50  
Old 07-20-2012, 12:04 PM
OldGuy OldGuy is offline
Charter Member
 
Join Date: Dec 2002
Location: Very east of Foggybog, WI
Posts: 2,570
Quote:
Originally Posted by Nava View Post
Entropy is metaphorically equated to disorganization, mess, disorder. The letters are in the wrong order; the spelling is suffering from the ravages of entropy.
Also I think that since the wearer was a small child, the implication is that the little one is quite helpful at creating disorder in the universe.
Reply With Quote
Reply

Bookmarks

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off

Forum Jump


All times are GMT -5. The time now is 11:25 PM.


Powered by vBulletin® Version 3.8.7
Copyright ©2000 - 2014, vBulletin Solutions, Inc.

Send questions for Cecil Adams to: cecil@chicagoreader.com

Send comments about this website to: webmaster@straightdope.com

Terms of Use / Privacy Policy

Advertise on the Straight Dope!
(Your direct line to thousands of the smartest, hippest people on the planet, plus a few total dipsticks.)

Publishers - interested in subscribing to the Straight Dope?
Write to: sdsubscriptions@chicagoreader.com.

Copyright © 2013 Sun-Times Media, LLC.