It’s an increase in the number of available states, In ice, the molecules are more constrained, thus the lower entropy.
States?
As in states of matter?
I’m familiar with entropy in information theory where it is a measure of unpredictability (inverse of information). In Computer Science we studied Shannon theoretical work which laid foundation for practical uses such as data compression (entropy of English language) or development of encryption methods.
You can find entire chapter devoted to entropy in Godel, Escher, Bach book by Hofstadter.
I’ve been confused by this for awhile actually. I mean suppose I had a closed jar of liquid water in intergalatic space and it was just sitting there. I’d expect the entropy of the jug of water to go up with time.(Since you know, 2nd law) However I’d also expect it to cool and since there’s no source of energy near by I’d expect it to freeze, not melt which would make me think the jug of liquid water has less entropy than the jug of ice.
Chronos any chance you could clear that up?
I think that from a thermodynamic point of view, there’s no way to say ‘liquid water has more entropy than ice’ without looking at the system of which it’s a part - a hot summer day, a cold night in Antarctica, or interstellar space. The ‘randomness’ element is part of the information-theory definition of entropy, isn’t it? Applying it to water molecules might be creating a dubious analogy.
When I was studying this, I thought of it like this:
If we start with 100 units of energy in a closed system and we use 25 units of energy to do work, we now have 75 units of energy left to do additional work, and 25 units of entropy (the “useless” energy). Do another 10 units of work, and now you only have 65 left, with 35 of entropy.
But this kind of thinking is not going to get you through the mid-term. In the end, entropy is a mathematical concept used by physicists. Thermodynamics is one of those subjects where you have to free yourself from intuition and trust the math.
Philosophically: Everything breaks down.
Live with a Labrador Retriever for a few years. Everything on the coffee table will end up on the floor.
Everything seeks its lowest state. If you have animals, it happens a bit quicker.
Okay, I’ll take a shot at this.
All things exist in a state of change.
Some things exist is a system of order and some things do not.
Things can move between a system of order to a system out of order and vice versa.
It takes much more effort to place things into a system of order than it does to remove them from a system of order.
The result of the above factors is that the amount of order in the universe is constantly decreasing. This loss of order is entropy.
But here’s the confusing thing about that explanation. When matter in interstellar clouds clumps together to make planets and stars and galaxies it actually becomes more organized but it increases entropy. The increase in entropy is due to the loss of potential energy.
Well long, long ago…
Scientist1 = Bob
Scientist 2 = Terry
Scientist3 = Alex
Bob: Ok Terry what was the loss of potential energy from that last one?
Terry: Looks like right at 35 kilojoules
Bob: You know, we’re always recording this “loss of potential energy.” We should really give it a name just to make it easier.
Terry: What do you want to call it?
Bob: I don’t give a shit, call it whatever you want.
Terry: OK, we’ll call it Entropy?
Bob: Entropy? What the fuck is “Entropy”?
Terry: It’s my Ex’s maiden name.
Bob: Get the fuck outta here, you Ex’s maiden name is “Entropy?!”
Terry: Yeah I think they’re Swedish or Bavarian or some shit like that.
Bob: OK, but what does that have to do with anything?
Terry: Well she sucked all of the energy and potential out of me so I figured it was fitting.
Bob: Jesus, you really need to let that go man.
Terry: She’s STILL got my fucking dog!
Bob: REALLY?! AGAIN with the God Damn dog?!
.
.
.
Alex: Hey guys, whatcha doing?
Terry: We’re measuring Entropy.
Alex: What the fuck is “Entropy?”
Terry: See I have this Ex…
Alex: Mention that mother-fucking dog and I swear I’ll beat the SHIT out of you!
And so it came to be.
We really need a professional scientist to make the best of this OP.
Entropy is the concept underlying the Second Law of Thermodynamics.
It is the word used by science for the inevitable increasing disorder which the 2nd Law observes must take place in any system closed to outside sources of energy.
The Universe is a closed system meaning the energy it contains will gradually dissipate in to a point called “heat death”. The total amount of energy must always be the same, but its activity under heat death will be too disordered to allow formation of galaxies, stars or planets, perhaps even atoms.
Perhaps this will explain it better.
http://s564.photobucket.com/albums/ss86/Trygolyte/?action=view¤t=entropy.gif
Philosophical slant: By shedding some light, you’ve done some useful work of fighting ignorance, but in the process, increased the overall entropy of the universe.
little help?
Well, elves are notoriously bad spellers, you see…
When engineers concerned with data compression speak of the entropy of something (file, image, or dictionary, etc.), they refer to the minimum number of bits needed to represent it. This can also be considered to be the “information content” measured quantitatively. Opposite to this is redundancy – a file which is mostly redundant can be compressed a lot.
But confusion arises. An image or text file with maximum entropy will not have maximum information in a practical sense: instead it will be pure noise! To be informative in a practical sense, an object needs to be some compromise between information and redundancy.
I think similar comments apply to thermodynamic entropy. A system with maximum entropy has no chance for further interesting evolution. Yet systems of very low entropy (i.e. high order) are too simple to be interesting. To be “interesting,” phenomena require both entropy (information) and order (redundancy).
Entropy is well defined mathematically, but hard to explain. It is a measure of how likely it is for a closed system to exist in a given macroscopic state. Since these probabilities range over so many orders of magnitude, entropy is defined on a logarithmic scale (like the Richter scale for earthquakes, or pH for proton concentration). If I start out with a very unlikely configuration of a system whose particles are in motion, it will very naturally evolve to a much more likely configuration. Fundamentally, that is all there is to the notion that the entropy is always increasing. It is not impossible for the reverse to occur, it is just incredibly unlikely.
What does all this mean? As I said, it is hard to explain. First we have to define what I mean by a macroscopic state. In statistical mechanics, we can understand how microscopic motion of atoms explains the macroscopic thermodynamic properties of heat and temperature. Entropy is another macroscopic quantity. As someone mentioned up thread, consider a box with a divider and hydrogen atoms on one side, helium atoms on the other. Assume these are perfectly non-interacting ideal gases, for the sake of simplicity. I remove the divider and the system is in a very unlikely arrangement of atoms with a given temperature and pressure. As the atoms move around they naturally mix. No change of energy occurs (potential or kinetic), but the entropy increases dramatically, because you go from a very improbable arrangement to an arrangement that is much more likely. Once the gases have mixed, it is astronomically improbable that random motion will restore the separation of atoms that we started with, while it is highly probable that we will continue to have a relatively uniform distribution of atoms.
So, to repeat, the second law of thermodynamics is a mathematically well-defined way of saying that closed systems tend to move away from highly improbable configurations to more probable configurations that have the same macroscopic state variables, like temperature and pressure.
In the example I gave, there is no change in potential energy. Entropy is decidedly not the loss of potential energy and does not even have the same units as energy.
For the poster who would prefer a measure that goes in the other direction, you would like Shannon’s information theory, in which information (again logarithmically measured) is described as negentropy. Information can only be lost, not gained.
Entropy is metaphorically equated to disorganization, mess, disorder. The letters are in the wrong order; the spelling is suffering from the ravages of entropy.