To put an example to that, consider a heat pump. Heat-energy–that is, the random kinetic motion of atoms or molecules–is absorbed from a low temperature reservoir and “pumped” via a compression and expansion cycle into a higher temperature reservoir. In this way, you can seemingly paradoxically make use of the 30°F outside ground temperature to help warm the air in your house to a comfortable 68°F. You can even do this using less energy than it would take to directly warm your house to the same temperature; hence, a heat pump can have an over-unity efficiency (usually referred to as “efficacy” so that people don’t wigged out) over a direct heat cycle. (The reason a ground source or water source heat pump works so well is that while the temperature might be low, the actual amount of energy is more concentrated by the density of soil or water in comparison to air.)
Of course, you are not, despite occasional claims by foolishly misguided free energy enthusiasts, getting something for nothing; the extra heat-energy is simple reorganized in a less random fashion, i.e. all concentrated in your house rather than distributed outside. And the trade-off is that global net entropy still increases; the amount of work done on the working fluid in the compression cycle by the compressor motor results in greater exergy (negative entropy) loss than is gained by reorganizing environment heat. So the Second Law still holds, James Clerk Maxwell slumbers peacefully in his grave, and nobody needs to pull the daemons away from their drinks. Turn a heat pump around and you get the common refrigerator cycle which cools off your car in summer and keeps your beer cold; in this case, you pump heat-energy out of the container and convect/radiate it into the environment by the coils on the back.
Now, we keep throwing around this term, “entropy” (or exergy) but what does this actually mean to you, the consumer? While most people like to focus on Law #2–and it is a pretty impressive statement which, correctly phrased, has something useful and mathematically explicit to say about any type of work cycle–the third law is a lot more interesting to me. In general concept, entropy is the amount of energy that exists in a sytem but isn’t available to do useful work. Why isn’t it available? Because you have to have a gradient or a differential between one area of your system and another in order to get energy to move or flow and thus do something. If I put a (hypothetically weightless) balloon in a room in which the temperature of the air inside the balloon is the same as the ambient temperature, it just sits, going nowhere. However, if I heat up the balloon (or cool the room), it will expand and rise because the temperature differential causes the air inside the balloon to expand and do work on the air outside, pushing it away and seeing the reaction force by the denser air outside. (I’m assuming you’re going this in someplace with gravity and air pressure gradients resulting therefrom.) Similarly, heating gas in a Stirling recipricating piston engine causes it to expand and push the piston because the pressure on the other side is lower.
So entropy isn’t just about the amount of energy you have in one spot, but how it compares to the amount of energy you have around it, and how that is organized. This in turn creates gradients which are not just useful, but in fact (in any real world situation) will automatically cause work to be done whether you like it or not. Hence, we have hurricanes and tornados, which result from a low-entropy distribution of energy to a higher entropy system (once they’ve exhausted themselves). If you have an insulated box where the temperature is exactly the same everywhere, nothing happens on the overall level, no matter how hot it is inside the box, because there is no gradient. Never mind that the individual molecules are bouncing around like children on a sugar high; the distribution is completely random (or perhaps I should say more appropriately “normal”) and you can’t get enough of them to go in any one direction for long enough to do anything useful. And you thought herding cats was difficult?
Local entropy can be decreased, but only by fussing about somewhere else, resulting in a net global increase in entropy. And you can’t stop it or get off the ride, ever. Well, except inside of a black hole, where local entropy is theoretically maximized, but nobody wants to go there, at least no one you really want to be associated with.
People frequently ignorantly invoke the laws of thermodynamics in common discourse as both an analogue and to “explain” why something is or is not popular. The most egregious, in my mind, is the use of misuse of the concept of entropy to “refute” evolution and Darwinist selection by asserting that living systems somehow function in violation of the statistical mechanics principles of thermodynamics. Never mind that not only does every biological mechanism or cycle that we’ve ever examined adhere precisely to thermodynamic principles (indeed, the discovery of respiration was due to the application of the laws of conservation and thermodynamics), but that life itself functions like a very complex heat pump cycle, moderating the flow of energy in order to do useful work. A proper understanding of thermodynamics reinforces the comprehension of biological processes as fundamentally mechanical/chemical interactions rather than contradict them.
Unfortunately, it is a topic only properly taught at the collegiate level. and then only in the natural sciences and engineering disciplines, even though the underlying principles can be readily grasped by anyone with an understanding of basic algebra. And look as I might, I can’t find a single pop-science or rudimentary level book on thermodynamics. The closest I can get is Van Ness’ Understanding Thermodynamics, and even that is too mathematical for a nontechnical audience.
Stranger