Recognizing that the concept of entropy has been stretched to cover all kinds of systems (including business applications), I am specifically talking about in a thermo sense. This is perhaps the hardest thing for me to conceptualize. Can the SD give me examples of when entropy manifests itself? Every typical example I’ve heard (processes that cannot be directly reversed, apples don’t fall back onto trees, etc.) can ALL be explained by the behavior of energy alone! So, why the need for this concept of entropy?
Another common example is allowing ice to melt into liquid water in a room at constant temp. It seems the liquid state is assumed to be more “chaotic” simply because the energy in the room flows into the ice until no more energy can flow? Or, because the liquid’s molecues can move about more freely than that of a solid. But, what about water at the triple point? The three states readily change form one to another all by an exchange of latent heats only. As I understand it, there is no sensible heat change at all. Now, what is said about the entropy in that instance? A delta of zero in entropy for both the system and its surroundings?
I WAG someone (in Carnot’s era) realized there’s only so much “unbalanced” energy in the universe which they defined as “disorder”. This disorder allows energy to flow until it is all “organized”. At that point, no more energy can flow in the entire universe. So, I guess the next logical step was to attempt to measure this disorder and the incremental changes moving us toward more and more order?
An ideal gas is confined to half of a chamber by a barrier. It comes to fill the whole chamber after removal of the barrier or flow through a hole in it. The entropy increases. It’s a matter of a factor of two per molecule, but after taking the logarithm we somewhat perversely multiply by R so that it looks nothing like 2 and looks like it has something to do with energy, which it doesn’t.
Flow of solute in solution from high to low concentration is pretty much the same thing.
One thing I can tell you from memory is that the concept of entropy in thermodynamics predates the modern theory of statistical thermodynamics. Part of the same theoretical revolution that determined that a substance’s temperature is just the average kinetic energy of the molecules it consists of, also determined that the entropy of a system is the randomness of the distribution of kinetic energy amongst its molecules. But I don’t know what entropy was believed to be prior to that revolution …
Entropy exists in the same way that the number 45 exists. It’s a well defined concept. From memory it’s the natural log of the number of possible microstates of a system. The thermodynamic temperature scale, the thing that the Kelvin is the SI unit for measuring, is based on entropy.
I think stretching it to cover things like business systems is silly, but I also think giving 110% is silly, so maybe that disqualifies me from characterizing business communications.
It’s used in protein folding as well. A random blob of amino acids has much more entropy than a properly folded protein, where each amino acid is locked into a specific spatial relationship to all the others. This is vitally important to life, because it helps determine proteins’ shapes, which in turn dictates their function. You get a dynamic balance between electrostatic forces gluing the protein into shape and entropy trying to pull it apart. Riding that line gives proteins the flexibility they need to perform useful functions.
The thermodynamic temperature scale, though, is connected to the thermodynamic definition of entropy, which I blame for creating the false impression that entropy is all about energy. The definition you refer to is the superior statistical-mechanical definition.
Probably, but the notion of information-theoretic is more general than physical systems.
The two definitions are related, though. There was a fascinating article in Scientific American that covered this a few years ago. Both care about the really intriguing parts. The two things ought to have names that are a little different, though.
As far as blame goes, physicist Rudolph Clausius developed the idea of entropy for dealing with heat and energy in physical systems. I blame the other definition for creating the false impression that entropy is not all about energy.
In each example its is possible to imagine that no energy needs to be done if the wires or cards were infinitely light. Yet the system gets more dosrdered
This sounds probably correct to me, and I assume this amounts to saying “The entropy of a system is the logarithm of the number of microstates compatible with that system’s macrostate”, or some such thing. But this leads me to the question: What exactly is a microstate? Or, rather, what makes some property of a system a property of its macrostate as opposed to merely one of its (more informative) microstate? What is the dividing line?
(Alternatively, if necessary, tell me that I’ve misconceived the concepts altogether)
The two definitions of temperature? Of course one type of temperature more or less reduces to the other, but the definitions themselves talk about completely different sorts of things. The same goes for the two notions of entropy.
Historical side-note: you probably know this, but many are unaware that this is the Latinized name of the youngest son of Santa and Mrs. Claus (named, of course, after their favorite reindeer).
It is unfortunate that the impression that you deride is not more widespread. The contrary impression leads to all sorts of misunderstandings and obfuscation.
Consider the entropy increase upon mixing of two ideal (perfect) gases at constant temperature and pressure. What does that have to do with energy?
I suggest trying to calculate that entropy using the thermodynamic definition. It involves devising a fairly complicated, but reversible, path to the mixed state. When you’re done, you will have arrived at an answer that is obvious from a different perspective that has nothing to do with energy.
A macrostate is a state of the system that can be described entirely in macroscopic quantities referring to the material as a whole. A microstate refers to a specific arrangement of the material’s most fundamental components. Another way of thinking about it is, suppose as is usually the case that your material is made of atoms. If you can switch two atoms and it doesn’t change the state, that’s a macrostate. If it does, that’s a microstate.
>Can the SD give me examples of when entropy manifests itself?
Quatum phenomenon produce entropy. For instance, when your computer booted up it needed to generate a lot of random numbers so you could even log in and authenticate. Your computer has a randon number generator which samples thermal noise to create random numbers. Entropy doesnt manifest itself, its everywhere and you just sample it.
Has that actually made it into any production chipset? I’ve heard a lot of talk about how it would be a good thing to do, but I didn’t think it had actually happened yet.
The two definitions I was thinking about were those of entropy.
I don’t think I’d heard about the relation to Santa, but it would explain why Clausius gave us so much basic thermodynamics.
It’s also unfortunate that I derided it, perhaps. Well, I deride the historical misunderstanding, not the preference for what really is the more general concept. The informational and thermal sides of the entropy coin really ought to have different names. I think the ideas behind them, and their applications in the 2nd law of thermodynamics and in information theory, are subtle enough without mixing the two. Historically, the energy version lead to the other, but developing understanding of the theory would work better developing the informational version and then examining its application in special cases including energy.