Remember Me?

 Straight Dope Message Board Remember Me?

#1
04-27-2019, 07:15 PM
 Guest Join Date: Aug 2017 Location: Florida, USA Posts: 591

## Is the second law of thermodynamics routinely violated?

In the thread on Dualism, it was claimed that we have observed violations of the second law of thermodynamics; it was even suggested that large-scale violations occure with relative frequency. My response is that we have not observed violations, and that a single violation of the second law would result in a paradigm shift for many fields of science.

There are multiple formulations of the second law of thermodynamics. These are formulations of the same law as I was taught early on in secondary school[1][2]:

Quote:
 Originally Posted by Rudolf Clausius Heat can never pass from a colder to a warmer body without some other change, connected therewith, occurring at the same time.
Quote:
 Originally Posted by Lord Kelvin It is impossible, by means of inanimate material agency, to derive mechanical effect from any portion of matter by cooling it below the temperature of the coldest of the surrounding objects.
There is also an axiomatic formulation by Constantin Carathéodory[3][4]:

Quote:
 Originally Posted by Constantin Carathéodory In jeder beliebigen Umgebung eines willkürlich vorgeschriebenen Anfangszustandes gibt es Zustände, die durch adiabatische Zustandsänderungen nicht beliebig approximiert werden können.
Quote:
 Originally Posted by Translation by D. H. Delphenich In any arbitrary neighborhood of an arbitrarily given initial point there is a state that cannot be arbitrarily approximated by adiabatic changes of state.
Wikipedia says Carathéodory's formulation is incomplete without adding the following, although I cannot access the source or a translation[5][6]:

Quote:
 Originally Posted by Max Planck The internal energy of a closed system is increased by an adiabatic process, throughout the duration of which, the volume of the system remains constant.
[1] Clausius, M. R. (1856, August). On a modified Form of the second Fundamental Theorem in the Mechanical Theory of Heat. London, Edinburgh and Dublin Philosophical Magazine and Journal of Science, Series 4, 12(77), 86. Retrieved from https://archive.org/stream/londonedi...ge/86/mode/2up
[2] Thomson, W. (1853). On the Dynamical Theory of Heat, with numerical results deduced from Mr Joule's equivalent of a Thermal Unit, and M. Regnault's Observations on Steam. Transactions of the Royal Society of Edinburgh, 20(2), 265. Retrieved from https://digital.nls.uk/scientists/archive/74629508
[3] Carathéodory, C. (1909, September). Untersuchungen über die Grundlagen der Thermodynamik. Mathematische Annalen, 67(3), 363. https://doi.org/10.1007/BF01450409
[4] Caratheodory, C. (n.d.). Examination of the foundations of thermodynamics [PDF file]. (D. H. Delphenich, Trans.) Retrieved from http://neo-classical-physics.info/up...modynamics.pdf (Original work published 1909).
[5] Second law of thermodynamics. (n.d.). In Wikipedia. Retrieved April 27, 2019, from https://en.wikipedia.org/wiki/Second...thermodynamics
[6] Planck, M. (1926) Über die Begründung des zweiten Hauptsatzes der Thermodynamik. Sitzungsberichte der Preussischen Akademie der Wissenschaften, Physikalisch-Mathematische Klasse, 1926, 453-463.
#2
04-27-2019, 07:18 PM
 Guest Join Date: Aug 2017 Location: Florida, USA Posts: 591
Quote:
Originally Posted by eschereal
Quote:
Originally Posted by Max S.
Quote:
 Originally Posted by eschereal In fact, I suspect large-scale violations occur with relative frequency. If you have a 5000K star orbiting an 8000K star, the colder one will emit radiant heat that will be absorbed by the hotter one. The net heat flow will be toward the colder one, but the hotter still receives some heat from it.
Simultaneous heat flow from the warm star to the cold star or even radiation to outer space would qualify as "some other change, connected therewith, occurring at the same time".

Even if you are using Carathéodory's principle with Planck's principle, there is still no violation. A (closed) system consisting of two cold stars in space is not adiabatically accessible from a system consisting of a warm star and a cold star in space. A warm star cannot turn into a cold star without suffering some transfer of heat or energy, either to the cold star or to space.

~Max
Adabiasis is a system mechanism. It pertains to fluid movement. Stars do not primarily emit heat through fluid transfer, outside the internal convection zone within the star. Most heat escaping a star is radiative, in the form of IR and other photons. The radiated heat goes out in all directions (though possibly less at the poles), so a star will gradually cool at it sheds heat into space.

Stars come in many different flavors, some hotter than others, so it is not unusual to have a hotter/colder binary pair. A small fraction of one star's radiative heat output will impinge on its partner, and vice versa. Hence, the colder star will move some of its heat to the hotter star, albeit less than the hotter star is moving to the colder star. A bunch of photons moving on a path are not deterred by a bigger bunch of photons coming the other way.
I've only heard of "adiabatic" in reference to a system where heat does not leave the system, and that was the meaning I used.

If you are describing a system consisting of a 5000K star orbiting an 8000K star, and you allow radiation to "leave" the system, then the second law of thermodynamics as formulated by Carathéodory (and Planck) does not apply. It is not violated; it does not apply to the system you have described.

If however, you include the surrounding space in the system, extending to infinity and encompassing all radiation emitted, then the second law of thermodynamics holds true.

~Max
#3
04-27-2019, 08:39 PM
 Charter Member Join Date: Aug 2001 Posts: 16,121
I thought if a law gets violated in physics, then it is no longer a law. Or in fact, never was one.
#4
04-27-2019, 10:13 PM
 Guest Join Date: May 2014 Location: South Bay, SoCal Posts: 205
Quote:
 Originally Posted by Grrr! I thought if a law gets violated in physics, then it is no longer a law. Or in fact, never was one.
At any reasonable macroscopic level, the 2nd law of thermodynamics has never been demonstrated to be violated, experimentally or theoretically. At the microscopic level, where statistical probability is potentially, locally, over very short times, superseded by quantum mechanical effects, it is possible to view some systems as violating the second law (though it it not clear to me that experimentally it is possible to create a completely isolated system as required by the second law, given the extent of the quantum field).

You should never take your scientific knowledge from blithe pronouncements in a thread on philosophy. Philosophers, apparently, can't be bothered to add in all the caveats and conditions that go along with the"violation" of a scientific principle.
#5
04-28-2019, 12:44 AM
 Member Join Date: Nov 2004 Location: Hey! I'm located! WOOOOW! Posts: 41,197
Quote:
 Originally Posted by Max S. I've only heard of "adiabatic" in reference to a system where heat does not leave the system, and that was the meaning I used.
Not heat: energy. Heat is merely one of many forms of energy.
__________________
Evidence gathered through the use of science is easily dismissed through the use of idiocy. - Czarcasm.
#6
04-28-2019, 05:02 AM
 Guest Join Date: Jun 2007 Posts: 6,688
Quote:
 Originally Posted by peccavi At any reasonable macroscopic level, the 2nd law of thermodynamics has never been demonstrated to be violated, experimentally or theoretically. At the microscopic level, where statistical probability is potentially, locally, over very short times, superseded by quantum mechanical effects, it is possible to view some systems as violating the second law (though it it not clear to me that experimentally it is possible to create a completely isolated system as required by the second law, given the extent of the quantum field).
Quantum theory doesn't enter into it. For any system of N particles, there's a probability scaling with 1/sqrt(N) that violations of the second law will be observed. For any macroscopic system, that probability is so low as to be completely negligible, while for systems of few constituents (which we can take to be classical), violations can and do occur.

Entropy is fundamentally a measure for how many microstates (unobservable individual particle arrangements, say) lead to the same macrostate (the gross, macroscopic properties of a system). So, there are many more microstates corresponding to all of the gas in a room being evenly distributed, than there are corresponding to all the gas bunching up in the middle; hence, the former is a high-entropy state, while the latter is a low-entropy state.

If you start out with a low-entropy configuration, any given change is likely to lead to a higher-entropy configuration, just by virtue of how many more ways there are to increase entropy, than to decrease it. But that doesn't mean that spontaneous decreases of entropy are impossible; it merely means that they're unlikely.

Take a toy system of three 'particles', 1, 2, and 3, carrying one unit of energy each, which can be in each of three boxes A, B, and C. This gives us the following ten distinguishable macrostates:
• (A1B1C1) Even distribution: each box contains one particle
• (A2B1C0) Box A contains two particles, B one, C none
• (A2B0C1) Box A contains two particles, B none, C one
• (A1B2C0) Box A contains one particle, B two, C none
• (A0B2C1) Box A contains no particle, B two, C one
• (A1B0C2) Box A contains one particle, B none, C two
• (A0B1C2) Box A contains no particle, B one, C two
• (A3B0C0) Box A contains all three
• (A0B3C0) Box B contains all three
• (A0B0C3) Box C contains all three

We can compute the probability of each macrostate, by counting the number of microstates that realizes it. The first macrostate can be realized by every permutation of particles, distributed over the boxes, thus, there are 3! = 6 possible ways to realize it. The last three macrostates can be realized in exactly one way. The six remaining cases can each be realized in three different ways (e. g. for (A2B1C0), particles 1 and 2, 1 and 3, or 2 and 3 could be in box A, the remaining one in box B).

In total, we thus have 6 + 6 * 3 + 3 * 1 = 27 possible microstates (which of course we knew, since there are 3 * 3 * 3 possibilities to distribute 3 objects among 3 boxes). Now, we can calculate the probabilities for each macrostate, as well as their entropy (as the natural logarithm of the number of ways in which it can be realized, i. e. the microstates):
• P(A1B1C1): 6 / 27 = 2 / 9 = 0.2222...; S(A1B1C1) = ln(6) ~ 1.8
• P(A2B1C0): 3 / 27 = 1 / 9 = 0.1111...; S(A2B1C0) = ln(3) ~ 1.1
• P(A2B0C1): 3 / 27 = 1 / 9 = 0.1111...; S(A2B0C1) = ln(3) ~ 1.1
• P(A1B2C0): 3 / 27 = 1 / 9 = 0.1111...; S(A1B2C0) = ln(3) ~ 1.1
• P(A0B2C1): 3 / 27 = 1 / 9 = 0.1111...; S(A1B0C2) = ln(3) ~ 1.1
• P(A1B0C2): 3 / 27 = 1 / 9 = 0.1111...; S(A1B0C2) = ln(3) ~ 1.1
• P(A0B1C2): 3 / 27 = 1 / 9 = 0.1111...; S(A0B1C2) = ln(3) ~ 1.1
• P(A3B0C0): 1 / 27 = 0.037...; S(A3B0C0) = ln(1) = 0
• P(A0B3C0): 1 / 27 = 0.037...; S(A0B3C0) = ln(1) = 0
• P(A0B0C3): 1 / 27 = 0.037...; S(A0B0C3) = ln(1) = 0

Assuming a simple dynamics of particles just hopping to a random box at each time step, we see that, while it's more likely for entropy to increase with each step, it's by no means impossible for it to decrease, as well. Starting out in one of the lowest entropy states, the probability that the next timestep will lead to a higher entropy state is 24 / 27 ~ 89%, with an 11% chance of staying the same. An intermediate-entropy state has an 11 % chance of decreasing entropy (and thus, violating the second law, as a colder box will transfer energy to an already hotter one, say in the transition (A0B1C2) --> (A0B0C3)), a 22% change of increasing, and a 67% chance of remaining equal.

As you increase the size of the system, these probabilities will favor higher entropy states ever more strongly. Still, if you wait long enough, you will observe violations of the second law even there---it's just that the wait time will quickly exceed even cosmological timescales.
#7
04-28-2019, 11:59 AM
 Guest Join Date: Aug 2017 Location: Florida, USA Posts: 591
Quote:
 Originally Posted by Half Man Half Wit Assuming a simple dynamics of particles just hopping to a random box at each time step...
By assuming the particles are "hopping to a random box" you have assumed the conclusion. The second law of thermodynamics specifically contradicts such a process.

~Max
#8
04-28-2019, 12:18 PM
 Guest Join Date: May 2016 Posts: 3,048
Quote:
 Originally Posted by Max S. By assuming the particles are "hopping to a random box" you have assumed the conclusion. The second law of thermodynamics specifically contradicts such a process. ~Max
The idea (postulate?) is that an isolated system is in thermal equilibrium if and only if every microstate is equally probable.
#9
04-28-2019, 12:47 PM
 Guest Join Date: Jun 2007 Posts: 6,688
Quote:
 Originally Posted by Max S. By assuming the particles are "hopping to a random box" you have assumed the conclusion. The second law of thermodynamics specifically contradicts such a process. ~Max
No. Any other dynamics would work just as well, as long as all microstates are equally probable.

Really, none of this is controversial in the slightest. Every physics student learns it in a course on statistical mechanics. People have even used this to try and argue that the universe itself might be due to a spontaneous fluctuation from a high-entropy state into a low-entropy state:
Quote:
 Originally Posted by Sean Carroll The idea is that, since the tendency of entropy to increase is statistical rather than absolute, starting from a state of maximal entropy we would (given world enough and time) witness downward fluctuations into lower-entropy states.

That idea doesn't work, but the general reasoning is sound.

Last edited by Half Man Half Wit; 04-28-2019 at 12:48 PM.
#10
04-28-2019, 02:55 PM
 Guest Join Date: Aug 2017 Location: Florida, USA Posts: 591
Quote:
 Originally Posted by DPRK The idea (postulate?) is that an isolated system is in thermal equilibrium if and only if every microstate is equally probable.
I always thought thermodynamic equilibrium was when two systems are in contact with no net flow of energy (heat). By allowing for microstatic particles to randomly reconfigure into distinct macrostates, you have also defied thermodynamic equilibrium.

Even the kinetic theory, to my knowledge, only allows for random movements so long as they express the same macrostates. But I might be wrong on that.

~Max
#11
04-28-2019, 02:59 PM
 Guest Join Date: Aug 2017 Location: Florida, USA Posts: 591
Quote:
 Originally Posted by Half Man Half Wit No. Any other dynamics would work just as well, as long as all microstates are equally probable.
No, it is only highly likely that all other dynamics continue to work. You have reduced the laws of thermodynamics from laws to rules of thumb.

~Max
#12
04-28-2019, 03:29 PM
 Guest Join Date: Jun 2007 Posts: 6,688
Quote:
 Originally Posted by Max S. No, it is only highly likely that all other dynamics continue to work. You have reduced the laws of thermodynamics from laws to rules of thumb. ~Max
By 'any other dynamics would work' I meant that the dynamics I gave, the 'random hopping' you (misguidedly) took issue with, isn't essential to demonstrating my point. All that's needed is the euiprobability of microstates. If that's given, any dynamics will lead to violations of the second law, whether you want to accept that or not.
#13
04-28-2019, 03:38 PM
 Guest Join Date: Jun 2007 Posts: 6,688
Quote:
 Originally Posted by Max S. Even the kinetic theory, to my knowledge, only allows for random movements so long as they express the same macrostates. But I might be wrong on that. ~Max
How do you suppose that could possibly work? How does the molecule know not to drift over into the other half of the room, for fear of changing the macrostate?

All that's happening is that there are a lot fewer ways to change to a lower entropy macrostate than to one with higher entropy. Hence, the latter sort of thing happens more often.
#14
04-28-2019, 05:05 PM
 Guest Join Date: Aug 2017 Location: Florida, USA Posts: 591
Quote:
 Originally Posted by Half Man Half Wit By 'any other dynamics would work' I meant that the dynamics I gave, the 'random hopping' you (misguidedly) took issue with, isn't essential to demonstrating my point. All that's needed is the euiprobability of microstates. If that's given, any dynamics will lead to violations of the second law, whether you want to accept that or not.
Has science proved the equiprobability of all possible microstates in a system? This isn't a given at all, since it violates the second law. What reason do I have to think it is so?

I apologize in advance for my ignorance of modern physics, which I suspect factor into your answer. All I've got to go by is grade school physical science, the few books I have read on the subject, and the internet.

~Max
#15
04-28-2019, 06:28 PM
 Guest Join Date: Aug 2017 Location: Florida, USA Posts: 591
Quote:
 Originally Posted by Half Man Half Wit How do you suppose that could possibly work? How does the molecule know not to drift over into the other half of the room, for fear of changing the macrostate? All that's happening is that there are a lot fewer ways to change to a lower entropy macrostate than to one with higher entropy. Hence, the latter sort of thing happens more often.
I can't even pretend to explain the mechanisms of molecular physics. But I thought the kinetic theory was compatible with the laws of thermodynamics - if so, somehow the molecules of a system in equilibrium will have to cancel each other out.

~Max
#16
04-28-2019, 11:49 PM
 Guest Join Date: May 2016 Posts: 3,048
Quote:
 Originally Posted by Max S. I always thought thermodynamic equilibrium was when two systems are in contact with no net flow of energy (heat). By allowing for microstatic particles to randomly reconfigure into distinct macrostates, you have also defied thermodynamic equilibrium. Even the kinetic theory, to my knowledge, only allows for random movements so long as they express the same macrostates. But I might be wrong on that. ~Max
It is valid that two systems are in mutual (thermal) equilibrium if there is no net flow of heat when they are in diathermal contact. However, imagine just one isolated system: if it is not in equilibrium, then you can imagine that under some extreme conditions it might not even have a well-defined temperature. On the other hand, once the system is in equilibrium all the "random movements" make no difference to the macrostate, assuming it is sufficiently "macro".

If you are suggesting that certain transitions are allowed but others are not then it is important to note that that is not the case: these transitions are reversible microscopic fluctuations and each transition is as likely to occur as its converse. Again, nothing is defied: in the macroscopic limit these fluctuations will be too small to detect.
#17
04-29-2019, 12:12 AM
 Guest Join Date: Jun 2007 Posts: 6,688
Quote:
 Originally Posted by Max S. Has science proved the equiprobability of all possible microstates in a system? This isn't a given at all, since it violates the second law. What reason do I have to think it is so?
Quote:
 Originally Posted by Max S. I can't even pretend to explain the mechanisms of molecular physics. But I thought the kinetic theory was compatible with the laws of thermodynamics - if so, somehow the molecules of a system in equilibrium will have to cancel each other out. ~Max
This isn't really the sort of thing one can prove, but denying it would lead to violation of several key tenets of modern science. For instance, reversibility: for a molecule to enter a certain region, leading to a higher-entropy macrostate, reversibility holds that the opposite must also be possible, which it wouldn't be, if for some reason the fact that the macrostate has a high entropy should prohibit this sort of development. Likewise, local causality: the options for a molecule in a sufficiently big macroscopic system would suddenly depend on the configuration of the system all the way over there, without any interaction between the parts.

Furthermore, it would introduce a sort of 'downward causation' incompatible with the fundamentally reductionistic nature of physics, i. e. with the idea that fixing the microscopic dynamics suffices to fix everything, because suddenly, the motion of molecules no longer depends on their interactions with their surroundings, but also, on whether they are considered to be part of a larger system in some macrostate. Whatever mathematics one uses to describe the motion of the molecules then would not only depend on their position and momenta, but also on the macrostate they're part of, which really basically throws the whole edifice of classical physics as based on Hamiltonian mechanics overboard.

Finally, the assumption of equiprobability is in fact a key ingredient in deriving macroscopic thermodynamics from microscopic statistical physics, as carried out chiefly by Boltzmann in the 1870s (so not that terribly modern). The implication of the possibility of violations of the second law was realized pretty much immediately, leading Boltzmann to propose his idea that the universe could be a spontaneous low-entropy fluctuation (however, the possibility of Boltzmann brains seems to doom this scenario).
#18
04-29-2019, 02:54 AM
 Guest Join Date: Aug 2017 Location: Florida, USA Posts: 591
Quote:
 Originally Posted by DPRK It is valid that two systems are in mutual (thermal) equilibrium if there is no net flow of heat when they are in diathermal contact. However, imagine just one isolated system: if it is not in equilibrium, then you can imagine that under some extreme conditions it might not even have a well-defined temperature. On the other hand, once the system is in equilibrium all the "random movements" make no difference to the macrostate, assuming it is sufficiently "macro". If you are suggesting that certain transitions are allowed but others are not then it is important to note that that is not the case: these transitions are reversible microscopic fluctuations and each transition is as likely to occur as its converse. Again, nothing is defied: in the macroscopic limit these fluctuations will be too small to detect.
There's just... something that I'm missing here. Maybe I'm still hung up on the "random hopping". I'll sleep on it and maybe it will make sense tomorrow.

~Max
#19
04-29-2019, 03:44 AM
 Guest Join Date: Dec 2009 Location: The Land of Smiles Posts: 19,139
I was just re-reading Cycles of Time by Roger Penrose and find his discussion on pages 39-43 pertinent to this thread — he even shows apparent counterexamples to the Second Law.

Linking to Google Books is a big pain, but the excerpt shows up (strangely and without attribution?) here, though only the discussion at the end of that page is most relevant.

Quote:
 Originally Posted by Roger Penrose All this having been said, it must however be pointed out that there are various particularly subtle situations where such crude notions of 'macroscopic indistinguishability' would appear to be inadequate, and even seem to give us quite wrong answers for the entropy! One such situation occurs with the phenomenon of spin echo (first noticed by Erwin Hahn in 1950) that is made use of in connection with nuclear magnetic resonance (NMR). ... ... The fact is that even though the spin states would appear to be very disordered in the intermediate situation, there is actually a very precise 'hidden order' in the apparently higgledy-piggledy arrangement of spins, this order being revealed only when the pattern of external magnetic field movements is reversed. Something analogous occurs with a CD or DVD, where any ordinary crude 'macroscopic measurement' would be likely not to reveal the very considerable stored information on such a disc ... In my opinion, there is a 'can of worms' here, if one demands that there should be a precise objective definition of physical entropy, applicable in all circumstances, with respect to which the Second Law is to be universally valid. I do not see why one should demand that there always be a well-defined, physically precise notion of 'entropy', that is entirely objective and consequently 'out there' in Nature, in some absolute sense, ... Entropy is clearly an extremely useful physical concept, but I do not see why it need be assigned a truly fundamental and objective role in physics. Indeed, it seems reasonable to me that the usefulness of the physical notion of entropy has its origin largely in the fact that, for systems that we tend to encounter in the actual universe, it turns out that the normal measures of 'macroscopic' quantities give rise to coarse-graining volumes that do in fact differ from one another by stupendously large factors. There is a profound issue, however, as to why, in the universe that we know, they should differ by such enormous factors. These enormous factors reveal a remarkable fact about our universe that does seem to be clearly objective and 'out there'—and we shall be coming to this shortly—despite the admittedly confusing issues of subjectivity that are involved in our concept of 'entropy', these serving merely to cloud the central mystery that underlies the profound usefulness of this remarkable physical notion.
#20
04-29-2019, 11:56 AM
 Guest Join Date: Aug 2017 Location: Florida, USA Posts: 591
Quote:
 Originally Posted by Half Man Half Wit This isn't really the sort of thing one can prove, but denying it would lead to violation of several key tenets of modern science. For instance, reversibility: for a molecule to enter a certain region, leading to a higher-entropy macrostate, reversibility holds that the opposite must also be possible, which it wouldn't be, if for some reason the fact that the macrostate has a high entropy should prohibit this sort of development.
Now here's something I don't understand. If a molecule enters a thermodynamic system from without, thereby imparting kinetic energy upon the system, that system is beyond the scope of the second law of thermodynamics. The process of a molecule entering a system is not adiabatic and the system is not isolated. The ensuing temperature change would be reversible at the system level by simply ejecting an equivalent amount of kinetic energy back into the environment, possibly the same molecule with the same energy coming out the opposite side of the system.

~Max
#21
04-29-2019, 12:01 PM
 Member Join Date: Jul 1999 Location: Chicagoland Posts: 1,918
In science, a "law" has no exceptions. Occasionally, one hears a claim that some "law" or other is being violated but, invariably, the true explanation comes to light that negates that claim.
#22
04-29-2019, 12:13 PM
 Guest Join Date: Aug 2017 Location: Florida, USA Posts: 591
Quote:
 Originally Posted by Half Man Half Wit Likewise, local causality: the options for a molecule in a sufficiently big macroscopic system would suddenly depend on the configuration of the system all the way over there, without any interaction between the parts.
The possible internal states of a system should be restricted by the properties of the system. This is a correlation made at every instant. The actual cause for the molecule's behavior should be determined by previous interactions.

~Max
#23
04-29-2019, 12:20 PM
 Guest Join Date: Aug 2017 Location: Florida, USA Posts: 591
Quote:
 Originally Posted by septimus I was just re-reading Cycles of Time by Roger Penrose and find his discussion on pages 39-43 pertinent to this thread — he even shows apparent counterexamples to the Second Law. Linking to Google Books is a big pain, but the excerpt shows up (strangely and without attribution?) here, though only the discussion at the end of that page is most relevant.
I'm sorry, but that flew right over my head. How is he defining "spins" and "entropy" and what does that have to do with the second law of thermodynamics? And I don't understand the CD or dye example at all.

~Max

Last edited by Max S.; 04-29-2019 at 12:23 PM.
#24
04-29-2019, 01:23 PM
 Guest Join Date: Dec 2009 Location: The Land of Smiles Posts: 19,139
Note that TWO very unrelated sorts of Second Law violations (or apparent violations) are being discussed here.

Further adding confusion is the use of two different definitions of entropy. One formula is dS = δq/T which relates entropy to heat and temperature. Another formula, used by Penrose and other theoretical physicists, is S = k log V where V is the volume of a macrostate in phase space. The first formula is a special case of the second.

Quote:
Originally Posted by Max S.
Quote:
 Originally Posted by Half Man Half Wit ... For instance, reversibility: for a molecule to enter a certain region, leading to a higher-entropy macrostate, reversibility holds that the opposite must also be possible, which it wouldn't be, if for some reason the fact that the macrostate has a high entropy should prohibit this sort of development.
Now here's something I don't understand. If a molecule enters a thermodynamic system from without, thereby imparting kinetic energy upon the system, that system is beyond the scope of the second law of thermodynamics ...
I think that in Mr. Wit's example the molecules are all inside a closed system, but one molecule moves from one region to another within that system.

Quote:
 Originally Posted by Max S. I'm sorry, but that flew right over my head. How is he defining "spins" and "entropy" and what does that have to do with the second law of thermodynamics? And I don't understand the CD or dye example at all.
Penrose gives examples of systems which appear to be in large boring "macrostates" but which in fact have useful information. In the CD example, apparently random imperfections can be transformed into a Hollywood movie, so disorder would seem to decrease when the disk is played. But this is just because macrostates are too fuzzy a notion to define easily.

Disclaimer: I am unqualified. If there are no major errors in the preceding please give me six brownie points!

Last edited by septimus; 04-29-2019 at 01:25 PM.
#25
04-29-2019, 03:12 PM
 Guest Join Date: Jun 2007 Posts: 6,688
Quote:
 Originally Posted by Max S. Now here's something I don't understand. If a molecule enters a thermodynamic system from without, thereby imparting kinetic energy upon the system, that system is beyond the scope of the second law of thermodynamics. The process of a molecule entering a system is not adiabatic and the system is not isolated. The ensuing temperature change would be reversible at the system level by simply ejecting an equivalent amount of kinetic energy back into the environment, possibly the same molecule with the same energy coming out the opposite side of the system. ~Max
Septimus has it right: I was talking about a single system, where, in order to realize this sort of thing, somehow the movements of individual molecules would have to be restricted by the overall macrostate. Picture a gas, homogeneously distributed in a box; in order to ensure that it stays that way, and not, very rarely, bunches up in one corner, something would have to direct the movements of individual molecules so as to prohibit such a thing from occurring. That's basically magic: the molecule itself might not interact with anything, but freely drift through space, where all of a sudden, it is being turned away, lest the corner get too full.

And again, I would like to point out that basically everybody agrees on this. Boltzmann himself, who pretty much came up with this picture, considered the possibility of random fluctuations to lower entropy states. Why are you so invested in that not occurring?

Picture again the gas, trillions and trillions of molecules zipping around. What should prevent their velocities, after who knows how many collision, from aligning such that they point in more or less the same direction? All that happens is little billard balls caromming off one another. Unlikely? Of course. But calling it impossible would entail something entirely mysterious occurring, some invisible guiding hand turning away excess molecules.
#26
04-29-2019, 03:34 PM
 Guest Join Date: Jun 2007 Posts: 6,688
Quote:
 Originally Posted by Max S. The possible internal states of a system should be restricted by the properties of the system. This is a correlation made at every instant. The actual cause for the molecule's behavior should be determined by previous interactions. ~Max
The two sentences are actually in direct contradiction. If only previous interactions determine the molecule's behavior at any given time, then there's nothing stopping it from moving such that (in concert with others) the entropy is lowered.

If, however, you insist on macroscopic properties restricting microscopic motion, then there needs to be some determinating factor of the motion of a particle that goes beyond just its past interactions.

Picture a billard table with a single ball (and we're doing physics here, so the billiard ball is a point mass with no friction). Set it in motion. If you haven't taken care to set up some simple periodic path, it'll eventually visit every point on the table, with none being particularly distinguished.

Now, add a second ball. The two balls can interact by careening off one another.

Divide the table into two halves. Suppose you can only distinguish which half of the table a ball is in. There are more ways for both balls to be in separate halves, than there are for both to be in the same half. Just mentally chop the table into ball-sized areas: if there are four such areas (two in each half), so there are two ways for both balls to be in one half, and eight for them to be in separate halves. Increasing the number of areas increases the disparity

What you're saying, now, amounts to saying that, if a ball already occupies one half, the other can't enter that half anymore. That is, whenever it threatens to cross the boundary, it must somehow get turned away, without interacting with the other ball or the walls.

Now, just increase the number of balls. Just set a mighty lot of them adrift on the table. They'll bump into each other, end simply by virtue of there being many more ways to be roughly evenly distributed, they'll spend most of their time that way.

But there's nothing about the way they're bouncing off of each other that forbids them from, occasionally, bunching up in one half of the table---and occasionally, that's what they will do. That's all there's to it. Anything else would require intervention by some mysterious agency we have no reason at all to believe should exist.

Last edited by Half Man Half Wit; 04-29-2019 at 03:34 PM.
#27
04-29-2019, 08:32 PM
 Guest Join Date: May 2016 Posts: 3,048
Quote:
 Originally Posted by Half Man Half Wit Anything else would require intervention by some mysterious agency we have no reason at all to believe should exist.
Maxwell's demon? I saw one on the subway last week, I swear! Or was that a Tasmanian devil? Norwegian something?
#28
04-29-2019, 11:58 PM
 Guest Join Date: Aug 2017 Location: Florida, USA Posts: 591
Quote:
 Originally Posted by Half Man Half Wit The two sentences are actually in direct contradiction. If only previous interactions determine the molecule's behavior at any given time, then there's nothing stopping it from moving such that (in concert with others) the entropy is lowered. If, however, you insist on macroscopic properties restricting microscopic motion, then there needs to be some determinating factor of the motion of a particle that goes beyond just its past interactions. Picture a billard table with a single ball (and we're doing physics here, so the billiard ball is a point mass with no friction). Set it in motion. If you haven't taken care to set up some simple periodic path, it'll eventually visit every point on the table, with none being particularly distinguished. Now, add a second ball. The two balls can interact by careening off one another. Divide the table into two halves. Suppose you can only distinguish which half of the table a ball is in. There are more ways for both balls to be in separate halves, than there are for both to be in the same half. Just mentally chop the table into ball-sized areas: if there are four such areas (two in each half), so there are two ways for both balls to be in one half, and eight for them to be in separate halves. Increasing the number of areas increases the disparity What you're saying, now, amounts to saying that, if a ball already occupies one half, the other can't enter that half anymore. That is, whenever it threatens to cross the boundary, it must somehow get turned away, without interacting with the other ball or the walls. Now, just increase the number of balls. Just set a mighty lot of them adrift on the table. They'll bump into each other, end simply by virtue of there being many more ways to be roughly evenly distributed, they'll spend most of their time that way. But there's nothing about the way they're bouncing off of each other that forbids them from, occasionally, bunching up in one half of the table---and occasionally, that's what they will do. That's all there's to it. Anything else would require intervention by some mysterious agency we have no reason at all to believe should exist.
This example is much easier for me to understand.

I assume the walls of the table are perfect and reflect 100% energy and there is no friction from air or the tabletop. The billiard table itself, by virtue of being a perfect box, is an isolated system. Therefore the table expresses but one macrostate, as you defined it. There are always N balls in the billiard table, and the sum kinetic energy of all the billiard balls remains a constant no matter how many times they collide. No energy flows into our out of the system - the table remains in equilibrium with its surroundings the entire time. That is consistent with the first and second laws of thermodynamics.

Then you divide the table into two halves and call each half its own thermodynamic system. You observe the table and find the majority of the billiard balls to be on one half, and then the other, and then the first half again, etcetera. You could say each half of the table represents a thermodynamic system, although the two halves are clearly not in equilibrium.

Let us freeze time to examine the billiard table. At this moment there are exactly N/3 billiard balls on the near half of the table and 2N/3 billiard balls on the far half. Now let us unfreeze time for just a moment, and we see that there are 0 billiard balls on the near half and N billiard balls on the far half. Has the second law of thermodynamics been violated?

Under Clausius's form of the law, heat cannot pass from a colder body to a warmer body without some other change occurring. If we redefine heat as a function of the billiard balls in a system, heat has been transferred from the near half to the far half. But at the same time, there has been a change - namely the gross translational momentum of billiard balls in the far half (accounting for perfect bouncing against the three closed walls of the table). Because we have done away with friction and other such things, every collision of billiard balls is a perfectly elastic collision - kinetic energy is always conserved. It follows that, if at least one billiard ball crosses from one half to the other, the two systems will never reach equilibrium and heat will forever fluctuate from one half to the other.

Lord Kelvin's formulation is equivalent to Clausius's.

Carathéodory's formulation is never invoked because the billiard ball moving from the near side to the far side is not an adiabatic process, as far as the near or far side thermodynamic system is concerned.

Also it is possible to imagine a billiard table with N billiard balls constantly moving while the number of billiard balls on each half of the table remains constant. For example, each billiard ball could bounce perpendicular to the long edges of the table, therefore never entering the other half. Another situation would be one where each billiard ball moves parallel to the long edges of the table and collides with an opposite ball at the exact midpoint of the table. Another situation would be where each ball moves parallel to the long edges of the table, but none ever collide with each other and each ball is "paired" with another which, on a different parallel, is always equidistant from the midline of the table. There are myriad other ways to make it work without invoking magic, besides the perfect initial state and suspension of friction, etc.

~Max

Last edited by Max S.; 04-30-2019 at 12:00 AM. Reason: remove double quote
#29
04-30-2019, 12:48 AM
 Guest Join Date: May 2016 Posts: 3,048
I'm not sure what you're saying, but if you start with a bunch of billiard balls or ideal gas molecules on a table/in a box, they fly across the middle of the table/box all the time and don't start bouncing perpendicular to the sides. And if the system is in equilibrium then the temperature is everywhere the same and not "forever fluctuating" from one (arbitrary, imaginary?) half of the box to another. Moreover, even if you start with the gas confined to one half of the box and let it freely expand, equilibrium will be achieved pretty quickly (not "never"), the temperature won't even change, but the entropy will increase, completely in accordance with the second law of thermodynamics.
#30
04-30-2019, 01:05 AM
 Guest Join Date: May 2016 Posts: 3,048
Also, to be clear, in these "macroscopic" considerations we are dealing with lots of very small particles, on large timescales.
#31
04-30-2019, 01:13 AM
 Guest Join Date: Mar 2001 Location: Cape Town, South Africa & Posts: 25,166
Quote:
 Originally Posted by DPRK Norwegian something?
Blue. Lovely plumage.
#32
04-30-2019, 01:15 AM
 Guest Join Date: Aug 2017 Location: Florida, USA Posts: 591
Quote:
 Originally Posted by DPRK I'm not sure what you're saying, but if you start with a bunch of billiard balls or ideal gas molecules on a table/in a box, they fly across the middle of the table/box all the time and don't start bouncing perpendicular to the sides. And if the system is in equilibrium then the temperature is everywhere the same and not "forever fluctuating" from one (arbitrary, imaginary?) half of the box to another. Moreover, even if you start with the gas confined to one half of the box and let it freely expand, equilibrium will be achieved pretty quickly (not "never"), the temperature won't even change, but the entropy will increase, completely in accordance with the second law of thermodynamics.
The point was, such a system can exist with billiard balls bouncing perpendicular to the walls. Ideal gas molecules assume random behavior and distributions by definition. Billiard balls do not, but the simulation you linked to starts off with pseudorandom positions and momentum anyways.

PHP Code:
``` ... 296. Molecule m = new Molecule(); ... 298. m.x = getrand(winSize.width*10)*.1; 299. m.y = getrand(areaHeight*10)*.1+upperBound; 300. m.dy = java.lang.Math.sqrt(1-m.dx*m.dx); 301. if (getrand(10) > 4) 302.     m.dy = -m.dy; ...  ```
Also I thought a system could be in equilibrium (with its surroundings) without all possible internal subsystems being in equilibrium.

~Max
#33
04-30-2019, 01:31 AM
 Guest Join Date: Aug 2017 Location: Florida, USA Posts: 591
Quote:
 Originally Posted by DPRK I'm not sure what you're saying, but if you start with a bunch of billiard balls or ideal gas molecules on a table/in a box, they fly across the middle of the table/box all the time and don't start bouncing perpendicular to the sides. And if the system is in equilibrium then the temperature is everywhere the same and not "forever fluctuating" from one (arbitrary, imaginary?) half of the box to another. Moreover, even if you start with the gas confined to one half of the box and let it freely expand, equilibrium will be achieved pretty quickly (not "never"), the temperature won't even change, but the entropy will increase, completely in accordance with the second law of thermodynamics.
Rereading this and your previous post, I was using equilibrium in the sense of mutual thermodynamic equilibrium between at least two systems. Not internal equilibrium.

Although that last part about billiard balls bouncing perpendicular to the walls would count as the billiard table being in a state of internal and external thermodynamic equilibrium.

~Max
#34
04-30-2019, 01:40 AM
 Guest Join Date: Aug 2017 Location: Florida, USA Posts: 591
Quote:
 Originally Posted by DPRK Also, to be clear, in these "macroscopic" considerations we are dealing with lots of very small particles, on large timescales.
It seems to me that no matter how many billiard balls you put on the table, if all collisions are perfectly elastic and at least one ball crosses the geometric midline of the table just one time, that thermodynamic system will never fully reach internal thermodynamic equilibrium between the two halves of the table. This is assuming perfect bouncy walls and no other forces such as friction, gravity, electromagnetic forces, etcetera.

~Max
#35
04-30-2019, 03:03 AM
 Guest Join Date: May 2016 Posts: 3,048
There is a reason they call it "statistical" mechanics. You can talk about a thermodynamic limit in which the number of particles grows effectively infinite. But in your example of billiard balls and the second law of thermodynamics this isn't even relevant. Consider even a single billiard ball (or 2, or 3, or whatever you prefer) bouncing around on half of, or an entire, billiard table. The (Gibbs) entropy of this system does not fluctuate. And when you remove the barrier across the middle of the table, it gets bigger. As for your observation that there might be some special or periodic trajectories with all the billiards bouncing in concert some way, these will have measure zero and can be neglected, nor will such fine structure survive in realistic physical systems.

Anyway, in short, no, the second law of thermodynamics is not routinely violated, especially not by (ideal or real) gases expanding in containers.
#36
04-30-2019, 10:21 AM
 Guest Join Date: Dec 1999 Location: Denton, TX, USA Posts: 12,479
Quote:
 Originally Posted by Grrr! I thought if a law gets violated in physics, then it is no longer a law. Or in fact, never was one.
Quote:
 Originally Posted by Jasmine In science, a "law" has no exceptions. Occasionally, one hears a claim that some "law" or other is being violated but, invariably, the true explanation comes to light that negates that claim.
A law in physics is a description of behavior - under these circumstances, the system will behave in this manner. Someone can state a proposed law of behavior, with that law being incorrect, or incomplete. That doesn't make it not a law, it makes it not a valid law.

Quote:
 Originally Posted by DPRK Anyway, in short, no, the second law of thermodynamics is not routinely violated, especially not by (ideal or real) gases expanding in containers.
That is the position that Max S. is trying to defend.
#37
04-30-2019, 12:46 PM
 Guest Join Date: Jun 2007 Posts: 6,688
Quote:
 Originally Posted by Max S. This example is much easier for me to understand. I assume the walls of the table are perfect and reflect 100% energy and there is no friction from air or the tabletop. The billiard table itself, by virtue of being a perfect box, is an isolated system. Therefore the table expresses but one macrostate, as you defined it.
No. The number of macrostates the table can be in is the number of distinguishable configurations (balls in different areas). Assume you have two, very crude, detectors---say, scales weighing each half of the table. They can tell you how many balls are on each side, but not their precise configuration. Then, a tuple (n balls left, k balls right) is a macrostate.

Suppose now that there are in fact four places each ball can be in. A microstate then is a quadruple (n,k,l,m) denoting the number of balls in the top left, bottom left, top right, and bottom right areas of the table.

Assume there are two balls on the table, balls A and B. The macrostate (2,0) (both balls in the left half) could come about in two ways, corresponding to the microstates (A,B,0,0) or (B,A,0,0). The macrostate (1,1), on the other hand, has eight realizations: (A,0,B,0), (A,0,0,B), (0,A,B,0), (0,A,0,B), and four more with A and B switched. This is a higher-entropy state.

But, there's nothing---absolutely nothing---that keeps the state (A,0,0,B) (say) from evolving into (A,B,0,0). A just lays there, and B transitions from bottom right to bottom left. That's all.

Quote:
 Then you divide the table into two halves and call each half its own thermodynamic system.
Again, no: I'm considering the table to be a single system.

Quote:
 Also it is possible to imagine a billiard table with N billiard balls constantly moving while the number of billiard balls on each half of the table remains constant.
Which is why I added this:
Quote:
 Originally Posted by Half Man Half Wit If you haven't taken care to set up some simple periodic path
(In fact, this is just the thing about the equiprobability of microstates again.)

And I'd really like your opinion on these two quotations from Ludwig Boltzmann:

Quote:
 Originally Posted by Boltzmann One may speculate that the universe as a whole is in thermal equilibrium and therefore dead, but there will be local deviations from equilibrium which may last for the relatively short time of a few eons.
Quote:
 Originally Posted by Boltzmann Hence, if one takes a smaller system of bodies in the state in which he actually finds them, and suddenly isolates this system from the rest of the world, then the system will initially be in an improbable state, and as long as the system remains isolated it will always proceed toward more probable states. On the other hand, there is a very small probability that the enclosed system is initially in thermal equilibrium, and that while it remains enclosed it moves far enough away from equilibrium that its entropy decrease is noticeable.
(Both are from Boltzmann's paper 'On Zermelo's Paper "On the Mechanical Explanation of Irreversible Processes"'.)

Do you think he's wrong here?

Last edited by Half Man Half Wit; 04-30-2019 at 12:47 PM.
#38
04-30-2019, 01:45 PM
 Guest Join Date: Aug 2017 Location: Florida, USA Posts: 591
Quote:
Originally Posted by Half Man Half Wit
And I'd really like your opinion on these two quotations from Ludwig Boltzmann:

Quote:
 Originally Posted by Ludwig Boltzmann One may speculate that the universe as a whole is in thermal equilibrium and therefore dead, but there will be local deviations from equilibrium which may last for the relatively short time of a few eons.
Quote:
 Originally Posted by Ludwig Boltzmann Hence, if one takes a smaller system of bodies in the state in which he actually finds them, and suddenly isolates this system from the rest of the world, then the system will initially be in an improbable state, and as long as the system remains isolated it will always proceed toward more probable states. On the other hand, there is a very small probability that the enclosed system is initially in thermal equilibrium, and that while it remains enclosed it moves far enough away from equilibrium that its entropy decrease is noticeable.
(Both are from Boltzmann's paper 'On Zermelo's Paper "On the Mechanical Explanation of Irreversible Processes"'.)

Do you think he's wrong here?
I admit, in my profound ignorance, I had never heard about Boltzmann's physics before yesterday. I've heard of the Boltzmann brain, though. I spent part of last night trying to read his [I]Vorlesungen Ueber Die Principe Der Mechanik.[I][1]. It's been slow-going since I don't read German.

From what I've read of your quotes, Boltzmann is also assuming a random distribution of possible internal states and redefining both thermodynamic equilibrium and the second law of thermodynamics to work upon such a basis.

[1]. Boltzmann, L. (1904). Vorlesungen Ueber Die Principe Der Mechanik (II Thiell). Leipzig: Johannes Ambrosius Barth. https://archive.org/details/bub_gb_fJD9u5tl4NYC/page/n3

Last edited by Max S.; 04-30-2019 at 01:50 PM. Reason: removed Deutsch keyboard mappings
#39
04-30-2019, 02:05 PM
 Guest Join Date: Aug 2017 Location: Florida, USA Posts: 591

## Re: Boltzmann

Quote:
 Originally Posted by Ludwig Boltzmann One may speculate that the universe as a whole is in thermal equilibrium and therefore dead, but there will be local deviations from equilibrium which may last for the relatively short time of a few eons.
Specifically, I don't think he is right to say a system posessing absolute internal thermodynamic equilibrium can fluctuate. He would need to redefine thermodynamic equilibrium to mean "almost thermodynamic equilibrium" or "to the best of our measurements, thermodynamic equilibrium, but not actually so".

Quote:
 Originally Posted by Ludwig Boltzmann Hence, if one takes a smaller system of bodies in the state in which he actually finds them, and suddenly isolates this system from the rest of the world, then the system will initially be in an improbable state, and as long as the system remains isolated it will always proceed toward more probable states. On the other hand, there is a very small probability that the enclosed system is initially in thermal equilibrium, and that while it remains enclosed it moves far enough away from equilibrium that its entropy decrease is noticeable.
Neither am I convinced that the entropy of an isolated system will decrease over time. I cannot derive this without assuming, as a premise, that microscopic reality is determined by stochastic rather than corporeal processes.

~Max

Last edited by Max S.; 04-30-2019 at 02:06 PM. Reason: formatting
#40
04-30-2019, 02:25 PM
 Guest Join Date: Aug 2017 Location: Florida, USA Posts: 591
Quote:
 Originally Posted by Half Man Half Wit No. The number of macrostates the table can be in is the number of distinguishable configurations (balls in different areas). Assume you have two, very crude, detectors---say, scales weighing each half of the table. They can tell you how many balls are on each side, but not their precise configuration. ... Again, no: I'm considering the table to be a single system.
I am of the opinion that, by measuring subdivisions within a system you have divided that system into two sub-systems. The laws of thermodynamics hold for whatever level of detail you want, so long as it is consistent. Observe heat flow within a system and no heat flow outside. It is affirming the consequent to say the second law is violated. The second law of thermodynamics only concerns net heat flow to and from a system, as if a Carnot engine existed along the border. It does not apply to the internal state of a system until you define the internal state as two or more sub-systems.

~Max

Last edited by Max S.; 04-30-2019 at 02:26 PM.
#41
04-30-2019, 02:49 PM
 Guest Join Date: Aug 2017 Location: Florida, USA Posts: 591
Quote:
 Originally Posted by Half Man Half Wit Entropy is fundamentally a measure for how many microstates (unobservable individual particle arrangements, say) lead to the same macrostate (the gross, macroscopic properties of a system)
Quote:
 Originally Posted by Half Man Half Wit No. The number of macrostates the table can be in is the number of distinguishable configurations (balls in different areas). Assume you have two, very crude, detectors---say, scales weighing each half of the table. They can tell you how many balls are on each side, but not their precise configuration. Then, a tuple (n balls left, k balls right) is a macrostate. Suppose now that there are in fact four places each ball can be in. A microstate then is a quadruple (n,k,l,m) denoting the number of balls in the top left, bottom left, top right, and bottom right areas of the table.
Either you have redefined macro- and micro-states, or I misunderstand you. Going by the definition in post #6 (top quote), I thought the macrostate of the billiard table would be, for example, the number of billiard balls on the table. So we could have a state of 5 billiard balls, or a state of 3 billiard balls, etcetera. Microstate would be the arrangement of individual particles (billiard balls), for example, giving each ball x and y coordinates and momentum. Entropy is then the number of microstates.

If we are to measure the number of billiard balls on the left side of the table, that number is neither a property of the table nor a property of an individual billiard ball. Thus, I conclude you have designated the left and right sides of the table as their own systems, and the two of them combined comprise of the billiard table as a whole. Here is the heirarchy of systems and states and their properties in brackets:
• Top level. Billiard table { 3 balls, equilibrium with surroundings }
• Mid level. Left side of the table { 2 balls, inequilibrium with surroundings }, Right side of the table {1 ball, inequilibrium with surroundings }
• Fundamental level. Billiard ball A { x position < 0.5, y position ?, momentum ? }, Billiard ball B { x position < 0.5, y position ?, momentum ? }, Billiard ball C { x position > 0.5, y position ?, momentum ? }

~Max
#42
04-30-2019, 03:58 PM
 Guest Join Date: May 2016 Posts: 3,048

Entropy is not "the number of microstates" if that's the issue. (It will be proportional to the integral over the phase space of -p(a) log p(a), where p(a) is the probability of state a).
#43
04-30-2019, 05:32 PM
 Guest Join Date: Aug 2017 Location: Florida, USA Posts: 591
Quote:
 Originally Posted by DPRK @Max could you please rephrase what is your question about the billiard balls? Entropy is not "the number of microstates" if that's the issue. (It will be proportional to the integral over the phase space of -p(a) log p(a), where p(a) is the probability of state a).
Half Man Half Wit's post #17 and the threaded conversation preceding it implies that the second law of thermodynamics is contradicted by a theoretical billiard table. My responses are to the effect of, "no it doesn't".

~Max
#44
04-30-2019, 06:03 PM
 Guest Join Date: May 2016 Posts: 3,048
Then somebody is misunderstanding something, because the fluctuations aka random caroming of billiard balls on a table he or she is talking about do not contradict the second law of thermodynamics.

Last edited by DPRK; 04-30-2019 at 06:05 PM.
#45
04-30-2019, 07:53 PM
 Member Join Date: Mar 2000 Location: Between pole and tropic Posts: 7,997
Quote:
 Originally Posted by DPRK Maxwell's demon? I saw one on the subway last week, I swear! Or was that a Tasmanian devil? Norwegian something?
But doesn't Maxwell's Demon turn information into energy, coming to the relief of entropy?
#46
04-30-2019, 11:18 PM
 Guest Join Date: May 2016 Posts: 3,048
Quote:
 Originally Posted by KarlGauss But doesn't Maxwell's Demon turn information into energy, coming to the relief of entropy?
Basically, yes. The energy stays the same or even increases, but the information (entropy) goes down. You can think of a genuine, bona fide Maxwell's Demon as a combination air conditioner and perpetuum mobile.
#47
05-01-2019, 02:33 AM
 Guest Join Date: Jun 2007 Posts: 6,688
OK, so you actually think Boltzmann misunderstood his own theory in a naive way you're able to spot immediately, but no scientist in the century hence managed to do. That's confidence, I like it!

So, what do you think about Feynman's example from his Lectures on Physics, typically considered to be one of the best resources for teaching physics available?
Quote:
 Originally Posted by Richard Feynman Look again at our box of mixed white and black molecules. Now it is possible, if we wait long enough, by sheer, grossly improbable, but possible, accident, that the distribution of molecules gets to be mostly white on one side and mostly black on the other. After that, as time goes on and accidents continue, they get more mixed up again.
Or, what do you make of the following quote of Maxwell (from his Theory of Heat)?
Quote:
 Originally Posted by James Clerk Maxwell the second law is drawn from our experience of bodies consisting of an immense number of molecules. [...] it is continually being violated, [...], in any sufficiently small group of molecules [...] . As the number [...] is increased [...] the probability of a measurable variation [...] may be regarded as practically an impossibility.
And what do you make of Eddington's words in his entertainingly titled "The End of the World: from the Standpoint of Mathematical Physics"?
Quote:
 Originally Posted by Arthur Eddington So, after the world has reached thermodynamical equilibrium the entropy remains steady at its maximum value, except that 'once in a blue moon' the absurdly small chance comes off and the entropy drops appreciably below its maximum value.
I don't intend to bludgeon you over the head with these quotes from the big shots (well, maybe a little, but gently). But I do want to know if this doesn't at least hint to you that maybe, you've got it wrong somehow---that it's not everybody else (and I could dig up innumerably more similar quotes) who misunderstands, but rather, you? Or, failing that, do you at least recognize that I'm not saying something wildly out there, but merely, what's been the consensus ever since the statistical mechanics foundation of thermodynamics was proposed?

Quote:
 Originally Posted by Max S. I am of the opinion that, by measuring subdivisions within a system you have divided that system into two sub-systems.
Well, you haven't. All you've done is introduce what's often called a coarse-grained set of observables, analogous to thermodynamic quantities such as temperature, volume, and pressure---quantities that take account of the state of the system in aggregated form, without being sensible to microscopic details.

Quote:
 The second law of thermodynamics only concerns net heat flow to and from a system, as if a Carnot engine existed along the border. It does not apply to the internal state of a system until you define the internal state as two or more sub-systems. ~Max
That's simply wrong. As the first sentence of the relevant wiki entry puts it:
Quote:
 Originally Posted by wikipedia The second law of thermodynamics states that the total entropy of an isolated system can never decrease over time.
Nothing about net heat flow, applies to the internal state without dividing anything up. (Of course, you can state the second law in terms of heat flow, but that's not the only formulation.)

Quote:
 Originally Posted by Max S. Either you have redefined macro- and micro-states, or I misunderstand you. Going by the definition in post #6 (top quote), I thought the macrostate of the billiard table would be, for example, the number of billiard balls on the table. So we could have a state of 5 billiard balls, or a state of 3 billiard balls, etcetera. Microstate would be the arrangement of individual particles (billiard balls), for example, giving each ball x and y coordinates and momentum. Entropy is then the number of microstates.
Sorry if I wasn't clear regarding my definitions. A macrostate takes account of the system in terms of coarse-grained observables, like temperature, pressure, and volume, without being sensitive to the microscopic details.

The coarse-grained observables I have introduced in the billard ball example are 'number of billard balls in the left half' and 'number of billard balls in the right half'. The microstate is given by the arrangement of billard balls in the subdivided areas.

Quote:
 If we are to measure the number of billiard balls on the left side of the table, that number is neither a property of the table nor a property of an individual billiard ball.
It's a property of the system of billard balls on the table. Temperature, likewise, isn't a property of the room a gas is in, nor of any individual molecule.

Quote:
 Thus, I conclude you have designated the left and right sides of the table as their own systems, and the two of them combined comprise of the billiard table as a whole.
No. I'm just describing the table at different levels of abstraction. The total number of billard balls gives me relatively little information about the state of the system; the number of billard balls per half a little more; and the configuration of billard balls in the subdivided areas yet more. You can view the entropy of a given macrostate as the information we lack about the microstate, if that helps.

Quote:
 Originally Posted by DPRK Then somebody is misunderstanding something, because the fluctuations aka random caroming of billiard balls on a table he or she is talking about do not contradict the second law of thermodynamics.
That's not what I'm saying. Rather, if we view the table in terms of the coarse-grained observables I've introduced ('balls per left half', 'balls per right half'), this macrostate has a certain entropy depending on the number of microstates that can realize it (the phase space volume it occupies).

By using a model of the billard table that restricts the available microstates to exactly one ball per quarter of the table, we see that the state ('one ball per half') has a higher entropy than the state ('both balls in one half'). The transition from the former to the latter---which is perfectly well possible---then decreases the entropy of the system.

Last edited by Half Man Half Wit; 05-01-2019 at 02:35 AM.
#48
05-01-2019, 04:50 AM
 Guest Join Date: Dec 2009 Location: The Land of Smiles Posts: 19,139
I'm surprised none of the experts gave an answer related to the one I gave in a recent thread. Is the following in error? @ OP — would this have addressed your question?
Unlike the Laws of Newton, Maxwell and Einstein, the Second Law isn't a dynamical law; it's just a statistical fact, closely akin to the Law of Large Numbers.

A ten-gallon tank of gas at standard temperature and pressure contains about a septillion molecules — easy to remember and large enough to qualify for the Large Number Law — but deviations from an "entropy maximization" can still occur with non-zero probability just as a matter of simple statistics. (For example, consider 52 — less than a septillion but still largish compared with some integers. If you shuffle and deal four poker hands from a deck of 52, after a septillion trials you can expect a dozen or so occasions where there was a a four-way tie for best hand, all royal flushes!)
#49
05-01-2019, 05:45 AM
 Guest Join Date: May 2016 Posts: 3,048
Are you querying me or Max S. ? Of course there are thermal fluctuations, therefore the Boltzmann entropy of the system is also fluctuating, and from one particular moment to the next may decrease.

My only claim is that this does not contradict the second law of thermodynamics, unless of course one formulates it in a way that does not apply to mesoscopic or microscopic systems and then tries to apply it to such a system.

The ball system is a good concrete example of a model where one can calculate everything, which is why I asked Max S. if there were any outstanding questions about that model.

Last edited by DPRK; 05-01-2019 at 05:48 AM.
#50
05-01-2019, 09:26 AM
 Guest Join Date: Jun 2007 Posts: 6,688
Quote:
 Originally Posted by DPRK Are you querying me or Max S. ? Of course there are thermal fluctuations, therefore the Boltzmann entropy of the system is also fluctuating, and from one particular moment to the next may decrease. My only claim is that this does not contradict the second law of thermodynamics, unless of course one formulates it in a way that does not apply to mesoscopic or microscopic systems and then tries to apply it to such a system. The ball system is a good concrete example of a model where one can calculate everything, which is why I asked Max S. if there were any outstanding questions about that model.
What formulation of the second law are you referring to? Because I think it's entirely reasonable, as e. g. Maxwell also does in the quote above, to speak of a violation of the second law when entropy fluctuates downward.

 Bookmarks

 Thread Tools Display Modes Linear Mode

 Posting Rules You may not post new threads You may not post replies You may not post attachments You may not edit your posts BB code is On Smilies are On [IMG] code is Off HTML code is Off Forum Rules
 Forum Jump User Control Panel Private Messages Subscriptions Who's Online Search Forums Forums Home Main     About This Message Board     Comments on Cecil's Columns/Staff Reports     General Questions     Great Debates     Elections     Cafe Society     The Game Room     Thread Games     In My Humble Opinion (IMHO)     Mundane Pointless Stuff I Must Share (MPSIMS)     Marketplace     The BBQ Pit

All times are GMT -5. The time now is 05:05 AM.

 -- Straight Dope v3.7.3 -- Sultantheme's Responsive vB3-blue Contact Us - Straight Dope Homepage - Archive - Top

Send questions for Cecil Adams to: cecil@straightdope.com