FAQ 
Calendar 


#1




Is the second law of thermodynamics routinely violated?
In the thread on Dualism, it was claimed that we have observed violations of the second law of thermodynamics; it was even suggested that largescale violations occure with relative frequency. My response is that we have not observed violations, and that a single violation of the second law would result in a paradigm shift for many fields of science.
There are multiple formulations of the second law of thermodynamics. These are formulations of the same law as I was taught early on in secondary school[1][2]: Quote:
Quote:
Quote:
Quote:
Quote:
[2] Thomson, W. (1853). On the Dynamical Theory of Heat, with numerical results deduced from Mr Joule's equivalent of a Thermal Unit, and M. Regnault's Observations on Steam. Transactions of the Royal Society of Edinburgh, 20(2), 265. Retrieved from https://digital.nls.uk/scientists/archive/74629508 [3] Carathéodory, C. (1909, September). Untersuchungen über die Grundlagen der Thermodynamik. Mathematische Annalen, 67(3), 363. https://doi.org/10.1007/BF01450409 [4] Caratheodory, C. (n.d.). Examination of the foundations of thermodynamics [PDF file]. (D. H. Delphenich, Trans.) Retrieved from http://neoclassicalphysics.info/up...modynamics.pdf (Original work published 1909). [5] Second law of thermodynamics. (n.d.). In Wikipedia. Retrieved April 27, 2019, from https://en.wikipedia.org/wiki/Second...thermodynamics [6] Planck, M. (1926) Über die Begründung des zweiten Hauptsatzes der Thermodynamik. Sitzungsberichte der Preussischen Akademie der Wissenschaften, PhysikalischMathematische Klasse, 1926, 453463. 
#2




Quote:
If you are describing a system consisting of a 5000K star orbiting an 8000K star, and you allow radiation to "leave" the system, then the second law of thermodynamics as formulated by Carathéodory (and Planck) does not apply. It is not violated; it does not apply to the system you have described. If however, you include the surrounding space in the system, extending to infinity and encompassing all radiation emitted, then the second law of thermodynamics holds true. ~Max 
#3




I thought if a law gets violated in physics, then it is no longer a law. Or in fact, never was one.

#4




Quote:
You should never take your scientific knowledge from blithe pronouncements in a thread on philosophy. Philosophers, apparently, can't be bothered to add in all the caveats and conditions that go along with the"violation" of a scientific principle. 


#5




Not heat: energy. Heat is merely one of many forms of energy.
__________________
Evidence gathered through the use of science is easily dismissed through the use of idiocy.  Czarcasm. 
#6




Quote:
Entropy is fundamentally a measure for how many microstates (unobservable individual particle arrangements, say) lead to the same macrostate (the gross, macroscopic properties of a system). So, there are many more microstates corresponding to all of the gas in a room being evenly distributed, than there are corresponding to all the gas bunching up in the middle; hence, the former is a highentropy state, while the latter is a lowentropy state. If you start out with a lowentropy configuration, any given change is likely to lead to a higherentropy configuration, just by virtue of how many more ways there are to increase entropy, than to decrease it. But that doesn't mean that spontaneous decreases of entropy are impossible; it merely means that they're unlikely. Take a toy system of three 'particles', 1, 2, and 3, carrying one unit of energy each, which can be in each of three boxes A, B, and C. This gives us the following ten distinguishable macrostates:
We can compute the probability of each macrostate, by counting the number of microstates that realizes it. The first macrostate can be realized by every permutation of particles, distributed over the boxes, thus, there are 3! = 6 possible ways to realize it. The last three macrostates can be realized in exactly one way. The six remaining cases can each be realized in three different ways (e. g. for (A2B1C0), particles 1 and 2, 1 and 3, or 2 and 3 could be in box A, the remaining one in box B). In total, we thus have 6 + 6 * 3 + 3 * 1 = 27 possible microstates (which of course we knew, since there are 3 * 3 * 3 possibilities to distribute 3 objects among 3 boxes). Now, we can calculate the probabilities for each macrostate, as well as their entropy (as the natural logarithm of the number of ways in which it can be realized, i. e. the microstates):
Assuming a simple dynamics of particles just hopping to a random box at each time step, we see that, while it's more likely for entropy to increase with each step, it's by no means impossible for it to decrease, as well. Starting out in one of the lowest entropy states, the probability that the next timestep will lead to a higher entropy state is 24 / 27 ~ 89%, with an 11% chance of staying the same. An intermediateentropy state has an 11 % chance of decreasing entropy (and thus, violating the second law, as a colder box will transfer energy to an already hotter one, say in the transition (A0B1C2) > (A0B0C3)), a 22% change of increasing, and a 67% chance of remaining equal. As you increase the size of the system, these probabilities will favor higher entropy states ever more strongly. Still, if you wait long enough, you will observe violations of the second law even thereit's just that the wait time will quickly exceed even cosmological timescales. 
#7




Quote:
~Max 
#8




The idea (postulate?) is that an isolated system is in thermal equilibrium if and only if every microstate is equally probable.

#9




Quote:
Really, none of this is controversial in the slightest. Every physics student learns it in a course on statistical mechanics. People have even used this to try and argue that the universe itself might be due to a spontaneous fluctuation from a highentropy state into a lowentropy state: Quote:
That idea doesn't work, but the general reasoning is sound. Last edited by Half Man Half Wit; 04282019 at 12:48 PM. 


#10




Quote:
Even the kinetic theory, to my knowledge, only allows for random movements so long as they express the same macrostates. But I might be wrong on that. ~Max 
#11




Quote:
~Max 
#12




By 'any other dynamics would work' I meant that the dynamics I gave, the 'random hopping' you (misguidedly) took issue with, isn't essential to demonstrating my point. All that's needed is the euiprobability of microstates. If that's given, any dynamics will lead to violations of the second law, whether you want to accept that or not.

#13




Quote:
All that's happening is that there are a lot fewer ways to change to a lower entropy macrostate than to one with higher entropy. Hence, the latter sort of thing happens more often. 
#14




Quote:
I apologize in advance for my ignorance of modern physics, which I suspect factor into your answer. All I've got to go by is grade school physical science, the few books I have read on the subject, and the internet. ~Max 


#15




Quote:
~Max 
#16




Quote:
If you are suggesting that certain transitions are allowed but others are not then it is important to note that that is not the case: these transitions are reversible microscopic fluctuations and each transition is as likely to occur as its converse. Again, nothing is defied: in the macroscopic limit these fluctuations will be too small to detect. 
#17




Quote:
Quote:
Furthermore, it would introduce a sort of 'downward causation' incompatible with the fundamentally reductionistic nature of physics, i. e. with the idea that fixing the microscopic dynamics suffices to fix everything, because suddenly, the motion of molecules no longer depends on their interactions with their surroundings, but also, on whether they are considered to be part of a larger system in some macrostate. Whatever mathematics one uses to describe the motion of the molecules then would not only depend on their position and momenta, but also on the macrostate they're part of, which really basically throws the whole edifice of classical physics as based on Hamiltonian mechanics overboard. Finally, the assumption of equiprobability is in fact a key ingredient in deriving macroscopic thermodynamics from microscopic statistical physics, as carried out chiefly by Boltzmann in the 1870s (so not that terribly modern). The implication of the possibility of violations of the second law was realized pretty much immediately, leading Boltzmann to propose his idea that the universe could be a spontaneous lowentropy fluctuation (however, the possibility of Boltzmann brains seems to doom this scenario). 
#18




Quote:
~Max 
#19




I was just rereading Cycles of Time by Roger Penrose and find his discussion on pages 3943 pertinent to this thread — he even shows apparent counterexamples to the Second Law.
Linking to Google Books is a big pain, but the excerpt shows up (strangely and without attribution?) here, though only the discussion at the end of that page is most relevant. Quote:



#20




Quote:
~Max 
#21




In science, a "law" has no exceptions. Occasionally, one hears a claim that some "law" or other is being violated but, invariably, the true explanation comes to light that negates that claim.

#22




Quote:
~Max 
#23




Quote:
~Max Last edited by Max S.; 04292019 at 12:23 PM. 
#24




Note that TWO very unrelated sorts of Second Law violations (or apparent violations) are being discussed here.
Further adding confusion is the use of two different definitions of entropy. One formula is dS = δq/T which relates entropy to heat and temperature. Another formula, used by Penrose and other theoretical physicists, is S = k log V where V is the volume of a macrostate in phase space. The first formula is a special case of the second. Quote:
Quote:
Disclaimer: I am unqualified. If there are no major errors in the preceding please give me six brownie points! Last edited by septimus; 04292019 at 01:25 PM. 


#25




Quote:
And again, I would like to point out that basically everybody agrees on this. Boltzmann himself, who pretty much came up with this picture, considered the possibility of random fluctuations to lower entropy states. Why are you so invested in that not occurring? Picture again the gas, trillions and trillions of molecules zipping around. What should prevent their velocities, after who knows how many collision, from aligning such that they point in more or less the same direction? All that happens is little billard balls caromming off one another. Unlikely? Of course. But calling it impossible would entail something entirely mysterious occurring, some invisible guiding hand turning away excess molecules. 
#26




Quote:
If, however, you insist on macroscopic properties restricting microscopic motion, then there needs to be some determinating factor of the motion of a particle that goes beyond just its past interactions. Picture a billard table with a single ball (and we're doing physics here, so the billiard ball is a point mass with no friction). Set it in motion. If you haven't taken care to set up some simple periodic path, it'll eventually visit every point on the table, with none being particularly distinguished. Now, add a second ball. The two balls can interact by careening off one another. Divide the table into two halves. Suppose you can only distinguish which half of the table a ball is in. There are more ways for both balls to be in separate halves, than there are for both to be in the same half. Just mentally chop the table into ballsized areas: if there are four such areas (two in each half), so there are two ways for both balls to be in one half, and eight for them to be in separate halves. Increasing the number of areas increases the disparity What you're saying, now, amounts to saying that, if a ball already occupies one half, the other can't enter that half anymore. That is, whenever it threatens to cross the boundary, it must somehow get turned away, without interacting with the other ball or the walls. Now, just increase the number of balls. Just set a mighty lot of them adrift on the table. They'll bump into each other, end simply by virtue of there being many more ways to be roughly evenly distributed, they'll spend most of their time that way. But there's nothing about the way they're bouncing off of each other that forbids them from, occasionally, bunching up in one half of the tableand occasionally, that's what they will do. That's all there's to it. Anything else would require intervention by some mysterious agency we have no reason at all to believe should exist. Last edited by Half Man Half Wit; 04292019 at 03:34 PM. 
#27




Maxwell's demon? I saw one on the subway last week, I swear! Or was that a Tasmanian devil? Norwegian something?

#28




Quote:
I assume the walls of the table are perfect and reflect 100% energy and there is no friction from air or the tabletop. The billiard table itself, by virtue of being a perfect box, is an isolated system. Therefore the table expresses but one macrostate, as you defined it. There are always N balls in the billiard table, and the sum kinetic energy of all the billiard balls remains a constant no matter how many times they collide. No energy flows into our out of the system  the table remains in equilibrium with its surroundings the entire time. That is consistent with the first and second laws of thermodynamics. Then you divide the table into two halves and call each half its own thermodynamic system. You observe the table and find the majority of the billiard balls to be on one half, and then the other, and then the first half again, etcetera. You could say each half of the table represents a thermodynamic system, although the two halves are clearly not in equilibrium. Let us freeze time to examine the billiard table. At this moment there are exactly N/3 billiard balls on the near half of the table and 2N/3 billiard balls on the far half. Now let us unfreeze time for just a moment, and we see that there are 0 billiard balls on the near half and N billiard balls on the far half. Has the second law of thermodynamics been violated? Under Clausius's form of the law, heat cannot pass from a colder body to a warmer body without some other change occurring. If we redefine heat as a function of the billiard balls in a system, heat has been transferred from the near half to the far half. But at the same time, there has been a change  namely the gross translational momentum of billiard balls in the far half (accounting for perfect bouncing against the three closed walls of the table). Because we have done away with friction and other such things, every collision of billiard balls is a perfectly elastic collision  kinetic energy is always conserved. It follows that, if at least one billiard ball crosses from one half to the other, the two systems will never reach equilibrium and heat will forever fluctuate from one half to the other. Lord Kelvin's formulation is equivalent to Clausius's. Carathéodory's formulation is never invoked because the billiard ball moving from the near side to the far side is not an adiabatic process, as far as the near or far side thermodynamic system is concerned. Also it is possible to imagine a billiard table with N billiard balls constantly moving while the number of billiard balls on each half of the table remains constant. For example, each billiard ball could bounce perpendicular to the long edges of the table, therefore never entering the other half. Another situation would be one where each billiard ball moves parallel to the long edges of the table and collides with an opposite ball at the exact midpoint of the table. Another situation would be where each ball moves parallel to the long edges of the table, but none ever collide with each other and each ball is "paired" with another which, on a different parallel, is always equidistant from the midline of the table. There are myriad other ways to make it work without invoking magic, besides the perfect initial state and suspension of friction, etc. ~Max Last edited by Max S.; 04302019 at 12:00 AM. Reason: remove double quote 
#29




I'm not sure what you're saying, but if you start with a bunch of billiard balls or ideal gas molecules on a table/in a box, they fly across the middle of the table/box all the time and don't start bouncing perpendicular to the sides. And if the system is in equilibrium then the temperature is everywhere the same and not "forever fluctuating" from one (arbitrary, imaginary?) half of the box to another. Moreover, even if you start with the gas confined to one half of the box and let it freely expand, equilibrium will be achieved pretty quickly (not "never"), the temperature won't even change, but the entropy will increase, completely in accordance with the second law of thermodynamics.



#30




Also, to be clear, in these "macroscopic" considerations we are dealing with lots of very small particles, on large timescales.

#31




#32




Quote:
PHP Code:
~Max 
#33




Quote:
Although that last part about billiard balls bouncing perpendicular to the walls would count as the billiard table being in a state of internal and external thermodynamic equilibrium. ~Max 
#34




Quote:
~Max 


#35




There is a reason they call it "statistical" mechanics. You can talk about a thermodynamic limit in which the number of particles grows effectively infinite. But in your example of billiard balls and the second law of thermodynamics this isn't even relevant. Consider even a single billiard ball (or 2, or 3, or whatever you prefer) bouncing around on half of, or an entire, billiard table. The (Gibbs) entropy of this system does not fluctuate. And when you remove the barrier across the middle of the table, it gets bigger. As for your observation that there might be some special or periodic trajectories with all the billiards bouncing in concert some way, these will have measure zero and can be neglected, nor will such fine structure survive in realistic physical systems.
Anyway, in short, no, the second law of thermodynamics is not routinely violated, especially not by (ideal or real) gases expanding in containers. 
#36




Quote:
Quote:
That is the position that Max S. is trying to defend. 
#37




Quote:
Suppose now that there are in fact four places each ball can be in. A microstate then is a quadruple (n,k,l,m) denoting the number of balls in the top left, bottom left, top right, and bottom right areas of the table. Assume there are two balls on the table, balls A and B. The macrostate (2,0) (both balls in the left half) could come about in two ways, corresponding to the microstates (A,B,0,0) or (B,A,0,0). The macrostate (1,1), on the other hand, has eight realizations: (A,0,B,0), (A,0,0,B), (0,A,B,0), (0,A,0,B), and four more with A and B switched. This is a higherentropy state. But, there's nothingabsolutely nothingthat keeps the state (A,0,0,B) (say) from evolving into (A,B,0,0). A just lays there, and B transitions from bottom right to bottom left. That's all. Quote:
Quote:
Quote:
And I'd really like your opinion on these two quotations from Ludwig Boltzmann: Quote:
Quote:
Do you think he's wrong here? Last edited by Half Man Half Wit; 04302019 at 12:47 PM. 
#38




Quote:
From what I've read of your quotes, Boltzmann is also assuming a random distribution of possible internal states and redefining both thermodynamic equilibrium and the second law of thermodynamics to work upon such a basis. [1]. Boltzmann, L. (1904). Vorlesungen Ueber Die Principe Der Mechanik (II Thiell). Leipzig: Johannes Ambrosius Barth. https://archive.org/details/bub_gb_fJD9u5tl4NYC/page/n3 Last edited by Max S.; 04302019 at 01:50 PM. Reason: removed Deutsch keyboard mappings 
#39




Re: BoltzmannQuote:
Quote:
~Max Last edited by Max S.; 04302019 at 02:06 PM. Reason: formatting 


#40




Quote:
~Max Last edited by Max S.; 04302019 at 02:26 PM. 
#41




Quote:
Quote:
If we are to measure the number of billiard balls on the left side of the table, that number is neither a property of the table nor a property of an individual billiard ball. Thus, I conclude you have designated the left and right sides of the table as their own systems, and the two of them combined comprise of the billiard table as a whole. Here is the heirarchy of systems and states and their properties in brackets:
~Max 
#42




@Max could you please rephrase what is your question about the billiard balls?
Entropy is not "the number of microstates" if that's the issue. (It will be proportional to the integral over the phase space of p(a) log p(a), where p(a) is the probability of state a). 
#43




Quote:
~Max 
#44




Then somebody is misunderstanding something, because the fluctuations aka random caroming of billiard balls on a table he or she is talking about do not contradict the second law of thermodynamics.
Last edited by DPRK; 04302019 at 06:05 PM. 


#45




Quote:

#46




Quote:

#47




OK, so you actually think Boltzmann misunderstood his own theory in a naive way you're able to spot immediately, but no scientist in the century hence managed to do. That's confidence, I like it!
So, what do you think about Feynman's example from his Lectures on Physics, typically considered to be one of the best resources for teaching physics available? Quote:
Quote:
Quote:
Quote:
Quote:
Quote:
Quote:
The coarsegrained observables I have introduced in the billard ball example are 'number of billard balls in the left half' and 'number of billard balls in the right half'. The microstate is given by the arrangement of billard balls in the subdivided areas. Quote:
Quote:
Quote:
By using a model of the billard table that restricts the available microstates to exactly one ball per quarter of the table, we see that the state ('one ball per half') has a higher entropy than the state ('both balls in one half'). The transition from the former to the latterwhich is perfectly well possiblethen decreases the entropy of the system. Last edited by Half Man Half Wit; 05012019 at 02:35 AM. 
#48




I'm surprised none of the experts gave an answer related to the one I gave in a recent thread. Is the following in error? @ OP — would this have addressed your question?
Unlike the Laws of Newton, Maxwell and Einstein, the Second Law isn't a dynamical law; it's just a statistical fact, closely akin to the Law of Large Numbers. 
#49




Are you querying me or Max S. ? Of course there are thermal fluctuations, therefore the Boltzmann entropy of the system is also fluctuating, and from one particular moment to the next may decrease.
My only claim is that this does not contradict the second law of thermodynamics, unless of course one formulates it in a way that does not apply to mesoscopic or microscopic systems and then tries to apply it to such a system. The ball system is a good concrete example of a model where one can calculate everything, which is why I asked Max S. if there were any outstanding questions about that model. Last edited by DPRK; 05012019 at 05:48 AM. 


#50




Quote:

Reply 
Thread Tools  
Display Modes  

