FAQ 
Calendar 


#101




Quote:
Clausius went on and said there were such things as irreversible processes, which is in my opinion wrong, and refined the formula to an effective ∫ dQ/T > 0. Even so, both formulas are true. Clausius said entropy of a closed system can only stay the same or increase. I say it never changes at all. You seem to be saying that it can increase or decrease. Somebody has to be wrong about something. I've reproduced the relevant section of his paper in the spoiler below[1]. SPOILER:
~Max [1] Clausius, M. R. (1856, August). On a modified Form of the second Fundamental Theorem in the Mechanical Theory of Heat. London, Edinburgh and Dublin Philosophical Magazine and Journal of Science, Series 4, 12(77), 9298. Retrieved from https://archive.org/stream/londonedi...ge/92/mode/2up Last edited by Max S.; 05032019 at 03:31 PM. 
#102




Quote:
And I thought we said microstatic particles still had to obey classical laws of physics? The movement of particles would not be fundamentally random, it would be absolutely determined by the microstate at any previous instant. You can theoretically create such a lopsided box of cold and hot gasses, but only by the demon expending energy outside the system to make it so orderly and expending energy again to remove the barrier. ~Max 
#103




Quote:
~Max 
#104




Quote:
And it could very well be that the fundamental reality is random, but it was my impression from the other thread and your posts here that you do not subscribe to that idea, or at least do not present that argument now. ~Max 


#105




Quote:
Quote:
Quote:
Quote:
What I meant is, that thermodynamics gives you an effective, ultimately heuristic picture. Statistical mechanics connects this heuristic picture with the exact fundamental dynamics, and tells us how it comes about. At the microphysical level, there are no quantities like temperature or pressure; each is just an effective quantity that derives from more fundamental quantities of the underlying system, such as the kinetic energy of the individual particles. Entropy, likewise, emerges as an effective accounting of the number of states of the more fundamental system that realize the same state of the approximate, coarsegrained thermodynamic picture. 
#106




Quote:
~Max Last edited by Max S.; 05032019 at 05:06 PM. Reason: Hi, Opal! 
#107




The definition of entropyQuote:
Quote:
~Max 
#108




We should get together for a game of craps sometime.

#109




Quote:
Quote:
That something can be expressed in terms of state variables doesn't preclude that same thing being expressed on the basis of microscopic variablesjust as how temperature can be expressed as the average molecular energy, as well as in terms of internal energy and entropy. Those aren't two different things, but merely two different ways of talking about the same thing. Like the morning star and the evening star both refer to Venus, even though, quite clearly, morning and evening are different things. You can't logically pry them apart; and you can't logically pry thermodynamic entropy and the entropy of statistical mechanics apart. Last edited by Half Man Half Wit; 05032019 at 05:34 PM. 


#110




Re: Fluctuation theorem experimentQuote:
Quote:
I mean, of course you can physically move particles to states of lesser (statistical) entropy. It takes work to do so but I'm not sure if they accounted for the powering of the laser and adjusted their heat sinks accordingly. They didn't give the temperature of the solution or heat sinks but it would be near impossible to keep internal equilibrium after imposing an artificial temperature gradient. Not that internal equilibrium is necessary for the second law of thermodynamics... Unless you are using the statistical definition of entropy and the entropy formulation of the second law of thermodynamics. In that case the law only holds absolutely true in an isolated system when the internal motions of atomic particles are exactly periodical, or when the temperature of the system is zero Kelvin. In all other cases the statistical entropy of a system will fluctuate, and this is the extent of any conclusions I can support from that paper. As I keep saying this is all fine and good within the theory of statistical mechanics, but it would seem that statistical mechanics assumes a priori that every microstate has equal probability at every moment, regardless of previous microstates. If it were not so, there would be no probability distributions to base fluctuation theory upon. That means every law of dynamics is reduced from an absolute law to "very probable". The same argument about the second law of thermodynamics in that paper can be made to show individual particles are merely unlikely to jump across my desk faster than the speed of light in a vacuum. Or rather, the laws of physics are violated all the time but the chances of it having any noticeable effect are so low that it may as well be impossible. But I don't believe this is the position you take. ~Max 
#111




1. What are you saying is the problem with the experiment in that paper, or the theorem they proved?
2. It's hard for me to parse what you are saying here but on one hand, yes, as I mentioned there are certain "assumptions" about systems (like your gas in a box) tending towards thermal equilibrium under certain conditions, and on the other hand at the same time there is research on setting up systems involving eg ultracold atoms that resist thermalization and exhibit quantum scarring and similar phenomena. It goes to show you that there are still interesting things to do in condensed matter physics. 
#112




Quote:
Quote:
Quote:
~Max 
#113




Re: Fluctuation theorem experimentQuote:
[1] "... every such quantity received may be brought into calculation as if it had been generated by work, and every quantity lost by a reservoir of heat, as if it had been converted into work." Claudius, R. Supra at #101, in spoiler.. Quote:
~Max 
#114




Quote:
Quote:
Quote:
Quote:
Quote:
Quote:
Quote:
However, relativity makes additional predictions in the case where the Newtonian theory no longer applies. In particular, relativistic speed has a maximum value, which it can't exceed. The Newtonian theory knows no such limit; but it would be spurious to say that therefore, there are really two notions of speed, one Newtonian, and one relativistic. You can't say, but my spaceship flies with Newtonian speed, and so, can go as fast as it likes! But that's exactly what you're saying regarding entropy. The relativistic notion of speed has revised and extended the Newtonian one, which is now understood to be only (approximately) valid in the regime of the very small (absolute value); the statistical notion of entropy has revised and extended the classical thermodynamical one, which is now understood to be only (approximately) valid in the regime of the very large (number of particles). Quote:



#115




Quote:
Quote:
If one assumes that the past is (for whatever reason) in a lowentropy state, there is no dilemma here: we're still overwhelmingly likely to observe a steady increase of entropy. But it does point out that there are perfectly physically allowable evolutions of a system that decrease entropy. Furthermore, it also shows that by considering the microscopic dynamics, you can infer how, e. g., heat gets transferred to a hotter body: just take the molecules involved in, e. g., some convective process, and invert their momenta. And lo and behold, suddenly heat flows in the 'wrong' direction. But of course, we knew already that this must, on occasion, be possible. 
#116




Quote:
There is no "different definition" of entropy, just the usual nuances in defining state variables for a nonequilibrium system. There are also other experiments studying entropy in complicated nonequilibrium conditions if you are interested. 
#117




Quote:
Quote:
~Max 
#118




Quote:
~Max 
#119




Quote:
~Max Last edited by Max S.; 05072019 at 01:54 PM. 


#120




Quote:
The point is, I don't see how that experiment contradicts the second law of thermodynamics as cited in the original post here. The "law" the paper purports to contradict is not the second law of thermodynamics, in my opinion. ~Max 
#121




When natural gas is burned, there are intermediate processes where molecules devolve randomly into states with high free energy. For example the reaction CO + H_{2}O > CO_{2} + H_{2} requires that a water molecule be torn apart. Yet it happens, and methane couldn't burn properly without such high freeenergy intermediate states.
These states aren't considered to violate the Second Law because they affect only a tiny portion of the burning gas at any one time, and only very briefly. Quote:

#122




Quote:
I disagree with Clausius that irreversible processes exist on the grand scale of things, and he himself gave Maxwell's demon as an example of such an irreversible process, some ten years before Maxwell thought of the idea and decades before the thought experiment was refuted. So as far as I am concerned, in classical thermodynamics there is no dilemma to begin with. Heat doesn't flow in the wrong direction, either there is zero net heat flow or the system is not isolated. ~Max 
#123




Quote:
Quote:
~Max Last edited by Max S.; 05072019 at 02:51 PM. Reason: fixed quote 
#124




Quote:
Your definition of equilibrium makes sense but it does not describe the same property. ~Max 


#125




Quote:
Quote:
Quote:
Then, you start, either theoretically or experimentally, to study smaller numbers of coin throws. And you see, well, your 'law' doesn't hold anymore: you throw ten coins, and well, it's not always the case that five come up heads, and five tails. Rather, the whole thing turns out to be statistical: the chance of each coin toss coming up heads is 50%, but that doesn't mean all ten can't come up heads. So, you've traded in the 'exactness' of your lawwhich has turned out to be only approximatefor a statistical law describing more fundamental dynamics. You can use this newfound knowledge to precisely calculate how likely it is that a given ensemble of N coins contains half heads and half tails. This is how you can derive the laws of the ensemble from those of the more fundamental theory. So it is with statistical mechanics. You can calculate the expected entropy increase for any given system, then measure it, and find agreement. This is also how you know that the entropy from statistical mechanics is nothing different from that of thermodynamics; it just yields a more complete picture. Quote:
Quote:
Perhaps it helps if you think of it as a betting matter. Given the macrostate, which microstate would you bet the system's in? And there's only one way to bet: if all the information you have is the macrostate, then it could equally well be in either microstate consistent with that. This isn't an assumption of all microstates being equally likely at any moment, it's merely a statement of ignorance. Quote:
If so, and if the gas is completely described in terms of its constituent atoms, and it's in a state where each atom has a certain velocity in a certain direction, then what does prevent the state in which each atom has the exactly opposite velocity in the exactly opposite direction from being possible? Clausius' irreversible systems exist only in an approximate sense: if we start out in a low entropy state, then it will almost never happen that entropy further decreases. Hence, macroscopic systems are almost always irreversible. But, just as it's not in principle impossible that the shards of a broken cup, if shaken in a box, spontaneously reform into a cup, so it is not impossible for the velocities of the atoms in a gas to be aligned in such a way as to lower the entropysay, by all collecting in the lower left corner of a box. Quote:
But, in the case of heat flowing from body A to body B (hence, neither being isolated), say, by transfer of a mass of hot gas (convection), there is no contradiction in thinking that all the atoms in the gas could have the opposite velocity, heat thus flowing to the colder system. How could an isolated system be in equilibrium with its surroundings? If the system is isolated, there is no heat flow, and thus, no means by which to achieve equilibrium. I mean, that's why we keep our hot beverages in speciallydesigned flasks that limit the interaction with the surroundingto keep the coffee hot, and slow down equilibration as much as possible! 
#126




Quote:
~Max 
#127




Loschmidt's paradoxQuote:
Let δQ = 0That is the resolution to Loschmidt's paradox, as related by DPRK. Apparently the resolution is different using your definition of entropy rather than Clausius's. ~Max Last edited by Max S.; 05072019 at 04:22 PM. Reason: indent 
#128




Quote:
~Max 
#129




Quote:
~Max 


#130




Quote:
Are you to say a fully defined theoretical system has no entropy? How is this consistent with Clausius's definition of entropy as a function of heat flow and temperature? Is it impossible to fully define a theoretical system? Are you denying local reality? How is that compatible with monistic physicalism? ~Max Last edited by Max S.; 05072019 at 04:48 PM. 
#131




Quote:
If you must say there is no difference between classical thermodynamics and statistical mechanics, just replace my usage of "classical thermodynamics" with "Max S.'s personal science of thermodynamics". When there's a law in my personal science of thermodynamics, that means absolutely no violations. The definition of entropy in Max S.'s thermodynamics is given by the equation: dS = ∫ δQ/T where Q is net heat flow and T is temperature.The second law of thermodynamics is given in three different, equally inviolable forms in the original post. There are two valid corollaries to the second law of Max S.'s thermodynamics: "The entropy of an isolated system shall not change". "The entropy of an isolated system shall not decrease". Where "isolated system" means zero net heat flow external to the system. This is my plain interpretation of documents linked in the original post. Why is the physics wrong? We are back to square one. ~Max Last edited by Max S.; 05072019 at 05:03 PM. 
#132




Quote:
~Max 
#133




So what if there's no contradiction? Classical thermodynamics is false. It doesn't describe the universe. Temperature, heat, entropy, etc. aren't some goo that permeates the universe; they're emergent properties of an ensemble of particles. If there were no granularity to the universeit it were smooth all the way down and temperature was a real property of thingsthen classical thermodynamics would be fine. But it's not, and so statistical mechanics is the real description, and with it comes the fluctuation theorem and the possibility that the 2LoT will only be true on average.

#134




Quote:
Quote:
Again, one way to think about it is that if you have a particle subjected to Brownian motion in a fluid, and you pull on it by moving a laser with constant velocity, then some of the time the force does positive work on the particle, and some of the time it does negative work, but the former is exponentially more probable than the latter. Far from purporting to contradict this (or any related) law, the experiment exactly confirms it. 


#135




Quote:
Quote:
Quote:
Quote:
The wikipedia article shows you how to derive it from statistical physics. The fact that this equation is derivable, rather than having to be postulated, means that statistical mechanics is the more fundamental science. Quote:
Quote:
But this isn't going to lead anywhere, is it. I've shown you quotes, arguments, and experiments, all of which explicitly disagree with you, and yet, to you, it still 'seems like' the second law should hold inviolable. You don't give any reason for that, beyond your own belief. So let's try something else. I'm going to try breaking things down as much as possible, so we can figure out where you actually don't understand what I'm saying. Let's start with a simple toy example.
If you disagree with any of the above, please tell me exactly with what, and why. 
#136




Quote:
We can have different scientific models for the same phenomenon that are at different levels of abstraction or precision. If I am calculating where a cannon ball will land, I'm not going to go to the level of electron orbitals because it would take forever to calculate and the chances of a quantum phenomenon happening on a macro scale (e.g. the whole ball quantum tunnels) is absurdly small. For this kind of calculation newtonian mechanics is probably the right choice. Meanwhile the laws of thermodynamics are the right fit for almost all experiments and real world applications. But not all. Last edited by Mijin; 05082019 at 02:34 AM. 
#137




Quote:
Small Systems and Short Timescales" and Half Man Half Wit clearly cited it as an experiment demonstrating violations of the second law. From the paper: Quote:

#138




Quote:
But a single molecule of gas in the smallest of boxes could be a thermodynamic system. ~Max Last edited by Max S.; 05082019 at 09:16 AM. 
#139




Quote:
Quote:
Quote:
~Max 


#140




Quote:
~Max 
#141




Re: Flipping coinsQuote:
~Max 
#142




IANAPhysicist, but it seems to me the Laws of Physics don't have any legal standing. If any of them were "routinely" violated wouldn't the physics community simply stop calling it a Law?

#143




It's historical verbiage.

#144




Quote:



#145




Quote:
Anyway, I presume that, if that was your only objection, you accept the conclusionthat, observing the system (talking just about the coins here), one might formulate a law to the effect that 'greyness only increases', and further, that this law can be violated? 
#146




Quote:
Neither do I think the second law of thermodynamics is statistical in nature, nor that it only applies for large systems. So long as a system has heat, temperature, and pressure, it should be a thermodynamic system subject to the laws of thermodynamics. ~Max 
#147




Quote:

#148




Quote:
Actually I was wondering how entropy can increase (or decrease!) in an isolated system as Half Man Half Wit seems to imply it does. ~Max 
#149




Quote:
I will concede that one might formulate a law to the effect that 'greyness only increases', and further, that law can be violated. What is your point? ~Max 


#150




Dice logicQuote:
I think the minor premise should read: "A sixsided die is a die with a 1/6 probability of landing on any particular side". Then we have another blank here:
~Max 
Reply 
Thread Tools  
Display Modes  

