When natural gas is burned, there are intermediate processes where molecules devolve randomly into states with high free energy. For example the reaction CO + H[sub]2[/sub]O –> CO[sub]2[/sub] + H[sub]2[/sub] requires that a water molecule be torn apart. Yet it happens, and methane couldn’t burn properly without such high free-energy intermediate states.
These states aren’t considered to violate the Second Law because they affect only a tiny portion of the burning gas at any one time, and only very briefly.
I Googled a bit, mainly hitting paywalls, but did find a video lecture of that paper. It may be slightly interesting; it includes motion pictures of DNA molecules (which were chosen for their size, shape and flexibility rather than any special biologic properties) traversing a “tilted washboard.” However I do not see any sense of “2nd Law violation” beyond obvious fluctuations similar to those that arise trivially, as in the burning gas mentioned above.
Clausius only offered the entropic formulation of the second law, where entropy must always increase, in situations where there are irreversible processes. It would therefore be impossible to prove a contradiction by literally reversing the process, if you can reverse the process the change in entropy should be exactly zero.
I disagree with Clausius that irreversible processes exist on the grand scale of things, and he himself gave Maxwell’s demon as an example of such an irreversible process, some ten years before Maxwell thought of the idea and decades before the thought experiment was refuted.
So as far as I am concerned, in classical thermodynamics there is no dilemma to begin with. Heat doesn’t flow in the wrong direction, either there is zero net heat flow or the system is not isolated.
I don’t see why they would violate the second law to begin with. What does free energy have to do with the second law? Changes in the distribution of heat within a system do not contradict the second law of thermodynamics as stated by Clausius, Kelvin, or Carathéodory.
From DPRK’s link you can click on “Download Accepted Manuscript” on the right side.
I’m not sure if I ever responded to this. One isolated system would by my definition be considered in a state of thermal equilibrium with its surroundings. Self-equilibrium only makes sense when you divide an isolated system into two or more sub-systems, or when the temperature of the isolated system is zero degrees Kelvin.
Your definition of equilibrium makes sense but it does not describe the same property.
They are moving the bead, and the system has a certain entropy production along its trajectory. According to the second law, that entropy production ought to always be positive; according to the fluctuation theorem, it occasionally won’t be. They measure, and do find negative entropy production.
Well, now you at least know the theory and experiment regarding violations of the second law. Acceptance, usually, comes eventually (it might be a ‘going through the stages’ thing), although you’re showing some enviably strong convictions on a matter I had thought unlikely to really inspire controversy.
This gets things the wrong way around. Thermodynamics is like the science of many coin throws: for a large enough number, half of them will be heads, half of them tails. If all you ever observe are large numbers of coin throws, you might think it’s a natural law that there are always equal amounts of heads and tails (say, you only know the rough weight of the coins that came up heads vs. those that came up tails).
Then, you start, either theoretically or experimentally, to study smaller numbers of coin throws. And you see, well, your ‘law’ doesn’t hold anymore: you throw ten coins, and well, it’s not always the case that five come up heads, and five tails. Rather, the whole thing turns out to be statistical: the chance of each coin toss coming up heads is 50%, but that doesn’t mean all ten can’t come up heads.
So, you’ve traded in the ‘exactness’ of your law—which has turned out to be only approximate—for a statistical law describing more fundamental dynamics.
You can use this new-found knowledge to precisely calculate how likely it is that a given ensemble of N coins contains half heads and half tails. This is how you can derive the laws of the ensemble from those of the more fundamental theory.
So it is with statistical mechanics. You can calculate the expected entropy increase for any given system, then measure it, and find agreement. This is also how you know that the entropy from statistical mechanics is nothing different from that of thermodynamics; it just yields a more complete picture.
You can’t measure ‘statistical’ entropy, you can only measure entropy. You can calculate the entropy based on statistical methods, then make a measurement, and see that it agrees.
This is confused. A system has entropy because we don’t have complete knowledge of the microstate. Given that particular macrostate, it can be equally well in either microstate compatible with that description. It isn’t any more likely to be in either of these states, because if we could say that it is, we would have more knowledge about the microstate than the macrostate allows, and hence, would not actually be in that macrostate.
Perhaps it helps if you think of it as a betting matter. Given the macrostate, which microstate would you bet the system’s in? And there’s only one way to bet: if all the information you have is the macrostate, then it could equally well be in either microstate consistent with that.
This isn’t an assumption of all microstates being equally likely at any moment, it’s merely a statement of ignorance.
But you do accept that a gas, say, is ultimately made out of smaller particles, atoms? And do you accept that their motions are completely reversible? That is, that if an atom can move to the right at velocity v, then it can as well move to the left at velocity v?
If so, and if the gas is completely described in terms of its constituent atoms, and it’s in a state where each atom has a certain velocity in a certain direction, then what does prevent the state in which each atom has the exactly opposite velocity in the exactly opposite direction from being possible?
Clausius’ irreversible systems exist only in an approximate sense: if we start out in a low entropy state, then it will almost never happen that entropy further decreases. Hence, macroscopic systems are almost always irreversible. But, just as it’s not in principle impossible that the shards of a broken cup, if shaken in a box, spontaneously reform into a cup, so it is not impossible for the velocities of the atoms in a gas to be aligned in such a way as to lower the entropy—say, by all collecting in the lower left corner of a box.
This doesn’t make sense. If the system is isolated, then obviously there’s zero net heat flow, since there’s no place where the heat could flow.
But, in the case of heat flowing from body A to body B (hence, neither being isolated), say, by transfer of a mass of hot gas (convection), there is no contradiction in thinking that all the atoms in the gas could have the opposite velocity, heat thus flowing to the colder system.
How could an isolated system be in equilibrium with its surroundings? If the system is isolated, there is no heat flow, and thus, no means by which to achieve equilibrium. I mean, that’s why we keep our hot beverages in specially-designed flasks that limit the interaction with the surrounding—to keep the coffee hot, and slow down equilibration as much as possible!
If there is zero net heat flow then by definition:
Let δQ = 0
dS = ∫ δQ/T
= ∫ 0/T
= ∫ 0
= 0; Q.E.D.
That is the resolution to Loschmidt’s paradox, as related by DPRK. Apparently the resolution is different using your definition of entropy rather than Clausius’s.
When I used “measure” I meant calculate. I am not aware of a physical method to measure the entropy of a given system, of any analog to the thermometer or pressure gauge.
This would be the law of conservation of momentum, also known as Newton’s first law of motion, which to my knowledge is preserved in relativity so long as we are talking about one inertial frame of reference.
We can’t know every detail about real experiments, but theoretical thermodynamic systems can be fully defined by the laws of physics. This is the central premise of strong physicalism as we discussed in the Dualism thread. There’s no reason to bet, I can look at the description of a theoretical system and tell you exactly how it is.
Are you to say a fully defined theoretical system has no entropy? How is this consistent with Clausius’s definition of entropy as a function of heat flow and temperature?
Is it impossible to fully define a theoretical system? Are you denying local reality? How is that compatible with monistic physicalism?
This is not my understanding of classical thermodynamics, it is my understanding of statistical mechanics as relayed to me by you. My interpretation of classical thermodynamics is based on reading papers at face value. I linked the papers in the original post. Your saying I have it backwards is truly perplexing.
If you must say there is no difference between classical thermodynamics and statistical mechanics, just replace my usage of “classical thermodynamics” with “Max S.'s personal science of thermodynamics”. When there’s a law in my personal science of thermodynamics, that means absolutely no violations. The definition of entropy in Max S.'s thermodynamics is given by the equation:
dS = ∫ δQ/T
where Q is net heat flow and T is temperature.
The second law of thermodynamics is given in three different, equally inviolable forms in the original post. There are two valid corollaries to the second law of Max S.'s thermodynamics:
“The entropy of an isolated system shall not change”.
“The entropy of an isolated system shall not decrease”.
Where “isolated system” means zero net heat flow external to the system. This is my plain interpretation of documents linked in the original post. Why is the physics wrong?
To me it seems everybody is redefining entropy and the second law of thermodynamics. I take the laws as defined by Clausius, Kelvin, and Cartheodory and find no contradiction. Neither do I yet understand why everybody redefined entropy to begin with, or why the second law of thermodynamics as expressed in a corollary at the end of Clausius’s paper should still hold true after changing the definition of entropy.
So what if there’s no contradiction? Classical thermodynamics is false. It doesn’t describe the universe. Temperature, heat, entropy, etc. aren’t some goo that permeates the universe; they’re emergent properties of an ensemble of particles. If there were no granularity to the universe–it it were smooth all the way down and temperature was a real property of things–then classical thermodynamics would be fine. But it’s not, and so statistical mechanics is the real description, and with it comes the fluctuation theorem and the possibility that the 2LoT will only be true on average.
It’s the same thing. One can examine the microscopic origin of properties of matter, which is where statistical methods come in.
I never said any such thing, and neither does the paper !? So there may be a misunderstanding.
Again, one way to think about it is that if you have a particle subjected to Brownian motion in a fluid, and you pull on it by moving a laser with constant velocity, then some of the time the force does positive work on the particle, and some of the time it does negative work, but the former is exponentially more probable than the latter. Far from purporting to contradict this (or any related) law, the experiment exactly confirms it.
That’s indeed the case if a system is in equilibrium. However, it’s a necessary, not sufficient condition for equilibrium: my coffee in a perfect thermos has no net heat flow to the environment; yet, it’s not in equilibrium with it.
I didn’t say that the momentum spontaneously changes to pointing in the other direction. I said that it’s a valid state for the atom to move to the left at v, it’s a valid state for it to move to the right at v. Both will be solutions of the equations of motion.
I have no idea what any of that is supposed to mean. A thermodynamic system is defined (theoretically or experimentally) as one in which we use macroscopic, averaged quantities in order to describe what would otherwise be an impossible amount of data by few variables. If you’re wanting to describe it in terms of microscopic physics, you’re leaving the thermodynamic level.
Which is the same as in statistical mechanics. It’s just that statistical mechanics shows us that dS >= 0 doesn’t always hold any more.
The wikipedia article shows you how to derive it from statistical physics. The fact that this equation is derivable, rather than having to be postulated, means that statistical mechanics is the more fundamental science.
Sure. You may also take the laws of Newton, and find no contradiction. You can write down all manner of consistent theories, it’s just that nature has no need to oblige you. And nature doesn’t: the laws of thermodynamics, just as the laws of Newton, are valid exactly only in their proper domain.
There is no change, merely an explanation. Entropy was introduced as a phenomenological quantity; a more accurate, more inclusive theory was found that explains what that quantity is.
But this isn’t going to lead anywhere, is it. I’ve shown you quotes, arguments, and experiments, all of which explicitly disagree with you, and yet, to you, it still ‘seems like’ the second law should hold inviolable. You don’t give any reason for that, beyond your own belief.
So let’s try something else. I’m going to try breaking things down as much as possible, so we can figure out where you actually don’t understand what I’m saying. Let’s start with a simple toy example.
[ol]
[li]Suppose you have a million coins.[/li][li]At every time-step t, a subset k of those coins are flipped.[/li][li]Which coins are flipped depends on which coins were flipped in the previous step, and how they landed.[/li][li]Coin-flipping is a perfectly deterministic process: given perfect knowledge of the initial state, the outcome would be exactly predictable.[/li][li]It is nevertheless meaningful to talk about a coin-flip coming up heads with 50% probability, since we never have that perfect knowledge.[/li][li]At any time-step, roughly half of the k coins are going to come up heads.[/li][li]Nevertheless, it is possible for all of them to come up tails.[/li][li]Suppose we start with all of the coins tails up.[/li][li]Each time-step is going to drive the system closer to 50% heads, 50% tails, on average. [/li][li]Suppose, for instance, that heads and tails are colored black and white, respectively, and you can only observe the aggregate color; then, the system is going to get closer to ‘grey’ over time.[/li][li]There is exactly one way to realize the initial state.[/li][li]There are many more ways to realize the state ‘50% heads, 50% tails’.[/li][li]Virtually all the time, we’re going to see a state that’s away from ‘50% heads, 50% tails’ evolve towards that state.[/li][li]Nevertheless, occasionally, at some time-step, all of the coins (or a sizable majority of them) are going to come up heads.[/li][li]Occasionally, if we wait long enough, we’re going to see the system evolve away from ‘50% heads, 50% tails’.[/li][li]These departures can take on an arbitrary length—two time-steps in which there is an evolution into the other direction may occur one after the other.[/li][li]Waiting long enough, we’re going to see the system get as far from ‘50% heads, 50% tails’ as we want.[/li][li]Suppose we had, after watching the system for some time, formulated a law: ‘the system will always become more grey over time’. [/li][li]Alternatively, we could say that the quantity G = 1 - (1 - 2K)^2, where K is the ratio of heads to tails, always tends to the maximum. This quantity assumes it maximum at K = 1/2, and is zero both for K = 0 (all white) and K = 1 (all black), and thus, captures what we mean by ‘the system will always be more grey over time’ at least in that respect.[/li][li]This law, even though we had thought it to be exact, in fact only holds on average. [/li][li]If we wait long enough, we will observe violations of this law. Sometimes, the system will become more white, or more black; equivalenty, sometimes, a time-step will lead to a decrease of G.[/li][/ol]
If you disagree with any of the above, please tell me exactly with what, and why.
Agreed, but I would say we don’t need to say it’s false per se:
We can have different scientific models for the same phenomenon that are at different levels of abstraction or precision.
If I am calculating where a cannon ball will land, I’m not going to go to the level of electron orbitals because it would take forever to calculate and the chances of a quantum phenomenon happening on a macro scale (e.g. the whole ball quantum tunnels) is absurdly small. For this kind of calculation newtonian mechanics is probably the right choice.
Meanwhile the laws of thermodynamics are the right fit for almost all experiments and real world applications. But not all.
The paper in question is titled “Experimental Demonstration of Violations of the Second Law of Thermodynamics for
Small Systems and Short Timescales” and Half Man Half Wit clearly [POST=21620578]cited[/POST] it as an experiment demonstrating violations of the second law. From the paper:
Sorry, I meant to say I found no contradiction between those laws and the thought experiments or real experiments in this thread. If a thermodynamic system is so small that it has no thermodynamic properties, that you can’t even theoretically assign temperature or heat, it doesn’t make sense to apply the laws of thermodynamic systems at that level does it? If such a small system exists without those properties it isn’t really a thermodynamic system.
But a single molecule of gas in the smallest of boxes could be a thermodynamic system.
I don’t understand, why is your coffee not in equilibrium with its surroundings?
Loschmidt’s paradox as you described it involves negating the velocity of particles. I thought you provided this paradox to show a violation of the second law of thermodynamics. But there is no violation because an isolated system subject only to reversible processes has zero entropy, my definition. The point is, Loschmidt’s paradox does not contradict [DEL]the classical[/DEL] Max S.'s second law of thermodynamics. How can you say my definition of entropy is the same as yours when using your definition of entropy invites a paradox? No matter which definition is correct, you must admit that they are different.
I don’t understand why laws of thermodynamics cease to apply at microscopic levels. So long as you can define heat and temperature you can define entropy and the second law of thermodynamics should hold for a system consisting of one atom (closed but not isolated) or two atoms (isolated).
I’m having trouble understanding how Wikipedia justifies their definition of temperature (step 1), since the article they link to doesn’t show a similar form. Neither do I know what an eigenstate is. I can keep studying these but it will take me some time to make any sense of it.