Ah, I think I am beginning to understand. It must have been “random hopping” that threw me off, because all this time I had thought you mean equiprobability of microstates at every instant, independent of microstate at the former instant. Forgive me for the misunderstanding.
I am perfectly fine allowing for equiprobability of the initial microstate of a toy thermodynamic system. Now we can return to the main question: is the second law of thermodynamics routinely violated? Also, which version is violated and why?
So we shouldn’t judge a book by its cover? Because balloons sound pretty thermodynamic. Automatically mistrust any text that doesn’t come in plain buckram?
Boltzmann introduced a “molecular chaos” hypothesis to deal with entropy production in time reversible systems. You need some ergodic condition like that to prove his H-theorem, or a Fluctuation theorem.
I guess you’re right. I remember knowing things like how to read a thermometer, temperature conversion, the three states of matter, melting and boiling points and pressure, conduction vs convection vs radiation, and somehow the parts and process of a four-stroke engine. But I don’t remember if that came from class or a video game called Genius Physics.
Maybe, but L&L there basically say that they’ll ignore downward fluctuations, because of them being impossible to observe, and formulate the second law on that basis; this isn’t in tension with the claim that when such rare fluctuations occur, the second law is violated.
Then tell me how it could be. Tell me how, for a system that’s fully described as ‘five differently colored balls in an urn’ (that is, without adding any ad-hoc explanations, such as ‘maybe the red ball is attracted to my hand’), it could be that the probability of drawing either color is different from 1/5.
So, are you just choosing Nava’s route of ignoring experimental results you don’t want to accept?
The different formulations are equivalent. If the entropy of a closed system can fluctuate downwards, then a colder body can heat up a hotter one, for example (again, see the experiment I quoted above).
Yes, it’s violated (albeit ‘routinely’ for macroscopic systems being, as has been stressed, about once in several lifetimes of the universe), all of its versions are, and the reason is that the second law only applies on a statistical basis.
So, going back a little, is Bolzmann’s own brain an instance of a Boltzmann brain ? And together with everyone else’s brains ? Seems a bit too high a probability. So maybe very improbable states are much too common.
Now, of course, we have evolution, powered by an inflow of energy and acting as an invisible hand directing matter towards preferred states. However this seemed always to me as a weaselly explanation - the inflow of energy is necessary, but by far not sufficient. So, my question is a re-framing of the Fermi paradox: is the start of an evolutionary chain a thermodynamic low-entropy random fluctuation? And taking into account the self-sustaining and self-amplificatory characteristic of this low-entropy fluctuation, and supposing we could look (really) far away into the future should this have any effect on the homogeneity of the (very) future universe ?
Maybe we can start simpler. Let’s say that a quirky random fluctuation somewhere near an energy source (star ?) creates something much simpler, like a Maxwell Daemon - or just an air conditioner. What happens then with probability distributions over time in that portion of space ?
There’s many interesting ideas around thermodynamics and the origin of life; but, for the most part, they actually center around life being a very efficient way to increase (overall) entropy, and thus, being thermodynamically favored, see e. g. here.
I sense the sticking point here is the notion of ‘random’. These downward flucturations are random with respect to the macrostate, in the sense that knowing the macrostate does not allow you to predict that a downward fluctuation will occur; but they’re completely deterministic on the level of the microstate. Picture something like a gas evenly spread out in a box, with all the gas molecule’s velocities aimed towards the center: from the macroscopic level, taking the gas as being described by its volume, temperature, and pressure, you see nothing but an equilibrium, with no indication that anything’s going to happen but it remaining in equilibrium; but in fact, the gas molecules will bunch up at the center, thus decreasing entropy.
The whole is the whole table, because the thermodynamic system under discussion is the whole table. Both the legs of the table and their individual constituent atoms are “parts” of the whole table. Individual atoms could also be “parts” of the legs of the table, where the leg of the table is a sub-system of the whole table. I’m willing to drop this and go on using your definitions now that you have clarified them.
That is confusing - I know what I meant but let me rephrase:
When dealing with a thermodynamic system, I thought properties of the system must represent the system as a whole, dependent on the microscopic state but not necessarily on a specific microscopic state.
You cannot measure half of a thermodynamic system without defining a boundary between the one half and the other. The boundary need not have impermeable walls and it need not be isolated. It could be an imaginary line through the middle of the table. As soon as you define the boundary, you have defined a fully functional “mid-level” thermodynamic sub-system with its own set of thermodynamic state variables, including temperature and entropy and yes, the number of particles.
I say mid-level because our frame of reference is the table as a whole system. The fundamental level would be atomic*, or at least however specific we wish to go. Surely you can understand my description of systems between the frame of reference and the fundamental level as mid-level abstractions, or sub-systems?
atomic as in indivisible, not necessarily atomic theory
By this definition a every macroscopic property belongs to a thermodynamic system. “Macroscopic property” is the same as saying “state variable” as I learned thermodynamics.
Therefore, the number of particles on the left side of the table is not a macroscopic property of the table in aggregate, but a macroscopic property of the left side of the table in aggregate. The reason for this is that determining which particles are on the left side of the table depends on microscopic the position of those particles. If you can only measure the system as a whole, you can not observe which particles are on the left side and therefore that count is not a course-grained observable.
If you fail to admit the left side of the table as a thermodynamic system, then by your own definitions there is no such property as “the number of particles on the left side of the table”.
The point is that a macrostate is a set of quantities—however defined—that give you information about the aggregate properties of a system. (number of particles in the left half, number of particles in the right half) is a perfectly good description of the system.
If two urns contain five balls each, and each urn has exactly one red ball, the probability of randomly picking red from the second urn is 20%.
The probability of randomly picking red from the second urn every time for n trials is 20%[SUP]n[/SUP]. The probability of randomly picking red from the second urn half of the time for n trials is 20%[SUP]n/2[/SUP]. With 10 trials that’s 1.024x10[SUP]-5[/SUP]%. The probability of picking red from the second urn in at least 500 out of 1,000 trials is about 3x10[SUP]-348[/SUP]%. Improbable. Unlikely. But by definition, not impossible.
Improbable things happen all the time, just not on purpose. I’m not sure if that answers your question, or where you are going with this.
But you must understand that changes in the number of particles on the left half of the table does not constitute a change in the aggregate temperature, heat, volume, pressure, number of (aggregate) particles, or classical entropy of the table.
Therefore the only form of the second law of thermodynamics that could possibly be violated when a particle moves within an isolated system is the entropy formulation using a statistical definition of entropy. I’m not sure how anyone justified that formulation to begin with.
That makes sense, but only when using the statistical definition of entropy. None of the versions of the second law of thermodynamics in the original post are contradicted by such a gas. Those laws are absolute and apply at every level of detail.
It seems that statistical mechanics is a useful but imprecise abstraction of the underlying classical reality.
The formulation is justified because what’s postulated based on empirical observations in thermodynamics can be derived as theorems valid in the statistical limit from statistical mechanics. Additionally, statistical mechanics predicts entropy fluctuating downwards, and heat flowing from a colder to a hotter body, with a certain probability. These predictions are empirically confirmed. Hence, the earlier formulations are merely approximations of the more fundamental statistical notions.
Statistical mechanics is the more fundamental theory, and explains the empirical findings of thermodynamics, completing the theory at the microscopic level.
The equivalence of the classical thermodynamics form of the second law and the statistical mechanics form really isn’t hard to see. Consider a gas that’s a mixture of hot and cold, fast and slow, molecules (it will, of course, in reality contain particles of a continuous Maxwell-Boltzmann distribution of velocities, but let’s idealize here). Now, there are more microstates realizing the configuration where both sorts of particles are evenly distributed than there are microstates realizing the configuration where all the ‘hot’ particles are on the left side, and all the ‘cold’ particles are on the right side. All the cold particles moving right, and the hot particles moving left, is therefore a reduction in entropy—both statistical (since we’re going from a macrostate with many microscopic realizations to one with fewer) and in terms of heat transfer (since heat flows from a colder system to a hotter one).
As you may recall, this is the setup of Maxwell’s demon: he sits at the boundary between both sides, sorting hot particles to one, and cold particles to the other side. The trick is now simply that we don’t need the demon at all: the whole thing can happen purely by chance, should all the molecule’s velocities align in the right way; which they will, for generic initial conditions, after you’ve waited long enough.
That’s flat wrong. Take a room of gas, and pile up all the molecules in the left corner; this is an isolated system, and what’s gonna happen is that all the gas will soon fill the room, until the whole is filled, which is the state of maximum entropy for the system.
Simple: there’s more ways to increase than decrease the entropy, thus, any given change is more likely to increase it, and hence, on average, the entropy will increase.
How is it derived and from what is the difference between deriving and justification? It seems to me you must either uphold the fundamental axiom of equiprobability of microstate at every instant regardless of previous state, thereby severing causality and reducing all physical laws to “very likely”, or you must admit that statistical mechanics is but an approximation of an underlying reality.
And if statistical mechanics predicts a violation of the classical second law of thermodynamics, I would like to hear an explanation.
Neither. The statistical part comes in because we have incomplete information about the underlying microstate, and thus, can’t make certain predictions. It is, in the end, not in any way different or more complicated than assigning a probability of 1/6 to a thrown die showing any given number.
Perhaps as an analogy: thermodynamics tells you, after having observed lots of dice throws, that the die shows each number 1/6 of the time; statistical mechanics models the way the dice gets thrown, its possible trajectories, and derives that it’s gonna come up with every number once in six throws.
See my first post in this thread. Also, the post I made before this one. And in fact pretty much every post in between.