FAQ 
Calendar 


#51




Quote:
So far I can accept that fundamental premise as a heuristic, but not as an absolute physical law. The things you implied if the premise were false don't seem so certain to me, see post #13 and #14 and the related discussions, which are ongoing. I will add all of those cites to my reading list, but I can't guarantee when I will get around to reading them. And as always, I will admit my ignorance of modern or even older sciences in an instant. ~Max 
#52




Quote:
Specifically, you should be able to measure a system property without ascertaining the internal state. Would you agree? Quote:
~Max Last edited by Max S.; 05012019 at 10:29 AM. Reason: fixed links 
#53




Quote:
I propose redefining macroscopic to mean "of the whole" and microscopic to mean "of the constituent parts". That way the answers are clearcut. ~Max 
#54




Quote:
You can use the statistical definition of entropy and claim the entropy definition of the second law of thermodynamics is a law. But this is setting yourself up for failure  the statistical definition of entropy depends on that a fundamental premise of equiprobable microstates. That same premise is strictly incompatible with the the second law of thermodynamics in any form, including the entropy form. ~Max 


#55




Quote:
We can consider this as a physical system with five states available to it. These states are equiprobable, not because that's a physical law, but because there's no law, nor any other cause, that would make them not be. That's what I'm trying to get at with the billiard balls. Barring special initial conditions, such as carefullyarranged periodic paths, which would in themselves be something in need of explanation, whenever you look at the table, for a generic starting configuration (that is, balls set in motion with no matter what initial velocities), you're equally likely to find it in any of the possible microstates available to it. If you want to set it up differently, you'd have to add some further elementeither carefully set up initial conditions, or further forces acting on the balls to prohibit certain states/change their likelihood, and so on. But with just billiard balls bouncing off of each other, you can be certain you'll observe a decrease in entropy eventually. Quote:
Thermodynamic quantities are aggregate observables, which tell you something about the state of the system, but not all you could know. It's a statistical description if you have limited knowledge about the system. Quote:
Quote:
[/QUOTE]I propose redefining macroscopic to mean "of the whole" and microscopic to mean "of the constituent parts". That way the answers are clearcut. ~Max[/QUOTE] Neither 'whole' nor 'parts' have any objective meaning, though. Is the whole the whole table? If so, are its parts the legs and the top, or the atoms that constitute them? If it's the atoms, is the whole they form the leg they're part of, or the table the leg is a part of? Quote:
Quote:

#56




Quote:
It is fine in philosophy to say reality is fundamentally based on heuristics rather than absolute physics. You can build consistent physics on top of heuristics, but the tradeoff is predictive value at the lowest levels. Causality and scientific laws (including laws of conservation) are reduced to heuristics, because at the fundamental level changes in state are random. On the other hand, you can no longer support strong physicalism  the view that all phenomena can be explained by physics. To do so you need to contort the word "explained" to include pointing to this axiom that says it's fundamentally random. You can do that, but it's counterintuitive and leads to... interesting philosophical conclusions. Is that consistent with your position? With the positions of all those scientists? I thought you advocated a form of strong physicalism in the dualism thread. ~Max 
#57




@ Max — I've been trying to follow the conversation, but with growing confusion.
Assuming you thought I knew what I was talking about, would that sentence change your viewpoint? 
#58




Quote:
But anyway. What's the law that makes it such that the probability of five colored marbles in an urn comes out 1/5 for each color? Nothing beyond their mere number. Likewise with the microstates. You essentially want to say that there needs to be a separate law dictating these probabilities, and that they could just as well come out another way. But anyway. I've actually shown you how the equiprobability can be derived. For an ensemble of billard balls, for generic initial conditions (i. e. initial conditions not set up to lead to special, periodic motions, which form a measure zero subset of all possible initial conditions), it will be the case that the time spent in any possible microstate will be the same. Hence, there's an even chance of finding the system in any given microstate. And that means that entropy will tend to decrease if you start in some initial state, but may also decrease, with very low probability. That probability can be calculated via the fluctuation theorem: Quote:
Quote:
Quote:
By the way, is your username just a happy accident, or are you 'Max S.' exactly because of your belief that entropy always only tends to the max? Last edited by Half Man Half Wit; 05012019 at 04:11 PM. 
#59




Quote:
At any rate, I think it is quantitatively clear what is going on, especially for the classical systems statistically studied by Maxwell, Boltzmann, Gibbs, et al. 


#60




Quote:
I can't wrap my head around the equiprobability of microstates, plus I was using other definitions of the second law of thermodynamics. So I haven't reached that conclusion yet. ~Max 
#61




Even if 2nd law reversals happen, it's still not possible to use these to do work, because even if you had some mechanism that could supposedly exploit such reversals, it's equally possible that random fluctuations could undo any temporary gains. It's sort of like blackjack: you can get a small temporary reversal against the dealer but in the long run you always lose.

#62




Quote:
~Max 
#63




Quote:
The laws of probability are only useful in predicting to an extent  if the only constraint is probability, it is theoretically possible for every coin toss to be tails forever. ~Max 
#64




Quote:
If so, let's say the billiard table starts out with an initial microstate that would not classically lead to periodic motion. The distribution of billiard balls is somewhat homogenous. Step forward in time one moment. Surely there is a chance that the arrangement of billiard balls is now a special microstate which would classically lead to periodic motion. Even if this isn't the state now, at some point you would expect the arrangement of billiard balls to reach such a special microstate. The reverse holds true  one moment the system can have a special microstate and the next moment it might not. This is where I find the disconnect between statistical mechanics and causality. The effect is that all the "dynamic" laws of physics mentioned by septimus are reduced to heuristics. There is a chance, however miniscule, that the eight ball will randomly jump to the opposite side of the table faster than the classical speed of light in a vacuum. If not, then only the initial microstate of the table is random, and then only for the purposes of the hypothetical. From there on out everything follows classical laws and the second law of thermodynamics as classically defined is never violated. We would still use statistical mechanics to describe systems where the internal state cannot be determined, but that is just a heuristic. The underlying reality would be classical. ~Max 


#65




Yes, the classical billiards studied by Boltzmann exhibits Poincaré recurrence, which may seem surprising but goes to show one needs to be careful how to formulate and interpret the second law of thermodynamics for a finite number of particles, as discussed above, and how it technically arises as in Boltzmann's etatheorem.

#66




Quote:
Quote:
The probabilities in statistical mechanics stem from ignorance. We don't have full knowledge of the microstate, and thus, we don't have full control over it. Setting a large enough system up, always means drawing a certain microstate from a distribution consistent with the macroscopic variables we can control. And, for virtually all (all but an impossibly tiny subset) of these initial states, if we wait long enough, we're gonna see entropy dipping down. As has in fact happened in the laboratory. If you throw a die, it'll end up showing either number randomly with probability 1/6. If you trust that's what's going to happen, you should not have any trouble accepting the equiprobability of microstatesbecause it's exactly that. If the die fails to behave that way, at least in the limit of large numbers of throws, you'll call it unfairyou'll assume it has been tampered with in some way. That you're justified in concluding this is exactly because anything else than equiprobability would require some special intervention. Quote:
Quote:

#67




Quote:
The reason we can't use entropy reversals to do useful work is because the chance of a thimbleful of water spontaneously developing a 1c temperature gradient is probably such that you'd need to wait many times the universe's current age to get even odds (I haven't done the math though) But were such a fluctuation to happen, there's no reason it needs to behave any differently to any other heterogenous material at that point. No reason why the system should "know" it needs to leap back to some former state, and so no likely reason you couldn't do work. Sent from my Redmi 5A using Tapatalk Last edited by Mijin; 05022019 at 02:01 AM. 
#68




And this sort of thing, of course, routinely happens in the right circumstances. As per the abovecited wiki article on the fluctuation theorem:
Quote:

#69




Quote:



#70




Quote:
On another note: Max S., what do you make of papers like this one? Experimental Demonstration of Violations of the Second Law of Thermodynamics for Small Systems and Short Time Scales Do you think the experimental methodology was suspect? Are they misinterpreting their results? 
#71




I haven't looked at the paper, but my 20€ are on "improperly isolated systems".

#72




So how do I collect?

#73




First you begin by proving that the systems are properly defined and truly separated from any contact with the rest of the universe, then we talk ISBNs.
__________________
Evidence gathered through the use of science is easily dismissed through the use of idiocy.  Czarcasm. 
#74




That experiment purports to demonstrate in an experiment the applicability of the Fluctuation Theorem, which is — like Boltzmann's theorem — a mathematical theorem relying on certain assumptions. It is relevant to the entropy production rate of certain classical dynamical systems, which is actually interesting to put to experimental test because, we should not forget, all this mathematical discussion of idealized billiard balls and similar is supposed to have measurable and testable implications for real physical systems, like the Second Law of thermodynamics, the subject of this thread.
Back to the Fluctuation Theorem, it implies that in a system (satisfying certain assumptions...) that is not at equilibrium, so its entropy could either increase or decrease(?!), the ratio between the probability of positive and negative entropy production over any interval of time. In particular, it could happen that (only at all likely in a small volume and over a short interval) that the observed entropy production is negative. However, note that the average entropy production, no matter how small the volume or how short the observation time, is always positive. In the authors' own words, "the [Fluctuation Theorem]... is completely consistent with the Second Law of Thermodynamics, and it can be considered a generalization of this Law to small systems observed for short periods of time." So, while one should of course critically examine the experimental setup (which is described in the paper, involving 100 latex particles in a 1.0 ml glass cell full of water), the reported results are completely consistent with the mathematical model, precisely the same type of mathematical model considered and used by Boltzmann and others to derive the Second Law of thermodynamics (which, again, is in fact observed in real life; if it be "routinely violated" then I have overlooked something in this discussion). 


#75




Before coughing up the dough, the system consists of latex powder in water in a glass cell being zapped with a laser and observed through a microscope. In their computer simulation, particles along the walls act as a momentum and heat source/sink. So it depends what you mean by "isolated".... (cf different possible "ensembles" in statistical mechanics)

#76




Quote:
Besides, it's 'just' an experimental confirmation of a wellstudied theoretical result, which came out just as theory predicted. To have either the theory wrong, and the experiment as well, just in the right way to 'confirm' the theoretical prediction, or to have the experiment mistakenly show the predicted theoretical effect, would both be considerably less parsimonious explanations than just accepting that the experiment simply observed what it was set up to observe (and again, to assign validity to such rebuttals against scientific work would likewise requite assigning validity to all manner of creationist, climate skeptic, and other quackery). DPRK, I think we have an issue of terminology here, so let's try to sort that out. The story I was taught is as follows. In studying the theoretical capacity of heat engines, people like Carnot and Clausius came up with certain laws bounding their efficiency, like the second law of thermodynamics. This can be stated in the form 'for an isolated system, entropy can never decrease'. This was mainly based on empirical evidence. Later on, Boltzmann, Gibbs, Maxwell and so on formulated kinetic theory, which was able to derive the earlier laws as statistical predictions, valid on average, but not, as was thought previously, inviolable. Entropy can, in fact, spontaneously decrease. This, to me, is all that is meant by a violation of the second lawit was thought to be an exact, universally applicable law, but turned out to be merely statistical. (Which, perhaps counterintuitively, is exactly what makes it sovirtuallyinviolable: I can imagine universes where things move faster than light, where there's no quantum theory, no weak or strong force, or what have you; but I can't imagine one in which there's no second law, because fundamentally, it just boils down to the fact that more likely things happen more often.) I think you agree with me that entropy, even in a closed system, may spontaneously decrease, solely by virtue of the effectively random microdynamics. However, you don't want to call that a violation of the second law. Hence, I repeat my earlier question: do you appeal to some different formulation of the second law in that case, one that refers perhaps to the average entropy production? 
#77




Quote:

#78




Quote:
Quote:
Last edited by DPRK; 05022019 at 03:52 PM. 
#79




Quote:
~Max 


#80




Quote:
You can't build absolute laws on probability and at least three formulations of the second law of thermodynamics are absolute laws: the three in the original post and the entropy corollary, using the classical definition of entropy. I have yet to be convinced that any of these laws are contradicted by theory or observation. The only law that might be violated is the entropy version, using the statistical definition of entropy. But it would seem the original proof of that formulation relied on both Clausius's version of the second law and a classical definition of entropy. So I'm not sure the statisticalentropy version of the second law was ever a "law" to begin with. ~Max 
#81




Quote:
I am perfectly fine allowing for equiprobability of the initial microstate of a toy thermodynamic system. Now we can return to the main question: is the second law of thermodynamics routinely violated? Also, which version is violated and why? ~Max 
#82




So we shouldn't judge a book by its cover? Because balloons sound pretty thermodynamic. Automatically mistrust any text that doesn't come in plain buckram?

#83




Quote:

#84




Quote:
~Max 


#85




Quote:
Then tell me how it could be. Tell me how, for a system that's fully described as 'five differently colored balls in an urn' (that is, without adding any adhoc explanations, such as 'maybe the red ball is attracted to my hand'), it could be that the probability of drawing either color is different from 1/5. Quote:
Quote:
Quote:
Last edited by Half Man Half Wit; 05022019 at 11:59 PM. 
#86




So, going back a little, is Bolzmann's own brain an instance of a Boltzmann brain ? And together with everyone else's brains ? Seems a bit too high a probability. So maybe very improbable states are much too common.
Now, of course, we have evolution, powered by an inflow of energy and acting as an invisible hand directing matter towards preferred states. However this seemed always to me as a weaselly explanation  the inflow of energy is necessary, but by far not sufficient. So, my question is a reframing of the Fermi paradox: is the start of an evolutionary chain a thermodynamic lowentropy random fluctuation? And taking into account the selfsustaining and selfamplificatory characteristic of this lowentropy fluctuation, and supposing we could look (really) far away into the future should this have any effect on the homogeneity of the (very) future universe ? Maybe we can start simpler. Let's say that a quirky random fluctuation somewhere near an energy source (star ?) creates something much simpler, like a Maxwell Daemon  or just an air conditioner. What happens then with probability distributions over time in that portion of space ? 
#87




Does such a thing exist without assuming equiprobability of microstates at every instant, without regard to previous microstates?
~Max 
#88




Quote:
I sense the sticking point here is the notion of 'random'. These downward flucturations are random with respect to the macrostate, in the sense that knowing the macrostate does not allow you to predict that a downward fluctuation will occur; but they're completely deterministic on the level of the microstate. Picture something like a gas evenly spread out in a box, with all the gas molecule's velocities aimed towards the center: from the macroscopic level, taking the gas as being described by its volume, temperature, and pressure, you see nothing but an equilibrium, with no indication that anything's going to happen but it remaining in equilibrium; but in fact, the gas molecules will bunch up at the center, thus decreasing entropy. 
#89




Macroscopic vs microscopicQuote:
~Max 


#90




Is the count of particles on the left half of the table a macroscopic property of the whole table?Quote:
When dealing with a thermodynamic system, I thought properties of the system must represent the system as a whole, dependent on the microscopic state but not necessarily on a specific microscopic state. Quote:
I say midlevel because our frame of reference is the table as a whole system. The fundamental level would be atomic*, or at least however specific we wish to go. Surely you can understand my description of systems between the frame of reference and the fundamental level as midlevel abstractions, or subsystems? * atomic as in indivisible, not necessarily atomic theory Quote:
Therefore, the number of particles on the left side of the table is not a macroscopic property of the table in aggregate, but a macroscopic property of the left side of the table in aggregate. The reason for this is that determining which particles are on the left side of the table depends on microscopic the position of those particles. If you can only measure the system as a whole, you can not observe which particles are on the left side and therefore that count is not a coursegrained observable. If you fail to admit the left side of the table as a thermodynamic system, then by your own definitions there is no such property as "the number of particles on the left side of the table". ~Max 
#91




The point is that a macrostate is a set of quantitieshowever definedthat give you information about the aggregate properties of a system. (number of particles in the left half, number of particles in the right half) is a perfectly good description of the system.

#92




Improbable vs ImpossibleQuote:
The probability of randomly picking red from the second urn every time for n trials is 20%^{n}. The probability of randomly picking red from the second urn half of the time for n trials is 20%^{n/2}. With 10 trials that's 1.024x10^{5}%. The probability of picking red from the second urn in at least 500 out of 1,000 trials is about 3x10^{348}%. Improbable. Unlikely. But by definition, not impossible. Improbable things happen all the time, just not on purpose. I'm not sure if that answers your question, or where you are going with this. ~Max 
#93




Quote:
Therefore the only form of the second law of thermodynamics that could possibly be violated when a particle moves within an isolated system is the entropy formulation using a statistical definition of entropy. I'm not sure how anyone justified that formulation to begin with. ~Max 
#94




Quote:
It seems that statistical mechanics is a useful but imprecise abstraction of the underlying classical reality. ~Max Last edited by Max S.; 05032019 at 12:54 PM. 


#95




Quote:
Quote:
Statistical mechanics is the more fundamental theory, and explains the empirical findings of thermodynamics, completing the theory at the microscopic level. Last edited by Half Man Half Wit; 05032019 at 01:08 PM. 
#96




Quote:
I still don't see how you can get to "in an isolated system, entropy tends to increase over time" from the statistical definition of entropy. ~Max 
#97




The equivalence of the classical thermodynamics form of the second law and the statistical mechanics form really isn't hard to see. Consider a gas that's a mixture of hot and cold, fast and slow, molecules (it will, of course, in reality contain particles of a continuous MaxwellBoltzmann distribution of velocities, but let's idealize here). Now, there are more microstates realizing the configuration where both sorts of particles are evenly distributed than there are microstates realizing the configuration where all the 'hot' particles are on the left side, and all the 'cold' particles are on the right side. All the cold particles moving right, and the hot particles moving left, is therefore a reduction in entropyboth statistical (since we're going from a macrostate with many microscopic realizations to one with fewer) and in terms of heat transfer (since heat flows from a colder system to a hotter one).
As you may recall, this is the setup of Maxwell's demon: he sits at the boundary between both sides, sorting hot particles to one, and cold particles to the other side. The trick is now simply that we don't need the demon at all: the whole thing can happen purely by chance, should all the molecule's velocities align in the right way; which they will, for generic initial conditions, after you've waited long enough. 
#98




Quote:
Quote:

#99




Quote:
And if statistical mechanics predicts a violation of the classical second law of thermodynamics, I would like to hear an explanation. ~Max 


#100




Quote:
Perhaps as an analogy: thermodynamics tells you, after having observed lots of dice throws, that the die shows each number 1/6 of the time; statistical mechanics models the way the dice gets thrown, its possible trajectories, and derives that it's gonna come up with every number once in six throws. Quote:

Reply 
Thread Tools  
Display Modes  

