Life reverses local entropy, does gravity also do this, and is there loss of entropy for falling into the gravity well?
It’s in the inherent nature of life that it must create suitably organized structures in order to exist and to thrive. In that sense it can be said to “reverse local entropy” (while also increasing universal entropy) although I don’t think introducing the concept of entropy in this sense is particularly useful, any more than saying that I “reverse local entropy” by completing a jigsaw puzzle or assembling a model kit.
Gravity in general does no such thing. In certain isolated cases gravity can create more organized local structures, such as by separating liquids according to density or precipitating suspended solids. In general however, the tendency is for gravity to increase entropy by creating more homogenous structures, the ultimate example being black holes, thus destroying all previous organization.
I think it would also be appropriate to look at the question from the special meaning of entropy in thermodynamics. Since you had asked about life, you seemed to be addressing entropy in terms of the general idea of more ordered vs less ordered states in macroscopic structures.
In thermodynamics, however, entropy has the more specific meaning of “the measure of a system’s thermal energy per unit temperature that is unavailable for doing useful work”. It’s essentially a measure of the degree of order of the molecular organization of a system from a thermodynamic perspective. Loosely speaking, to do thermodynamic work, heat must transfer from a hotter body to a colder one. The gradient between hot and cold is a state of molecular organization that provides the energy to do work. The second law of thermodynamics basically says that some of this energy, instead of doing work that may reduce entropy, will bleed off into space and ultimately any such energy transfer increases the entropy of the universe (temperatures gradually become more uniform).
From a thermodynamic perspective, I think it can be said that gravity can reduce local entropy by aggregating mass and increasing the energy density in that region of space (simply, the mass gets hotter due to compression, impact, etc.). Turbulence of infalling gases can also generate heat. A spectacular example of both is the tremendous energy radiated by accretion disks around black holes. For the same reason, black holes are regarded as potentially great sources of energy if we’re able to figure out a way to harness it.
So I think you actually get quite different answers about gravity depending on whether one defines entropy in the general sense of macroscopic organization, or in the more specific thermodynamic sense.
First, define “local”. Living things are continually seeking out and consuming (relatively) low-entropy food, and putting out high-entropy waste. We’re only decreasing entropy if you draw the boundaries of your system with the waste outside, which requires continually re-drawing the boundaries, and you can make a case that the apparent decrease of entropy comes from that boundary-drawing, not from anything life itself is doing.
Second, a living thing is a system, while gravity is a force. You could certainly have a system that includes gravity that decreases entropy in some places and increases it elsewhere. But that’s not a property of gravity; that’s a property of the entire system.
I’ll let the OP define what he meant by “local entropy” in the context of life, but I took it to mean the entropy within the organism itself.
Not even that, really. When you learn and experience new things, that’s the entropy of your brain increasing.
I don’t think we know nearly enough about how human memory works to be able to say that. For a digital memory device, sure, an empty one containing all zeroes would have a Shannon entropy of zero, which would progressively increase as you added data. l doubt that anything like this happens in the human brain.
The Kolmogorov complexity of a system is directly related to its entropy. So in this case entropy does increase as we gain knowledge. But of course that doesn’t contain any useful commentary on the physical mechanisms involved.
But that’s the crux of the matter at hand, though. For a digital memory device that contains no information, we know that it is (or equivalently can be) initialized to all zeroes. In information theory, this gives it a Shannon entropy of zero (perfect predictability). But the brain doesn’t work in terms of a bank of empty cells waiting to be filled with information. Indeed it’s not even strictly digital, but in many ways more akin to analog systems. As we learn new things, it seems to diffuse older memories and push them farther back into the back rooms of our memories. It’s not at all clear what is happening, overall, to whatever we may mean by the “information content” of our minds.
In any case, I think this is getting a bit away from the point, which was not about entropy in information theory, but seemed more to be about the physical organization of the biological structures of life, and their ability to self-organize into the necessary cellular structures and organs.
Even a simple, classical digital memory device is not going to have an entropy that depends on what is being stored in the memory cells. If there are 64 bits, then the information content is 64 bits. The “reversal of local entropy” is going to come into the picture via the mechanism that keeps the bits safely stored in the face of thermal noise in the environment that naturally tends to randomize all the bits (maximum entropy state). This device will use energy and output waste heat, of course.
It certainly will in terms of the Shannon entropy of information theory. It may not in terms of physical entropy like thermodynamic entropy.
Thinking about life in terms of entropy has been fairly standard not only since Schrödinger published What is Life? in 1944, although that book is probably the source of the idea that life ‘reverses local entropy’—it assimilates free energy in order to reduce its own entropy, thereby avoiding decay, and increasing entropy in its surroundings. In fact, the resulting entropy increase is a key biosignature used in the search for life on other planets.
Gravitational entropy isn’t increased by greater homogenization, at least in the obvious sense—after all, gravity typically leads to matter clustering up, which is how we get planets, stars, solar systems, galaxies, and galaxy clusters, rather than a uniform matter distribution in the universe. The gravitational potential energy is lowered by greater clumping, so energy must have been released from the system, which leads to a total entropy increase (think about the heat radiated by a collapsing cloud of gas, and the resulting entropy).
No. The memory device contains a single, definite bit string. It would be at a high entropy state if its macrostate was something like ‘as many zeros as ones’, which corresponds to a high number of microstates, but the idea of a memory is essentially to have no ignorance of its microstate at all.
There is no Shannon entropy of a single bit string on its own; a Shannon entropy would be associated to a source of bits that produces a zero or one with a given probability. So if you have a coin that comes up heads or tails with 50% probability, you’d have a maximum Shannon entropy (of 1 bit). If the coin isn’t fair, then that entropy decreases; for a deterministic process, it’s 0.
So in response to the OP, I suppose one could say that gravity lowers ‘local’ entropy: in the gravitational collapse of a volume of matter, the entropy of that matter will indeed be lowered, if only at the expense of an overall entropy increase due to the radiated energy.
That’s true on certain scales, such as galactic scales and beyond. But gravity also homogenizes on smaller scales simply through aggregation and the reduction of structural order. And when matter collapses into a black hole, the entirety of its structures and properties have been reduced to a homogenous event horizon.
I see no distinction, but I welcome any correction. From the cite below, I would conclude that an empty flash drive, logically equivalent to one containing all zeroes, has a Shannon entropy of zero because the probability of the next bit being 0 is certain. Its Shannon entropy increases as soon as information is introduced.
Shannon entropy, also known as information entropy or the Shannon entropy index, is a measure of the degree of randomness in a set of data.
It is used to calculate the uncertainty that comes with a certain character appearing next in a string of text. The more characters there are, or the more proportional are the frequencies of occurrence, the harder it will be to predict what will come next - resulting in an increased entropy. When the outcome is certain, the entropy is zero.
Shannon Entropy Calculator | Information Theory
It’s true on all scales where gravity is dominant; after all, gravity doesn’t know on what scales it is currently acting. Even in the accretion of matter on small scales, such as asteroids and planetoids, the same principle obtains. Even Jupiter’s rings are lumpy, not smooth, despite the high level of thermalizing interaction.
Entropy and black holes is a subtle matter. At first, from the perspective of classical general relativity, it would seem that they should have rather minimal entropy: after all, by the no-hair-theorem, a black hole’s exact microstate is completely determined by its mass, charge, and spin—just three numbers. Compare that with the data needed to characterize the state of all the matter that went into forming the black hole. Indeed, with entropy comes a temperature, and with temperature comes radiation, and in classical GR, black holes don’t radiate.
It’s only some subtle reasoning involving thought experiments that force one to ascribe any entropy to BHs at all, and even then, the microscopic origin of this entropy is left completely opaque. And even that was contentious until detailed calculations by Hawking using quantum field theory on a curved background showed that yes, there must be a temperature and radiation associated with a BH horizon.
But even so: without gravity, the expectation for a ball of gas in a certain volume would be to spread out to occupy the full volume as a way of increasing entropy; but with gravity, that’s not what happens—instead, a maximally lumpy configuration, with all the matter confined to the smallest possible volume, is the result.
Sure, but for a memory that’s not all zeros, the next bit is just as certain. A Shannon entropy is associated with a process producing each of a possible set of responses with a certain probability. So a ‘memory’ with non-vanishing Shannon entropy would produce different bit-strings whenever it is queried, which would usually not lead us to call it a ‘memory’ as much as a ‘stupid broken piece of shit’ .
I have to wonder whether I’m misunderstanding you or whether you’re just pulling my leg and being disingenuous. The whole point of the Shannon entropy metric is to specify the amount of a priori knowledge necessary to ensure predictability of a series of bits or characters. This is directly related to the Kolmogorov complexity metric previously cited by @Francis_Vaughan. Clearly, a series of memory cells known to contain all zeroes does not have the same complexity as one containing a large series of arbitrary sequences, and can be trivially specified.

Entropy and black holes is a subtle matter. At first, from the perspective of classical general relativity, it would seem that they should have rather minimal entropy: after all, by the no-hair-theorem, a black hole’s exact microstate is completely determined by its mass, charge, and spin—just three numbers.
Right. That’s what I’m saying, and all that really needs to be said in this context. As far as anyone knows, Hawking radiation doesn’t change that – not even subtly.

But even so: without gravity, the expectation for a ball of gas in a certain volume would be to spread out to occupy the full volume as a way of increasing entropy; but with gravity, that’s not what happens—instead, a maximally lumpy configuration, with all the matter confined to the smallest possible volume, is the result.
It seems rather bizarre to consider a singularity to be “lumpy” – a term which implies multiple random aggregations. The unique feature of a singularity is that there is only one, and it is the ultimate gravitationally-induced homogeneity. All black holes are the same, differing only in the three parameters you mention.

I have to wonder whether I’m misunderstanding you or whether you’re just pulling my leg and being disingenuous. The whole point of the Shannon entropy metric is to specify the amount of a priori knowledge necessary to ensure predictability of a series of bits or characters
No. The point of Shannon entropy is to quantify the average amount of new information gained upon receiving a new element from a message. Thus, there needs to be uncertainty about that new element. There is none in the case of a deterministic source.
A memory with the content 11001001 will produce the message 11001001 every time it is queried, i.e. with probability 1. Therefore, its Shannon entropy is 0. Something that produces, say, (11001001, 11001000, 01001001) each with probability 1/3, would have a Shannon entropy of ~1.5 bits, and be a shitty memory.
Shannon entropy is not applicable to individual sequences; it applies to random processes (think a source producing a message). Kolmogorov complexity is a different beast invented exactly in order to quantify randomness (or incompressibility) of a single sequence. Colloquially, it’s true there that an intuitively ‘ordered’ single sequence will have a smaller complexity than a ‘disordered’ one, but strictly speaking, this holds only in the limit, or relative to an appropriately specified description language (Turing machine). The relationship to Shannon entropy is roughly that the individual messages produced by a random source will have, on average, the same Kolmogorov complexity as that source’s Shannon entropy. So if you had a source that produces random bits with a certain probability, then looking at each sequence produced by that source (in a sufficient long-term limit), and computing (well, approximating, since KC isn’t computable) it’s Kolmogorov complexity, then that value would on average agree with the Shannon entropy of the source.
You might be thinking about every bit of the memory as separate, its own ‘signal’, so to speak, and thus, view the memory as producing the message ‘1,1,0,0,1,0,…’. That’s again not a random process, and if you want to invoke your own ignorance on the next bit of the message as a reason for viewing it as a random process, then the same would hold for the message ‘0,0,0,0…’, because no number of 0s tells you whether the next bit is 0, too.

As far as anyone knows, Hawking radiation doesn’t change that – not even subtly.
The existence of Hawking radiation means that rather than an extremely low-entropy object, as in classical general relativity, a BH is an extremely high-entropy object (maximal, in fact, for something of its size).

It seems rather bizarre to consider a singularity to be “lumpy” – a term which implies multiple random aggregations.
Well, a singularity is the ultimate lump—the expectation would be for such a lump to rapidly disperse to maximize its entropy, which is the opposite of what happens if you include gravity. The entropy of a system is determined by the phase-space volume it occupies, and the phase-space volume of a singularity is 0. (Roughly, the lumpier—the more concentrated at certain points—the smaller the phase-space volume.) So this is a minimal entropy configuration, and yet, the system evolves spontaneously to that point, which seems surprising from a second-law perspective.
I think I was a bit sloppy in my formulations above when talking about ‘gravitational entropy’ leading to increased lumping. More accurately, I think, would be something like this: in the presence of gravity, an overall entropy-increasing (and thus, spontaneous) process, rather than leading to a homogeneous matter distribution, will typically lead to a matter distribution that’s highly inhomogeneous, up to the point where all matter is concentrated in a single point. This is contrary to the expectation from usual thermodynamic processes, which lead a highly concentrated matter distribution—e.g. a ball of gas in a larger volume, or all of the air in one corner of the room—to rapidly disperse.
In the presence of gravity, this leads to gravitational potential energy being transformed into heat, which increases the entropy away from the matter distribution, while the entropy of the original matter distribution is decreased (though not by more than the entropy of the emitted radiation).
Does this help?

A memory with the content 11001001 will produce the message 11001001 every time it is queried, i.e. with probability 1. Therefore, its Shannon entropy is 0. Something that produces, say, (11001001, 11001000, 01001001) each with probability 1/3, would have a Shannon entropy of ~1.5 bits, and be a shitty memory.
And then you need to steal the Enterprise to recover from your planet-wide computer failure.

And then you need to steal the Enterprise to recover from your planet-wide computer failure.