How massive is organized information? (entropy, E=mc^2)

I have a 1GB hard drive. All else being equal, what is the difference in mass between the HD with completely random 0s and 1s and the HD with all 0s or all 1s?

The difference between the 2 HDs is measured in entropy, which is a form of energy. (Need some stat mech here, I think) Einstein’s theory of special relativity gives us the relationship E=mc^2. This means the two HDs should have different energies and masses. If deltaG = deltaH - T deltaS holds, the HD with the least entropy/most organization should have more energy and be more massive than the other HD. Is there such a state for the system as completely or the most random? The states with all 0s and all 1s have the same energy, don’t they? Give answers for 1GB = 2^30 bytes, e.g. the real gigabyte. I know there’s a lot more to special relativity than E=mc^2, but I think that’s the only part that plays a role here. Pretend these HDs are at rest with respect to your inertial reference frame. :slight_smile:

I think you are misapplying entropy. For the sake of argument assume that it takes the same energy to encode a 1 as a 0. If this is the case then it matters not if the disk contains the human genome, completely random numbers, all 1s or all 0s.

The flaw is that there is no natural “disordered” state of 1s and 0s that is different than organized information. True entropy would lead the disk to a state where there are neither 1s or 0s encoded.

If you had a hard disk platter that was not formatted at all and formatted it with anything you would be changing the entropy of the hard disk. You could certainly derive the relativistic mass from the stored energy but we’re talking amoeba farts.

Aye, you’re confusing two different uses of the term ‘entropy’. There’s a thermodynamic meaning and an information theory meaning. (Actually, several information theory meanings-you can get some info on the most popular by Googling on ‘Shannon entropy’.) They have nothing to do with each other, to the dismay of creationists trying to claim that the theory of evolution violates the laws of thermodynamics.

I got some vague impressions from a Scientific American article of maybe a year or so ago. First, information has a mass of something like 1e-65 kg/bit. Second, the level of organization is in the eye of the beholder - that is, a seemingly random pattern of bits may be highly organized, and you might say there’s no way to distinguish between disorganized and highly organized bits. That’s why a telephone modem makes a sound like rushing air. Third, I didn’t think relativity was at the heart of it. Fourth, the thermodynamic and informational entropies were identical at their roots.

But these were only vague impressions.

I do remember that the information you could possibly fit into a volume was equal to either 4X or 1/4X (don’t remember which) the bounding area of the volume in Planck areas. The reason it’s proportional to bounding area had to do with relativistic warping of the space because of the mass of the information. It struck me that this warping making the information proportional to the bounding area was a nifty twist, but that the big story was that mass is made of information. I also wondered why the nifty twist got so much air play and the big story didn’t - was it because I misunderstood???

I should have said at the beginning that I’m aware this amount of energy/mass is equal to an “amoeba fart.” I’d like someone to calculate it anyway.

That’s more what I’m looking for. 1e-65 kg/bit qualifies as ameoba fart to me. (new thread topic; how much energy is stored in an average ameoba fart?)

Say wha?

Can anyone else comment on these last two?

Thermodynamic entropy and information-theoretic entropy are really only related in so far as they have the same name. And where did you get the idea that entropy is a form of energy?

Well, there is a sort of deeper connection when you move to statistical mechanics as an “underlying theory” of thermodynamics. Remember that thermodynamics is to statistical mechanics as Newtonian gravitation is to Einsteinian.

Anyhow, given a configuration space and a Heyting lattice (I don’t think it needs to be boolean) of measurable subsets you can ask “what is the probability that the system is in this subset of the configuration space?” and work up a statistical theory which implies thermodynamics. Now, information theory (in the Shannon sense) is really about statistics and Shannon’s entropy is about the “lack of information” before an observation is made. The two aren’t the same, but they’re not totally unrelated.

But, yes: entropy isn’t energy, and that’s where it goes downhill.

Sorry, T delta S is energy.

Did they ever satisfactorily get around the “entropy is in the eye of the beholder” problem? I thought that one definition used by information theory was the “data compression” standard: a low entropy state is a state that can be exactly specified by a very compressed algorithm. So for a hard drive for example “all zeros” or “all ones” is a heck of a lot shorter than spelling out a 10^9 string of digits. Or am I misunderstanding?