I am trying to grok the information paradox, but I don’t understand what information they’re talking about.
Thanks,
Rob
I am trying to grok the information paradox, but I don’t understand what information they’re talking about.
Thanks,
Rob
In current physics, a black hole is described by its mass, charge, and rotation vector, it doesn’t have any characteristics but those three. Two black holes of the same mass, charge, and rotation behave exactly the same to an outside observer. The matter that went into the black hole used to have a lot of other information describing its state, see Quantum state - Wikipedia to get an idea of what other information there is. Generally scientists believe that you should be able to work backwards from the present state of a system to it’s previous states, but it’s not possible with a black hole as we understand it.
Just to use a trivial example, you could toss an encyclopedia into a black hole. An encyclopedia contains oodles of information, but it’d all be lost. Now consider just how many encyclopedias it would take to make up the mass of a star.
Which is still a huge underestimate, since encyclopedias contain a lot more information than just the words printed on the pages, but that should get the idea across.
Yeah, but if you burned the encyclopedia, you wouldn’t be able to retrieve the information written in its pages. If you dropped one into a neutron star, what information would not be lost (although I guess the crust of a neutron star is ordinary matter).
Thanks,
Rob
The funny thing is that, in principle, you could. Classically, you could just find the position and momentum of each particle in the ash and smoke, trace their positions backwards in time, and you’ll recover the encyclopedia. In QM, you can just find the current state vector, invert the time evolution operator between before the burning and get the state vector of the original encyclopedia.
To put it another way, the particular configuration of smoke and ash that results from an encyclopedia burning, could only be caused by an encyclopedia burning, so given enough time and computing power, you can look at the smoke and ash and infer the contents of the encyclopedia.
Of course, this is massively impractical, but with an encyclopedia thrown into a black hole, it is actually impossible. Any black hole is consistent with any type of matter having been thrown into it, so long as that matter has the same mass, charge, and angular momentum. It is impossible, even with infinite time and infinite computing power, to determine that it was an encyclopedia thrown in, and not an equivalent mass of pool noodles.
I don’t get this. Even given unlimited computing power, surely Heisenberg’s uncertainty principle would prevent the classical approach? As for inverting the time evolution operator, I don’t understand much, but it sounds a great deal like this would allow doing something equivalent to time-travel, which is another big no-no of physics.
In quantum mechanics the wavefunction is axiomatically the most that can be known about a system, even if it doesn’t allow you to perfectly predict the result of all measurements. In between measurements though, the time evolution of the wave function is unitary. This means that given the present form of the wavefunction you can exactly know all past and future forms of the wavefunction (as long as there are no interceding measurements).
Of course in practice knowing the exact form the wavefunction for a complicated system is impossible, but unitary nature of the wavefunction is still a very important underlying property.
The problem is that the event horizon of the black hole appears to lead to non-unitary time evolution of the wavefunction as you cannot always reconstruct the past form of the wavefunction of a system whilst part of that system is hidden behind an event horizon. This is because the state of the black hole is not sensitive enough to the state of the matter that went into making it.
The black hole information paradox is the apparent conflict between black holes and the fundamental principle in quantum mechanics of unitary wavefunctions
Yes it would, but there is a quantum analogue which takes the wave function and traces back how it evolved to determine what it was like previously.
It’s not time travel, just “figuring out the state of something in the past by inferring from its current state”.
The only requirement for “no loss of information” is that there are no two “past” states that evolve into the same “future state”. Classical mechanics has that property, as does quantum mechanics. Except in the presence of black holes – then a whole bunch of different “past” states all evolve into exactly the same black hole state.
With no theoretical limits of accuracy? That’s not how I have imagined things to work! I have always been under the impression that the destruction of information is a fundamental trait of the universe!
Well, if it could be done with infinite precision and along a continous timeline, it would certainly allow for a virtual trip to ancient Rome!
Actually, given that this would require some God-like powers in the first place (namely, to assess the current state of every particle in a large piece of the universe), it’s not too much of a stretch to imagine that it would also allow you to rearrange matter and energy, down to atom-scale, to recreate the physical world at an earlier stage…
Another way to look at it: the moment the book catches fire, some information will immediately be dispatched as EM-radiation, traveling with the speed of light. If you later decide that you want to recreate the total information of the book, wouldn’t that require FTL-travel?
Or cooperation from aliens who are already out there. And even if the aliens aren’t out there, the information still exists; it’s just inaccessible.
Again, recovering the information from a burned book is really really difficult. We know that already. It’s just not impossible.
Well, what’s the big difference between “inaccessible” and “non-existing” information? Couldn’t information on it’s way to the event horizon of a black hole be considered “inaccessible” much the same way as some information heading for the edges of the universe towards the far galaxies in* Coma Berenices*?
“Inaccessible” information, in the sense of information travelling away from us at c, could have some effect on us in the future. E.g. it could cause aliens to notice and report back to us.
The current state of Proxima Centauri is similarly inaccessible, but it will still have a potential effect on our telescopes at some future date.
Contrast with information from the other side of an event horizon. Even if there are helpful aliens recording our information and trying to transmit it back to us, they will never succeed.
Ultimately, the reason why it matters has nothing to do with whether we can actually measure/calculate it or not. The real problem is more subtle and more profound: Quantum mechanics asserts that the information must not ever be lost, and if it is even possible for information to be lost, most of quantum mechanics crumbles to the ground. General relativity, meanwhile, predicts the existence of these objects which do indeed destroy information. It’s like the unstoppable force and the immovable object: Both cannot exist in the same universe. Ultimately, it’s just one more proof that either quantum mechanics or general relativity, or both, is wrong in some way, and it’s a lot more fundamental than most of the conflicts between those two theories.
Even if the supposed aliens did relay the information; how could you tell whether it was true, false or simply garbled?
I get that this is way over my head, but I must still ask you: what theorem of quantum mech. asserts that no info must ever be lost?
Does a black hole ever fully exist in the observable universe? In that as space is distorted towards becoming an event horizon time slows so that from the standpoint of an observer outside it only approaches the needed curvature asymtotically. Just a thought - it has always bugged me.
One could perhaps then say that the information has not become actually truly inaccessible, just very very difficult to access. Perhap if the universe were to return to a big crunch the books would actually balance.
We know the aliens are truthful because we are holding their spouses and children hostage and will execute them if we sense even the slightest treachery. We know it is not garbled because they have proactively gone out and cleared out all of the matter between Earth and them to make sure the burning encyclopedia photons have a straight shot.
At some point further elaboration on the specifics of how to burn an encyclopedia and get it back. Suffice it to say that unitarity guarantees that it is in principle possible using some combination of aliens and torture methods.
Within the mathematical machinery of QM it simply doesn’t make sense for the time evolution of the wavefunction described by a wave equation to be anything other than information-conserving. It’s really the assumption that if the probabilities of all possible outcomes of some experiment sum to one at the present time, then they also would’ve summed to one if the experiment were to be performed at some previous time.
So how does this wave function handle stuff like radioactive decay or Hawking radiation? I was under the impression, perhaps wrongly, that these were truly random.
Stephen Hawking recently offered a solution: apparently, the information is still encoded holographically on the event horizon. When the black hole evaporates, the information slowly goes away with it. Albeit in a chaotic and unusable form. Apparently, he conceded a 30 year bet with Kip Thorne by releasing this proposed solution.