What is the blackhole firewall paradox?

I’m going to give this my best shot but I have no doubt that I’m oversimplying and glossing over quite a bit, but I hope it can be a decent starting point. Here is the source article.

It all started with Stephen Hawking and the idea of came to be known as Hawking radiation. He realized that according to standard quantum field theory, empty space consists of a roiling sea of virtual particle and anti-particle pairs constantly popping into existence and then annihilating.

So he wondered, what if such a pair appeared just on the event horizon of a black hole. The anti-particle would be absorbed and the particle would escape. Since the anti-particle would cancel some small amount of the blackhole’s mass, over time, this would cause the black hole to evaporate.

It was a brilliant insight but it had one problem which I won’t claim to understand. The issue is conservation of information. If the particle radiating away from the bh was truly random as QM suggested, that would mean that all of the information contained in the bh would eventually be lost as it evaporated.

[COLOR=#000000][FONT=arial]Every particle that fell into the bh to create it contained certain information. QM requires that information be conserved. However if bh’s evaporated in the way proposed by Hawking, and there was no arguing that they did, and if information was lost in the process, it seemed to present a paradox.

It was eventually shown by linking quantum theory to relativity that any information HAD to be preserved and somehow encoded in the Hawking radiation. The only problem here was that no one had any idea whatsoever of how this information might be so encoded since there was no doubt that the radiation is random.

It’s times like this you step back and have to wonder just how smart these people really are. I’m sorry to be disrespectful but this doesn’t seem to be what you want to call an insignificant detail. Nevertheless, because someone showed that you could link QM with relativity, well, everybody had better just shut the fuck up. Even Hawking conceded defeat in 2004. I mean what the fuck? It may have SEEMED that you’ve shown that information couldn’t be lost, but until you can also show how it’s being encoded, it very much looks like little more than a Jedi hand wave. Personally, I think relativity is in for some rough times, but I guess we’ll see.

Anyway, it turns out that FINALLY, in 2012, someone decided that maybe this should get cleaned up. Prior to that, if I’m reading the article correctly, it looks like the assumption was that the radiated particle was entangled with ALL of the particles that had previously fallen into the bh. So that the radiation, taken as a whole, allowed you to reconstruct the quantum level information. I swear to god this sounds like bullshit, but hey, it’s quantum mechanics, so that probably means it’s true.

The problem that was discovered though was that if you had the radiated particle entangled with all of the lost particles, then it couldn’t be entangled with the anti-particle partner that was currently being sacrificed to the bh. QM doesn’t let you have simultaneous entanglements like that between independent systems. So one of the entanglements had to go. The thing is, if you get rid of the one between the particle and anti-particle, it apparently generates a lot of energy.

There have been many proposed solutions to this problem, but none have been generally accepted. It looks like another battle to the death between QM and relativity and I’m afraid the odds and fan favorite is QM.

[/FONT][/COLOR]

The basic source of the information loss paradox is really the unitary evolution of quantum mechanics. Unitary evolution essentially is information-conserving, which is just a fancy way of saying that all probabilities must add up to 100%—something happens with certainty. So when Hawking showed that the application of quantum field theory in curved spacetimes leads to the production of thermally distributed radiation, which is perfectly random and does not conserve information, people stood before a problem: what does it mean to violate unitarity? Is there any sense in considering probabilities that no longer add up?

Thus, many physicists didn’t believe that information is actually lost in black hole evaporation. One solution came in the form of Leonard Susskind’s black hole complementarity: there are two stories you can tell about the evolution of information in the presence of a black hole, both are mutually inconsistent, but no observer has ever access to both stories, so every observer sees a consistent story.

This works roughly as follows: an external observer sees information fall into the black hole, and essentially being ‘reflected’ at the horizon—or rather, at a point about a Planck length away from the horizon (the ‘stretched horizon’). This derives from the ‘membrane paradigm’ description of black holes: since time slows down for any infalling matter as seen by an external observer, he effectively sees everything accumulating at the horizon, which it heats up, and which then emits the information in the form of radiation.

However, for an infalling observer, the story is very different: he sees everything just passing the horizon, with nothing observably strange at that point—in fact, he generally does not know that he has crossed the horizon. This picture is only consistent if no observer ever could verify both stories, i.e. if the descriptions are complementary in Bohr’s sense: consistent, but mutually exclusive, descriptions of the same reality.

The firewall argument now challenges this conclusion by using a clever construction in which measurements on entangled systems are used to accumulate more information than ‘ought to’ be simultaneously possible—roughly, you measure some radiation outside the black hole, then jump into it and measure internal degrees of freedom that are highly entangled (for an old black hole) with external degrees. But then, you can basically verify the maximal entanglement of one system with more than one other—something which is impossible in QM (that is referred to as ‘monogamy of entanglement’).

One possible solution is to consider that the entanglement is, in fact, broken at the horizon, which leads to the firewall. However, this would be a gross violation of the equivalence principle of general relativity: at any given point, it must be possible to ‘transform away’ the effects from the curvature of spacetime. But if there’s a firewall, that isn’t possible.

So for the moment, everybody’s just kinda scratching their heads—it seems that several principles, which have all been thought to be very robust individually, may be in conflict when considered together. What ultimately wins out, what will be thrown out, if anything, is anybody’s guess.

It would be more accurate to say that, when you apply quantum field theory to the curved space around the event horizon of black hole, the strong gravitational field leads to the production of pairs of particles: one particle and it’s antiparticle, if the particle is not it’s own antiparticle. From the pov of a far away observer one of these particles has positive energy and one negative, the negative energy particle necessarily falls into the black hole whereas the positive energy particle will escape to infinity. The negative energy flux over time will tend to reduce the mass of the black hole, until it evaporates completely.

There’s a few notes worth making at this point: firstly in the case where particles and antiparticles are being produced, there is no preference for which of the pair has positive energy and which of the pair has negative energy and equal numbers of particles and antiparticles will appear in Hawking radiation. Secondly that the negative energy flux across the event horizon leads to a reduction of the mass of the black hole and evaporation is a pure assumption from the fact that, classically, asymptotically flat spacetime conserves energy at infinity. There is no actual mechanism in the theoretical background of Hawking radiation that allows the black hole to lose mass. Thirdly viewing this process purely in terms of particles has pitfalls as the whole process relies on the ambiguity in definition of particles (which is btw distinct from the divide between real particles and virtual particles in peturbative quantum field theory), the particles that have a detectable existence will be the ones that are detected by an observer faraway from the black hole.

Hawking radiation comes from applying quantum field theory to curved spacetime, but presumably the curved space itself is governed by quantum field theory. One aspect of Hawking radiation that is worrying from this point of view is that Hawking radiation appears to predict (again though there is an element of reasonable assumption in parts of this prediction) that a black hole is not time reversible as Hawking radiation does not depend on the exact states of the particles that fell into the black hole to create it. The reason this is worrying is that the evolution of a quantum state is time reversible, so it would imply something very odd about the evolution of the quantum state of the whole spacetime, this is the black hole information paradox.

As basic quantum theory is assumed to be more fundamental than general relativity, after a lot of head scratching it was generally agreed that the information must be preserved (i.e. the evolution of the quantum state is in fact time reversible) in some way. Again though this is more a reasonable assumption rather than something with a hard theoretical underpinning and in fact some very well-respected physicists dispute the reasonableness of that assumption - for example Roger Penrose contends that the information is in fact lost as for many years he has linked the non-reversibility of gravitational collapse+black hole evaporation to the non-reversibility of the collapse of the wavefunction due to measurement.

As I have tried to show that there are quite a number of what are thought of as reasonable assumptions made about the quantum physics of a black hole (only some of which I have mentioned). Last year it was shown that if you take all the reasonable assumptions about the quantum physics of a black hole, then they suggest that after a certain time something known as a black hole firewall will develop. The black hole firewall exists just behind the event horizon and consist of quanta with ludicrously high energy which will destroy anything they come into contact with. Overall though until there is a strong theoretical framework provided by something like a quantum field theory of gravity, it is highly possible that a black hole firewall is simply an artifact of a faulty assumption.

If the mind thinks " the Universe should have a conservation of information rule", the mind is in a condition of having a far too overactive imagination.
One the same idea, If particles can be created, eg matter and antimatter, then thats really two units of information being created, not one unit of information and one unit of anti-information. So it was already a destroyed idea before it got thought.

Only if you assume that the particles are created ex nihilo. But that already runs up against the conservation of mass and of energy, which I hope you would agree are on sound footing. In all known processes that produce particle-antiparticle pairs, you have to start with some other particles coming into the interaction, carrying mass and energy and information with them.

There is a possible solution to the black hole information paradox in superstring theory, in which the black hole is actually a “fuzzball” composed of superstrings rather than an infinitely-dense singularity. The idea is that information that falls into the hole remains stored in the superstrings that compose it instead of being crushed out of existence in a singularity.

Chronos, I seem to recall you explaining Hawking radiation in a rather elegant fashion once. Rather than considering particle antiparticle pairs at the event horizon, you considered the black hole on a macro scale as a thermodynamic system and made some deductions based on its change in entropy. The details of your explanation elude me at the moment. Care to reiterate?

First, thanks for the explanations. I think I got a good chunk of that.

I’m especially curious about this part though and offhand my guess is that completely different mechanisms are involved but I’ll toss it out there. I vaguely recall something about micro blackholes possibly being created at LHC. Doesn’t matter if that’s true or not, let’s assume it is since at some energy level presumably it does in fact happen. I recall the researchers there saying that they would evaporate almost instantly. Is this a different mechanism?

Also, the whole wave-particle distinction when it comes to the zero point field can be a little over done don’t you think? I mean from Casimir radiation to sea quarks to things that last for femto-seconds and only in the heart of a collider’s detector, it’s some pretty slippery stuff.

No, this is exactly Hawking radiation. The temperature of the black hole goes inversely to the mass, so low-mass black holes are extremely hot and thus, radiate away almost instantly, while high-mass black holes are correspondingly cool.

Ah. OK, thanks. I’ll try to read a little on this before trying to pursue it further. It’s one of those things that was always tangentially interesting but that I never quite got to.

I seem to remember the holographic principle being tossed around as a possible solution to the information paradox. As an aside, I vaguely remember from an article I read some months back that the hp seems to solve the problems between gen. ret. and qm, because gravity is no longer present when the normal “3d” world is converted to its “2d” holographic equivalent.

Has any progress been made here? Has anyone managed to discredit the idea conclusively? Is the failure to find “Hogan’s Effect” conclusive evidence that hp is impossible? And to tie this all up with the OP, would hp help at all in unraveling the firewall paradox?

The article in the OP goes into it but I sort of gave it the bum’s rush. The article is a little on the long side.

Thank you for pointing that out. I hadn’t looked at the original article before posting; I guess I should have. I suppose that answers what I was getting at for the purposes of the OP.

As HMHW says black hole thermodynamics predict that a v. small black hole will evaporate very quickly.

I think the problem with things like virtual particles is that they’re not really explained properly. From reading many pop-sci books you’d get the impression they’re just like real particles, but with certain magical properties, which isn’t the case.

It’s worth pointing out though that there are several ambiguities in the definition of ‘real’ particles in QFT and pair production by gravitational fields is directly related to this. In a region of flat (or almost flat) spacetime, assuming no strong fields which might do funny things to the particles, you can impose a very sensible definition on the total number of particles in that region; however if you have a region of spacetime where there is a strongly curved there will be considerable ambiguity in the number of particles in this region. When you have a region of the latter type surrounded by a region of the former type (such as a black hole surrounded by empty space) then the ambiguity in the number of particles in the strongly curved region will manifest itself in the flat region by (from the POV of someone in the flat region) particle pairs appearing in the latter region and radiating out into the flat region.

Bad track warning - possible derailment ahead.

I’m not sure what you mean by ‘magical properties’ and while I’m not very well versed in the math that basically IS QM, I hope you give me some credit for being more familiar than most with the concepts. So I do understand that technically, any elementary “particle” can’t really be described any more precisely than as a smear of probabilities in some vector space. Even so, individual phenomena will generally lend themselves to being described better by one vocabulary or the other - wave-like or particle-like. Blob-like does usually go over very well. :cool:

But I’m not going to argue the point despite appearances to the contrary. The real purpose for posting is to ask about curved space and for that I’m going to ask you to indulge me.

There is a school of thought that regards the very concept of space time as nonsense but nevertheless recognizes that space can be curved or flat. So, for the purposes of my question, if possible, can we please assume that we are simple talking about curved or flat space that does NOT have a time component? If you want to bolt on 10 or 12 other dimensions, that’s cool, as long as they are spatial dimensions.

My question is basically, what does it mean for space to be curved. What I imagine is that in areas of intense gravity, space becomes compressed. I imagine it being like the universe at the big bang where it wasn’t just matter and energy that came into existence, but space itself. Areas of intense gravitation are like a small regression to that primordial state.

If that’s accurate, then is this how Hawking radiation is related to the zero point field? If the Casimir effect is an expression of the zpf’s normal quantum fluctuations and such are an inseparable quality of the very fabric of space, then as space is compressed, don’t they have to become amplified?

Thanks. Hope those questions weren’t abysmally stupid.

According to what was said in the thread, one virtual particle has negative energy and the other positive energy, meaning that their net energy is 0, hence no violation. Are you asserting this is not the case, or am I just confused?

Not if the particle pair are of necessity entangled in a Tweedle-Dee/Tweedle-Dum way, so that each has to be the opposite of the other. If they aren’t free to have independent properties, then in a sense there is only one piece of information there.

Actually, evaporation of extremely small black holes is a different process than evaporation of large ones. All of the derivations of black hole evaporation assume that the hole is basically unchanged by the emission of the particle, and this assumption works well enough for an astronomical-mass black hole, but if you try to extrapolate this to extremely low masses, you’d find that the emitted particle is a significant fraction of the energy of the entire hole, and the approximation breaks down. Presumably, extremely small black holes also break down in some way, but we lack the capability as yet to calculate just how.