Having searched and not finding what I’m wondering about, I’m asking this question: Hawking Radiation…huh? As I understand it, there’s quantum activity in a vacuum, in the form of particle-antiparticle pairs forming and quickly destroying themselves. At the limit of a black hole’s terminal pull, these pairs sometimes send one into the black hole and the other away, and that’s Hawking Radiation. My problem with that is that the black hole gets yet another particle added to its mass, so the evaporation doesn’t make sense. Enter Wikipedia, which tells me that “to fill the energy ‘hole’ left by the pair’s spontaneous creation, energy tunnels out of the black hole and across the event horizon. By this process the black hole loses mass, and to an outside observer it would appear that the black hole has just emitted a particle.”
Please help me understand this. The particle-antiparticle pair shoots one item into the black hole and one item away, thus preventing them from meeting back up for their orgazmic mutual suicide. Now, the universe is off kilter, because mass/energy has been created, which is not kosher at all—not by a long shot. But, the black hole is there, and so some energy, in the amount identical to the amount represented by the original matter/antimatter pair, does the quantum tunnel out of the black hole to fill the “hole” the pair had created. Thus, the black hole gains one particle, but loses the energy equivalent of two, for a net loss, and Bob’s your uncle, the thing is evaporating.
Is this basically a correct statement of the theory?
If so, what the heck is the energy “hole” the particle pair created? Doesn’t the energy “hole” imply that there was a reduction in energy to match the increase in mass, which means the universe isn’t off balance after all?
The sum of the total energy of the two particles (including the energy in their mass) must be zero, because they came from nowhere. But the energy also includes their gravitational potential energy, which as you can imagine, is not to be ignored in the vicinity of a black hole. And gravitational potential energy is always negative. So it’s not completely implausible to suppose that a particle in the vicinity of a black hole might have a negative energy. And because of the way that gravitational potential energy works, a particle with a negative energy is guaranteed to fall into the black hole. So the black hole eats a particle, but that particle’s energy is negative, so it’s a net loss for the hole. Meanwhile, if one particle had a negative energy, the other one must have had a positive energy. Most of the time, that one’ll end up in the hole, too, in which case nothing happens to the black hole, on net. But it’s possible for a particle with positive enough energy to escape from the vicinity of the hole, in which case the black hole doesn’t make up for the negative particle it ate. The escaping particle must have at least enough energy to make its rest mass, so massless particles like photons are overwhelmingly more likely to be produced, but in principle, anything can be.
Personally, though, I prefer to think of Hawking radiation in terms of thermodynamics. You probably already know that all physical processes increase entropy, and that final states have higher entropy than initial states. Well, a black hole is pretty much the ultimate, as far as final states go, so one would expect a black hole to have a whopping great entropy (in fact, this entropy is proportional to the surface area of the event horizon, and it is, indeed, very large). But anything that has an entropy must have a temperature (which works out to about a millionth of a Kelvin, for a typical stellar-mass black hole), and anything that has a nonzero temperature must radiate (though of course something with a temperature that low radiates very little, and in fact would absorb much more from the cosmic microwave background radiation than it could emit).
So…that would mean that large mass black holes will NEVER evaporate through Hawking radiation? I mean, never, until the temperature of the universe falls below the temperature of the black hole? Meaning a number of years with more zeros than you “scientists” can count?
I knew small black holes would evaporate pretty quickly, while large ones would evaporate very slowly…but from this it seems that the large ones will be gaining no matter what for the forseeable future.
Yes, but the timescale for the Universe to cool that much is significantly shorter than the lifespan of a black hole. The lifespan of a 3 solar-mass black hole in a zero-temperature environment would be somewhere between 5.8310[sup]67[/sup] and 3.1210[sup]68[/sup] years, depending on a few as-yet unknown details of neutrino physics, but my money’s on the larger of those numbers. And for the largest holes (in the vicinity of 10 billion solar masses), we’re talking 1.16*10[sup]97[/sup] years. So waiting for the Universe to cool off is a blink of the eye, as far as a black hole is concerned.
Assuming a constant rate of expansion, it’d be only a few times 10[sup]16[/sup] years. But the expansion isn’t actually constant; it appears to be accelerating, so the time scale would actually be shorter than that, perhaps as low as a few times 10[sup]11[/sup] years. Either way, it’s negligible compared to a milligoogol years.
Chronos is of course correct, but here’s an old post of mine that gives, in my opinion, a more intuitive (and only slightly wrong) way of thinking about it.