Center of the Earth: Nuclear Reactor???

Okay, you guys are all gonna think I’m a total idiot for having any doubt about this, but I HAD to be sure because I can’t stand ever being wrong about anything. :stuck_out_tongue:

A “friend” of mine has some pretty odd ideas about the planet and how it works. He cites a guy named Thomas Chalko, whom near as I can tell is a professional philosopher of sorts; peddling books with his own ideas gleaned from “40 years of research and 5 years of careful meditation” as he puts it.

Anyway, one of the stranger ideas he has tried to foist upon me is the idea that the center of the earth is actually some kind of Nuclear Reactor, and it’s in danger of exploding due to something humans have been doing to the earth for a long time. (He said global warming.)

My nuclear physics is a bit rusty, but I was under the impression that for nuclear fusion you need stuff like hydrogen, helium, and other gasses, and for fission you need radioactive elements that have been treated or altered in some major way.

Is there any truth to his claims at all? Or was I right to tell him he should’ve laid off on the paint chips as a kid?

Any info you guys can give me would be much appreciated.

There is a minority belief that the very inner core of the earth is not iron, but rather uranium. See here for an overview and links:

http://geology.about.com/library/weekly/aa072102a.htm

It’s not a widely held belief, and though there’s no compelling reason to declare it untrue, there’s also no reason to declare it true. It doesn’t explain anything that conventional theories cannot.

But, even if true, there’s precious little we could do to it to alter its activities, for good or ill. So fear not an explosion due to global warming.

There is a small percentage of radioactive isotopes inside the Earth. These naturally decay and produce a good deal of the Earth’s interior heat.

These are not nuclear reactions however. Just isolated atoms spontaneously decaying.

Here’s a nice little article about some new results regarding radioactive Potassium in the Earth’s core.

Nothing humans can do, even setting off all nuclear weapons at a selected spot, is going to effect anything anywhere close to the Earth’s core.

The “40 years of research” consisting of reading Superman comics, apparently.

If you want to have nuclear fission, you need a higher concentration of fissionable isotopes than is now available in uranium ores. Such concentrations have existed in the past, due to the difference in half-life between (fissionable) [sup]235[/sup]U and (non-fissionable) [sup]238[/sup]U. When they existed, there were natural nuclear reactors. However:
[list=1]
[li]The planet notably failed to explode[/li][li]The differentiation processes that produce ores (as distinct from regolith) do not and never have existed at the Earth’s core[/li][li]The differential half-lives mean that uranium ores have not been fissionable for a couple of eons (1 eon = 10[sup]9[/sup] years) now[/li][/list=1]
As for fusion, suffice to say that, even assuming that there is primordial deuterium in the core, conditions there are not and never have been remotely conducive to fusion.

The paint chips, definitely.

From what we know of natural radioactivity and compressed matter, it’s quite clear that the combination of intense pressure and heat is caused by two developments: (1) the mass of the Earth compressing itself, or to be more specific the interior of itself; and (2) the results of 4.6 billion years of natural radioactive breakdown, with only limited convective/radiative expulsion of heat. There appear to be four main causative factors: Al-26, which has a sufficiently “short” half-life (~10[sup]6[/sup] years) to be completely gone now, but which was apparently a major constituent of the primordial mix, and the three long-lived (on the order of a billion year half-life) nuclides familiar to most students of radioactivity: Th-232, U-238, and U-235.

It’s important to note that the last three are only minor constituents of the Earth, whether you’re talking crust, mantle, or core – but they have been consistently breaking down for 4.6 billion years, and save for the surface deposits, their heat-of-breakdown has had nowhere to go. Further, distinguish between natural breakdown of single atoms taken as a totality, which is what we’re talking about, and the potential for chain reaction, of which only U-235 is capable among naturally occurring nuclides (AFAIK, and with the trace nuclides U-233 and Pu-239 formed from the other two ignored), and which appears to have happened naturally only once in Earth’s history, and that as a surface occurrence (the Congo, ~1 billion years ago).

The Sun is one giant nuclear (fusion) reactor, not the Earth. The Earth is warmed by radioactive decay, but only to a slight degree. Cite

It is interesting to note that the radioactive decay mentioned above is sufficient increase the calculated age of the earth from maybe 100 million years to the actual 4+ billion. Lord Kelvin had calcuated the maximum possible age of the earth on the basis of how much internal heat could have been generated by its gravitational collapse and concluded that there could not possibly have been enough time passed for Darwin’s evolution to have taken place; the earth would have already too cool. So the radioactive decay is quite important. Although I don’t think it has much effect on the surface temperature, but the temperature underground.

Here’s a bit (with pix) on the natural reactors in the OKLO uranium deposits of Gabon. IIRC, solution chemistry (water) brought about the enrichment of [sup]235[/sup] necessary for fission of the ore. You’d need an entirely different mechanism to get enrichment at the earth’s core.

I’m wondering if some of this comes from the fact that just about everyone learns in grade school or high school that Jupiter is “almost” a star and generates more heat than it receives from the sun. Some people are probably thinking well if jupiter can, then so can the earth, despite the fact that scientifically it’s all kinda silly.

If you want to have some fun with your friend, tell him that instead of exploding, the heat from global warming is going to cause the continents to melt and sink into the molten core of the earth killing us all in a fiery inferno.

I see a hollywood movie in this somewhere.

FYI you don’t need hydrogen and helium for fusion. Stars do start using heavier elements towards the end of their lives. I found this site with a decent description of the process: http://www.herts.ac.uk/astro_ub/a40_ub.html

Maybe, but that doesn’t have anything to do with nuclear reactions either. It’s just primordial heat from Jupiter’s formation, ie, it’s still cooling.

Uh, no. This was thought for substantial time to be true, but the current belief among experts is that the reason is much the same as Earth – breakdown of naturally occurring radioactive constituents, combined with what’s a very minor element in Earth’s heat engine but substantial in Jupiter’s – gravitational contraction.

Sorry, I meant gravitational contraction, but I was under the impression that the heat generated by that was mostly during Jupiter’s initial formation, and wasn’t an ongoing process. I stand corrected.

Polycarp, note the article I linked to. They suggest that K-40 might account for a fifth of the heat produced by the Earth.


And, BTW, another sep. issue, etc.: Jupiter is in no way special producing more heat than it receives from the Sun. So does the Earth, etc. I don’t believe that it is suggested that a sustained reaction is even taking place inside Jupiter (until the monoliths arrive). So even if humans carved up Jupiter, added all it’s mass to the Earth, etc., we still wouldn’t get a sustained nuclear reaction. Screwing up the climate is clearly not on a comparable scale to that.