I was reading about the Demon Core, a sub-critical plutonium core used in criticality experiments during and after the Manhattan Project. In the second of two accidents, a physicist named Louis Slotin was holding apart two beryllium hemispheres around the core with a screwdriver. His hand slipped and the core went supercritical. Slotin quickly knocked off the top hemisphere saving the lives of the other men in the room, but not himself. What would have happened if he had not taken the action he did? I am guessing that the plutonium would have heated up, melted, and destroyed the beryllium reflector and become subcritical again, after, perhaps, setting the building on fire and releasing a shitload of radiation. Is that about right? Or would you get an explosion?
There’s no way that you could have gotten an actual nuclear explosion. At worst, you’d get a nuclear “fizzle” that melted the core and released a lot of radiation and radioactive contamination. It would have made a real mess.
As you surmise, I expect that the melting core would have quickly destroyed the reflector before much plutonium fissioned. Alternatively, the fissioning would have stopped when the core melted and spread all over the place. Again, a real mess.
Slotin did exactly the right thing, but it was pretty idiotic to set up the experiment in this manner in the first place. The experiment should have been set up such that if anything slipped, the reactivity *decreased * instead of increasing (like raising up a reflector from the bottom instead of lowering it from the top, for example).
I figured you wouldn’t get a full-fledged explosion or making nuclear weapons would be pretty trivial, not to mention pretty unsafe, but how big would a fizzle be? Are we talking about a firecracker or a truck bomb?
Remember that the time required to move your hand and knock of the reflector is practically an eternity in nuclear fission. In a fission bomb, each generation (shake) is around 10 nanoseconds. In a working bomb, it takes around 56 shakes (just over half a microsecond) to involve all the atoms. So, my guess is that the core wouldn’t have even melted - the experiment was designed to explore the edges of bare criticality.
Criticality simply means that a nuclear reaction is self-sustaining, meaning that the average number of neutrons produced per fission event is equal to the the average number of neutrons lost (by leaving the system or being absorbed in a non-fission event). The “demon core,” on the other hand went supercritical.
In any event, despite the fact the nuclear generations indeed progress rapidly, if neutrons are being lost just as rapidly, there is no rapid increase in power. What you have to look at is the neutron multiplication factor (k), which is a numerical value indicating if a reaction is subcritical, critical (in which case k =1), or supercritical. The mere fact that nuclear generations rapidly progress don’t tell you anything about how rapidly the power is increasing.
For example, nuclear reactors use the same fuel as is used in nuclear bombs, with the same rapid generation time, yet the rate of power increase is much lower. A nuclear reactor does not blow up when started up, despite the fact that to increase the power in a nuclear reactor, it must, by definition, be slightly supercritical.
What complicates this discussion is that you have so-called “prompt neutrons” and “delayed neutrons.” In the case of the demon core, not only did it go supercritical, the reactivity was above that of prompt criticality. So it was in fact rapidly increasing in power.
As it turns out, what shut the reaction down was rapid heating of the core, resulting in enough thermal expansion to shut down the reaction. With enough heat being produced to actually expand the core, it seems not too far a stretch for the core to have also melted in the process.