Nuclear-fusion power plants

That is very true, but it was pointless. He banned reprocessing within the United States, while reprocessing was going on in Russia, China, France, the Uk, Japan and India. Carter foolishly thought that the ‘example’ set by the US would cause those other countries to abandon reprocessing, but all he managed to do was cripple the US nuclear industry.

I’m not sure that’s 100% true. Yes, if the coolant is a/the moderator the absence of coolant very quickly, like seconds, stops the nuclear chain reaction.

That just leaves all the thermal heat in all the reactor parts, and whatever spontaneous decay heat is created as atomic decay continues of whatever isotopes at whatever rate. But absent the coolant carrying away both those sorts of heat, the temperatures of the no-longer-critical fuel & cladding mass will rise and rise. Perhaps to the point of melting, deforming, slumping, cracking, etc.

As well, some of these things are also chemically reactive with hydrogen gas being a common reaction product. So after awhile a conventional chemical explosion of that evolved hydrogen occurs and splatters the molten or at least softened used-to-be fuel and other reactor components all around the immediate area. Now that is a right mess to clean up.

It might be possible to still get useful power from a reactor designed so residual thermal heat and residual radioactive decay heat could not melt or deform any of the components in the total absence of anything but radiative cooling. But I don’t know that that’s how any actual or proposed reactor is actually designed.

Bottom line: coolant moderation has a lot of safety benefit. Hooray for that. But it’s not (necessarily) a cure-all for meltdown or other catastrophic reactor failure modes.

Well, I think it’s protection against a ‘meltdown’ in the way it can happen in a reactor with graphite rod moderators. But as you say, there are other accident modes, including the hydrogen explosion that happened at Fukushima.

Still, Meltdown is a disaster of a higher order, and modern reactors are designed to prevent it.

What do you mean? There have definitely been meltdowns of reactors that use the coolant fluid as a moderator, as in complete melt of the core with container escape.

Granted these events weren’t as bad as they might have been with a different design, but I’m not sure how you arrive at suggesting that they cannot melt down.

Former nuclear submarine officer here…

Immediately after shutting down the [fission] nuclear reaction, decay heat is approximately 10% of the pre-shutdown power level (assuming the reactor was operating at that power for an extended period of time). So if the reactor was operating at 100% power prior to the shutdown, residual decay heat would be as much as 10% of the full thermal output of the reactor. It then decreases from there over a period of days.

In the absence of cooling measures, this amount of heat is typically sufficient to cause a meltdown, which is why reactors typically need some method of removing this decay heat even after shutdown.

A reactor can be damaged by residual heat, but the traditional ‘meltdown’ scenario I was thinking of is that the reactor continues to run out of control without coolant because the moderator is still inside the pile causing it to react. The ‘China Syndrome’ was an hysterical depiction of that. It was this type of meltdown I was talking about.

That said, ‘meltdown’ appears to be a much more expansive term referring to any reactor core damage through excessive heat. In which case, liquid moderators don’t guarantee that you can’t have a ‘meltdown’.

Though it is possible to design reactors that deal with the residual decay heat with either passive systems or just aren’t affected by it. New reactor designs tend to eschew systems that require generators, electric pumps, etc. (for good reason).

Technically, the ultimate source of every pile of nuclear waste is in fact a really big fusion reactor some lightyears away that went kablooey billions of years ago.

If our fusion reactors end up going nova, we might have a bigger problem than nuclear waste.

4 posts were split to a new topic: There was no nuclear disaster at Fukushima

Question about this timeline to meltdown, is this based on the type of fuel used in a nuclear submarine which (from what I have heard) is closer to weapons grade than the fuel used in civilian reactors, in a much smaller containment vessel? Or does that apply across the board regardless of fuel strength and containment vessel size?

If you’re going to be that technical, it was actually more than one such kablooey. More like a few million. And don’t forget to pronounce “billions” as beel-yuns.

It is literally impossible for a traditional pressurized water reactor (the type commonly used in the West and that was used at Three Mile Island and Fukushima) to melt down in the way you describe in your first paragraph, because the coolant is the moderator (which is necessary for the fission reaction to continue). So if you lose your coolant, you lose the moderator, which stops the fission reaction. Pretty ingenious, IMHO.

[Which makes it all the more remarkable that the design of the Soviet-era RBMK reactors used at Chernobyl instead used graphite as a moderator and water as coolant. This eliminated the fail-safe feature that is inherent to western pressurized water reactor designs).

With that said, once a reactor has already melted down, there were fears that the resulting molten mess could potentially restart the fission reaction (because the geometry is completely different from the original design) and continue melting to “China.” But this seems exceedingly unlikely. In any event, it’s better to design the reactor so that it doesn’t melt down in the first place.

The safest fission design is the EBR II at INL, where the fissile material is contained in suspension in the primary coolant loop. The reaction is generated by compressing the liquid so that it goes critical. If the secondary coolant loop fails (stops or goes dry), the primary loop exceeds critical balance and expands until it cannot maintain the critical state.

This design never got support outside the lab (EBR II ran an auto-shutdown test in the mid '60s) for unspecified reasons (which I imagine probably had to do with an inability to generate the most desirable lanthanides for weapon building).

Going way back to 9th grade Earth Science, me and my lab partner were tasked to debate Nuclear Fusion against two other students. We got the “con”/against Fusion side of things. Yet when I did my research on Fusion (this was circa 1979 - three mile island and “no nukes”) I thought “This is amazing shit! Using the same processes that stars - our Sun use! Nearly free power for everyone!”

Our counter-argument wasn’t that this tech will be forever 20 years away and dream on Fusion believers but that this is the stuff of H-bombs and it’ll be Bikini Atoll in your backyard when it inevitably goes wrong - yet even then it’ll be somehow weaponised (or whatever phrase was in use for that sort of thing at the time) and things like rail-guns and such weren’t at least part of our debate yet the conclusion was Fusion is terrible & bad.

Our team won so Fusion was banned for the day. I know Fission went down miserably. Dunno about the pro/con debate with Solar - that’s 20+ years away! Wind power? It will break the environment and it will ruin the aesthetic appeal of Scottish golf courses.

I remain “all for” fusion power plants. Not so much the rail-guns. If they go nova, that will be spectacular!

DT magnetic confinement fusion power systems (I’ve been told “reactor” is out) require some amount of tritium inventory, and neutron capture will cause structural components to become radioactive. So if hit by a meteor, that stuff gets tossed about. But there’s no mechanism for a self-sustaining reaction that could get out of control.

It’s not really a timeline to meltdown; instead it’s a time range in which a meltdown due to decay heat is possible without some means of removing the residual decay heat. It is an artifact of the many decay chains of the fission byproducts from fissioning U-235 and would be a concern regardless of the type of reactor. That said, how the decay heat is removed does depend on the reactor design, and it’s even possible to design a reactor such that the decay heat is removed passively (using natural convection instead of pumps). Of course you still need a heat sink. This could be a large water body or the atmosphere (as in the iconic cooling towers frequently used for power plants). I’m not going to address the specifics of nuclear submarine design other than to state the obvious: a submarine in the ocean is surrounded by a very large heat sink (i.e. the ocean itself).

Thanks!

That is definitely good news. :slight_smile:

It is kind of a mixed bag, though.

With fission, we refine the uranium and put it in the core. The fuel naturally burns itself, producing the heat that we draw off for power. The moderators (control rods) harmlessly absorb the neutrons when they are inserted into the core and allow the fuel to burn when they are pulled out. The useful heat we extract prevents the fuel from melting itself, and the moderators allow us to get the fuel to stop burning.

Fusion fuel, by contrast, does not naturally burn. We have to squeeze really, really hard on it, in very precise ways, to get ignition. Like most material fluids, the fuel is not so keen on getting squeezed that hard, so it fights back. All we have to do is stop squeezing and it settles down.

The point is, the only outside energy we need to put into a fission reactor is making the fuel (and the reactor), which is kind of difficult but not all that energy intensive, from a net output perspective. We just set it up and it burns on its own. Fusion requires an immense run-time energy input to produce energy output, which is why it does not pose any real system failure hazards.
       The reason we do not have fusion power plants is that we have not figured out how to do the big squeeze in a way that yields significantly more energy out than the amount we need for the squeeze. The run-time yield for fission is just about 100% (no energy goes into the pile) while the gross yield for fusion has never topped zero (the NIF test yielded more energy out than the laser power going in, but not more than the energy the lasers drew to make that input).
       Fusion does have the theoretical potential to produce a lot of power from a small amount of fuel, if done right. But that potential remains highly theoretical. The NIF test put the fuel in an elaborate holraum in order to generate a brief microscopic thermonuclear blast – which, naturally, meant that the holraum was destoyed.
       As we have been trying to do it, fusion requires carefully constructed hardware that will be subjected to an absolutely brutal environment – mostly heavy neutron radiation, but other types of radiation as well. The tolerances within which the hardware has to perform are very demanding: it is not altogether obvious that we could build fusion reactors (assuming the possibility), which will not be inexpensive, that will retain structural/functional integrity for long enough to produce meaningful output before they fall apart.

Maybe fusion could save us. I am guessing not. It seems like what we need it for is to produce ever increasing amounts of sloth, garbage and confusion, which are things that we should be solving, not accelerating. And, of course, the experts focused on getting us to fusion are not exploring other possibilities (I have no idea what else might be possible because no one has been looking over there).