Freshly spent fuel has had about 3% of its uranium fissioned and Pu and Curium have been produced and the lot is very hot in terms of radioactivity. But why does it need to be removed from the reactor? Is it not very hot thermally? Can it not sustain a critical mass?
You are correct - once the fuel is spent, you no longer have a critical mass, and therefore you can no longer make enough heat to run at 100% power.
Most civilian reactors start with 4% or 5% enriched uranium (natural uranium is less than 1%). After the reactor has been in operation long enough, and your fuel is (for example) half-spent, the fissile material (U235) is more spread out than it initially was. The neutrons created by one fission event are now less likely to hit another U235 atom (necessary in order to sustain the chain reaction), so power output decreases. When you can no longer sustain 100% rated power, you need to refuel.
Add to that, many of the fission products in the fuel rod are neutron absorbers, so as the fuel is reacted, overall neutron density in the core falls, reducing reaction rates.
Finally, the fuel rods distort over time (gaseous reaction products, thermal and neutron induced creep in the cladding). These changes interfere with the very precise environment required to sustain and manage a controlled nuclear reaction. Again, replacing rods is necessary to keep the thing working.
zut:
Fuel is reprocessed. The rods are allowed to “cool off” to allow highly radioactive short half-life material to decay. The cladding is stripped off and discarded. The remaining fuel is dissolved in acid, chemically separated and then re-enriched into new fuel. The problem is that there is plutonium in the mix, which has to be extracted and is not suitable for feeding back into the system. The chemicals used are toxic and corrosive and radioactive. Materials cannot be allowed to accumulate to prevent “criticality” incidents. All in all, a fairly risky, dangerous and often poorly managed process. In general, not popular with anybody
Other modern reactor types are more tolerant of fuel types, use a much higher proportion of the fuel available and produce less longterm waste. On the downside, they require liquid sodium metal for cooling, so the potential for a catastrophic chemical/radiological accident is somewhat increased.
Si
It’s very hot, often near the melting point of the fuel, and because you can’t just shut the reactions off, they continue to generate heat. So you put it in a pool of water, separating the elements as far as necessary to prevent them from reacting with each other and wait for them to cool off. It’s just the cheapest and safest thing to do.
Reprocessing is every bit as nasty an operation as si_blakely indicates (nice summary, by the way) and because of the hazards and contamination risk there are fewer facilites for doing reprocessing than there were a couple of decades ago. It’s often not fiscally viable to reprocess waste, even when it can be safely recovered, so instead we plan to stick it in the ground in Nevada and hope it doesn’t leak into the water supply. :rolleyes: Other types of reactors produce less waste and are more efficient, but because we’re not authorizing new reactors in this country all of our reactor designs are at least three decades old and don’t reflect current advances in nuclear engineering.
They are talking about new reactors in the UK - but looking at the French Light Water Reactors. Seems a real shame not to look at the new technologies that have been developed.
I see it like this - we must reduce carbon emissions. To do that, we have to pay more for our power. Thats it. The free lunch supplied by cheap oil is over. Now, we gotta pay.
Fusion is currently a nonstarter. I like renewables, but only tidal energy is consistent enough to form base supply, and requires massive engineering and infrastructure. Wind and solar require big intrusive infrastructure too, but are not consistent. You need almost double the required capacity because it is not all going to be generating, and you need some large-scale way of storing excess power when you do have it (pumping water up a hill, probably).
So nuclear is the remaining no-carbon option. It has risks, and long term waste issues, but modern techniques reduce that, and at least the waste is managed in some way, unlike radioactive waste from coal plants which is scattered to the winds. And even the environmentalists are beginning to see.
How much juice could you get by passing helium through the spent fuel and running it through a turbine? Also, why do you have to separate the elements to keep them from reacting if they are spent? Or do you mean reacting chemically? And if you do, don’t they react before they are removed from the core?
Fission reactors use two loops; a closed loop of “hot” coolant, and then an open loop that goes to the turbine. The two are coupled by a heat exchanger (and probably a regenerator) in order to keep the radioactive contamination isolated from the outside world. Helium is a pretty poor working fluid for a turbine system with a low or progressively reducing thermal differential; because of the low molecular weight it has little inertial unless highly compressed and heated. (For high temperature differentials, however, it’s excellent, as its near ideal gas behavior lets you get close to ideal Carnot cycle efficiencies, and its lack of reactivity even at high temperatures gives little concern about chemical oxidation or erosion.)
You separate the elements because they’re still decaying and producing neutrons, which in turn cause fission chain reactions (i.e. neutrons are absorbed that cause more reactions, which release more neutrons, et cetera). By keeping them far enough away from each other that the neutrons from one rod don’t interact with the fuel elements from another rod. Starved of neutrons, the reactions gradually decrease in incidence. The quickest way to do this would be to remove them from water entirely (the water acts as a moderator to slow the neutrons down and increase the likelyhood of neutron capture) but this would also cause the elements to heat up and melt, making a pool of radioactive fuel and waste that could continue reacting and producing more nonuseful isotopes. Since the fuel is so dangerous and difficult to handle, you put it in a pool and let it cool off, requiring minimal supervision and handling operations.
Running a fission reactor just isn’t like turning your car engine on and off; it’s more like filling a jar full of gasoline, lighting it with a match held out as far as you can, and using the resulting barely controlled burning plume to run a fan you’ve suspended up above it, hoping that it doesn’t melt or catch on fire. You have to just let it keep going until the fuel burns out. It’s nasty, dangerous stuff, to be done vary carefully and only with as much redunancy and mitigation of hazards as possible. Don’t try this at home.
Q: What happened to the British AGCR (Advanced gas-Cooled Reactors)? Use high-velocity helium or carbon dioxide gas to transfer the core heat. expensive to build, but impossible to melt down. has this technology been abandoned?
You could extract power from the decay heat, but it is not a significant level compared with an actual nuclear reaction.
Freshly spent fuel has had about 3% of its uranium fissioned and Pu and Curium have been produced
So your fuel rod still contains 97% of it’s fissile U235 - this is useful and you want to recover it. The Pu is also useful, but not in a typical power generation reactor - it has to be removed from the reprocessed fuel. The Curium and other fission products have to be removed because they absorb neutrons and stop the reaction, or make the reaction unpredictable.
This is the chemical part of the process. During this phase, you have liquid concentrations of fissile uranium that may be precipitated out. If too much uranium falls out of solution at the wrong time, you get a critical mass of fissile uranium - neutron density increases and you get a self-sustaining nuclear reaction. This generates heat and neutrons and gamma rays - lethal to anybody nearby. Then you have to try to dilute the mass again to stop the reaction.
on preview - what Stranger said.
At Los Alamos, when they experimented with criticality they called it twisting the Dragons tail.
Sometimes the Dragon blows a little flame to let you know he does not like it, and you can make toasted marshmallows.
Sometimes he turns the entire building into an incandescent ball of plasma.
That is nuclear energy
The most modern reactor in the UK is a Pressurised Water Reactor - an enriched uranium, negative power density light water reactor, with a closed primary loop.
One of the barriers to more esoteric reactor designs is the use of plutonium - no-one wants large amounts of plutonium in circulation.
There are a number of concerns with AGRs, the primary one being that using graphite as a moderator could be an extreme hazard should the reactor ever catch fire or be damaged. While a coolant leak from a pressurized water reactor (PWR) would be moderately dangerous, the radioactive steam would disperse quickly and ambient radiation levels would not be permanently affected. (See Three Mile Island.) In contrast, a core fire in a graphite moderated reactor would result in release of radioactive [sup]14[/sup]C. This would combine with free oxygen to make Co[sub]2[/sub] and be absorbed by plants and animals, resulting in secondary reactions of hazardous, short- and moderate-life radioactive isotopes of calcium, strontium, and other elements that are readily absorbed into the body. (See the Windscale incident, the [Chernobyl disaster](Chernobyl disaster).) Graphite is also considered undesirable as a moderator because of its propensity to change in material homogeneity due to the Wigner Effect and annealing from heat. Since this is hard to model and predict for long operational lifetimes, current reactor designs have eschewed graphite as a moderator.
AGRs are far more efficient (but physically larger) because of the high temperature they operate in comparison to PWRs. It isn’t true, however, that they can’t undergo meltdown; unlike the CANDU-type reator, which is self-regulating in a potential overload case, a PWR could get to melting temperatures without active control and potentially go supercritical. Of course, the moderator would catch fire and burn up, but as noted above, this is scarcely an acceptible failsafe option. Also, they require highly enriched material and because of the limited amount of configuration you can perform, don’t tolerate variations in fuel composition well.
Current “advanced” reactor designs include various types of boilingwaterreactors, the High Temp Gas Cooled reactor, steam moderated reactors, and the Pebble Bed Modular reactor. Each has their benefits and disadvantages (the PBMR, for instance, is very scalable and easy to refuel, but uses a graphite moderator), but aside from developing nations with limited natural fuel resources most of the world is pretty leery (and perhaps rightfully so) regarding nuclear fission for power generation.
After about 5 years of so in a cooling pond, the spent fuel is cool enough to be put into ‘dry cask’ storage. This is basically a large, reinforced steel & concrete ‘barrel’ of spent fuel rods. Here’s a photo of some casks. http://library.thinkquest.org/17940/texts/images/drycaskstorage.jpg
It is my understanding that PBMR reactors use a single loop system. It just occured to me that the spent fuel will be giving off radon. Is that why you couldn’t operate it in a single loop system?
Radon, specifcally [sup]226[/sup]Ra and its daughter product, [sup]222[/sup]Ra from uranium and [sup]220[/sup]Ra from thorium, are natural decay products. It’s a noble gas and basically nonreactive, so you can basically let the small amount that would be produced vent to atmosphere with little risk. (Radon is only hazardous when it accumlates in a contained environment like a cave or basement.)
The reason most reactors (including, most likely, an operational PBRM) use a closed coolant loop and an open turbine loop is because the contaminants in the water become activated by intense exposure to neutron flux. (It also makes it easier to keep the working fluid free of caustic substances which might degrade systems inside the reactor that are difficult and hazardous to service and inspect.)
Sure, there’s plenty of useful material. The fuel elements are pulled not because all of the enriched fuel is expended, but because the reaction products which are generated interfere with the efficiency of the process by absorbing neutrons, as si_blakely has already noted. But extracting that material is a very difficult and tedious task.
I understand why you need a two loop system with water, but why do you need it with helium?
What is the magic price point where it becomes economically viable to reprocess spent fuel rods? What is the price needed to make thorium attractive? BTW, are there any advantages to the thorium fuel cycle?
With regards to the earlier comments about graphite moderators, is there a way to make them safe? Since temperatures above 250 degrees C make the Wigner effect go away (according to the article cited above), and PBMRs are designed to run at temps about 1600 degrees IIRC, if the coolant were lost wouldn’t the temperature keep the pebbles from igniting?
This is a bit of a hijack, but why it comes up all the time when discussing nuclear power: Why is Plutonium so much more distasteful than Uranium? You can make nuclear weapons out of Uranium, too.
To isolate the reactor from the environment. Whether this is technically necessary or not is questionable, but it would probably be necessary from a political point of view. I’m not sure how this would impact the efficiency of such a device, but I can’t imagine that a heat exchanger using gaseous helium would be very efficient.
Beats me, beats me, and the advantage of thorium is its relative abundance. I don’t know much about thorium-based reactors (except for breeding cycles involving thorium) as the concept isn’t really being researched in the US, though I see according to the Wikipedia article that it’s considered a future goal for the Indian nuclear industry.
Graphite is “safe” as long as it doesn’t build up energy or catch on fire. However, the perceived danger of a burning graphite reactor (highlighted by the Windscale and Chernobyl incidents) will probably make the public suspicious of any graphite-moderated reactor regardless of its technical merit.
Weapons grade uranium is in excess of 90% [sup]235[/sup]U; this level of refinement is horribly difficult to achieve given the problems in separating [sup]238[/sup]U and [sup]235[/sup]U. This is well and beyond any effort short of a large industrial power. (Normal enriched uranium suitable for power production is 7%-20% [=sup]235[/sup]U.) Plutonium, however, can be of much lower purity owing to its greater reactivity, and so would be easier to weaponize. The disadvantage of [sup]239[/sup]Pu is that you have to use an implosion-type weapon rather than a gun-type weapon that the slower-reacting uranium could offer; in practice, a nuclear weapon sufficiently compact to be launched on an IRBM or carried by a modern fighter/bomber aircraft (much less smuggled through borders) would have to be an implosion -type, and the skill and computational power to do so is now readily availble for an insignificant fraction of what it cost do do this work fifty years ago.