I was excited to read about what appears to be a new development in earth-bound nuclear fusion power, the success of the Wendelstein 7X “Star in a jar” plasma containment unit. But when I read further (e.g., on Wikipedia), it seems that both this “Stellerator” and the other type of fusion reactor, the Tokamak, have been known technology since the 50’s! So my questions are:
a) Why haven’t we had fusion power for decades already, if these technologies are that old, and
b) What’s new about this Wendelstein stellerator that should be a cause for excitement?
a) As noted, it is very, very, very, very, very hard to contain plasma at hot enough temperature and pressure for any fusion to occur, let alone enough to produce a positive net energy, let alone to get to the point where the cost per kilowatt hour is less than current methods. (This stuff is never going to be free.)
But it is quite worthwhile to keep trying out new methods hoping that some significant new information will be found that will help nudge the problem forward.
b) All these research institutes and such have PR departments that love to hype anything (usually resulting in completely misleading press handouts). Yeah, the current project has reached an intermediate milestone on their check list. Lot more things to check off still.
Throw in these overhyped PR announcements with clueless media and stuff happens. E.g., “infinite energy”.
Special note: fusion is not radioactive free. There’s going to be lot of free neutrons and crap spit out which will hit the walls of the containment vessel and turn stuff radioactive over time. Neutrons like these aren’t as bad as in fission reactors, but still care will have to taken with building, maintaining and decommissioning any such device if we ever get to a production model.
The basic answer to a) is the engineering is bloody difficult! Hot plasma does not want to be confined, it is inherently unstable. The dialog PastTense has quoted gives some of the problems and it is probably worth noting that in a tokamak stability improves with size which is why Iter- the world machine being built in the south of France - is several times larger than the current largest tokamak, the European device JET. Iter is the machine referred to that should achieve 500MW of fusion power.
On question b), as the previous answer say, a stellarator can, theoretically operate continuously whereas a tokamak pulses - they can be long pulses up to 50+ minutes but still not continuous. The trouble is a large stellarator is incredibly complex to build. The electromagnetic coils that confine the plasma are all complex shapes weaving around the vacuum vessel and an engineering nightmare which is why W7X has taken so long to bring to fruition although it is not that large compared to Iter or even JET. See the picture here!
Totally agree with **ftg **- do not trust the PR hype! They are all competing for funding. That’s not to say fusion is not worth pursuing. If it can be made to work it really is a brilliant power source for the late 21st century and beyond. The fuel is readily available, it does not produce CO2, it is inherently safe in that the reaction cannot run away and, yes, it does produce radioactive waste but waste that will decay in 10s or 100s rather than 10s of thousands of years - a much easier engineering challenge.
Should have added: fusion may never become a viable power source - the engineering challenges may make it uneconomic or there may be physics problems that cannot be solved - but given the possible benefits it is worth the punt. The cost of Iter, although a lot in terms of science budget, is tiny compared to the amount spent on subsidising renewables or searching for oil and gas and it is spread across half the world’s population.
Something to think about, fusion isn’t “easy”, even in the core of a star. For a star the size of the sun, it takes several billion years to fuse all the hydrogen in it’s core–at any given moment, only an infinitesimally tiny percentage of the hydrogen is fusing. (The reason a star is so hot is that stars are so massive that even that infinitesimally tiny percentage is still a hell of a lot of hydrogen.) If fusion were easy, a star would burn up its fuel very quickly, and stars would gigantic bombs instead of gigantic furnaces.
In light of that, it is very impressive that there is even a possibility of humans harnessing fusion for controlled energy production on Earth.
One tweet said the device was switched on last December. Meanwhile… just started operation?
“It aims to show that earlier weaknesses in the design have been addressed” - a polite way of saying “let’s see if this one works”.
Also “plasma parameters approaching those of a future fusion power plant” - in other words, this still isn’t a fuctional power plant design.
Giant superconducting coils, cooled close to absolute zero (Liquid Hydrogen? Helium??) and a plasma at stellar temperatures nearby. What could happen if superconducting coils carry a large current lose their superconductivity and start to exhibit resistance?
As others point out, “no radioactive waste” really means “very little compared to a uranium nuclear fission plant”. In fact the lack of neutrons was one of the discrepancies that made the cold fusion claims difficult to believe.
The problem is that a continuous fusion reactor that can keep producing more energy than it takes in, for an arbitrary long period, has not yet been designed. It always seems to be 10 or 20 years away. Then comes the engineering challenge, of taking that reactor and making it actually generate power, to use that energy coming out.
It’s an engineering problem. You need to create the conditions that will produce a fusion reaction while containing that reaction at an acceptable level. So far, attempts to create those conditions have consumed more energy than the resulting reaction produced.
Actually, I think it also means ‘very little compared to a coal plant’, though I can’t find a cite offhand. Burning coal releases a significant amount of radiation into the environment, and I’m pretty sure fusion waste is much less than a typical coal plant.
Burning coal doesn’t release a significant amount of radiation into the environment. However, radioactive materials occur naturally, and burning coal basically burns off most of the other stuff leaving a significantly higher concentration of radioactive elements in the ash. If this ash isn’t properly contained, it leaks out into the environment, and coal ash is hard to contain. As a result, the areas around coal-fired plants do tend to have higher levels of radioactive contamination than the area around a typical nuke plant (with obvious exceptions like Chernobyl and Fukushima).
EPA regulations regarding fly ash are a lot more strict than they used to be.
Other than the dozens of tons of the actual reactor containment. Neutron activation can be a bitch when the entire containment vessel becomes actively radioactive (in many cases, with fairly frisky gamma output).
That’s another engineering challenge: a safely disposable containment arrangement so that the entire power plant building doesn’t become a radiologic hazard for decades to come.
Tritium is indeed used for a number of glow-in-the-dark applications. If you have a watch with glowing hands, and the hands keep glowing all night, and it was made any time in the past several decades, it uses tritium.
One of my favourite fun facts is that the peak power generation density of the solar core is about the same as that of an active compost heap. There’s just soooooo much compost.
Which puts fusion plants in some context. They are trying to make something with a power density several orders of magnitude larger than that of the solar core.
We have been 20 years away from fusion power for 40 years. But it is an engineering problem so it is solvable – evantually… and who knows if by the time it gets solved we’ve come up with something else altogether.
Last I checked, the current best estimates are more like 35 years away, and that’s only if funding holds steady for that whole time, which it probably won’t.