Well, don’t pop the champaign cork yet. First of all, an yield gain factor of Qp~1.2 (indicating a 20% increase in yield relative to the input power), even if real (we’ll get to that in a moment) is not enough to sustain a nuclear fusion reaction. Any fusion process will have substantial losses just in how much energy can be recovered even before you get into converting it into electricity or usable thermal energy. A Qp>10 is generally assumed to be the minimum gain for a self-sustaining fusion reactor of any kind, so this is still almost an order of magnitude away from that threshold.
Second, I can’t read the Financial Times article but I’m assuming that this is either the Nuclear Ignition Facility (NIF) at Lawrence Livermore National Laboratories (LLNL) or the Z Pulsed Machine (ZPM) at Sandia National Laboratories (SNL). Although both are designed and used to research fusion conditions, neither is intended or capable of producing sustained power. The primary mission of the NIF is actually nuclear weapon research and sustainment, although most of the papers you see coming from it are on solar astrophysics, and ZPM is really designed to study high energy plasma dynamics. Both produce interesting physics and may be of some value to nuclear fusion power researchers but neither are a concept leading to net usable energy production. In both cases, the yield produced is so transient that it is often difficult to get an accurate measurement of yield, and if these are preliminary results I could easily expect a ~25% uncertainty on the estimated yield.
Assuming this is the NIF, this would be deuterium-tritium fusion (D-T). The products of this reaction would be an alpha particle (ionized helium) with a kinetic energy of 3.52 MeV, and a ‘fast’ neutron with a kinetic energy of 14.06 MeV. Recovering much of the energy of the alpha particle is pretty simple since it is a charged particle; you can just run it through an electrostatic grid, or through an induction loop, or whatever and use it to directly produce electricity. The neutron, however, is not only electrically neutral but unlike the thermal neutrons that are used to heat water in a boiling water fission reactor, these neutrons are moving very fast and will damage the materials which form the vacuum chamber necessary to allow the fusing plasma. There are proposed schemes to line the walls of such a chamber with materials that the neutrons could irradiate to produce fissile materials that could be used as fuel in fission reactors or 6Li to breed tritium (which you would actually want to do because natural tritium is vanishingly rare) but they would also have to withstand the temperatures and radiation conditions in the vacuum chamber without interfering with the plasma, which is something that the ITER project is actually intended to develop and test.
Setting aside whether this is really an overunity event and the difficulty of converting energetic fusion yields into useful electrical or thermal energy, we often see claims that a successful nuclear fusion power generator will solve all energy scarcity and climate problems when a supposed breakthrough occurs. In practice, this is unlikely. For one, most plausible nuclear fusion power production systems are gigantic, multi-billion dollar construction projects that take years—in the case of ITER, decades—to construct and bring on line, and end up being very prone to operational problems such as loss of vacuum, misalignment of magnetic coils (for magnetic confinement), difficulty in handling materials, et cetera. If a proposed solution costs US$10B per operational facility and requires thousands of highly trained engineers and technicians to operate and maintain it, it isn’t very scaleable or practical for any but the most advanced nations.
Even if some method of nuclear fusion were developed that was highly scaleable, i.e. Tony Stark’s “ARC reactor (referred to in fusion circles as “fusion in a tuna can”) it doesn’t really solve the distribution and conversion problems in general. The general assumption is that nuclear fusion would be used to produce electricity, which is great for nations that have expandable and robust energy distribution infrastructure; however, few if any nations could switch their heating, manufacturing, and transportation systems to be fully electrified even setting aside the costs of retrofitting buildings, developing new manufacturing methods, and deploying enormous fleets of battery electric vehicles or producing synthetic fuels for existing vehicles derived from electricity. And fuel for fusion is not free; deuterium is readily available in normal seawater just by fractional distillation, but at some cost, while tritium would have to be bred from 6Li as noted above, but this isotope is pretty rare and would require extraction from clay deposits or seawater at enormous expense.
In short, even if the Financial Times article is completely accurate (unlikely; if this were a verified event one would expect notes in Physical Review Letters, Journal of Fusion Energy, Nature, Science, et cetera) it doesn’t mean that practical nuclear fusion power generation is around the corner. At best it is achieving a technical milestone that is one of many gates necessary to get to the point that fusion power production is a practical means of offsetting current power generation methods, and is certainly not something we should be relying upon to ‘fix’ climate change or provide surplus global energy.
And you don’t have to take my word on it; physicist and skeptical science communicator Sabine Hossenfelder has a very salient short lecture on the topic:
And yet, it never goes out of style. Which tells you something about how we perennially underestimate the difficulty of producing sustained nuclear fusion in terrestrial conditions.
Stranger