explain the fission energy percentage in H-bomb to me

the article Nuclear weapon design - Wikipedia explains that a big percent of yield of an H-bomb comes from fission rather than fusion. Does this mean that if let’s say we have a 4M bomb (or whatever is the normal yield for a modern anti-city weapon) where 50% of energy comes from fission we could dismantle it, produce a pure A-bomb from constituent fission material and get that implied 2M yield? Or is the percent of fission in the bomb material in an H-bomb significantly higher because of the extra neutrons so that in essence an H-bomb spends the fission material much more efficiently?

How does the percent of energy extracted from fission material in an H-bomb compare to the percent of energy extracted through the breeder reactor / reprocessing technological chain?

Much of the fission energy in a “dirty” H-bomb comes from the fast fission of the U-238 tamper. In an A-bomb, very little U-238 will undergo fission. However, in an H-bomb, there are plenty of fast neutrons zipping around, and they will cause U-238 to undergo fission.

Some information here: Thermonuclear weapon - Wikipedia

Note that the “Tsar Bomba” had a U-238 stage which would have boosted it’s output from 50 MT to 100 MT, at the expense of introducing more fallout than all the previous atomic tests combined. It wasn’t used.

We’ll use the B61 gravity bomb (tactical)as our example (it’s a very common one in the US arsenal). If a nuclear weapon is used by the US, it’ll most likely be one of these, because it’s one of the smaller air-droppable bombs still in the inventory.

Unboosted fission primary: 0.3 kilotons. This is roughly the same as 300 tons of TNT, which is about 1/3 the power of industrial disasters like the Texas City Disaster in 1947 or the Port Chicago disaster in 1945.

Highest yield: 170 kt. This is certainly the entire fusion secondary being fired. By way of comparison, this is 170 times more power than Texas City or Port Chicago, and 566 times greater than the unboosted primary yield, giving us a fusion percentage of 99.8%.

This doesn’t take into account the possibility of the fusion secondary having a uranium tamper, or the small yield of the fission “sparkplug” in the secondary, but even assuming the tamper is 50% of the yield, we’re still talking about 50% coming from fusion.

Basically in bigger bombs, the percentage from fusion is higher; the Tsar Bomba was 97% fusion, and strangely, the cleanest bomb ever made.

Minor but necessary nitpick to beowulff’s post: The Tsar Bomba, largest thermonuclear weapon ever made, was tested, at the 50 megaton rated level. (I believe it actually worked out to 57 MT equivalence. It was not tested, perhaps never built, with the U-238 booster stage that would have incremented its destructivity equivalence to 100 MT.

And in the version tested, almost all the bomb’s power came from fusion, as bump notes, the fission stage being solely a ‘trigger’ or ‘pilot light’ for the inherently more powerful and more efficient fusion reaction. What’s being discussed above is the so-called fission-fusion-fission bombs, which ‘piggyback’ the fission of (otherise refinement-of-uranioum byproduct) U-238 on the fusion reaction at effectively no cost to the fusion output.