Anyone know how to work this out?
We’ve got several power plant engineers on the board, including Anthracite; I’ll await what they have to say.
I can answer the last one, though: unless Canadians are doing something I don’t know about, the amount of plutonium saved was zero, because we’ve decided not to use Pu239 in power plants, sticking to U235. There are good reasons for both that decision and the argument for reversing it.
I know how to work it out. What I don’t know off hand is the right numbers to put into the calc but I can come close. The numbers used here are rounded off to save effort. (Check my numbers. It’s very late at night where I am.)
The total average generation level in the US is about 300 gigawatts (300 x 10^9 watts). Multiplying by 3.412 to convert to BTU gives us about 1 x 10^12 BTU/hr. However, we need to consider efficiency. Assume the coal plants are about 40% efficient. That requires 2.5 x 10^12 BTU/hr. Coal produces, on average, about 20 x 10^6 BTU/ton. Therefore, we would burn a bit over 10^5 (100,000) tons of coal per hour if all the electricity were generated by coal.
If we consider that 15% of the population was without power for an average of 48 hours, that would give us the equivalent of about 7 hours. So lets say we didn’t use 750,000 tons of coal.
Of course, not all electricity is generated by coal. Only about half is. So 300,000 to 400,000 tons of coal, some oil, some gas, and some uranium. Oh, yes, and some water didn’t go over the dam.
No one in the US or Canada is using Plutonium fuel. Of course, Uranium fuel, as a byproduct of the fission process, creates Plutonium.
KenGr’s estimate is likely as good as any, I say. There are just too many variables to possibly get any closer, IMO (although I would have personally used 34.2% for Eastern coal plant efficiency, and about 11,813 Btu/lbm for coal HHV for Eastern plants, I don’t see that it matters for a guesstimate like this).
Oh, you coal fanatics. So precise!! I know that newer supercritical units are getting into the low 40’s for efficiency and I thought backfits would have raised the average a bit.
As a bit more information has become available, I found a few errors which seem to have balanced out. The year round average generation is now nearer 500 gigawatts than 300. (You know you’re getting old when you remember numbers from 20 years ago.) However, the latest numbers indicate less people were affected than the 15% I assumed. It appears 62 gigawatts of demand were lost. That was at a peak time so the the 24 hour average was probably more like 40 gigawatts or a little more. So my number of about 45 gigawatts was pretty close by making two counteracting errors. Isn’t probability wonderful!!!