Nuclear power getting more efficient each year?

Some years ago, Una posted a link proving that nuclear power plants are getting more efficient all the time, so that even though we’re not building more of them, they are producing more energy. Coffeekitten wants the evidence for a paper he’s writing, and I can’t find the post for the life of me. Help?

The efficiency of any power plant (nuclear or coal, oil fired) is determined by the maximum temperature of the working fluid (steam) and the minimum temperature of the cooling sink.
For nuclear plants, there are three types:
-boiling water reactors (loest)
-pressurized water reactors (higher)
-gas cooled (highest)
As far as I know, nobody has been building gas cooled reactors recently.
But I could be wrong.

Ralph: At the level of high school physics and spherical cows that’s correct. In the real world it’s just one of dozens of terms in the chain of (in)efficiencies in a real power plant. Meaningful gains *could *be made in those other areas without needing to completely replace the working fluid cycle.

I do NOT know the details of the real answer to the OP’s question. But better electrical generators, or better cooling towers, or more ability to use fuel later into it’s life cycle all could have been done.

There have been improvements in both the turbines and generators used by nuclear power plants. The has also been better management so that plants have less down time per year.

As you can see the amount of nuclear power generated per year went up steadily from 1990 to 2007 and has been fairly flat since then.

http://www.eia.gov/totalenergy/data/monthly/pdf/sec7_5.pdf

Many of the efficiency related components of plants in operation today are 30-40 years old and are candidates for replacement.

I can give you two examples of how plants are becoming more efficient.

The first has been going on for a long time and is now almost universal. Plants change only one third of their fuel during a refueling outage yet get nearly the same Mw thermal output. This shortens refueling outages and (obviously) consumes less fuel, costing less and sending less waste to the spent fuel pool.

A specific example I can give you of increased production is the power uprate project at Seabrook Station about 5 years ago. The plant changed the windings in its generator, installed better designed turbine blades and changed the turbine speed control from electro-hydraulic to digital. This resulted in an increase in net output from 1100Mw to just shy of 1200Mw for a one time cost, producing more from the same amount of fuel. Maintenence costs were reduced for components that were nearing the need for replacement anyway and unexpected plant outages due to electronics failure were also reduced as the new turbine control gear was more redundant than the old one.

This is almost all of it. Turbine upgrades and generator upgrades can really help out a lot, and increased net capacity factors are also a good thing.

I’m not sure however that I said the main thing was they were getting more efficient, but more than increased net capacity factors were what was getting more power out of them.

Here is some recent and historical data from the EIA which graphically shows the increase in capacity factor and generation, while plants in operation actually decreased since their peak of 112 in 1990.

I would think so, if the internal combustion engine became more efficient as time went on. But does increased nuclear power efficiency equate with decreased chance of Fukushima-esque disasters? Is there even a word in the Japanese language to describe that kind of horror?

I’d like to argue against your point, but I don’t want to derail this thread, so I’m just reporting your post.

Because you didn’t like it?

Considering that some of the problem with those reactors is that they are an older, less safe design then yes. The more important problem is that when nature throws a bigger disaster at your country than you thought it would, bad things tend to happen; there were plenty of other industrial disasters in Japan thanks to that earthquake and tsunami. The biggest known earthquake in Japanese history can be expected to overwhelm safety precautions designed according to experience with lesser quakes.

Some variation on “what an unfortunate mess”, I expect. Seriously; “horror”? It’s a serious industrial disaster caused by a major natural disaster, but that’s all. Godzilla has not arisen, not has Cthulhu. Other industrial disasters have killed a lot more people and done a lot more damage, with a lot less excuse on the part of the industries in question.

MODERATOR NOTE

No, because you are derailing the question. The OP only has to do with efficiency.

If you have a complaint about the safety of nuclear reactors, start a thread in the proper forum.

No warning, but don’t do this again.

samclem, Moderator

I’m not disputing your statement but in the engineering analysis of thermal power plants I haven’t seen this used on a total plant basis. Usually a plant’s efficiency is expressed in its net heat rate which is determined by taking the total heat output of it’s fuel and dividing that by it’s net output in KiloWatts.

So a less efficient plant may have a rating of 16000 Btu/kWh while a typical coal plant may be rated at 9500 Btu/kWh.

Allow me then to dispute what ralph124c said because it’s simply not true. I have 2 decades of experience in God knows how many countries working at or with more than 600 power plant units, including teaching more than 60 courses on the subject, as well as teaching at University, and heat rate is not measured in the real world by “the maximum temperature of the working fluid (steam) and the minimum temperature of the cooling sink.” It sounds like it’s a mangling of Carnot’s efficiency theorem. I suppose if you are looking just at a turbine…no, you can’t draw the boundary condition there and ignore the rest of the plant.

BubbaDog, the only dispute I have with what you posted is that for a typical coal plant NPHR is more close to 10,500 Btu/kWh. If you were talking about GPHR, or else a brand-new super or ultrasupercritical coal plant on a NPHR basis. A greenfield coal plant I’m the lead analyst on is predicting about 9,333 Btu/kWh, but then they also have very little in the way of emissions controls (which add auxiliary power, which make the NPHR higher (worse)).

I was going off the top of my head based on the last coal plant power plant project that I had been involved with (2005) The proposals I had read were around 9500 for NPHR but we considered them optimistic. Come to think of it all of the numbers from that project were optimistic. I left the project in 2006 and what I know of it comes from news articles. The final installed cost of the plant turned out to be more than twice the original estimate. Glad that I’m not the guy trying to explain that to the regulators.

But those aren’t related to the ‘nuclear’ part of the plants, which just produces the steam. Couldn’t those same improved turbines & generators be driven by steam from a coal plant, for example? So that would imply that non-nuclear power plants would be showing the same type of efficiency gains.

I’ll make this answer short, because I’ve done a 60-page Powerpoint on this subject I really do not want to bore people with.

Yes, turbine upgrades have been applied to steam power plants over the years to increase efficiency. One of the most popular ones in the 1990’s was the “dense pack” upgrade, which improved efficiency of the low-pressure side of the turbine. Over the last few years the HP and IP sides have started to see efficiency retrofits. But there are problems associated with this - coal plants must obey NSR and in many cases they cannot increase their net generation to the grid without triggering a recertification of their emissions. So what happens is efficiency increases, but power output stays the same as they purposefully derate the unit.

Another thing which is coincident with the 1990’s-present is the installation of emissions controls such as FGD, SCR, FF, etc. which take auxiliary power from the plant, which lowers the plant efficiency. So you often have cases where a turbine upgrade is coincident with adding an FGD, such that the net effect is to produce about the same net power as before.

In many cases they could, but that wasn’t the question the OP asked.

In any case, turbines are generally designed for a specific temperature range. Old nuclear plants tend to run at lower temperatures ranges and use different turbines than a coal plant and Natural Gas plants are a completely different beast.

Breeder reactors then to run at higher temperatures than conventional reactors and there have been suggestions that they could be used as replacements for coal boilers. I don’t that is practical myself.

Not sure if this is what you’re looking for, but I found an archive of a discussion that sounds familiar. A poster named The Second Stone pointed out that nuclear plants hadn’t added any new capacity while other sources such as wind and solar were constantly adding capacity. He claimed nuke hadn’t added a single new watt of power.

http://boards.straightdope.com/sdmb/showpost.php?p=12329162&postcount=47

**Una **refuted this by pointing out that while no new plants were being built, nuclear power plants have added additional watts to the energy grid with efficiency improvements. I think that’s what her link showed, the link is dead now.