Nuclear power and water usage

Today an editorial writer in the AJC claimed that Georgia’s two nuclear reactors use as much water per day as the entire city of Atlanta. Not having the figures myself, I have to ask, is this even in the ballpark of reality?

Grrrr… I don’t mind newspapers printing stupid reader opinions, but you’d think they’d try to put a stopper on claims that are demonstrably untrue or obviously fallacious.

Probably need to define “use”. Water from a lake/river is used for making steam for the turbines and for cooling, but it’s cooled back down and returned to the lake/river (that’s what those cooling towers are for). Some water is likely “lost” in the process, but it doesn’t sound right that it would be as much as the op-ed author says. I don’t think they use any more water than any other major factory/power plant/whatever. Having said that, thermal pollution is a concern around some nuke plants, because some discharge water that’s somewhat warmer than surrounding environs. This attracts fish, but the warmer temps can inhibit spawning.

The essence of steam generation is to get water very hot to make steam, use the steam to run a generator, and when you’ve gotten all of the generation you can out of the steam, cool it down and condense it back to water to heat again through the cycle. Nuclear generators use atomic reactions to heat the steam, while fossil fuel plants burn dead dinosaurs.

Whatever you are using to heat the water to generate steam, you still have to cool the steam/water. My question is whether the cooling water requirements of a nuclear plant are materially greater than a similar coal or other fossil fuel plant per kWh of electricity generated.

Is there some other way to turn nuclear reactions into electrical power besides steam?

There are thermoelectric generators that convert heat directly to electricity in a similar way that photovoltaic cells convert sunlight. Google “topaz reactor” for an example. AFAIK these have only been used for deep space missions, as they are not very efficient, and hard to scale up. Also, if used on earth, they would need to be water cooled in the same way that a steam cycle plant is.

Note: The river, sea, whatever water at a power plant is only to cool the distilled water that is part of the steam cycle.

There are some advanced reactor designs like the pebble bed reactor that can use helium as the working fluid.

FWIW,
Rob

The helium doesn’t run the turbines, though. The coolant (which is irradiated and can be liquid sodium, He or water) runs through heat exchangers to pressurise steam which runs the turbines. You really don’t want radioactive turbines.

The steam needs cooling, and environmental water is used for that. I worked IT at a thermal generation plant in NZ (gas or coal fired) and the rule was that whatever the plant output was (up to 1000 Megawatts), that was what was going into the river. During the summer, when the water was hot and the upstream hydro dams were holding on to water, river temp was the limiting factor for generation.

Of course, any energy put in effectively replaced energy removed by the 1000 Megawatts of Hydro energy extracted further upstream.

Si

From what I have read about this (and it was in Wired, so take it for what it’s worth), the helium IS used to run the turbines. Having a low nuclear cross-section, it is difficult to irradiate and a coolant leak would be no big deal because the atmosphere would displace the helium. IIRC the wikipedia article on pebble bed reactors says the same thing.

Rob

Fair enough. I would have thought that the low mass of the He would make driving a turbine quite hard, but it seems that it can be done. You still need to dump some heat from the He before recirculating, though.

Si

The method of heating the water is different between nuke and fossil plants, but once the water is heated up, steam is steam. The same amount of electricity generated is going to take the same amount of steam, and therefore the same amount of cooling.

I do not believe that is totally true, because the turbine heat rate is poorer. And given that nuclear turbines operate at lower pressures (http://www.ne.doe.gov/np2010/pdfs/ABWROverview.pdf , see pg. 3)…they tend to have poorer turbine heat rates. See for example here, Section 5.7.1.2, which shows typical turbine heat rates versus pressure.

http://www.wbdg.org/ccb/DOD/UFC/ufc_3_540_02n.pdf

AFAIK nuclear power plants will use somewhat more water for cooling than a coal plant of equivalent size if their turbine heat rate (net or gross) is poorer (higher numerically, as is obvious on inspection). HOWEVER, even if they require more water for the condenser, it’s only a marginally greater amount, and not something that should cause that much additional grief to a local community relative to what a coal plant would require.

At the power plant I work at, we only need to make up for evaporative losses and blowdown from the cooling towers which corresponds to about 8000 gpm at 100% power. We lose about 3-5% efficiency because we have to run the cooling towers.