I’m taking an industrial electricity class and the subject of how power companies know when to increase mechanical power going to a generator came up. What is it that they monitor to know when to adjust the power input to a generator?
My instructor thinks they monitor the voltage. I think since they are using induction motors to generate the power that they would monitor frequency.
Also I know when you run a induction motor that it runs at 3450 rpm but is rated at 3600 rpm no load speed the diffrence is because of the magnetic slip. Do they have to run the generators at a faster seed to comensate for the slip do they run them at the speed if it was a loaded motor, unloaded speed or do they have to run them faster than what they would spin unload to compensate?
Synchronous generators are used far more often than induction generators. Synchronous generators operate at a (pretty) constant 3,600 rpm.
Most of the generators on the system will be set at a fixed (but controllable) power output. A much smaller number will be controlling the instantaneous power generation/power load balance.
Frequency is the thing that is measured, with the control system maintaining frequency within a pretty narrow band.
I only worked in a generating station while going to college, but in that case we monitored the current and the line voltage. Assume there are two generators connected to the transmission line. The voltage of the two can only differ by the current that each is generating times the impedance of the line between each generator and the point where they are connected together. The frequencies can’t differ at all except transiently for a short period. So, if the line voltage begins to go down because the load has increased the power input to the generators is increased to bring it back up. If you increase the power input to one generator the current from that generator goes up, the frequency stays the same. What happens is that transiently the frequency from the generator goes up until its phase leads the line phase by enough to supply extra current.
In our lab when our load increased we raised the current of first one generator and then the other in increments until both were generating equal currents and the line voltage was restored to the correct value. I would assume that utility power managers do someting along those lines but with more skill and better equipment than we had.
Power companies do not drive their generators with induction motors. They use: 1) Steam turbines, 2) Gas turbines, 3) Water driven turbines, 4) Diesel engines, or other prime mover that does not get its power from the electrical grid.
Synchronous speed for an induction motor on a 60 cycle line is 3600 rpm divided by the number of pairs of poles. A two pole motor runs at 3600, 4 pole at 1800 rpm, 6 pole at 1200 rpm etc. The nameplate speed is the rpm at rated output power. Usually about 3550, 1750, 1150 etc. with the slip between synchronous and full load speed providing the torque needed to drive the load.
Sorry, David Simmons, your first two paragraphs are almost completely incorrect.
Those things get monitored, and lots more besides. The governor, the thing that controls the amount of mechanical power that the prime mover produces, can be set to control either electrical power output, or speed (frequency).
If it’s controlling electrical power output, it needs to derive what the electrical power output is via the generator’s terminal voltages and currents. If it’s controlling speed, then that speed can be derived from either the voltage waveform, or the actual shaft speed.
No. The line voltage can go up and down independently of the load power. All that’s required is a change in the load’s power factor. The generator’s terminal voltage is fed independently into its excitation system, which effectively changes the generator’s power factor to compensate for changes in voltage.
You can’t assume that frequency magically stays the same. Something has to measure and control it.
I would suggest that you’re mis-remembering the lab set up, and what was actually being demonstrated.
“Magically” is your word. If two or more AC generators are connected in parallel they must run at the same speed. If one of them tries to speed up it is retarded by a strong magnetic force since it is now trying to run the others in the system as motors. If it starts to slow down it is speeded up by a strong magnetic force since the system is now trying to run it as a motor. If that weren’t true and they ran at different speeds then the generators would be in and out of phase at the beat frequency and there would be huge surge currents between them. And sure the frequency has to be monitored. Although the machines have to run at the same frequency it doesn’t have to be the right frequency unless someone sees to it that it is.
I suspect it is true that exitation is also used to control phase and output since the generators are true synchronous machines. In our simple set-up we left the exitation alone.
And no, I’m not misremembering the set-up. As far as what is measured, of course much more is measured that voltage and current. So I guess my explanation wasn’t so much wrong as incomplete. You can only get so much in a short post.
I don’t want this to turn into a debate. At this time in life I have no need to be reeducated on all of the minutia of the management of power generating grids so I’ll step out and let you run things from here on.
I work with control systems for power generating facilities. In response to the OP and the original question, the power producer doesn’t actually “adjust the power input” per se. As was already pointed out, the power producer has the ability to either
set a desired machine speed/frequency that needs to be maintained, or
set a desired power output that needs to be maintained.
(There is also ability to set voltage, VARs, excitation, etc, but let’s keep this simple.)
In either case, the control system is designed to make the necessary adjustments to the prime mover to achieve the desired results.
In case 1) the control system will monitor bus frequency/shaft speed and will make adjustments accordingly. If electrical load increases, the machine will at first tend to slow down. The control system, sensing this, will adjust the prime mover (add steam/fuel) to increase speed back to the setpoint. This type of situation is seen most often in an isolated power system - such as on a ship at sea. There is no connection to a vast power grid.
Case 2) assumes connection to a power grid and sharing of power generation among several generators. Large power plants, referred to as “base load” stations, generally have their generators at full load output. Other smaller generating stations on the grid may have partial load output, determined by the system control office. Most, if not all, generators will be in a “load control” mode. A Megawatt output is determined for all generators and each generator produces the MW requested. If the MW output requested is increased, the control system responds by adding more steam/fuel to the prime mover to increase the power output to the new setpoint. While bus frequency is monitored and maintained in this scenario, the primary source of output control is from the measurement of actual output MW compared to the requested MW.