Say I lived 100 km from an electrical energy production plant. What percent of the electrical energy going into the wire is lost on its exit from the wire?
DOE says the overall losses are around 9.5%: http://www.energetics.com/gridworks/grid.html
I would think that 100km would be a longer-than-average transmission distance, so your case might be greater.
It depends not only on distance but the voltage, type of system, and efficiency of substation step-down/step-up transformers. Line losses come from Ohmic or Joule heating (loss due to resistance in the conductor) and corona discharge (loss due to ionization of the surrounding medium through inductance). For high voltage alternating current (HVAC, not to be confused with heating, ventilation, and air conditioning) there is an upper limit efficiency of power transmission at a certain voltage, whereas with high voltage direct current (HVDC) power transmission the higher the voltage the more efficient it tends to get until you get to the point where you are causing electric breakdown of the insulator or the conductive medium. The losses with HVDC tend to be more on the conversion end where you need to turn it back into AC for the end user. AC has been used traditionally for large regional or national grids because it is much easier to regulate a network of unsynchronized AC generators than trying to load balance DC, but if you were to design a power network today from scratch without worrying about existing systems and appliances you would almost certainly select DC.
To answer your specific question, electrical grid losses from generator to end user are somewhere between 7% to ~15%, largely dependent upon how modern the system is, how decentralized and distributed energy production is, and how well energy demand is estimated. (If demand exceeds local supply it pulls from other areas of the grid resulting in longer virtual transmission distances and greater strain on the system.)
Stranger
Thanks.
It seems to me that in this billion dollar industry, a few bucks could go into increasing efficiencies. Wouldn’t that be cheaper than building new power plants?
If there was value in increasing efficiency (i.e. it was cheaper to upgrade the grid than to build additional power plants) the industry would. The problem, however, is twofold: one is that power plants have an operational lifetime, and while incrementally upgrading them to extend their operational duration or capacity is often done because it is fiscally viable (or because the utility doesn’t have the capital to build an entirely new plant), at some point it just makes sense to build a new plant. Nuclear fission facilities are particularly sensitive to this point; owing to the damage to the core mechanisms from neutron radiation, nuclear facilities have a limited operational lifetime (~30-40 years), and pushing beyond that caries increasing risk. And unlike coal or oil fired plants, you can’t just shut it down for a year and refurbish all the internals; while you can upgrade things like regenerators and turbines on the outer (open) loop, it is far more hazardous and expensive to get into the inner loop and main core. Things that have to be done regularly, like fuel replacement, are done by remote systems that are designed in. Once the plant is past safe and economic use you pretty much have to seal it in. There are alternate designs–in particular the modular pebble bed reactor–that are by design easier to break down and remediate, but not in current use.
Upgrading the grid for marginal improvements in efficiency would be a good idea; the problem is that (in the United States) no single entity owns “the grid”; it is a quasi-shared infrastructure, so there is little incentive for a single party or small group to invest a passel of money into improving it. “The grid”, as it exists today, is largely legacy infrastructure from the WPA era that has been incrementally improved and expanded without a cohesive plan. While HDVC would be the ideal for long distance transmission (>100km) integrating it into the existing HVAC grid is difficult and expensive, hence it is generally only used where HVAC is impractical or where it branches off to an isolated subgrid.
I suspect that Una Persson can speak in more detail and far more authority than I on this issue, electrical power generation being his metier.
Stranger