If all rooftops were used, how much solar power could be generated?

Back in post 11 I listed my actual installation based on a real, American suburban peaked roof house. Someone doing a WAG could PROBABLY take that data point and extrapolate…

Ontario and Quebec Hydro are probably the two largest generators of hydroelectric power in North America, if not the world as a whole.

In other related comments, I confess to being skeptical of the practical success of solar-cell-style power generation from pavement – I suspect it’s one of those bright ideas that encounter unforeseen difficulties in actual implementation.

But I wonder if thermal-difference power generation akin to deep ocean thermal power might be a viable possibility. The general idea in OTPG is to take the temperature differential across the thermocline and turn it into electrical pwoer, and it works quite well in tropical oceans where the temperature spread is adequate and sufficient insolation to heat the surface water is available. My thinking is that most of the country/world has underground layers that are relatively cool (cf. the temperature in the typical cave), and taking the potential between asphalt heated by the Sun and such a soil layer, though not much for any one pair of contacts, would cumulatively produce quite a lot of power at minimal investment, though I don’t have the physics to calculate even ballpark estimates of how much.

Right. But right now, all the energy from the car and the weight of the car is just going into the pavement and being wasted.

Is this a whoosh? The “energy” of the weight of the car is only potential energy. The only way to harvest that energy is to let the car press down the pavement. Which means the car will have to expend at least the same amount of energy to raise itself. And that energy will come from burning more fuel. I have no idea what you mean by “the energy from the car”, unless you mean that it heats up the road a bit.

OK, I was too tired last night to break all this down. Too much bad math to correct, too many incorrect claims. Sam, at this point it’s just a given that you will claim things that aren’t in your own cites. Now it seems that you don’t understand math or power. Let me try and help you and Delayed Reflex out.

A megawatt (MW) is one million watts (10^6). A terawatt (TW) is one trillion watts (10^12). A petawatt is one quadrillion watts (10^15). cite

Even the number you cite for electrical generation is only 4.156 TW, not 4,156 TW. Your ludicrous 29000 TW number would be 29 PW, which would be equal to approx. 16.66% of the total energy flow from the sun that strikes the earth (174 PW).

The entire freaking human race only uses about 15 TW of electricity a year.

The US uses just just under 4 TW of electricity a year. 710,000 MW is .71 TW, or close to 20% of the total electrical consumption of the US per year.

Here’s a handy conversion website so you don’t overload your calculator.

Here’s a primer on orders of magnitude re: power.

The numbers I cited, Delayed are indeed not made up fantasy bullshit.

My cites are solid, and actually say what I claim them to say, unlike Sam’s claims and cites.

I found that about 400,000 houses in Germany have solar collectors on their roofs,(though there are also a bunch of large solar plants, but I’m also guessing most of those 400,000 houses are not fully packed). The total number of households is about 38 million. Wildly extrapolating: if the total roof-space of every building (not just houses) in Germany is more than about 1.2 average house-roofs per household (which seems fairly reasonable), then it is possible to produce all the currently needed electricity in Germany just from solar.

I’m not saying it would be cheap, reliable or useful, though.

Sorry Bo, you seem to be mixing up Watts (a measure of power) with Watt-hours (a measure of energy) - power being energy over time, and energy being power*time. As there are 8760 hours per year, to get TWh from (average) TW you multiply by that amount. Thus, if the US averages 4TW output (as you posit), you would be generating 35,040 TWh over the course of the yea (of course the US actually averages 3.31TW output). The fact that you suggest that the United States Department of Energy is an unreliable cite is astounding.

Please feel free to re-check my math and let me know if you think it is unreasonable - I was simply trying to show the scale of what the solar roadways project proposed.

Many ton vehicles driving constantly over expensive, delicate, electro optical devices?

Yeah, no UNFORSEEN difficulties there…

Rounding your numbers to more favorable values, lets say your system cost roughly $30k for a roughly 10kW nameplate system - so lets say that ends up being $30/Watt installed nameplate capacity (you, the consumer, pay roughly half that - but somebody has to pick up the rest of the cost). At that rate, it would cost $21.3 trillion to install the 710GW of solar panels that would cover every roof and supply 9% of the nation’s electricity.

Of course, you often see cites that say that manufacturers can make cells for $1/Watt - I imagine that is the marginal cost of making a panel, but lets assume that it is the full cost including capitalization of the manufacturing facilities, etc. - so that is 30x cheaper than what you paid. Then it would cost a mere $710 billion to outfit all the roofs in the nation - assuming all transportation and installation was free. Actually $710 billion seems like a quite doable number if rolled out over, say, 40 years - around $18 billion/year.

Once solar power reaches grid parity, it will likely take off (much as wind power has in areas with significant wind). It still suffers from intermittency, but at least solar power tends to generate energy during peak (day) hours, especially in hot sunny areas where air conditioning is used a lot.

Goosh, that 35,000 number sounds awfully close to that 29,000 number some noob here came up with :slight_smile:

Note, this jib is not aimed as said “noob” or delayed reflex.

Thanks, but I’m in no need of math or engineering lessons from you. You’re wrong, I’m right, you’re being unnecessarily confrontational and verging into personal attack, and frankly it’s kind of embarrassing for you.

The only mistake I made was to use the word production instead of consumption for that number. My concluding sentence clearly shows I was talking about consumption. When talking about how much of our energy needs we can replace with solar, we’re clearly talking about consumption.

As for your confusion of Watts with Watt-hours, this is a critical distinction to make for a couple of extra reasons applied to solar. But the fact that you don’t know the difference makes it especially hilarious for you to be trying to school me in the subject of power engineering.

Let’s look at some reasons why this is critical, and what factors work against solar power:

Let’s say I have a fuel cell capable of generating 100W of power. I also have a solar panel capable of generating 100W of power in direct sunlight. This are not even remotely the same in terms of their ability to deliver a total amount of power over a day, a month, or a year.

My 100W fuel cell, at 100% duty cycle, can produce 2.4 kWh of power per day. My solar panel at best will have maybe four or five hours of peak sunlight, maybe 8 hours of off-peak sunlight where I can generate on average half the rated power, and then 12 hours of no generation. So my solar panel will only be able to generate .9 kWh of power in a 24 hour day.

But wait, it gets worse. Because I still need power at night. So now I have to store my electrical energy. That means batteries. Putting energy into a battery and taking it out involves energy conversion losses. Including the charging system, inverters, and other stuff, I may lose half my energy in the process of putting it in a battery and taking it out later on. So now my .9 KWh is really about .7 kWh.

Now if we look at the time over a year, it gets worse. My fuel cell can generate 876 kWh of power in a year. But for solar, cloudy days suck. Here’s a nice table of cloudy days in the U.S.. Let’s say we want to put our system in a house in New York City. New York gets about 150 cloudy days a year, and about 115 partly cloudy days. On typical cloudy days, solar panels generate about 20% of rated power.

So, our annual output of our 100W panel is now:

(.7 kWh X 150 X .2) + (.7 kWh X 115 X .5) + (100 X .7 kWh) = 21 + 40.25 + 70

So our 100W fuel cell generates 876 kWh of power per year, but my 100W solar panel only manages 131.25 kWh. So we need SEVEN 100W panels to give us as much annual power as a 100W fuel cell.

Then if we want to really drill down further, we can note that on a peaked roof, half your solar cells are going to be at a bad angle to the sun. We can add in the fact that solar cells get dirty and have leaves and snow on them from time to time, and thus will rarely be putting out their max rated power. And we could go on and on. With solar, the deeper you look into the actual logistics of installation, maintenance and operation, the worse it gets.

Now maybe you can see why my first estimate was very much a rough approximation, ‘ballpark-within-an-order-of-magnitude’ type of calculation, and why I was also erring on the side of giving solar power the benefit of the doubt so no one could accuse me of bias.

Not that that stopped you…

Just to be clear - we should actually be looking at electricity consumption, not energy consumption. There is a difference.

Taking your fuel cell example - suppose it converts fuel to electricity at 50% efficiency (reasonable I think). Thus, it consumes 2kWh of fuel to generate 1kWh of electricity - it is the former number that is reported in the “energy consumed” - the 29,000TWh. In the EIA statistics, 11,750TWh of energy is consumed to produce electricity (mostly fuel) - as it appears power plants average ~35% efficiency, this results in 4,147TWh of electricity being generated. For the purposes of this discussion, electricty generated ~= electricity consumed. Thus you can see that electricity from solar generation more relevantly applies to the 4,147TWh than the 29,000TWh. If we were able to replace 100% of our electricity needs with renewables, you would see that total energy consumed would go DOWN while electricity consumed remains constant, as we don’t have the waste energy lost due to the inefficiencies of thermal power plants.

It is interesting to see how this applies to the 60% of energy that is used for purposes other than electricity - 8,147 TWh (27.8 quadrillion Btu) is used for transportation, for example. Let’s assume that most of that energy is fuel burned in internal combustion engines and that on average they are 30% efficient. That means we are getting 2,444 TWh of transportation work done. If we were to completely electrify the transportation sector, it wouldn’t be unreasonable to expect it to be at least 75% efficient (electric motors are far more efficient than IC engines) - thus we would only need to consume 3,259 TWh of electricity to meet our same transportation needs. Of course, if this electricity is generated from 35% efficient power plants, then you would end up expending 9,311 TWh of energy - more than just burning the fuel directly in your cars! This should emphasize how important it is to transition our power plants to renewables before or at the same time as electrifying our transportation system - otherwise we will be looking at minimal energy benefit (not that net energy consumption is everything).

I assume industrial/residential/commercial energy usage is primarily heat, in which case burning fuels is already very efficient, so electrifying that with renewable energy would probably result in roughly the same total energy consumption (~9202 TWh). Still, on the whole, if everything was 100% renewables, you could expect to see total energy consumption in the States drop from 29,000 TWh to 16,600 TWh thanks to no longer wasting so much energy as heat.
The rest of your post is a good breakdown of the concepts of nameplate rating and capacity factor, which are crucial to understanding the actual contribution of power plants. People always only refer to power plants by nameplate power rating, when it would really be more useful in many cases to describe them by how much energy it can be expected to produce over a given period. Even an average annual power rating would be nice - “500MW peak, 100MW annual average” - but I guess that wouldn’t be good for marketing.

Sorry if I wasn’t clear, but I meant that there is more rooftop area than just residential rooftops.

What do solar panels “cost” in terms of energy to make vs their expected energy return over the liftetime of use?

This is true in general, but not for this particular example. I was trying to show how rated power output of a fuel cell and rated power output of a solar panel can be the same yet one can still be only capable of delivering a fraction of the amount of energy over a year as the other. What I was trying to show was that you can’t simply say, “Hey, my power consumption is only 900W, so I just need to buy nine 100W solar panels, and I’m good to go.” I picked a fuel cell as a comparison out of the air, because for the purpose of the comparison I just needed another source of energy that’s capable of producing its rated power 24/7.
One of the reasons I started this whole thing with a very simplified model is that once you get away from WAGs, the real picture gets hellishly complex. I only listed a tiny number of factors that go into determining the overall efficiency of a rooftop solar power system. And of course, you have issues with other power generation methods as well. We haven’t even talked about the energy requirements for building solar panels, installing them on rooftops, and maintaining them. We haven’t talked about how many would be down for repairs at any given time, and how many would not be maintained correctly at all. We haven’t discussed the problem of baseload energy, and the fact that the power company would still need to maintain enough power generation for extended cloudy periods, which means when solar is running full blast the other power plants would be running at lower duty cycles and in some cases lower efficiency.

This is one reason why the power companies resist allowing people to feed power back into the grid. It makes their life a lot harder. If people started doing it en masse, we’d lose some efficiency from the baseload infrastructure.

This Study, which could admittedly be biased in favor of solar power, says the energy payback happens on average in 2 years for the types of panels they analyzed, and that solar cells generate 9 to 17 times the amount of energy required to create them.

Of course, you also have to factor in the energy required to ship them, install them, maintain them, and dispose of them. But on the other hand, a fair evaluation of alternatives would include the energy required to build power plants, mine the fuels, refine them, ship them, convert the fuel into electricity, etc.

Sorry, that acapacity factor is going to be way off to be applied to all rooftop solar. Those estimates are for PV solar in the most efficeint circumstances, and not just rooftops,places where the sun shines, etc. Rooftop PV will be much less efficient over the whole spectrum than this.

True - as I was directing the point to Bo I picked pro-solar cites so as to, like Sam, avoid any accusation of anti-solar bias, but it is true that the average capacity factor for solar panels on roofs across the nation would be much lower (probably in the range of 5-15% is my guess).

Turns out though that I made several errors in my calculations originally anyways - looking back I am not sure how I got them :smack:. 710 GW (nameplate) of solar power @ 25% capacity actually generates 1555 TWh/year, not 1.06TWh (that appears to be a daily value, but with the 25% applied twice :o) - so if we take a still more reasonable average value of 10% for capacity factor (average daily energy output = 2.4Wh/W capacity, yearly output = 876Wh/W capacity) then we are looking at 622TWh per year, around 15% of annual power consumption. If you think 5% capacity factor is more likely then we’re still looking at 311TWh/year, or 7.5% of annual consumption.

I will throw in a data point. I work for a large supermarket chain in the northeast United States. We put solar panels on the roof of a few of our stores. The maximum power we ever were able to draw was 6% of the store’s requirements. On a bright sunny day in the middle of the afternoon. We could have gotten it to 20% by plastering the entire roof with solar panels, but it would have cost hundreds of thousands more.

It was a 100% PR move. We will never make up anything like the cost of the panels in energy savings. In fact the cost came out of the “Corporate Responsibility” department, not construction, maintenance or energy management. The engineers had to be dragged kicking and screaming into the project, because by simpler, cheaper energy conservation efforts we could have save more energy and money. Like installing alarms that alert the staff or a central monitoring station that someone has jammed open a freezer door.

Or replacing lights with more efficient ones.

This might work a lot better in Arizona, but as I understand it the cost of electricity is so much lower there, that the payoff is even longer.

Snowboarder Bo:

I just got around to reading This Study, which you seem to think makes a slam-dunk case for solar power. It doesn’t.

First, they make it clear right from the beginning that the study was commissioned by an industry group with the specific intent of promoting solar power. They also point out that the group specifically requested the study consider installed costs of $2/Wp (that’s Watts peak, calculated as being power output at noon on a sunny day with the panel directly facing the sun). Furthermore, all the numbers in the study are based on significant government assistance, including lower interest rates for solar installation borrowing, renewable energy credits, reverse metering (selling excess power back to the grid at retail prices), a 15% federal tax credit for homeowners, AND carbon pricing on the alternatives through cap and trade or a carbon tax.

Those are the assumptions - all very much the best possible case for solar. For example, current prices for solar panels alone run from about $3 to $5 per Wp. Then there’s the cost of the installation, the mounting hardware, inverters, etc.

We have an example in this thread. Algher posted real-world costs for an installation:

4.510 kW DC(STC)
SolarCity Power System $28,811

That’s $6.39 / Wpdc.

According to the study, at that price the demand for solar power is nil. It remains nil even at 2/3 that price.

In fact, you don’t get to a payback of < 10 years (the point at which you get widespread adoption) until you get the price down to $1.25/Wpdc, AND you have all the other government incentives in place.

Oh, and by the way… remember my estimate of 100 billion sq feet of available roof space? You know, the number you called “Wrong. Laughably wrong”? The study’s estimate of available roof space for solar is… 84 billion square feet. My number was about 15% off, and I estimated on the high side, which is what you want to do with a back of the envelope calculation like this.

Finally, I said that real-world numbers indicated that we’d be unlikely to get to 5% of our power through solar, and more likely 1-2% in the near future.

According to the study, IF solar were $2/Wp today (it’s not), then by 2025 you could have an installed capacity of 47 GWp. At a capacity factor of 20%, That’s about .8% of US generating capacity.

Based on this report, all the conclusions of my original WAG message are completely true., within the level of accuracy I was shooting for.