whole-house generator and fuel efficiency — need answer kinda fast

I’m not explaining this well, am I? Look at this curve:
http://www.ecoult.com/applications/microgrids-telecommunications-and-remote-diesel

Let me think of a rough analogy… hmm… this is going to sound really bizarre, but let’s say you’re a filmmaker who made a movie and you want to show it to 200 of your friends, but your home theater only fits 50 (you’re not THAT rich). Ideally, to save on electricity costs, your time chaperoning them, the number of times you’d have to run the minibar dishwasher, etc., you’d want them to come in groups of 50 at a time so you only have to do it 4 times.

But the reality is that they can’t all make it at once, so you end up needing 15 different showings to get all 200 people to finally see the movie, with the extra capacity just being wasted. There are empty seats, the power for the projector room is still being used, you’re still using your time, but it’s just not operating at capacity. The wasted energy isn’t doing anything useful, just being turned to heat.

Generators are similar in that to do ANY work, they require X amount of fuel. But it’s not a linear relationship such that 100% load = 100% fuel consumption rate and 50% load = 50% fuel. It might actually be the case that 50% load uses 70% of the fuel (made-up numbers), so that 20% is just being wasted as heat. But if you had a battery there instead, that extra 20% could’ve gone to useful charging instead. You’ve turned wasted inefficiency into useful power.

In the real world, it doesn’t make any sense to waste time on this if it’s just a once-in-a-blue-moon emergency (like a hurricane). But if it’s something predictable, like a winter backup system for an off-grid cabin, it might be worth the consideration.

The idea is that you can run the generator at max efficiency a few hours a week, charge the battery bank, and let that handle the small loads. Batteries aren’t perfectly efficient either, but they’re a lot better than generators. At the extreme case, consider running one single LED light bulb: for a battery bank, it would be almost effortless. For a generator, it would be doing a hell lot of work and burning a lot of fuel, but only a tiny percentage of that power (maybe 20%? not sure) would actually go towards powering the light bulb; the rest would just be used to keep the generator internal circuity going and then burning off as waste heat.

(2nd and last attempt at an analogy)

Water is another often-used one. Imagine a water wheel in an artificial canal that’s used to generate power for your off-grid cabin. It takes a certain amount of water flow/pressure to start the wheel and keep it going. Let’s say you had a small dam of water behind your house, maybe 5000 gallons worth. If you took just an eyedropper of water at a time and put it on your wheel, it would do absolutely nothing for power. It would just make a tiny wet spot and then evaporate a few minutes later. Likewise, you can get a fleet of firefighting helicopters to come, suction all your water out, and dump ALL of it on your wheel at once. It might turn once or two cycles, or it might just completely break. In between those two extremes, your water wheel can handle a range of flow rates depending your needs, maybe the optimal being 5 gallons/minute. At 4.5 gallons per minute, it will still sort of work, but it’ll start and stop and stutter and some of the water will just leak off the side, wasted.

My suggestion is to conserve a limited fuel supply is to not run the generator full-time.

I also have the whole home generators Briggs & Stratton 40346,

Yeah, mostly. These days you wouldn’t want to just needly waste that power by diverting it to heat, you’d just give the motor less power to start with. But I guess back then they didn’t have better ways to do that, so resistors just sat there and wasted it, burning golf course grass and maybe a few rich asses.

But the principle of efficiency is similar: when you use electricity (or any fuel, really), you can either apply it to do something useful, or you can just spin a motor / turn a wheel / move a piston / heat a resistor that does nothing useful and just burns off as heat. That’s what inefficiency is, essentially… the % of input fuel that doesn’t get turned into useful work output.

Modern systems are designed to recapture some of that lost efficiency. Hybrid cars, for example, will use regenerative braking to recapture part of the energy they used to accelerate the car in the first place. Some solar power systems will use excess power to heat a home water heater instead of selling it to a utility for less than it’s worth, thereby saving on the gas bill.

Thanks, everyone, for the help. And the interesting sidebars.

The answers you’ve received are pretty solid. I have a slightly simpler analogy that might make things a little more self-evident:

Most non-Diesel home generators use Otto-cycle engines, which is the same sort of engine found in cars, so the things that affect efficiency for car engines also apply to generators. (Some cars such as the Prius use Miller-cycle engines, but that’s neither here nor there).

The OP asked if the fuel consumed was constant regardless of the load. Well, cars are a perfect analogy in this case. Let’s say you’re rolling down the interstate with the cruise control set to 65 mph. You’re in central Illinois, so it’s dead flat, and it’s a windless day. Your fuel consumption is basically constant under those conditions. But you’re getting a little warm, so you turn on the air conditioning. Does your fuel consumption drop? Of course it does. In fact, turning on your headlights increases fuel consumption marginally. Adding more load to a generator does the same thing for the same reason.

I think some of the non-obviousness stems from the common misconception that the alternator’s resistance to turning varies only with engine RPM. In fact, you can vary the alternator’s output (throttle it) at a constant rotational speed by changing its field current. Alternators make electricity by moving wires through an electrical field. If you change the strength of the electrical field, you change the output of the alternator.

Let’s say you have a 10KW gasoline generator. It turns at a constant RPM to produce AC at 60 HZ. If you’re only drawing 5 KW on the electrical side, the field current is decreased to match the load and the throttle is partly closed to limit the amount of fuel and air, thereby reducing consumption.

Now you plug in several window air conditioners, which requires your generator to increase its output from 5 KW to 10 KW. The RPMs have to remain constant, of course. So the field current increases, making the alternator harder to turn. In response, the generator opens the throttle on the engine, burning more fuel even though the RPMs haven’t changed.

There’s something interesting that falls out of this: for a given output, Otto-cycle engines are generally most efficient at low RPMs with an open throttle than they are at higher RPMs with a partly-closed throttle. That’s why driving down the highway at 2000 RPM in 6th gear saves gas compared to 5000 RPM in 4th gear.

The throttle itself is inefficient because it restricts airflow and requires that the engine work harder to pump air and fuel past the restriction. The classic analogy here is that it takes more effort to breathe through a straw than through an open mouth.

On a generator, RPMs are constant, but power output isn’t. The throttle is most open (and thus the engine most efficient) when the load is near the top of the rated output. If your load is 1 KW and you buy a 10-KW generator, you’ll burn fuel at a faster rate than if you bought a generator that maxes out at 1.2 KW because the bigger generator’s throttle will be mostly closed. It’s not a coincidence that the word “throttle” is a synonym for “choke.”

This is all greatly simplified. For example, smaller generators tend to be more cheaply constructed and therefore less efficient than large ones. But you get the idea.
N.B.: I’m a mechanical engineer, not an electrical one. If I’ve made mistakes on how field current and alternator throttling work, I’d welcome corrections.

This is why inverter generators are hugely more efficient. With an inverter, the engine in the generator can run at basically any RPM and produce A/C or D/C, it doesn’t matter. No matter what it produces, as long as the total power is greater than the load power (RMS current times voltage), the inverter electronics can just produce the output voltage and frequency desired. More or less, there are practical design restrictions, the components chosen for inverters will have a maximum voltage they can work with, so it’s not quite that open ended.

Anyways, this means that the generator’s governor, which will be controlled by a microprocessor running an algorithm, can choose the engine RPM and throttle setting that is most efficient for the load the generator is experiencing. Inside the microprocessor’s memory will be a lookup table of efficiencies, and it can determine the load by how much current is flowing on shunt resistors somewhere in the inverter, and it can determine the voltage using ADCs.

So in effect, this is why inverter generator manufacturers report amazing specs like 1 gallon of fuel consumed at 25% generator load for 12 hours, because they are plenty efficient at low load.

Unfortunately, the problem is that they are expensive. A 3 kW Champion brand duel fuel inverter generator is $1000, and it only outputs 120V so you can’t run a house.

The 7 kW ‘dirty and loud’ generator that is also capable of 240 volt output (so it can run a house) by Champion is $600-$800.

But you need to store a ton less fuel for an inverter generator. Probably 1 gas grill propane tank for every 2 days on generator. Or 2 gallons of fuel you could siphon from the car per day. You would just need to set the load up to work on 3 kW, 120 volts. The main thing is that central air conditioning wants more power at higher voltage - this is why having a 120 volt window unit or mini split somewhere in your house would be handy.

If it matters, my 5KW Kohler (gasoline) lists the following for fuel usage per load %.

0.76 gph @ 100%
0.67 gph @ 75%
0.58 gph @ 50%
0.48 gph @ 25%

I’m know this won’t scale linearly to your whole house set, but it’s a place to start. If it takes mine three-quarters of a gallon per hour to produce 5000W, you would probably be OK multiplying that as needed to estimate your usage. Probably will be conservative as your larger genset isn’t working at max to produce 5KW like mine is.

My generator is 20kw, and runs off of natural gas (the manual gives numbers for both).

You’ll definitely notice an increase in your natural gas or propane bill.

A late spring storm knocked out my power for three days. My NG bill that time of the year usually drops under $25. IIRC it rose to $40 that month. Not a big hit, but imagine living in Houston now. Perhaps you used the generator six weeks. Your gas bill would be several hundred dollars. There’s also the maintenance costs on the generator. They aren’t intended to be run for months at a time.

Power from the utility is cheaper than anything we can generate ourselves.

Unless you’re in the desert and can supplement power with solar panels.

You don’t gave to be in the desert anymore. Prices have come down a lot. You can check insolation maps if unsure.

(used to work in the industry)

Maybe you should calculate the whole house’s wattage first

And you can use this tool to calculate the wattage in this link:

Then analysis how much electricity is used