Lead acid batteries are pretty efficient at returning energy used to charge them, better than most affordable alternatives, though nickel-iron is pretty good but not very common. Considering charger and inverter losses as well, you can probably exceed 50% overall efficiency but not by a lot. So the night rate would have to be half the peak rate or less to make it feasible, and and 1/3 or 1/4 to make the payback time reasonable.
One big problem with lead-acid is that you seriously reduce the lifespan if you try to use all the capacity. The battery will be seriously degraded with just a handful of 0-100% cycles. For reasonable life, you need to overate the battery bank by 2X and 4X is better if you can manage it. A battery cycled between 100% and 75% charge will last perhaps 10X as long as one cycled between 100% and 25% so a battery 3X bigger ends up cheaper in the long run, probably even when you consider the time value of the initial purchase.
There is a town in south Texas, Presidio, that is doing this on a huge scale. The battery takes up a large building and is a sodium-sulfer design that must be maintained at high temperature . In this case the motivation is that the Town is served by a single grid feed that is not overly reliable. The battery is intended to provide time to repair the grid tie, and is less expensive than running a second feed.
Just want to point out that efficiency of charging varies with state of charge. It is more efficient to get a Lead Acid battery from 60% capacity to 80% then 80% to 100%. Efficiency usually increases at a lower state of charge, but lifespan of Lead Acid is reduced as the battery spends time at lower states of discharge.
Yes but you’re comparing a multi-million dollar purpose-built high voltage direct current transmission system with a home system - two very different beasts indeed.
In your example the gains in transmission efficiency (from about 7% losses per 1000 kms to around 3% per 1000 kms) no doubt more than cover the inversion/rectification losses. Especially given the enormous amount of power being transmitted. And there’s no storage losses to consider.
In the home example we’re dealing with much smaller amounts of power, being rectified, used to charge a bank of batteries, drawn back from the batteries, inverted and then used in the house.
In fact my initial estimate of a 30-40% loss seems to be optimistic, given Kevbo’s point below, “Considering charger and inverter losses as well, you can probably exceed 50% overall efficiency but not by a lot.” Not that this is an estimate for the total loss over the whole system, note just losses due to inversion/rectification alone (which on reflection I might not have made clear in my earlier post).
Oh and to put this in context, this is still pretty good compared to internal combustion engines which do well to exceed 35% efficiency. (The rest lost as heat, noise and friction).