I am trying to help someone who wishes to design a hand-powered generator for charging small batteries, cell phones, etc during a power outage. But I think I’m doing something wrong with the math.
Energy will come from someone pulling a handle on a cord attached to a generating device, much like the pull-cord on a lawnmower or other small gas engine. As a starting point I am guessing (optimistically) that they’ll exert a force of 100 lbs over a distance of one foot. So, the basic input of each pull is 100 ft-lbs.
A typical AA rechargeable battery can deliver 2500mA hours at about 1.5V, which is about 3.75 watt-hours of energy. According to the calculator at the site;
One kilowatt-hour of energy is the equivalent of about 2,650,000 foot-lbs of energy,.
So, 3.75 watt-hours is about 10000 foot-lbs.
That means, even at 100% efficiency, my friend would have to pull the cord 100 times to get sufficient energy to fully charge one AA cell.
That’s a surprising result. On one hand I find it hard to believe he’d have to do that much work to recharge a battery. Looking at it another way, its hard to believe a little AA battery could in theory power a machine to do as much work as raising 100 lbs 100 feet.
Math is math and I can’t argue with the results, but I’m wondering if I’ve made a mistake somewhere. I know errors easily creep into calculations for food energy due to confusion about calories and kilocalories, and I wonder if I’m overlooking something like that here.
So could someone verify, or correct, my calculations?
Thanks.