electrical question - voltage regulation and batteries

OK, I know there’s some EE types lurking in here… I’m doing a little personal project and trying to figure out the best way to regulate voltage coming out of a dual cell battery pack using a voltage regulator chip (LM138, if you are interested).

The battery pack can either be wired up as two 12V cells in parallel or as one 24V cell. I want to regulate the voltage to various levels under 12V. Would there be a difference in battery life due to stepping down the voltage from a higher level if I regulated from 24V instead of 12V?

I think that another difference would be halving the available amperage if I hooked the batteries up in series, correct?

Thanks in advance!


Battery life will be better if you step down the minimum amount.

For a given load your regulator will produce a certain current. This current will need to be supplied by the battery. If you set this up as 24V then both cells will need to produce that level of current. If instead you set this up as 12V then each cell will need to produce 1/2 the current. A battery only has a certain amount of charge in it so if you use less current then the charge is depleted slower and the battery lasts longer.

Bad idea trying to get more life out of them by wiring in series for 24v. You will get more time above 12 volts but the batteries will be dead and likely damaged from over discharge before the voltage drops to that point.

IIRC the LM138 probably won’t even work in this application as it will likely burn up from trying to dissipate so much power. You could use a switching power supply to regulate voltage down this far in fact it might be pretty easy to salvage a PC power supply to do just that. Still you are better off running the batteries in parallel and making sure to turn off when the battery voltage drops too low.

A linear power regulator is essentially an automatic variable resistor. Like any resistor, it dissipates power according to Ohm’s Law. For example, if you input 24V and ouptut 10V 100mA (1W), there’s 14V across the regulator and 100mA flowing through it, so that’s 1.4W being dissipated by the regulator. Whereas if you use 12V input and 10V 100mA output, that’s only 2V across the regulator, and 0.2W power dissipation.

But the configuration also depends on the exact output voltage needed. If you need 12V output, then the input to the regulator must be larger than 12V. (How much larger depends on the regulator - look up the data sheet.) So you may be stuck having to use 24V. Whether this would burn up the regulator depends on how much power the regulator is forced to dissipate vs. how much power it’s designed to handle. Again, look up the data sheet. Or it may be worth the trouble to find a 13V or 14V battery pack.

Thanks for the info gazpacho.

I’m not quite sure I understand what you are saying, Padeye - I am using non rechargeable batteries. Why would they be damaged from over discharge? Would a really big heatsink help for regulating from 24V to say 6V? Thanks,

Hmm… I will look into some switching regulators then if I need a big drop in voltage. Thanks

If you don’t care about damaging the batteries then it is less of an issue but it is still being wasteful of the capcity of the batteries unless of course your goal was to heat your house with a voltage regulator.

What are you doing? What are voltge and current requirements? With a little more information we could give you a more useful answer. If you are using non-rechargable batteries my guess is you have several 1.5v cells in series. There are no 12v cells that I am aware of. If you are concerned about the linear voltage fade of Aklakine or similar batteries why not switch to nickel metal hydride which have a more constant voltage during discharge until they get close to being spent.

Switching regulators are more complicated. My experience with them is that they involve a fair amount of calculation for the various inductors and capacitors.

You really need to determine the load current you expect so you can find out if the linear regulator will over heat.

If you really want 6V then using 24V instead of 12V is a complete waste if you have a linear regulator. If you need 11.5 then 12V may not be good enough. especially since the batteries are probably 12 ± 1 or 2 volts.

A heat sink on the regulator will help dissipate power, but only up to the point allowed in the data sheet. The data sheet probably assumes such a heat sink.

What type of non-rechargeable battery? You need to worry about how much its voltage falls before it’s depleted completely. You might also need to worry about internal resistance of the battery.

What’s the current and voltage requirement of your device? And how crucial is battery life? There are all sorts of trade-offs you can (and need to) make depending on these factors.

A lot of the newer regulators aren’t too complicated to figure out. The one thing about switching regulators you have to watch out for is that a lot of them don’t regulate properly if they don’t have a minimum load on them, so if your load varies too much they may go out of regulation on the bottom end.

If you have unregulated 12 volts and you need regulated 12 volts, plus you need some lower regulated voltage like 5 volts, you can use a 5 volt regulator and then use a 5 volt to 12 volt DC-DC converter. In the old days this was very costly and inefficient, but with the newer regulators and converters this is getting to be fairly cheap and easy.

Another possible consideration is that switching regulators are electrically “noisy” where linear regulators have a much smoother output. Sometimes this makes linear regulators desirable in spite of their much worse efficiency.

Ever wonder why you never (or seldom) see devices designed with batteries in parallel? There’s a really good reason…

Batteries in parallel discharge each other during storage quite rapidly. One will be a bit higher in voltage than the other and attempt to charge the other. This goes back and forth until both are discharged. Rechargables can actually discharge each other to the point they are ruined; they may never again hold a full charge.

This cycling is only significant when the batteries are stored in parallel. There’s a couple of ways to address this but it basically comes down to separating the parallel batteries. A switch or other method of disconnecting the terminals will work.

The circuit will be driving several different devices (not at the same time), so I don’t know if I can pin down a specific load current. Plus, each device will variably draw anywhere from .2 to 2 amps. How much wiggle room would I have with respect to component selection (caps, etc) if I just want to have one circuit with a variable resistor to set the output voltage for different devices?
The devices are all designed to be battery operated, so I think they should have a little allowance for voltage variation, right? Any other issues besides cost if I use a switching regulator instead of a linear one?

Thanks for the input!

I’m still not getting the whole picture. You have multiple devices, all of which are designed for battery operation? What’s the nominal battery voltage each one is designed for? Do they all run on the same voltage?

You said your devices do not have constant current draw, so you can’t use a resistor to set voltage. You need some type of regulator.

If you have multiple devices with different voltage requirements, you need a separate regulator for each voltage. Except devices designed to run on 8 batteries (12V nominal) - those should be connected directly to a 12V battery pack. No point in using 12Vx2->regulator.

For devices designed to run on less than 12V, you can find off-the-shelf switching regulators for 12V->6V and/or whatever you need. You can use linear regulators too - this will be a simpler setup, but less efficient.

They typically have more electrical noise, but that can be cleaned up with big capacitors. Many switching regulators have a minimum output current that need to be maintained, so if your device uses too little power at times, it may not work. Depends on which regulator you choose.

Oops, sorry for being somewhat vague and unclear before - the goal changes a little every time I learn something new from you guys.

I’d like to build an adapter of sorts that will power various battery powered devices like camcorders, cd players and just carry along different connectors instead of a bunch of different adapters. I want to use it in the car or with a 12V cell. Each device runs on different voltages so I would like to use an adjustable regulator and just selectively connect resistors to properly bias the regulator. My main concern now is whether a switching regulator would work for this application as it seems from the product sheets that I need specific inductors and capacitors for each voltage/device output level. [http://www.national.com/pf/LM/LM2678.html#Datasheet] This would force me to build a separate circuit for each device I want to power, rather than just switching resistors like I had intended.

My other option is to go with the linear regulator and suffer the 40-50% decrease in battery life…

I’m actually looking at the adjustable regulators where you can bias the output voltage with a resistor. On an adjustable linear regulator, it’s very simple (the adapters from radio shack use this design), but on an adjustable switched regulator, it looks like I may need to select other components for the specific coltage/load in addition to setting the voltage with the resistor.

Keep in mind that the 12 volts in a car isn’t really 12 volts. It’s usually a bit higher, varies a lot with engine speed (up to 14 or 15 volts isn’t all that uncommon), and is electrically very noisy (i.e. riddled with alternator and spark plug noise).

That’s a pretty wide range of current. Switching regulators often aren’t happy if they don’t have a minimum load of about 10 percent of their rated max on them. 0.2 happens to be 10 percent of 2 amps, but if you have a device that draws 2 amps you want your regulator to be able to handle that comfortably, so you’ll want your regulator to be sized for at least 3 or 4 amps. Some devices that use say a 200 mA wall wart don’t actually draw a constant 200 mA either. You may find it’s more like 20 mA with occasional peaks up to 200 mA.

Linear regulators do a lot better at handling minimum loads, but are much less efficient than switching regulators. You can always do the old trick of sticking a 10 percent load resistor on the output of a switcher. It’s not quite as efficient for lower current devices, but it’s going to be more efficient than a linear regulator for higher current devices.

A lot of load devices don’t need a particularly regulated voltage, as they have their own internal regulators, and some circuits just aren’t all that picky about their supply voltage. Other devices are very picky about voltage regulation, and some are also quite picky about noise as well. The more generic you make this, the more difficult you make it, since it has to output a much cleaner and better regulated supply voltage to handle all types of devices.

I think he can get around this problem if he can place a diode from each battery to the input of the regulator. This will lower the maximum voltage you can regulate to the load, by about 0.7 volts.

One caution I’m not sure was mentioned ; the heat involved here is not trivial. If worst case he uses the 24V and delivers 12V (about) load at 2 amps, we are talking 20+ watts in the regulator. (even 12V regulated down to 6V at 2 amps is 12W) Wastefulness aside, that’s going to get hot and if not properly dissipated can damage adjacent surroundings or give you a nasty burn if touched.

Yep, I spent some time with the spec sheet and I believe I can size the components (inductor and output capacitor) for my particular switching regulator so that it will handle the range of voltages and currents I need. Thanks for all the help… I learned some good things.

That’s a new one to me! Why don’t they just come to the same potential and sit there when the higher voltage cell has discharged slightly?

What’s the functional difference between say, 2 C cells in parallel and a larger cell, say a D cell? Basically, putting cells in parallel is the same as making the anode and cathode areas larger. Although their electrolytes aren’t coupled, which may make a difference I guess.

I’m not doubting you, just trying to get a handle on the mechanism here.