Current limiting resistor for LED strip

I have a 5 meter led strip that is powered through a car battery. The current drawn is 3 Amps and it drains the battery a bit faster than I would like.

I tested a 0.5 meter piece of that same strip on my bench. Without a resistor it draws 0.3A. I put a 30 ohm resistor in series and I am happy with the result, because there’s not much difference in light output and the current is more than halved at 0.13A

Now what resistance should I put on the 5 meter strip to have the same result? Also, all resistors I have handy right now are rated at one quarter watt. Should I need higher wattage resistors?

Ohm’s law.

For your .5m strip, the voltage drop across the 30Ω resistor is V=IR = .13A * 30Ω = 3.9V

Assuming you want the same voltage drop for your 5m strip, and assuming that the 5m strip takes 10x the current of the .5m piece, your resistor needs to be 3.9v/1.3A = 3Ω (which makes sense, since everything is 10x as big).

The power dissipated in the resistor is I^2 R = 1.3 * 1.3 * 3 = 5W. so get a beefy resistor!

I have a question inspired by the question. Would a voltage regulator or a DC-DC converter, in the circuit use less power than adding a resistor to the circuit?

I think you’d be better off with an active driver circuit that modifies the duty cycle of the lights. Since LEDs go on and off instantaneously (as far as the human eye is concerned), if they are on for 5ms and off for 5ms, you will perceive about half-brightness and get about 50% power drain.

The circuit need not be complex. A 555 chip wired to provide a variable duty-cycle square output (I think it’s a basic astable with a pot for control and a diode - been a few years since I 555’ed), driving a suitable power transistor… and presto, you can adjust the lighting to whatever combination of light level and power draw you like, with very little power wasted as it would be through a resistor.

Actually, you can get better brightness at equivalent power drain using this method instead of simple current limiting.

PM me if you’re not up to finding a circuit, and I will pull something off the shelf.

Absolutely.
DC-DC converters can be upwards of 95% efficient.
So, instead of wasting 5W, you’d only be wasting .25W.

The downside is expense.

How about this one?

The 15W output is troubling me. Should I put two of these in parallel?

That should be fine.
You are dropping almost 4 volts right now (with your external resistor), leaving 9v across the LED string. You measured 1.3 A, so 9V @ 3A should be plenty.

15 watts and 9 volts at 3 amps? Something is wrong there; that is 27 (9 x 3) watts, although the 15 watt output should still be enough by itself. Also, you don’t want to parallel two DC/DC converters (or linear regulators) since they will have slightly different output voltages and one will take all of the load until it gets overloaded (which may be considerably higher than its rated output). This is even more true for DC/DC converters using synchronous rectification (safe to assume if the efficiency is well over 90%), since they can sink current, thus the converter with the lower output voltage will try to drag down the output of the other one.

Also of note, pulsing the LEDs at 12 volts with a variable duty cycle would draw more power than reducing the voltage to get the same average current (12 V, 0.13 A = 1.56 W (+ some losses in the control circuit); 9 V, 0.13 A = 1.17 W, using the claimed 96% efficiency for the DC/DC converter brings this to 1.22 W). If I were doing this myself, I would make my own DC/DC converter (just because I wouldn’t want to have to buy one or wait for it to arrive).

If you want to be hard core, a number of chip manufacturers make power controllers that are specifically designed to power LED arrays.

Using a series dropper resistor to power an LED from a voltage source is convenient, but very inefficient. While this doesn’t matter too much for an LED biased at, say 2 mA and driven from 5 V (the LED would use about 4 mW, and the resistor would use 6 mW - big deal), these losses are unacceptable at higher powers. The usual neophyte error is to “fill up” the available voltage rail with LED voltage drops, so for instance if an LED has 2 V of forward voltage drop across it at the required current then 6 LEDs in series would fit nicely across 12 V with only a small series resistor. In reality this is a very bad idea, as the LED current then becomes a very strong function of supply voltage, temperature, ageing and batch spread. Look at the voltage versus current graph from this Cree LED and it can be seen that changing the bias voltage only a little, from 2.6 V to 2.8 V, increases the current from 100 mA to 400 mA.

The best way to bias the string of LEDs in the OP would be to use a dedicated constant current LED driver that includes 12 V within its working range. This will bias the LEDs to the set current, which will remain invariant as long as the voltage supplied is within spec. Here’s one that will work with 12 V DC. Others have adjustable current outputs, or can be driven from a domestic AC supply. Go google “constant current LED driver” - there are many available. They are invariably “switched mode” designs, so are very efficient, though price and quality/reliability are strongly related…

Exactly. I’d check the datasheet on that part before using it. It would not surprise me if the package is rated to dissipate 15W, and the maximum current rating is 3A, or something like that. (Actually, it wouldn’t surprise overmuch if the specified power performance is worse than you’d think, but without seeing it I’ll assume it’s correct). You’d probably still be okay, but check it before you use it.

And I kept the part about not putting DC-DC converters in parallel because it bears repeating.

I just now realized that the LED strip is actually split in two. Essentially they are two 2.5meter strips in parallel, fed by the same battery.

Something like this, olny substitute the resistors with leds.

Can I now put a DC-DC converter in series with each piece of LED strip, or am I going to have the same problems?

Assuming that each converter has its own branch, that should work. Though it does mean that your original question & answer might not be correct.

You can use one DC/DC converter if the total current draw is within its ratings, just like putting two light bulbs in parallel (the one you linked to can supply 1.67 amps if the 15 watt rating is accurate, assuming that two 2.5 meter strips draw as much current as a 5 meter strip, as you originally calculated; 1.3 amps should be safe for a well-designed converter).

Thanks, I ordered two of them, just in case.