Quickie: power / bulb / dimmer related question -

If I have a lamp stand with a 300W bulb in it, but use a dimmer switch to lower the output, does it still draw the same power? I have a friend who insists it will cost the same to run the bulb on any setting on the dimmer - that dimming doesn’t effect the use of electricity, only the comparative output.

So, is he right?

Thanks.

http://www.smarthome.com/solution41.html

The way I see it, it works like this: If for example you set the dimmer so that the 300W lightbulb consumes 100W (1/3 as bright), the dimmer itself will consume about 2W. So the total power usage will be 102W.

So, effectively a light at 1/2 dimmer setting will cost 1/2 as much to run as one on full? So, effectively, my mate is talking toss?

Thanks, and welcome, GreyWanderer.

I should say, (in case it makes a difference) the bulb is in a standard lamp plugged into a normal socket, and the dimmer is a sliding switch on the lamps lead (rather than a fixed wall switch).

Correct.

It shouldn’t make a difference if the dimmer is on the lamp. If you want to be absolutely sure, put the dimmer on 1/2, wait a while, then touch the lamp and the dimmer itself and check how hot the dimmer is compared to the light bulb. If I’m right the lightbulb should be too hot to touch while the dimmer is barely warm.

To expand on what GreyWanderer said, I suspect your mate simply has an incorrect impression of how dimmer switches work. He’s probably thinking that the dimmer switch absorbs a portion of the electrical power that would normally go through the bulb. However (I just found out), dimmers actually work by breaking the circuit to the bulb for a short period of time. HowStuffWorks has an informative explanation of how the switch works. It also explains why my lights buzz when the dimmer is turned down low, which is something I’ve wondered about.

Most dimmer units actually do lose a few volts through them, and they consume a small amount of power.

The result is that a almp run on a dimmer will not be quite as bright as one without.

This was something that surprised me, but having seen it for myself(whilst being sceptical at the time) I found this was the case.

Modern dimmer switches don’t lose much power, but some older ones did.

Note that light bulbs produce heat and light. At lower voltages, more of the energy is going into heat rather than light. The bulb looks orangier (oranger? more orangier? fweep it). OTOH, this extends the lifetime of the bulb so you save money there.

In general, dimmer is better. Off works best of all.

The reason is not so much any voltage drop in the dimmer(which would be negligible). The reason is that the dimmer is not conducting the full 180º of the semicycle. The dimmer works by adjusting the percentage of the semicycle time it is on but even at full tilt it cannot conduct starting from t=0 because it need a certain voltage to trigger it so it is losing a bit of the beginning. This is easily seen in an oscilloscope.

**
You mean, if I take the light bulb out, and turn on the switch, the electricity won’t leak out on the floor?

Not true, at least not for halogen lamps. They need to heat up properly to be able to regenerate, and running it on a dimmer for an extended period of time is not advisable. I think it’s enough to run it on full blast for a couple of minutes every now and then, though.

(Sorry, no cite, but it was discussed here a couple of weeks (months?) ago, and I’m sure you’d find it with a diligent search.)

Ah well Sailor, I did know the whole reason why it was so, but I was trying to keep the explanation as simple as possible, and yes, the average voltage is reduced due phase angle of the trigger point, which is not exactly the same as saying what I did, though that can be interpreted to mean the same when one looks further into the way a dimmer actually works.

Well I’m not sure if it’s negligible or not; I guess that’s a matter of opinion. A typical triac has a V[sub]TM[/sub] between 1 and 1.5 V. So as an example, if we assume:

• The circuit is 120 VAC
• The light can dissipate 100 watts at 119 VAC
• The dimmer is adjusted for maximum brightness (knob fully CW)
• The dimmer is designed so that the triac is always in the circuit
• The triac has a V[sub]TM[/sub] of 1 V at 0.84 A

then the triac would be dissipating an average of 0.84 watts. If we include the rest of the dimmer’s circuitry (phase fired proportional controller, series resistance of inductor, etc.) then the dimmer is probably dissipating an average power of around 1 watt or so for a 100 watt light. Not much, I agree. But I believe some dimmer manufacturers have reduced the dimmer’s power consumption to near zero (at full brightness only) by engaging a mechanical bypass switch when the knob is turned fully clockwise.

This was the core philosophy of the power company that served a fancy Virgin Islands hotel I once stayed at.

With a standard diac/triac setup, the triac won’t turn on until the diac’s forward breakover voltage (V[sub]BO[/sub]) is reached. V[sub]BO[/sub] for an ordinary BR100 diac is 30 V.

On a 120 V system, 30 V is reached at about 10[sup]o[/sup], so no current will flow in the light for the first 10[sup]o[/sup] of each half-cycle.

However, the effect of delaying the turn-on to 10[sup]o[/sup] is to reduce the rms voltage by 0.1 V, which is obviously much smaller than the 1 V reduction caused by the forward voltage drop across the triac.

You need to delay the turn-on to 25[sup]o[/sup] to achieve an effect that matches that of the forward voltage drop of the triac.

The voltage drop in the triac is less than the natural voltage oscillation on the line due to varying loads, etc. I do not think you could detect a voltage change of a couple of volts by the brightness of a lightbulb but, hey, be my guest and get a voltage stabiliser for your entire house.

FWIW, since the original question had to do with wattage consumed when using a dimmer, check out:

How A Dimmer Switch Works