We have a radio/alarm clock in our living room that I hate. Why? well, the switch that switches it from ‘off’ to ‘on’ to ‘alarm’ to ‘radio’ seems to have a mind of it’s own, so sometimes, I’ll come into the room and hear a ‘hissing’ noise since apparently, the radio seems to be on (but not to a particular station, hence the hissing), and volume turned low.
I’ll switch the dang thing to appears to be ‘off’ and sure enough a few days later, it seems to be hissing again. I think the ‘off’ and ‘alarm’ places are too close together or something and while I may think it’s off, it’s really set to alarm.
BUt, that’s not my question.
My question is:
A radio alarm clock runs as long as it’s plugged in. Does it use more electricity when it’s doing more than one function ? (ie, if the alarm is sounding, the radio on etc.) ( not that I think I’m going broke paying for whatever the additional electricity might be over the course of the year, but I just wondered)
To elaborate on Gary T’s short and sweet answer, while it may not be the case for every single electronic device (and certainly one could be designed that would unnecessarily use electricity), you can think of most functions as being controlled by a switch on the device. So, when the alarm goes off, something switched it on (pardon the idiom). Until that time, it was not connected and therefore consuming no energy, just as a light that is not switched on will not consume energy.
In fact, in a very broad sense it should be clear that the device only uses electricity when turning it into some other form of energy (be it sound, visible light[sup]*[/sup], heat, or whatever). Though it isn’t always easy to monitor the amount of heat produced vs. other energy that comes out of it.
[sup]*[/sup]I suppose one could reasonably argue that visible light and electricity are the same, though there is a definite conversion from 60 Hz AC out of the wall to the lighted numbers on a clock display.
I read in a magazine a few years ago (the title escapes me now) that said if everyone unplugged everything that was drawing electricity but was not used in the past 30 days, the average home would save up to 10% on their electric bills. I have an old clock radio in my garage that has been on continuously for over 10 years, I don’t use it for the time or as a radio. Maybe I should unplug it. Nah.
It costs about $0.50 to run a 1-watt device for a year. That’s 8760 hours per year, and $0.00006 per watt-hour energy cost = .53.
A digital clock gets warm, so it’s probably using 2 or 3 watts and costing you one or two dollars per year. Note that in the winter, any electrical load will warm your house and make your furnace run that much less.
The big energy costs aren’t “vampire loads” such as wall-wart transformers. They are from bad home insulation in winter environments… and maybe from people like me who leave their 300 watt PC and monitor on when not in use.
You are absolutely correct. Just dropping your thermostat temperature by 1 degree F (assuming no hysteresis, and that it actually does change the setting to be 1 F colder) will save you much more than unplugging various clocks and transformers around your house.
That having been said - I still think people should unplug any transformer or device that is not needed. The effect is small, or even miniscule, but still it’s a savings of energy. And you lose nothing by having unused devices unplugged.
I also do worry about the “hot” ends of 5-16V transformers just laying around. I have seen an unused cell phone charger short out by having it’s loose “hot end” touch metal. Scary.
You loose time bending down and crawling under that table to plug and unplug the various little things. That is a hassle most people will put up with only for so long. Especially if the savings are minuscule.
And the opposite also applies. In the summer, any electrical load will make your air conditioner work harder, and the effect is even greater, because air conditioners are not very efficient. So each one watt unused appliance actually wastes about 2.5 watts.
>> So, when the alarm goes off, something switched it on
Funny you should mention this as just a few days ago I had a misunderstanding with my Chinese girlfriend. We were talking about the camera’s flash and she had me completely confused as I asked her “Did the flash go off?” and she said "no"but the context indicated she meant “yes”. Finally the situation was clarified when I explained to her that in English when you say "The flash went off’ you really mean it went “on”. Makes sense, doesn’t it?
>> In the summer, any electrical load will make your air conditioner work harder, and the effect is even greater, because air conditioners are not very efficient. So each one watt unused appliance actually wastes about 2.5 watts.
Hmmm… sorry but it’s kind of the other way around. An a/c with a COP of 2.5 will use one wh of electricity to remove 2.5 wh of heat from the room. I’m not sure what you mean by “inefficient”. I suppose it would be inefficient when compared to an a/c with a COP of 10… the problem is they haven’t been invented yet… in fact they are probably impossible in this universe. But theybe more efficient if they existed.
Ok, seems that the consensus is much as I expected, “yes” it’s using more energy, “no” I won’t be able to go to Palm Springs (or even across town) on the potential annual savings, but “yes” I probably should try and get Snookie, the electronic repair genius to fix the damn thing so it actually can be reasonably positioned to “off” (yes, we use the clock aspect of it).