During the winter in cold climates, the outdoor temperature is always well below room temperature so the furnace is responsible for keeping the house warm. It seems that for electrical appliances, whatever their function, virtually all of the input energy is converted to heat eventually (where else could it go?). So, ignoring the difference in the cost of electricity vs. natural gas, the energy used by appliances will reduce the load on the furnace by an equivalent amount. Therefore leaving TVs, computers, lights, etc. turned on unnecessarily does not actually “waste energy.” Do you agree? Can you prove me wrong?
Electricity isn’t free, even if you spend less on the gas. You still have to have money to pay for that electricity.
I would postulate that these devices aren’t very good at dispersing their heat to the places it’s needed, and that they aren’t all that efficient at generating heat with the power they consume since that isn’t their primary function
Actually, they are nearly 100% efficient at generating heat. If you have a refrigerator that consumes 1000 Watts, every single one of those Watts is going to heat your house. A TV might be slightly less efficient, since some of it’s radiation is light, and some of that might escape. Still, better than 99.99% of a TV’s power consumption will end up as heat in your house.
So, yes, you may be able to offset some of your heating expense with power used for appliances.
Obviously it isn’t free in an absolute sense, but it’s “free” given that you would have spent the same amount on gas.
Uneven heating would probably increase local losses to some extent, but for the sake of the argument I think it’s ok to ignore these effects. But the efficiency of heat production is the crux of my argument - I think it’s close to 100%. It doesn’t matter what the primary function is, the energy has to go somewhere. For example, a speaker produces sound waves, but if the room is well insulated most of the sound energy will be absorbed by the walls, floor and ceiling, causing them to heat up. The only sources of loss I can think of are:
-light (eg. from TVs) radiating out the windows (can be made negligible with curtains)
-sound escaping from the house (insulation should keep this small)
-electrons leaving the house through phone/cable lines (negligible, also offset by a similar amount of electrons entering)
-plus obviously warm air escaping and energy radiated outside by the physical house, but this would happen regardless of whether appliances are in use, as long as the temperature is the same
The “waste” heat produced by appliances is heat. Without it, you would have to supply more heat to maintain the same temperature. In the absence of smart heat distribution systems, that heat is as efficient as any induction electrical heat provided. It is less efficient than gas, oil, or heat pump sources.
Variables of design might make it less efficient, such as electric cables outside the insulation area, or automatic venting to areas not needing heat.
Tris
Sure the spot next to your computer will be hot, but not the spot on your couch. Guess what you’re going to do in that situation? Turn on the furnace so the hot air system blows closer to the couch. Local hotspots dont usually help humans feel warm (which is a different problem than just adding energy to the house). Thats why most space heaters have fans or are built to direct heat.
Arguably, youre wasting energy if youre leaving stuff on just for the heat. Youre just generating local hotspots that you wont feel and wont make you feel warmer. I guess you could put your computer on top of your coffee table and point the rear end towards the couch so you can get some of the warm air. Then you might turn the thermostat down, but at that point you might as well buy a real space heater and see if it saves you any money. Depending on your energy costs it might be more expensive to use electric heat than gas heat.
The critical point is that a calorie of heat obtained from your electrical system costs more than a calorie of heat obtained from your furnace. Only about half of the energy produced by burning fuel to run a generator wind up as electricity for you to consume, due to various inefficiencies. In addition, maintaining the infrastructure that produces and distributes electricity costs money too, and is paid for by consumers of electricity. So the savings in gas or oil only partially offsets the cost of the electricity.
But the heat won’t just sit beside the computer and chill out (pun not originally intended but intentionally left in place). The computer is producing a steady flow of heat, it will spread throughout the house. The furnace will still be coming on (just not as much) so there should be decent air flow.
I assure you I’m not actually interested in saving money. All I care about is the theory!
For more practical considerations (elaborating on Uncertain’s post), this document indicates that electrical energy costs about 3X as much as natural gas (per watt). Wakefield Community College isn’t that well known in the energy sector but I’ll give them the benefit of the doubt. That combined with the local heating issues makes this idea a pretty strong non-starter in the real world.
>The computer is producing a steady flow of heat, it will spread throughout the house.
Right, over time all energy spreads, but there’s a critical amount of heat that must come out in whatever space you are in, before you, as a human, say “Whoa its warm in here, I better turn down the furnace.” Typical appliances in typical spaces wont really do that. So yes, that fridge and computer will raise the average temp .05 F, which is great if you werent a human, as you cant even feel that difference. Youre better off just putting on an extra undershirt or opening the blinds.
>The furnace will still be coming on (just not as much)
The thermometer in your home isnt that accurate. It’ll come on as much as before, unless youre sitting next to a server rack and burning some serious wattage. Unless you live in a datacenter then I doubt you’re really going to see a real difference in the heating bill.
The problem with undirected heat energy is that it’s not all raising your house’s temperature.
Some of it is lost to converting things chemically. If the wall and floor behind the fridge are getting hard and yellow, it’s consuming energy to do so. You are cooking your walls, and like regular cooking that’s energy being trapped in new chemical bonds.
You can’t be serious.
You simply can’t.
Well, sure, but you’re not going to be running your computer just to heat up the coffee table. But if you have some other reason for wanting to leave your computer on (for instance, you don’t like waiting for it to boot up when you want to use it), the fact that the energy isn’t being completely wasted might be enough to tip the scales on whether running the computer is worthwhile.
Is it just me, or is this an an unusual number of posters essentially falling over themselves to tell an OP that’s basically right that he’s wrong?
Yes, that happens.
I recently had to move a refrigerator away from a kitchen cubbyhole wall. It (and predecessors) had sat in that location for at least 40 years, venting heat against the wall. The (1950’s era) wallpaper behind that was dried out and almost flaking off the wall. Much worse shape than the same wallpaper just a couple yards away.
That’s not the point. What he was saying is that some of the thermal energy is going into creating chemical bonds. Sure - but how much? One part in a hundred quadrillion?
It’s simply absurd to claim that that is a significant error in the evaluation of the heat being produced.
Yeah, all your appliances and computers that are in the area you want heated probably reduce your heating bill to some degree. As others have noted, essentially all the energy they consume turns into heat.
I used to work in a small office where we had four or five PCs, three small IBM midrange boxes (one AS400 and two System 36s) a small refrigerator, and miscellaneous other stuff like lights, desk lamps, etc.
We hardly used the heat at all during the winter. And this is Utah.
On the other hand, we ran the hell out of the AC in the summer. So for us, I doubt we saved much on the Heating/AC bill, I think it was probably about a wash.
But if you’re in a climate where you principally need to heat your living space, you’re quite likely getting some break on the electricity in the form of a reduced heating expense.
For every watt of electricity that warms your home, another one or two warms some wires and a power plant somewhere far away.
Now, if you had a home heater that was generating electricity while generating heat, then you’d really be getting something for free.
Honestly, at first I thought the OP was proposing some sort of generator between the cold outside and the heat of the house.
Electric heat is extremely inefficient. Of course the generation of heat from electricity is near 100%, but first you have to generate electricity from heat back at the power plant. You are much better off just burning the fuel at your house. Of course that is what people up north do. Leaving your computer on to heat the house is just dumb.
Doesn’t the energy get used to move stuff? If you run vacuum cleaner, some of the energy is wasted as heat, and some of it is used to turn the motor, right? If a refrigerator was 100% efficient, all of the energy would go toward cooling your food and there would be 0 heat waste, right?