As long as I’m willing to buy more power strips, I can add as many appliances as I want, no? What?
Our home office is served by a single 15 Amp circuit. Besides the overhead light, there are a few outlets with a passel of equipment plugged in. I’d like to avoid a fire hazard or tripping a circuit, but don’t no how to figure out what I can add (or if I have too much already).
Also, if I do eventually call in an electrician to add a line, any idea what that’s going to run? I’ve added circuits before in easy-to-reach locations—a fairly simple job—but this is two floors up from the circuit panel (detached house, basement panel, 2nd floor office).
You can add more loads as long as their total current drain does not exceed 15 amps. Any more and your breaker will pop or your fuse will blow. Hopefully.
This can be a problem if you have a bunch of stuff running and you plug in one more heavy load, like a big motor (vacuum cleaner) or resistance heater (toaster). (I recall a situation where we could use the toaster or the hair dryer but not both…)
Every appliance/computer/wall wart/battery charger lists its maximum current drain, or at least power usage. (You can figure out the current drain by dividing the power usage by 120.)
You would need to add up all the maximum amperage draws from everything on the circuit. You should be able to find the power rating marked on the units. If you exceed the rating you will start tripping the breaker. (Note that ratings are usually maximum, so you might be able to add up to say 20 amps and still be under the 15 amp typical draw.)
Power = Amps X Volts
Amps = Power/Volts
So, a 100W light bulb draws about .9 amps.
As far as running a separate line, I do this kind of stuff myself for the cost of materials. I’m guessing about $50 to $100 in stuff, and probably another $100 for labour. But that’s just a WAG.
ETA: Yeah, what **Sunspace **said. (My fellow electronics technologist.)
Thanks. I figured I’d be doing math, but am still a little unclear on the concept.
I have a 500 Watt power supply in my desktop, but I’m fairly sure I don’t use the full load—do I need to look up the board, CPU, and video card individually? I imagine other things in there (RAM, sound, drives) take up a trivial amount of power, but should I account for all the other gewgaws in there?
What do I divide the 500 Watts by?
For everything else, do I just look at maximum Watts (i.e., my monitor says “57 Watts (typical); 110 Watts (maximum)”?
We also have a couple UPSs in the office. Do we use their rating or ignore that and just look at the items plugged in?
My big unknown in adding a line is the labor/PITA of going up two floors. For me (and I’ve seen lots of threads here expressing frustration), going up through the floors is likely to lead to enormous headaches and ruin a lot of sheetrock. But I figure an electrician would have done hundreds or thousands of these, so will be much more adept—adept to the point of trivial labor (i.e., down in the $100 range) or is it a bear of a job regardless?
It really depends upon how accessible the walls are. Usually by sawing a small access port, drilling in to the floor and using a “fish” wire you should be able to find a way to run a cable through the wall. Patching a bit of drywall isn’t that big a deal, but painting to match might be. Of course you might be able to just put a plastic access cover over the hole and be done with it.
Everything that plugs in consumes power. Add up all the Watts and divide by 115 Volts. I would use the maximum ratings to see how close you come to 15 amps. If it comes out to say 16 or 17 using maximums you should be OK. If it comes out to, say, 25 you might be screwed. If if comes out to 10 you are fine, etc.
Power, strictly speaking, is the rate at with something uses energy. The 500 watts* is the maximum rate at which the computer will draw electricity.
The power supply provides energy to all the other things inside the computer–it’s ‘upstream’ of the other parts of the computer–so when considering the computer as a whole, you only need to consider its rating.
Power/voltage=amperage, so divide 500 watts by 120 volts to yield 4.167 amps.
Yes. Most of the time any given device will not be drawing at its maximum power, but assuming that they are ensures greatest safety.
Depends. I’m not that familiar with UPSes, but I have a laptop, which is like a computer with its own built-in UPS.
If the UPS is just passing electricity from the wall through to the connected equipment, it’ll just draw a little on the side to maintain itself. If there’s no external power, the UPS will draw down its internal battery to power itself and the connected equipment. The maximum power usage of a UPS occurs when the power comes back on, and the UPS resumes passing current to the connected equipment and powering its own operation… and it also charges its internal battery.
The UPS should give a maximum rating for the equipment it can power off its battery, and another maximum rating for the full load it will draw: connected equipment, charging battery, and everything.
[sub]*Note: the full name of the unit ‘watt’ does not take a capital W; that’s only for the symbol W.[/sub]
I would like to add that your room is most likely not an isolated circuit. Outlets that are installed when a building is constructed will be wired in a convenient pattern. An outlet in one room often shares an outlet in an adacent room because of a common wall.
I had a circuit in my basement that started with a single light, moved upstairs to another light and socket, moved to the 2nd floor to a light and a socket and then went outside to the garage. Before I rewired it you couldn’t begin to run heavy equipment in the garage.
I have so many electrical items running on my office circuit that I had to plug my laser printer into another room/circuit. Otherwise it would dim the ceiling light slightly as the heating element cycled. Very annoying.