Am I about to burn my house down if I use my new microwave?

About a month ago, I bought a new microwave. However, the guy who installed it noticed something strange; the old microwave had the plug at the end of its cord removed, and attached to another cord inside an electrical box in my cabinets, with the other cord attached to the same outlet as my refrigerator, while an outlet directly above the microwave was left unused, so he plugged the new microwave into the unused outlet and left.

It never occurred to either of us that the two outlets were connected to two different lines; the one the microwave was using was connected to a 20A breaker, and has only that outlet connected to it, while the one he plugged the microwave into was on a 15A breaker that also included all of the kitchen and living room lights plus the TV, DVR, and stereo. This is Not A Good Thing when the microwave by itself draws 14.75A of current when it is running.

I called in an electrician, and he did something similar to the old setup (the only difference is, rather than remove the plug from the microwave’s cord, he installed an outlet into the electrical box and plugged the cord into that). However, he noticed that the outlet that the microwave and refrigerator are plugged into is not a 20A outlet, but a 15A outlet (the jacks are missing the “T-tab”), even though the breaker is marked 20A, so I assume it has AWG 12 wiring.

Am I about to burn my house down by having what appears to be a 15A outlet hooked up to what had better be a 20A if the microwave and refrigerator are running at the same time? Would replacing the 15A outlet with a 20A one improve the situation? How do I even tell what kind of wiring is there without tearing my house apart, anyway? (It’s on the first floor of a two-story house, but the garage is directly underneath the first floor; I know the cable TV wiring goes in through the garage to the first floor.)

There’s nothing wrong about having a 15A receptacle on a 20A circuit. Using a 20A receptacle on a 15A circuit is a different matter and not allowed.

Nah. You’ll be fine.

Also, a 20A circuit will have (at least) 12 gauge wiring, as opposed to 14 gauge, along with a 20A circuit breaker. In general, the only difference between a 20A and 15A receptacle is you’re able to use both a 20A and 15A (120V) plug with the 20A receptacle, whereas you cannot use a 20A plug with a 15A receptacle. Since your microwave has a 15A plug, it won’t make any difference if you were to use a 20A receptacle with it.

To tell what gauge wiring is there, you could check at the circuit breaker and at the receptacle. They should both be at least 12 gauge for a 20A circuit. If you’re not familiar with the difference between 12 and 14 gauge by sight, the cable jacket of the wiring should be marked with the gauge number. 12 gauge jacket color is commonly yellow and 14 gauge is commonly white, in US at least.

Don’t let your microwave get you in a Stranglehold.

Hmm. I find it interesting that the oven draws this much current yet has a standard 15 A plug (NEMA 5-15P). Even though it isn’t true in your case, this means it can be plugged into a 15 A circuit. I’m not an electrician, but I seem to recall a branch circuit shouldn’t be loaded to more than 80% of the circuit rating. Which means a 15 A circuit shouldn’t be loaded with more than 12 A. If this is true, I wonder how the oven manufacturer gets away with installing a 15 A plug on the power cord? Shouldn’t they install a 20 A plug (NEMA 5-20P)?

Would it matter if it is a dedicated circuit?

Not sure. Did a quick google search, and saw a couple references saying that the 80% rule is only applicable for loads that are continuously powered for more than 3 hours. Not sure what the SD is on this.