Alternating Current

I’m trying to understand the concept of AC power in contrast to DC. So, when you receive AC power the charge is constantly switching from positive to negative, so how exactly does a house hold appliance use this power, for example a light bulb needs the positive charges to go to one part of the bulb and negative to the other part, how does the charge get from one place to the other? Maybe I have this completely wrong, could someone dumb it down for me? Thanks.

The real confusion is that you are asking about power, not current or voltage.

In AC systems (alternating current), the current and the voltage go from positive to negative and back to positive sinusoidally like a wave. But that is not what your house sees, because it uses power.

Power is the product of voltage times current (P = VI) which ends up always being positive since a negative number times a negative number is positive.

So the end result is power that goes from zero up to some higher value (120W for example) and back down to zero. But this happens really really fast. The end answer is that your appliances see an average (RMS) value.

*this assumes the voltage and current are in phase which they should be.

Take you lightbulb on a 60Hz line. For 1/30th of a second, a lot of electrons flow one way thru a filament heating it up. For the next 1/30th of a second they flow the opposite way, still heating it up. Lather rinse repeat.

Many electric motors take advantage of the cycling to change the polarity of the electromagnets in the motor it get it to spin. Many standard electronics (TV sets, computers, etc.) have power supplies inside that convert the AC to DC. Google on “bridge rectifiers” if you want to learn more.

Let’s talk regular incandescent bulb here. It makes light by resisting the current flowing through its filament, which heats it up to the point that it glows white hot. (regular light bulbs are actually much better at giving off heat than light, hence their use as the cooking element in the Easy Bake Oven.) It will work whether the current always flows in one direction (DC) or flows one way for 120th of a second, then the other way (AC).

The usefulness of AC comes in when you take into account that the wires leading up to the bulb also resist the current in a small way. With DC, they might eventually heat up to the point of burning. With AC, the average power delivered to any point of the wire over time is zero, so you can send power farther with AC.

My favorite aspect of AC is that the current obviously has to pause briefly when changing directions, which means there is no current at that time. Therefore, your light bulb running on AC is blinking on and off 120 times per second.

Nitpick: RMS is the effective value, not the average.

RMS = peak × .707
Average = peak × .637

Where current and voltage are not in phase (any circuit besides a purely resistive one), something called power factor comes into play. There are inefficiencies in circuits where the power factor is less than unity, and therefore it is the goal of engineers to design circuitry so that it appears as resistive as possible to the power source.

A detailed discussion of power factor is not in order since it does not really address the OP, but here is a link.

Oops, bad math. 1/120th.

Let’s start from the top…

Only partially true. The voltage at your AC outlet is virtually guaranteed to be an alternating & sinusoidal wave. But while the current is usually alternating and often resembles a sine wave, it does not need to be alternating nor does it need to be sinusoidal.

In a reactive circuit you can indeed have instantaneous “negative” power.

There’s no such thing as RMS power. And average voltage (or current) is not the same as RMS voltage (or current).

Where did you come up with that? The primary usefulness of AC is that, unlike DC, transformers can be used with AC.

Not really. The thermal time constant for a light bulb’s filament is much longer than the period (or half period) of the AC excitation voltage. As a result the intensity will vary sinusoidally, but the intensity won’t go to zero.

I’m starting to understand, although for example if I put a battery in a CD player backwards it will not work because it is not receiving the correct charge right? So, why is it that other appliances work if the charge switches 120 times a second (or runs through it’s cycle 60 times a second), does it not matter what the charge is in appliances then? If it doesn’t matter why does the charge need to switch?

The Hat’s Rabbit: Some things run on AC (e.g. many motors, heaters, etc.). Some things run on DC, such as anything with solid state or active electronic components.

An AC source can (obviously) power an AC appliance.
A DC source can (obviously) power a DC appliance.
An AC source can power a DC appliance if there’s a rectification stage.
A DC source can power an AC appliance if there’s a oscillator and/or inverter stage.

Motors typically use AC electricity directly. Light bulbs don’t much care if the electricity is AC or DC. Most other things in your house require DC electricity. Your CD player, for example, has a DC motor in it, and the amplifier circuits and such inside of it need a constant DC power source. Most devices take the incoming AC and convert it to DC.

Since most things in your house use DC then the obvious question is why do we use AC in the power system? The answer is that if you transfer the same amount of power at a higher voltage you lose a lot less power to heat in the wires. AC can be stepped up and down in voltage with a simple transformer (just 2 coils of wire wrapped around a common iron core). There is no simple component that can step DC voltages up and down.