kVA vs. Watts

What is the difference between kVA and Watts? I know that it has something to do with the square root of three (or something like that) and I probably learned this at one time in my University education. But now that I am in the real world, I don’t understand why kVA is used sometimes and Watts (or kW) is used other times.

Is there anyone out there that can help me to understand this?

kVA: Kilo-Volt-Amperes: Voltage x Current (sortof). This is the power being transmitted by the line.

Watts: A Watt has the same unit as a Volt-Ampere, but is generally taken to meen absorbed (used up) power only.

kVA are the total power that the power company sends down the line, and kWatts are the power that was wasted by the client or used to do work for the client.

kWatts are used for billing, so that they bill you for what you use. kVA are used for indicating the capacity of components in a transimission system, because even if the power does no work, it is still energy being moved around.

Note: I am not a professional in the field. I have only taken one course on power systems, and it is entirely possible I screwed something up. Reactive power (the power that moves around but isn’t actually used up by anything) is a weird concept to understand.

Not exactly, though you’re on the right track. VA is apparent power, and is volts times amperes, as you say. Watts is true power, and is volts times amperes times power factor, which is a measure of the displacement of the current phase angle, and is only needed for reactive circuits. Reactive circuits are those which have inductive or capacitive elements, or both. Your power meter measures true watt-hours.

I was going to say VA is real power, while watts includes both and the complex part (of course, this would only matter if there were reactive components). Is this the same as your definition, Q.E.D? Not sure how it relates to the power factor.

Q.E.D. is right.

W(watts): real power

VA(volt-amperes): complex power

VAr(volt-amperes reactive): reactive power

W/VA: power factor

Okay, so I had it backwards.

Here’s how I think about this:

Ideally, an inductor uses ZERO energy. A Capacitor uses ZERO energy. Only heaters, lighting, motors, etc., use energy. Yet if you connect a capacitor or an inductor to your AC outlet, a signficant current appears, which seems to imply that lots of electrical energy is flowing in the line. But this is impossible, since the coil or capacitor remains ice cold. Where is the energy going?

It’s going back and forth. When you hook a coil or capacitor to an AC generator, first the generator transmits energy to the coil/capacitor, but then the coil/capacitor sends all the energy back again. Ideally, none is used.

Volt-amps does not tell you the true energy flow. The real energy flow is measured in watts. If you hook a wattmeter to a big capacitive load, the wattmeter will indicate zero. But if you measure the VA in the capacitor circuit, the volts and amps will be large, and multiplying them together gives you the VA reading.

So how are watts measured? Not difficult in theory. You just measure the volts and amps over a tiny instant, then multiply them to get the wattage at that instant. That way the wattage at a particular instant will be positive if the generator powers the load, or negative if the load is dumping some energy back to the generator. Then you take the average wattage over a few seconds to determine “the” wattage used by the load. See the trick? If the energy was ONLY flowing back and forth, then the average energy-flow over a couple of seconds will average out to zero.

When you mention sqrt3 in power calculations, you will be addressing 3 phase circuits.

As you will know r.m.s is the root mean square value of an alternating quantity, but this only applies to single sinewaves.

When you have 3 sinewaves displaced by 120 degrees(I’m struggling to do it in radians due to keyboard incompetance) then the square root value cannot apply, hence root 3 is used.
Why use KVA and Watts?

Power is fed down transmission lines as a product of current and voltage, however due to various circuit components they become displaced relative to each other, especially at the consumer end which can have a big impact on that displacement.

This displacement is called the power factor and tends to reduce the amount of power that can be transmitted.

Work can only be carried out by the in-phase components of current and voltage, but we need to take account of the time displacement, hence

Watts(true power)= VAmpspower factor(the latter is the cosine of the displacement angle between them)

The load presented to the supply by household consumers is neither high enough, nor reactive enough, to need to worry too much about separate charges for power factor.

The load presented by industry to the power supply is a differant matter, it can be highly reactive and can drag the current and voltage out of phase by plenty.

The result of this is that although a normal Wattmeter would show the real power consumed, the cables, feeders, breakers and all the transmission lines have to be able to cope with the true power, plus (using pythgoras) the reactive power, and this latter can be considerably more than the true power.

One way a utility supplier bills its heavy duty consumers is to take account of this, so there will be a charge for total electrical energy consumed - true power, then there will also be one which will take into account the power factor to be added to this charge.

There is often a maximum demand charge too, that is, if you exceed a certain agreed instantaneous demand within say one billing period, you pay a premium for this - this is because you are effectively using up reserve power capacity which has to be maintained. This works in your favour as the rate per unit on this kind of bill is less provided you stay within your agreed limit.

You may also pay for security of supply, this is where very large consumers will pay the power company to guaruntee that if there is a network problem, they will be least likely to be shut down when they have to cut other users off temporarily.

Here’s how I think about this:

Ideally, an inductor uses ZERO energy. A Capacitor uses ZERO energy. Only heaters, lighting, motors, etc., use energy. Yet if you connect a capacitor or an inductor to your AC outlet, a signficant current appears, which seems to imply that lots of electrical energy is flowing in the line. But this is impossible, since the coil or capacitor remains ice cold. Where is the energy going?

It’s going back and forth. When you hook a coil or capacitor to an AC generator, first the generator transmits energy to the coil/capacitor, but then the coil/capacitor sends all the energy back again. Ideally, none is used.

Volt-amps does not tell you the true energy flow. The real energy flow is measured in watts. If you hook a wattmeter to a big capacitive load, the wattmeter will indicate zero. But if you measure the VA in the capacitor circuit, the volts and amps will be large, and multiplying them together gives you the VA reading.

So how are watts measured? Not difficult in theory. You just measure the volts and amps over a tiny instant, then multiply them to get the wattage at that instant. That way the wattage at a particular instant will be positive if the generator powers the load, or negative if the load is dumping some energy back to the generator. Then you take the average wattage over a few seconds to determine “the” wattage used by the load. See the trick? If the energy was ONLY flowing back and forth, then the average energy-flow over a couple of seconds will average out to zero.