For the electrical Engineers. A question about resistors.

Ok, I know that this is a very basic question, and I should be ashamed of not knowing, but…

Does a resistor block current or use it up? If I attach a resistor across the leads of a battery is it only allowing x amount of current to go through because no more current can get through? Or is it only letting x amount of current through because it is using up the rest (i.e., giving it off as heat)?

Or have I completely screwed up this question?

One way to look at it is to say that a resistor causes electrical friction.

Like mechanical friction if you put enough energy into a friction point it will warm up, the greater the force acting through the resistor the hotter it gets.
That heat has to come from the energy source, so the friction point is taking one form of energy and converting it to another - heat.

The force pushing current through a resistor is the potential, or voltage differance across it.
The resistor acts like a restricting device and limits the amount of current that can pass through it.
Some folk have difficulty understanding that it is not the amount of resistance that makes it get hot, it is the amount of current passing through it, so a low value resistor will allow more current to pass at any given voltage and is more likely to get hot.

So to answer your question, in restricting current flow a resistor uses up power.

I guess the most basic answer is that the resistor uses up the energy (gives it off as heat). It really doesn’t use up the current, current is a function of of the resistance and voltage, V=IR or in this case, I=V/R, where I is the current. So there is only so much current. However much current goes into a resistor, the same amount comes out, so no current is lost in the resistor. The energy is converted to heat. This is kind of the priciple behind toasters and hairdryer heating elements.

Your signature is too long. I don’t know if you saw it, but I have asked you before to trim your sig. The limit is four lines.

bibliophage
moderator GQ

Current is not used up. In fact that is one of the two basic laws about how electric circuits work. Current going in equals current going out. It comes from the idea of conservation of charge. You cannot create or destroy charge.

Before you connect the resistor across the battery, no current flows. When you connect the resistor across the battery, a specific amount current will flow.

How much current will flow? Well, it depends. On two things:

  1. The battery voltage, which is given in Volts. All else being equal, the higher the battery voltage, the higher the current.
  2. The value of resistance, which is given in Ohms. All else being equal, the lower the resistor’s resistance value, the higher the current.

This can be tidied up with Ohm’s law:

I = V / R

where V is in Volts, R is in Ohms, and I is in Amps.

In other words, if you keep the resistance constant and vary the voltage, the current will be proportional to the voltage. And if you keep the voltage constant and vary the resistance, the current will be a hyperbolic function of the resistance.

Of course, if the voltage is fixed, and the resistance is fixed, the current is fixed.

To sum up, a specific voltage value and a specific resistance value will give you a specific current value. The actual value of current (in Amps) depends on the actual voltage (in Volts) and resistance (in Ohms).

And as someone else has pointed out,

  • the current entering the resistor, and
  • the current exiting the resistor, and
  • the current entering the battery, and
  • the current exiting the battery

are all the same. The current is the same everywhere in the circuit, and (of course) is calculating using Ohm’s Law.

Hope this helps.
P.S.: This is really a first-order / ideal explanation. While Ohm’s law is always true, in reality there are many other variables involved, e.g. temperature and RH dependence on resistance, voltage coefficient of resistance (VCR), lead resistance, source impedance of battery, tolerances, leakage currents, etc. etc. etc.

Compare an electrical circuit to plumbing. Current correlates to water flow rate, and voltage correlates to water pressure. If you put water through a restriction of some kind, the amount of water going in is the same as the amount of water going out, but its pressure is reduced. In the same way, if you put electrical current through a resistor, you’ll get as much current out as you put in, but its voltage will be reduced.

Like CrafterMan said, and (for DC Power) the power disipated from the resistors (as heat) is equal to the current squared times the resistance. Energy is equal to the power times the time. For A/C it can be a little more complex. I still find resistors used in control circuits on large DC motors in industrial applications. Big waste of energy in most cases. Modern power electronics can save these folks big bucks and keep the lights on in CA.

Actually the current is reduced. The voltage does not vary when you vary the resistance.

DrMatrix, you’re confusing the behavior of a resistor by itself with an entire circuit which happens to contain a resistor. The current coming out of a resistor is most certainly the same as the current going in, and the voltage is certainly lower coming out than it was going in. Ref. Helmholtz’s compatibility and continuity laws, referred to indirectly by gazpacho/

In the case of an entire DC circuit, if you impose a higher or lower voltage across a resistor, the current will be higher or lower according to Ohm’s Law, as you state. But the above paragraph about the resistor itself is still true.

Er, make that Kirchhoff, not Helmholtz. All those 19th c. German scientists looked alike, hehehe.