Is there a difference between a voltmeter and a multimeter?

Are they basically the same thing is there some fundamental difference?

You can buy a dedicated panel meter that only measures voltage. The ones I have seen are ac-only or dc-only. A multimeter usually measures voltage and resistance. I’ve got a Fluke 12B which also measures capacitance and can check diodes. Some can measure amps current flow but only small values as the meter has to be placed in series where voltage is checkedin parallel. Amp meters for higher than milliamp range are usually special purpose.

Some automotive multimeters have functions to measure RPM when connected to the ignition system. Fluke also makes temperature probes that can be used in conjunction with their meters.

As the name implies, the voltmeter only measures volts while the multimeter measures other magnitudes as well like amps, etc.

Oh, some analog volt meters can work with the power being provided by the voltage itselt. IIRC the old Simpson 260 meters I used in the navy would work for voltage with no batteries. Measuring resistance requires that the meter provide voltage so the current flow can be measured to derive resistance.

One thing to point out is that people may use the term voltmeter for a Digital Multimeter (DMM).

If it has a digital readout, test leads and a large indexed or notched knob, the chances are that is more than just a voltmeter, even though perhaps someone may have referred to it as that.

In fact, I can’t remember the last time I saw a pure voltmeter sold as a tool. All of them are actually multimeters nowadays.

I only see voltmeters as components (the thing with a needle and some sort of scale) that can be used as part of something else (like a multimeter, to state one example).

I see. So compared to a voltmeter, a multimeter can do all that and more.

Many thanks, muchachos.

IIRC,

Actually, this gets into the ‘figure of merit’ for a good voltmeter. Drawing current from the circuit under test inherently changes the value measured from what it would be without the meter present. (One of the few everyday demonstrations of the Heisenberg Uncertainty Principal, to my way of thinking)

‘Goodness’ in a voltmeter is expressed in its ‘impedance’ - implying how little current it will draw - and is expressed in ohms/volt (the 'volt meaning the full scale voltage of the meter)

Self-powered meters are much better in this regard than the old circuit-powered ones.

How about the difference between a volt-ohm meter (VOM) and a multimeter of 30 years ago. IIRC, they both measured volts and resistance on multiple scales but VOMs were more accurate for voltage because they had much higher impedances. Or no?

I have an ancient Westinghouse voltmeter next to my DVD player. It once belonged to Mrs. Nott’s grandpa, who was a radio engineer, and who, reportedly, invented the electric scoreboard.

My more modern multimeter measures everything. Subtle variations can tell if the chipmunks in my woodpile are laughing at my cat. I never show Freckles the results; he’s a proud guy. :smiley:

Nothing but terminology. The impedance of an analog meter, like the Simpson VOM’s or multimeter, depends upon the sensitivity of the galvanometer that is the guts of the meter.

The most common value was a galvanometer that deflected full scale with one miliampere of current. They were often refered to as “1000 Ohms/volt.” So if you were on the 10 v. scale, the impedance was 10K Ohms. The 100 v. scale, 100 kOhms, etc.

Meters were also available with galvanometers having a full scale deflection with 100 microamps. These were 10000 Ohms/volt.

Galvanometers were made with even more sensitivity but I never heard of any of them were used in portable meters.