What meter types and specifications are appropriate for testing instrumentation and measurement cables to see what their shunt resistance to ground or between conductors is? The problem is that in very sensitive and accurate temperature measurements with resistance-based detectors, slight conductance between cable conductors can add significant errors. As a minimum, the books say, make sure no cables have lower than 100 megohms resistance between conductors. But in analyzing errors and studying the reliability and trustworthiness of different cable types and installation practices, one might reasonably want to test to 1 or 10 GOhms or perhaps even more.
The reason choosing a meter for this isn’t trivial is that there seem to be two distinct purposes that in meter descriptions aren’t distinguished clearly. One purpose is what I describe above. The other is testing to see whether insulation is safe at high voltages, or starts to break down.
Certainly, some meters advertise high test voltages, 1000 VDC or more, that they supply to the component (cable) under test. This is very different from an ordinary multimeter in resistance measurement mode, where generally you figure the voltage the meter produces will not do anything ugly to the thing you are testing (though this does actually require thought and care sometimes). Some meters have an insulation testing mode in which these high voltages are used, and if the meter has a resistance measuring mode like a DMM does, these two modes are separate, even if the insulation testing mode gives a reading in ohms. Some insulation testing meters have a hand crank for a generator to create the high voltages (though electronic high voltage power sources that step up from internal batteries are more and more common and seem to be displacing the crank types).
Even more potentially confusing, the term “megger”, which seems somewhat colloquial, appears to be used to refer to the high voltage devices that verify insulation safety, but derives from “meg-ohm-meter”, which emphasizes the resistance measurement.
Measuring very high resistances does require higher voltage and lower current than does measuring more common resistor values. The fact that a higher voltage is used doesn’t necessarily mean the meter can’t tell me what I want to know. But, insulation breakdown and some other phenomena that a high voltage can cause may not be linear, and may not obey Ohms law, and therefore may not be relevant to the tiny errors I care about in measurement circuits.
So - what’s the straight dope on specifying a meter to do what I want and not something aimed at an entirely different and potentially contradictory purpose?
Thanks!