Calibrating measuring equipment?

On this video here this guy tests an older CRT oscilloscope and finds that some of the measurements are a bit off: Tektronix 2225 Analog Oscilloscope - EEVblog #196 - YouTube

And on this video the same guy calibrates the oscilloscope to fix it.: Tektronix 2225 Oscilloscope Teardown and Calibration - EEVblog #208 - YouTube

On both videos he uses an Agilent digital waveform generator as a reference. But how does he know that the waveform generator is true and the oscilloscope is off and not the other way around?

Typically, reference equipment is calibrated to a reliable standard once per year, or at the specified re-cal period. The calibration house will have standards that are traceable to NIST primary standards. I didn’t watch the video, but my guess is the guy knows his generator is in calibration.

some measuring devices because of the complex circuitry involved are prone to drift in its measuring ability. a more stable signal device, maybe due to a smaller number or different components, could be used to calibrate the measuring device.

i didn’t watch the videos.

‘But how does he know that the waveform generator is true and the oscilloscope is off…’

Because his calibration source, the Agilent digital waveform generator, will have a calibration certificate that gives its accuracy of calibration and the last date which the calibration certificate is valid. The Agilent digital waveform generator will have been calibrated by its manufacturer using in-house calibration standards that are traceable back to National Standards.

I think the question is: how does he know that it’s the scope that’s off and not the generator?

The answer to that is more complicated. A piece of equipment is calibrated at regular intervals, typically once a year or once every two years. The manufacturer knows, for a properly functioning item, about how long it can stay in cal for, and the interval is set to be shorter than that. This assumes that something doesn’t go haywire during that period. If something happens during the cal-valid interval that causes the generator to be off, then it’s up to someone noticing it. The technician will probably notice that all the scopes he is seeing are off, and will suspect the generator then.

For more critical applications, calibration houses will offer a service that if a piece of gear is found to be out-of-tolerance, they will notify anyone who has equipment that was calibrated by that piece of gear, and offer a re-calibration.

So to answer the question in the scenario in the OP, the generator would be in its cal period, but the old scope was likely well past-due.

An easy way is to measure a known source; for example, a crystal oscillator, which will have a frequency tolerance of 100 ppm or so (+/- 100 Hz for 1 MHz, near “room” temperature), which should be close enough for most uses. That is how I know my oscilloscope, a 30 year old Tektronic 2213, which hasn’t been calibrated in the 10 years I’ve had it and thousands of hours of run time, is still close to its specs; a 1 MHz waveform fits in the proper divisions (1 cycle per division on 1 us/div and 1 cycle per screen in 100 ns/div), similarly, a 5 volt regulator measures near 5 volts, although a multimeter is better for DC or 50/60 Hz sine wave AC voltages, and a frequency counter if you need an exact frequency (the specs on the scope allow for +/- several percent tolerances, which is why using a 100 ppm oscillator is valid; voltage regulators may be +/-5% but a multimeter can confirm the voltage).