the equivalent of sample rate, bitlength for LP format

I can’t seem to find the equivalent of sample rate and bitlength (or whatever are the appropriate quantifiers) for LP’s in the wealth of “Vinyl versus CD” articles out there. Is there such a thing?

I used to think that the RPM speeds (33, 45, etc…) had something to do with the “sample rate”, but after looking through a few audio books, I’m not so sure.

Would it be more appropriate to look at the specs of the master tape format from which LP’s are made?

Vinyl is analog, CDs are digital. There isn’t any direct corelation.

To elaborate, a sound wave can’t be processed digitally unless it is converted into information a computer can understand. Simlar to how a camcorder records “frames” of motion, and plays them quickly to recreate the appearance of motion, the sound wave is sampled in slices representing a specific portion of a waveform. The more samples per second, the truer to the original waveform the computer modeled representation will be. Bit depth, refers to the number of amplitude values that can be assigned to a digitally sampled wave form. Google around and you’ll find hundreds of sights with information on the subject ranging from the basics to rediculous.

I understand that this is, to a large extent, an “apple and oranges” question, but there must some kind of standard of resolution or audio fidelity that can be applied to LPs that I can roughly convert (in however convoluted a way) into the straightforward sample size/sample frequency attributes of CD audio.

In other words, CD audio has a certain set amount of precision that is determined by the format. In what way/how is this precision measured for analog recordings – or is it completely arbitrary?

There is no sample rate in analog. OTOH, the bitlength (8 bit, 16 bit, etc) could be correlated to the accuracy of the turntable. For instance, to get extreme, it can only measure to a height of one atom.

If you want a notional sample rate for comparison purposes you could take two times the maximum frequency (i.e. the frequency where the amplitude drops by an appreciable factor e.g. 3 or 6 dB) because according to Shannon’s theorem you need to sample at at least 2 x maximum frequency in the signal to be able to reconstruct the original waveform (using a mathematically ideal low pass which cannot be built because it would violate causality).

For constructing a notional word length you could compare the noise of an analog recording to the quantization noise of a digital recording. E.g. for a signal/noise ratio of 60 dB (just to name a figure), which is an amplitude ratio of 1000:1, you could compare that to a quantization error of the lowest bit in a 10-bit value (2^10 = 1024).

These are just notions that appear mathematically plausible to me as a first approximation; I have no idea if these comparisons are any guide to the relative quality of the recording to people with refined hearing.

How about bandpass? Not a measure of fidelity so much, but a measure of available frequencies in a way. CD’s and such are what? 20Hz - 20KHz? Notice how that sounds different than your telephone at 300Hz-3KHz?

Not really related to your question, but very interesting nonetheless.

You may have already seen it as it was on slashdot at one time…

http://www.cs.huji.ac.il/~springer/

The simplest way I can think of to characterize the “quality” of a turntable + LP setup would be by measuring the signal to noise ratio. This has the advantage of not being an “apples to oranges” comparison, as the signal to noise ratio can also be measured or estimated for a CD deck + CD.

The signal to noise ratio is the ratio of signal strength to noise strength, as the name implies. I’m sure someone can come through and give a few examples where measuring the signal to noise ratio would not provide an adequate comparison to CDs, but I don’t know enough to do that.

Balthisar: I suppose you could plot the SNR as a function of frequency, and that would cover your comparison criteria as well.

Some of this has already been covered, but here’s a few values to use that more or less directly compare.
Dynamic range is how loud the loudest sound can be relative to the softest. It’s what bit depth tells you. The dynamic range of a CD player (16 bits) is about 96 dB. For an analog recording, the dynamic range is mostly determined by manufacturing, but the signal-to-noise ratio in playback comes in as well. For a phonograph record, you can’t make the groove too narrow (imperfections in manufacturing end up as noise) or too wide (limited by record size and presumably cartridge movement). The dynamic range ends up around 60 dB at the very best, more like 50 dB in practicality.

How much range is needed is another matter. The range of human hearing from threshold (0 dB-SPL) to pain is about 120 dB. But since there’s some level of background noise when you’re listening (or possibly recording), you generally won’t set the volume too low, so that a 96-dB range is sufficient to keep the loudest parts from being too loud. Many music recordings are compressed to be louder anyway, and don’t even use the full range. Older analog tape masters weren’t much better than 60 dB either (though the LP may have been compressed, whereas the CD possibly isn’t). However, having more range obviously can’t hurt. It just may not mean that much. It’s like shaving blades.

Frequency response (or ‘bandwidth’) is what the sample rate tells you. For a time-sampled signal like a CD, the theoretical maximum frequency is exactly half the sampling frequency (A CD uses 44.1 kHz, giving 22.05 kHz max). You can be sure it’s impossible to record anything of a higher frequency on the CD.

For an analog signal, there is no theoretical maximum frequency, but there’s always a practical maximum. I’m not sure what the practical frequency limit of a record is, but the range is probably far more determined by the stylus. To produce a high frequency signal the stylus must move at that frequency, and its limited mass means there’s going to be some limit. Not to mention that there’s a design tradeoff - a smaller mass cartridge is more likely to bounce or skip. This cartridge, described as ‘audiophile’ and ‘low mass’ has a frequency response “Essentially flat from 20 - 22,000 Hz”.

When looking at frequency response, consider that humans can’t hear tones above 20 kHz (although it’s possible that higher frequency signals may have an effect on a heard sound, the effect, if any, is likely very slight). Most recording equipment isn’t designed to pick up sounds outside that range anyway, and most speakers aren’t able to reproduce them either.

A 10kHz sine wave, digitally sampled at 40kHz, is indistinguishable from a 10kHz square wave, digitally sampled at 40kHz.
A 5kHz sine wave, digitally sampled at 40kH, will appear different on an oscilloscope than a 5kHz square wave, but will still look pretty chunky, comprised as it is of only 8 different voltage samples.
At 2.5kHz, which is starting to get into the range where people can hear the difference between a sine wave and a square wave, the sampled wave will be made up of only 16 distinct voltages, as compared to the smoothly varying voltage produced by an analog system. That’s distortion. I suppose you could quantify it by calculating the rms deviation of the samples from the sine curve, but that’d take too much calculus to be any fun.

Great link whatami. It’s good to see that someone tried that with their scanner. If they could work the bugs out, it’d be a much easier way of dumping LP’s to CD than hooking up the old turntable.

Holy crap that is cool. I am awed.

If you took a 2.5 kHz sine wave and sampled it at 40 ksamples/sec, you would get 16 different voltage values, but it wouldn’t be distorted. If you put those samples into a D/A converter and filter it properly, you’ll get a very nice, undistorted output waveform. As a matter of fact, you can run the sine wave all the way up to nearly 20 kHz and still get an undistorted output when it’s regenerated.