A couple years ago, there seemed to be a major push to produce CD’s with 20-bit digital recording technology. Proponents hailed the achievement as a breathrough in sound reproduction. Lately, I’ve been noticing 24-bit recordings hitting store shelves.
A few questions:
Is 24-bit recording substantially better than 16-bit?
Does producing a CD using “high-definition sound” cost more than 16-bit methods?
Is it that much more expensive?
What is the current maximum that could be produced in terms of XX-bit technology?
My sincere apologies to pjen,philosopher, and CrankyAsAnOldMan for being a jackass in a couple recent threads. Thanks to JillGat for her patience.
Well, there is the HDCD format, which is 20-bit resolution. DVD-Audio supports up to 24-bit. And you can find normal DVD’s that have a 24-bit PCM audio track. So it’s out there.
24 bit recordings are exactly 8 bits better than 16 bit recordings.
2, 3. Depends on what recording format you’re talking about. The most advanced formats should not be more expensive to produce than a DVD, in terms of enduser product. However, studio production can be considerably more expensive since many studios only record the performances at 16 bits. On the other hand, many studios record at 24 bits or even higher and downsample it for 16 bit CDs so it wouldn’t cost them a penny extra.
Your question doesn’t really make sense to me. In a mathematical sense, it should be possible to produce a recording that has so many bits that it can’t be distinguished from the original performance. But how many bits are required? First they told us that 16 bits was enough. Now they say we need 24 bits. I betcha in a few years, they’ll start telling us we need 32 bits. It’s all a big scam to get us to buy expensive new equipment and charge even more for basically the same content. But to get back to answering your question, yes, there is one format that meets your criterion: direct-to-disc analog recording, pressed on vinyl records. Unfortunately, the latest digital technology isn’t even as good as what they used to do with analog in the 1950s.
Not much higher than 24 bits. The limiting factor you’re going to run into is the analog-to-digital converter. Everything after that can be handled by a fast enough general purpose computer chipset, since it’s just zeroes and ones. But the initial converter is a specific hardware chip that needs to support your desired bitrate. Some quick checking shows that TI only makes up to 24-bit ADC’s, so I’d say it would be pretty hard to find anything much better.
Not looking for a GD and I am neither an engineer or a high end audiophile but this claim, on it’s face seems… well… highly questionable be honest with you given the advances in recording technology over the past half-century. By what reasoning are you seriously maintaining that current digital recording/playback technology is inferior to that of a direct to disc analog recording of 50 years ago.
Again, I’m not a engineer just an interested layman looking for a clue as to you can hold this opinion which would seem to be contrary to technological expectations.
It is well known amongst audiophiles that nothing comes close to a DTD analog recordings on vinyl (well, as long as the records are in good condition and not scratched). The products of the late 1950s and 1960s were about the peak of the art, it became almost a lost art in the 1970s as people started playing with digital.
The reasoning is simple, but hard to explain without calculus. The short story is that a digital waveform is an approximation of the original waveform, while the analog waveform IS the original waveform. Every digital sampling method will introduce small errors, no matter how many bits or what sampling rate you use. It is a mathematical fact that the digital sampling does not reproduce the original waveform perfectly. But how much of a perfectionist are you?
Would you rather have a DAC (Digital to Analog Converter) with a resolution of 2[sup]16[/sup]-1 or one that has a resolution of 2[sup]24[/sup]-1? Hard numbers are a wonderful thing.
[Trust me, the accurate reproduction of ultrasonic harmonics and extremely fast signal transients is very desirable. Especially if you are attempting to reproduce the intrinsically more accurate transcript of a DTD (Direct To Disc) recording.]
For the record, there are 32 bit DAC’s, but they are expensive and tempermental little critters to operate and super-stable high frequency clock drivers are not yet a common feature of many home stereos.
I’ve heard the “vinyl is better than CD” argument, and I’m curious about the level of financial investment necessary to hear the difference. (Okay, there will be those that will argue that nobody can really tell the difference, but let’s assume that we can.)
I mean, personally, I’ve listened to some of my dad’s records on an old Technics record player and it sounded okay. About as good as a cassette tape, I’d say, and definitely not as good as CD. The record seemed to be in good shape, so I’ll assume the loss(?) of quality is to be found in the hardware. So how good a record player would I need to be able to notice (if not necessarily appreciate) the difference between records and CDs?
And this just about weeds you out of the running for the title of “True Audiophile”.
Anyone who wishes to make the remotest claim of appreciating a vinyl (pronounced “vinnle”) recording must own an AR, Thorens, Rabco (straight arm tracking) or Benjamin Miracord (me) turntable or suffer the scorn of the true cognoscenti. Hah!
Damn, I hate getting into this, because trying to talk reality with people who have fallen for the “analog is real sound, and digital is an approximation” line is very tiresome. However, we must fight ignorance where we see it, I guess.
Direct to Disk recordings can be very good. And they can be bad. I have a few that I recall as being among the best recordings I have. However, there are a couple of things wrong here. The technique definitely did not reach its peak in the 60s, and it is not better than current (or even 10 year old) digital recording methods. Different, yes. Better (by any objective standard anyway), no.
Sigh, so much wrong, so little space to explain. It is a fact, demonstrable by comparing waveforms on two channel scopes, that a waveform can be reproduced essentially perfectly by digital sampling. You have to take some precautions, and not all ADCs and DACs are created equal, but I have seen (and heard) a complex musical waveform sampled, stored, copied, and then reconverted to analog and played back along with the original into a two-channel scope and the results superimposed. The two signals were identical to the resolving power of the scope (a high end model at the time, about 6 years ago). Listening told the same story: the digital copy was impossible to tell from the original.
One of the main things you hear from the analog crowd is the line that digital is just a sample, and you never recover the entire waveform. This is wrong. A waveform properly sampled and dithered will be entirely recovered. There is no lost information in the frequency range below the Nyquist frequency (just under half the sampling rate) down to the limit of the resolution (determined by the sample size). So the current widely used standard for CDs (approx 44k sampling rate) gives you a useable range up to approx 20k. And years of testing show that any signal above this range make no difference to the perceived sound.
The other line the analog guys like to throw around is that a digitally sample will not produce a smooth waveform, and you may even hear the term “stair-step waveform” thrown in by the more uneducated ones. This is also wrong. Again assuming a properly sampled and dithered AD/DA process, the resulting waveform can be smoother than that analog signal you love on your vinyl. How can that be, you ask? Simple. High frequency noise, commonly known as “hiss”. Noise causes small random variations in the signal that look like you are modulating the (mostly) lower frequency music. This shows up as a “wobble” in the original “smooth” waveform. Guess what? Noise levels on the best vinyl are around 20 db worse than digital. This means a smoother waveform output from the digital source, not from the vinyl.
The bottom line is that every recording process introduces some distortion into the signal. The fact that most “high-end” audiophiles don’t want to admit is that analog recording methods (while they can be breathtakingly good) introduce more distortion than digital, and usually of a more random nature that is harder to deal with down the playback chain.
Both methods can produce amazing results, but by any objective standard, digital is more accurate.
Yes, I agree, so much wrong. Your understanding of many things is fundamentally flawed. Your weak understanding of the Nyquist sampling thresholds does not account for ultrasonic and high speed transients (as metioned by Zenster). A sampled waveform NEVER is exactly the same as the original, it is a mathematical impossibility. But maybe it is different, but maybe the sampling errors are beyond the range of humann hearing. So make up your mind, which is it?
There is much to audio beyond what can be represented on a scope, especially a “high-end” one from 6 years ago that is a digital device rather than a good old analog device. Go find me a nice mid-70s analog scope and I’ll teach you a few things about waveforms. But don’t waste my time with the new digital scopes, they have sampling errors of their own. Your convoluted story proves nothing. What was the origin of the waveform, a digital synth? A guitar? A harpsichord? A recording of an instrument? A digital recording of an instrument? How did you make the analog recording, and the digital recording? I can find of so many flaws in your weakly-described scenario that you could never possibly prove what you set out to prove.
Yes, you can make crappy DTD recordings, and analog players have their inherent flaws. But anyone can hear the difference between a good DTD disc and a CD. An audiofile friend of mine went to much trouble to locate an excellent DTD record, as well as a modern digital remastering on CD, we compared them on his $80k stereo. The DTD disc sounds like the jazz quartet is in the room, has excellent presence, and even shows excellent back-wall reflective effects. The CD sounds like you’re playing a CD, a damn good CD, but still just sounds like a CD. I was skeptical myself, until I heard the difference. No, I don’t want a “smoother” waveform that you seem to think is superior, I want the REAL waveform without dithering or smoothing, I want ALL the waveform as the musician performed it, subsonics, transients, and all. Objective measurements with instruments are not the ultimate arbiter of human perception.
And yes, DTD peaked a long time ago. Not because of technology, but because nobody does live studio performances that can fill a whole album side anymore. Audiofile recordings have become a limited market and tend to be more of a demonstration record than a commercial product, unlike the heyday of DTD.
I do remember in my telecommunication class last year that the professor asked if anybody knew what had been approved for CD’s and one annoying kid correctly answered that a new 24bit standard was being introduced. (This in response to those who said they didnt’ know wtf the OP was talking about)
As for the analog/digital thing. It always seemed to me that the “audiophile” was just another quirky rich guy (or not-so-rich occassionally) who wants to find another way to claim superiority. “Maybe you can’t tell the difference, but my wealthy ears certainly can. Now if you’ll excuse me I must go eat some food that you will find disgusting because you don’t know anything about good taste.” I mean your friend with an $80k stereo is a case-in-point. WHO THE FUCK SPENDS $80K ON A STEREO?!?@ I don’t care how rich you are there are better fucking things to do with $80K than buying a stereo.
It is very practical to spend such an amount of money if you are a record producer, and get paid for your “golden ears” like he was. He also had a record collection of over 10,000 pieces of vinyl. But I guess you’d think that was a frivolous expenditure too.
Well this ain’t the first time I’ve stuck my foot in my mouth, and it won’t be the last. However for every “golden-eared” record producer there are 100 yuppies who don’t know what the fuck they’re doing but certainly need to be cool and have a record player and insist its superior. My uncle lives in san fran and everybody around him is a yuppie and they all have record players but dont know why.
The 16 bit, 44 kHz sampling used for CDs is something of a compromise. There is certainly some lost information compared with an analogue recording. Generally however, a CD gives pretty good reproduction at a low price. The low noise of digital recordings compared with analogue makes the digital sound superior on most systems.
I can accept that a really good analogue recording, lovingly cherished and played through a ridiculously expensive stereo, may be an audibly better reproduction than a CD. I suspect that the 24 bit, 96 kHz sampling rate recordings will be a totally indistiguishable reproduction from the analogue, even for “golden ear” audiophiles, although it may take a double-blind test to convince them of that.
RJKUgly what was the bit depth and sampling rate of your digital recording, and what was the resolution of your scopes? A 12 bit digital scope will clearly see a 16 bit digital recording and an analogue recording as identical!
Chas E. "A sampled waveform NEVER is exactly the same as the original, it is a mathematical impossibility"
Agreed, but then an analogue recording is never exactly the same as the original either. When the quantisation noise introduced by sampling is smaller than the random noise on an analogue recording, the bit depth becomes irrelevant. And granted, a 44 kHz sampling rate gives poor high frequency reproduction and you lose much of the ultrasound, but just how ultrasonic do you want to go before you admit it isn’t relevant? How high a frequency will an analogue recording reproduce? How high a frequency can you get out of the loudspeakers, even on an 80000 dollar stereo?
Are you saying it is possible to reproduce a sampled waveform exactly using analog techniques? Analog circuits have bandwidth limits, as do analog recording media. Circuit noise can be reduced but even theoretically, it can’t be completely eliminated. Given a real-world analog audio system, I don’t see why it is impossible to build a digital system that matches its performance. If you are not bound by standards for sampling rates and resolution currently in use, that is.
1)A digital sample cannot perfectly describe the original waveform, but for most people it is good enough. It will also sound the same after the CD has been played a hundred times.
2)The analogue recording may more perfectly describe the original waveform the first time the vinyl is played, but most people won’t know the difference. After playing the vinyl a hundred times, however, the hisses, clicks, and pops which typically accompany a heavily used LP recording will be obvious to anyone.
Bottom line; digital is a superior media in the real world of imperfect turntables, dust, scratches,and kids.
This is getting very close to Great Debate territory, but I’ll try to proceed carefully. And I’m responding to several different people here, I hope this isn’t confusing.
Yes, it does. Ultrasonics and high speed transients (which amount to the same thing when you’re talking about waveforms) are by definition, above the Nyquist frequency when sampling at 44k. The assumption (and it is backed up by decades of testing) is that neither of these effect the audible signal. Those of you who think they can hear ultrasonic signals have never, not even once, been able to prove it in a documented controlled listening test.
Actually, mathematically speaking, it is possible. Going by the math only, a Fourrier transform gives you back exactly the original waveform. The real world isn’t so kind, however, and true perfection isn’t available to us. But you’ve answered your own question, and in the same way I did. I said essentially perfectly, and that no information was lost down to the level of resolution (or quantitization, to use the more correct term). Essentially perfect means no detectable differences for our purposes.
We did several signals that day, and several more off and on over a period of months. The one that impressed me was the output of a TEAC reel-to-reel studio recorder. We also sampled a DTD disk, and several high end vinyl pressings (Telarc, Sheffield, etc).
And yet in blind listening tests, so much of the difference goes away. Not between vinyl and CD, of course, they’re pretty easy to spot. The surface noise, wow and flutter, compressed dynamic range, etc, make the vinyl easy to spot as a rule. But comparing the vinyl to original, or the CD to the original, blind listening tests show that the CD is much harder to distinguish from the original than the vinyl.
Dithering and smoothing during the AD/DA cycle get you closer to the original waveform, and are not designed to (and don’t, if properly done) smooth transients out of the music. Subsonics are much easier to record on digital than on vinyl, BTW, because of the high excursion required for the stylus. This leaves ultrasonic. If you think ultrasonics effect the hearable music, step up and take a double blind test. You’ll be the first person in history to pass, I’m certain.
This has become, unfortunately, all too true much of the time. 30 years ago, it took some understanding of audio, and a dedication to the hobby to get truly good reproduction. Now all it takes is money, and not that much of that. $10,000 will get you some amazing sound in a reasonable sized room.
Yes, there is certainly some compromise, but much less than the high-enders would have you think. And if the analog recording you’re talking about is vinyl, then many double-blind listening tests have shown that even a 16/44 CD gives a copy that is harder to distinguish from the original.
Very true. But I don’t think you’ll convince all of them. Many just won’t see the reality.
Both excellent points. Depending on the digital methods used, however, you can change “for most people it is good enough” to “all people”. Even using 16/44, it is much more difficult to tell a well recorded CD from the original signal than the high-enders would have you think.
And the digital recording is better in the real world of turntables, period. Dust, scratches, kids, etc., all just make the differences larger.
The OP’s asking about 24 bits rather than 16 bits, but what’s the difference?
The difference is noise. You sample an analog signal at a high enough rate (44kHz) to capture the frequencies audable to the human ear, and you store those samples in a certain number of bits of information.
By sampling at 8 bits for instance, you’re forced to round these samples up or down to the best integer (from 0 to 255). This error can be visualized as random noise being added to the signal which just happens to alter the signal enough to lie exactly on an integer boundry.
They don’t purpously add this noise to sample the sound, that’s just how your ear interprets the rounding error.
Want an example? Try listening to a sound sample played at 1 bit. You can recognize what it is but the noise level is overpowering.
So the difference between 16 bits and 24 bits is the noise introduced by rounding the analog sample to a nearest integer, of course 16 bits of data (65,536 different power levels) makes that noise level so low that you can’t hear it given the accoustic properties of a reasonably priced stereo setup, making it good enough for government work!
The whole 24 bit studio recording thing is because of the fact that when you mix two digitized signals, the noise trapped in each is added together in the final copy. So while 16 bits might be fine for two channels playing sepertaly on a stereo, when you mix 20 channels together at 16 bits each, the noise starts to get in the way. So studios record and do all the volatle mixing at a higher bit rate, and down-sample it for the CD after all the high noise-introducing mixing is done.