Are (analog) records actually considered to sound better than (digital) CD's?

I call B.S. The link to the c’t article is pretty informative (and saved me the effort of finding it). Even 128bit mp3s were of equivalent quality to CDs in a blind test. Nowadays, almost everything is 192VBR or up.

I don’t make judgments about what sound you like, but there are no quality differences between CD, mp3, or vinyl. There might be inherent equalization bias in each, but there are no real differences audible to most human ears in their ability to reproduce sounds.

I think it’s a lot like the “quarter inch club” on a shooting forum I read. Almost everyone had a rifle that could make a quarter-inch group on demand. But then the club was formed - to get in all you had to do was send in a target to the judges. A lot of people suddenly had to sheepishly admit that when trying for repeatable and objectively measured results, it was not as easy as they had thought. I suspect that those who can “hear the difference” are right, but that in a blind test they would have a much more difficult time telling what format is higher quality than they would just saying “that’s a different source”.

PS: Just in case it isn’t clear, compression mentioned above is not like mp3 compression, but is dynamic range compression (like an automatic volume knob). Both CDs and vinyl can be compressed – the data rate on CDs doesn’t change, though.

mks57 Man, you make a lot of qualifications. Of course, LP’s are more prone to degradation with improper handling. Variables in engineering, etc. can make anything sound better or worse. But the fact remains that UNDER IDEAL CIRCUMSTANCES, LP’s have about 10 db more dynamic range than CD’s.

Again, I cite my experiences with a direct to disc LP of Dark Side of the Moon. Played with my Shure V15/5 cartridge on my Pioneer quartz-locked tangential tracking turntable, you could walk into the room with the opening track playing, and not even be aware that there was anything playing, and by the end of the clock crescendo be running out of the room from the sound!

I also echo the thoughts of those who equate pops and clicks with LP’s. How about you HANDLE THE RECORD BY THE EDGES, CLOSE THE DUST COVER WHEN PLAYING, AND RETURN THE LP TO THE SLEEVE WHEN YOU ARE DONE? Duh.

I can make CD’s sound crappy too. In fact, a small scratch can make them unplayable, where the LP will just make a pop and play on.

As of this moment, my turntable is in storage, and I listen to all music in a digital form. It ain’t what you play it on, it’s what you play!

FWIW, at least ten year passed between my comparisons. I am still dissapointed in CD sound, as a pure audiophile, but the convenience of digital recording almost makes up for it. Almost. :frowning:

And FWIW, I have done a lot of a/b testing and find that for MP3, 196 kb/s is where I can’t tell the difference any longer. Of course that may be because that is approximately the sampling rate for CD’s so nothing more to gain. I will be interested in hearing some of the new high def audio formats. When my rich uncle dies, that is. :wink:

I find it interesting that Neill Young is one of the biggest whiners about CD fidelity. When I saw him last year at the Nokia in LA, he sounded like he was down a well. Rush, normally audio perfectionists in concert, sounded like crap too, over amplified and mushy, so it is probably the venue.

The emperor has no clothes. Despite the brain-dead positive reviews of the place, it is an ugly barn with crap acoustics. Never again!

Record players are mechanical devices with motors, drives, platters, bearings, shafts, magnets and needles - as such my feeling is that they respond rather well to care and attention especially during manufacture. That is to say, an expensive turntable will out perform a cheap one in most cases.

CD players have a laser, a DAC and that’s about it so a cheap one and a costly one will sound about the same, from there on it’s up to the signal processing and amplification. Back in the 80s everyone had a cheap turntable, of course the new CDs sounded better. But with modern items of sufficient quality the comparison becomes one of personal choice as stated in the posts above.

I would love this laser turntable: no pops / crackles / hiss, doesn’t damage the vinyl and it even plays damaged records.

Just telling you what I read. IIRC Neil said his LP of “Harvest” was better than the CD of it. As the person who created it, arguably, he is in a position to know what subtleties they were striving for when recording it and how those translate or don’t into the final listening experience via different media. IOW, take it up with Neil. :wink:

I used to be an audio engineer doing mostly live sound but some recording studio work for musicians I knew. I am sure that vinyl sounds better than CDs theoretically but unfortunately by the time CDs were ubiquitous my ears were too old to pick the difference. Few people are capable of understanding how limited is their capacity to differentiate sounds. Every year your top end drops out a bit.

I still attend a lot of live music and I am sure that young people nowadays have relatively more damaged hearing than when I was young. It is common place for me and my friends to not bother trying to talk at music venues because of the level at which even the filler music is played.

Mind you the biggest crime against music nowadays is compression gone mad.

Is this a problem of the recording medium, or of the capture procedure? In other words, I’d think that these problems are in the sensitivity of the microphone, and would then apply to any recording medium.

I think you misunderstood me. When I said “infinite”, I did not mean to imply that there are no limits up and down. I meant to refer to the detail and fineness in the middle.
For example, if I pluck a stringed instrument, and then move my thumb to raise the pitch, then the note will change. For illustration, let’s say it went from a C to an F. In the real world, the sound waves from that instrument were of an infinite variety. It played every single wavelength in that range, an infinite number of them, each for an infinitesimal length of time. In theory, an analog device of sufficient quality would be able to record all of them. But a digital device - by design! - is incapable of doing such a thing.
A digital device samples the sound many thousands of times per second (I think the standard for audio CDs is 44100 per second) and records the sound of that moment. It then does the same thing a fraction of a second later. The sounds between those two moments has been lost. It’s very much like a movie or video picture, which is actually a series of still shots. They come together so quickly that most people perceive it as continuous, but I can’t help suspecting that when people talk about the “warmth” of vinyl, I think they’re referring to the missing sounds, and when they talk about the “harshness” of digital, they are referring to a below-consciousness-threshold of staccato-ness.

Except that they quite don’t. First, the better CD player will have a better error-correction algorithm. More importantly, though, not all DACs are equal. Furthermore, good circuit layout will reduce channel crosstalk.

Now, is this difference as big as between a cheap and a good turntable? Probably not, but it is noticeable.

Actually both, but the recording medium is the predominate source of noise in analog systems. Any electronic circuit has thermal noise, which can be an issue for things like radio telescopes that are dealing with very weak signals. It isn’t normally a problem for audio equipment. Analog magnetic tape and vinyl records have noise due to irregularities at the microscopic level. Sort of like the visual noise produced by film grain.

A digital device (analog to digital converter or ADC) is perfectly capable of capturing every frequency within its specified operating range. If you want to get picky, no device with a bandwidth limit, either digital or analog, is going to perfectly reproduce your stringed instrument. By shifting the pitch, you are in effect frequency modulating a carrier signal. A mathematical analysis will show that this produces infinite numbers of sidebands in the frequency domain. In the real world, most of them can be safely ignored because they contain an infinitesimal amount of energy and information. If we just consider signals with constant amplitude and frequency, sine waves, they can be accurately captured by a properly designed digital system. In fact, digital systems excel at this because they use very accurate timing sources, quartz crystal oscillators, that are far more stable and precise than the speed regulation of an electromechanical device like a tape transport or turntable.

People often say “what about the gaps in between digital samples”. The answer is that there is no information in those gaps. If there was anything there, it was eliminated by the anti-alias low pass filter that precedes the analog-to-digital converter.

When you play back a CD, those digital samples are fed into a reconstruction filter. The output of the reconstruction filter is an almost exact match for signal that entered the analog-to-digital converter.

You seem to concede that it is not a perfectly exact match. Could you elaborate on the difference?

The signal is quantized with a finite number of bits, 16 bits for CDs, which limits the signal-to-noise ratio. The more bits, the higher the signal-to-noise ratio. So very weak signals get lost in the noise.

Phase noise, which is very short term instability in frequency, from the system clock can produce distortion in the analog signal.

Filters are not perfect. High frequency (> 20 kHz) components in the input signal can leak through the anti-alias filter and produce aliasing distortion. They can also introduce phase distortion and irregularities in frequency response.

ADCs and DACs can introduce noise and distortion via quantization error. Real-world devices don’t perform as well as ideal devices.

Noise can be coupled into the circuits from the power supply and power distribution networks.

I have the same issue, but it’s got more to do with DJs talking over the music–which is an anomaly even vinyl doesn’t saddle us with.

I think the problem with the OP is that the question is aimed at the the laboratory-setting of sound reproduction, but the average Bruce doesn’t listen to music in the lab. He listens to his music against a backdrop of a humming engine and road noise, or at the beach, or in the gym, etc. Almost NOBODY listens to music in a laboratory setting. I can say with certainty that my record player’s quality would be poor in the car or at the beach, and that vinyl vs. CD becomes a silly argument when the fridge and the dishwasher are running in the next room.

Analog does sound better than digital, because it is a better representation of the sound wave.

But here’s the thing, the difference is only valid for the first few times the record is played. After that the needle on the vinyl wears the record down and this makes the sound worse than a CD which has the same playback over and over.

Also most people can’t tell, that doesn’t mean that there are not SOME people that can tell, but most people can’t

A lot of things are marketing hype, but it works. I mean people still pay 3 times as much for Tylenol or Advil instead of generics 'cause they are convinced they don’t work as well

It’s not black and white. The average person can hear the difference between a 128k MP3 and a CD under the following circumstances:

[ul]
[li]They can A/B the two sources[/li][li]The material contains sounds that reveal the limitations of MPEG compression[/li][/ul]
If you give them a sample from a CD that features applause, play them that, encode the sample to 128k MP3 and play that - most folks will hear the difference. The CD will sound like applause, and the MP3 will sound like bacon frying.

Here’s something I wrote in another thread that got a bit lost (the thread was about HDMI cables):

Ethan Winer published an amazing and eye-opening article about this that I saw linked to from Mix magazine, a trade journal for sound professionals. He shows that a 4" difference can cause dips and peaks of 3 and even 6 dB at various frequencies.

This is why audiophile tweeks don’t show up in double-blind “A/B/X” tests, but do show up in longer, subjective listening tests where the listener gets up, changes the cable or piece of electronics and sits back down. Unless their head is strapped down like Alex in “A Clockwork Orange” it is going to be in a different position. It’s not that the audiophiles are fooling themselves - they do hear a difference, but it’s not caused by what they think it’s caused by.

…and…

The same issue of Mix had an article where they took some super high bitrate recordings and ran them through an A/B/X comparator. A was the super high bitrate source’s analog out. The second was the same signal run through a very good CD recorder in monitor mode. The first was from DVD-Audio and SuperAudio CD, while the second was limited to 16 bits at 44.1k -CD standard. The two sources were matched to .01 dB. The trials were done in a wide range of locations with a large number of listeners, recording and mastering engineers, males and females with a total of 544 test subjects.

The results?

The number of times out of 554 that the listeners correctly identified which system was which was 276, or 49.82 percent — exactly the same thing that would have happened if they had based their responses on flipping a coin. Audiophiles and working engineers did slightly better, or 52.7-percent correct, while those who could hear above 15 kHz actually did worse, or 45.3 percent. Women, who were involved in less than 10 percent of the trials, did relatively poorly, getting just 37.5-percent right.

Most of these tests were conducted at professional audio events like the AES convention.

As for the claim that LPs have greater dynamic range than CDs? Cite, please! Because my link states that CDs can have more than 90 dB of dynamic range, while a virgin pressing can start at 60 dB and rapidly drops from there as the noise floor rises. Coupled, of course with the fact that this theoretical 60 dB is dependent on a short recording to accommodate the wide groove pitch.

I disagree. The amount of error (THD, quantization error, etc.) in modern digital audio systems is so low that I doubt anyone can reliably hear it.

This is easily proven, but method of the proof won’t be accepted by the analog partisans. If you an original sound source simultaneously to a record lathe and a CD you can easily compare the two waveforms to see that the CD captures the original sound accurately. Where this all fall apart of course is that the only way to view the waveform is by feeding it into one of those evil digital machines. Audiophile claims are subjective (disdained by engineers as “golden ears”) while the opposition is objective (disdained by audiophiles as “meter readers”). There is no possibility that the two camps will ever see eye to eye.

It’s pretty much moot at this point. Virtually all recordings are digital these days. The studios that still have analog tape machines use them as an effect - running the drums on a rock track to a 24 track to get tape saturation. Even if they track to analog (an expensive proposition) they’ll still digitize at some point to edit. Anyone who has cut analog tape with a razor blade does not miss it at all.

Actually I do, but just a little bit. I miss the tactile aspect of shuffling the tape. However, this is 100% subjective. When actual work is concerned digital editing far, far superior to tape and razor blade. However, I’m both glad I had a chance to actually splice tape and glade I don’t have to do it anymore.

To add a bit to the elaboration on digital reproduction: The Shannon Sampling Theorem applies to time-sampled signals. It proves that any time-sampled signal of finite bandwidth can in theory be reproduced perfectly (given a sample rate of twice the bandwidth). The distinction here is that the samples can be any level (i.e. analog). Digital signals must take a particular value at a given time (that’s the quantization), and hence cannot reproduce the analog signal identically. In practice, the error is usually designed to be below a level that it would be possible to notice.

The only way this might possibly be true is if you’re using an optical record player and ignoring frequencies below 100 Hz, and even then I seriously doubt the number. A CD has an ideal dynamic range of 96 dB (20 * log[sub]10[/sub]65536 to be precise) and it’s possible on any given CD even if they aren’t all mastered that way. I’ve never heard of a vinyl record that had 106 dB of dynamic range overall. The vast majority of vinyl LPs in use barely have 60 dB even fresh. Granted, it’s again largely at the low end (there’s noise down to subsonic from the turntable moving).

There is one thing of importance to note - much of the noise on an analog playback is very easy for the brain to filter out (not clicks, but I mean things like tape hiss). Although it takes more noise to damage it, errors can truly mess up a digital signal and the brain can’t recover it nearly as well if at all. Most of us have likely listened to radio broadcasts, or watched television at some point with severe levels of static noise, but we probably don’t recall the noise. The brain has some remarkable signal processing ability. Consider the success of perceptive lossy compression like mp3 which can still do tolerably decent throwing out 80-90% of the information from a CD-quality signal.

(Anecdotally, though - I even have tinnitus in one ear and can’t hear much above 16 kHz but using ABX tests on my own material I need at least 256 kb mp3 (or 192 kb AAC) to fool my ears reliably.)

A problem with these sorts of tests, though, is that they don’t take the hypersonic effect in account. This is a phenomenon that is not quite well understood, but people appear to the unconsciously sensitive to frequencies above the hearing threshold. Oohashi and al., who conducted the hypersound experiments, noted that usual AB tests are flawed because the samples, and the pause between them, are too short. If you are to do tests, you should listen to a whole piece (theirs was 200 seconds long) and allow a few minutes for your brain to rest between listenings. Even then, the difference will be in rather subjective terms, like “rich”, “smooth” or “warm”. However, EEGs show there is a real response to sounds that are too high for you to consciously perceive.

Where I feel that this may be relevant in the discussion of LP vs. CD, is that the CD will, be design, almost completely lack any hypersonic material. While the size of the needle tip limits the range of frequencies that can be accurately reproduced, a turntable should nevertheless reproduce some high frequency energy, which may play a role in explaining why some feel vinyl sounds “warmer”.

I’m not trying to be the guy who claims the golden ear and the magical stereo equipment. I think anyone with a decent pair of headphones and a decent ear can fairly reliably (80%+) tell the difference between a 128K mp3 and cd quality, without even having A/B sources.

I have a large mp3 collection spanning various years - I’ve tried to do 192K+ but in the early years nearly everything was at 128k. I’ll set my collection on random and something will come up and I’ll think “ugh, this has to be 128” and I’m usually right. Part of the issue is that encoding technology has improved over the years at the same time average encoding bitrates have gotten higher, but 128 is just harsh enough that even very common sounds (cymbals sound washy for example) are usually noticibly modified.

It’s a continuum though - we’d probably agree that almost everyone could hear the difference between a 32Kbps or 64Kbps mp3 compared to cd, right? Most people could tell 96. 128 is where you start to be able to hide it if you listen on something like stock earbuds or $20 speakers. I’d probably need an A/B comparison to tell the difference between most songs at 192Kbs and CD quality, and I probably wouldn’t be able to tell the difference often at 256.