The sound quality of "Playhouse 90"

I’m watching something on AMC right now, during which they excerpted scenes from an episode of “Playhouse 90”, from 1959. The program itself is of a serious nature, but I couldn’t help but notice when they showed these excerpts that the sound quality was stunning! I mean, it sounded almost like a CD. I contrasted this with the commercials shown intermittently during these excerpts, and they sounded like what I would expect something from the late fifties to sound like.

What, in this instance, caused the recording to sound so good? I’m literally shocked.

Playhouse 90 is available on DVD and I’d WAG that the audio had been cleaned up for the DVDs and that whomever did the show on AMC used the masters that had been made for DVD for their source on the show.

That’s true, the sound quality of old 2-inch Ampex videotapes has a surprising brilliancy. The reason why is fairly simple: those tapes were running through the videotape recorder at 15 inches per second, the same speed as studio-quality audio recordings. The faster the tape recording speed, the greater the frequency and dynamic ranges possible. The broad bandwidth required for a broadcast-quality television signal necessitated the 2-inch tape width and high recording speed; the improved sound quality was a side benefit.

The ironic thing is that the low quality of speakers installed in television sets before the 1980s gave little indication of the high fidelity sound available from these 2-inch tapes (which began to be phased out in 1978 with the introduction of an improved 1-inch system). Only now can we appreciate them.

You can hear a two-minute sample of this near-CD sound quality from a videotape of An Evening With Fred Astaire (1958). And this was just a pick-up recording made by an NBC affiliate in Wichita!

You can experience more of the audio brilliancy of old Ampex system with the DVD releases of Mary Martin in Peter Pan (1960), and The Judy Garland Show (1963) (note several mentions of the high sound quality in the customer reviews).

That Fred Astaire sample is astounding. What happened—did television just degrade in quality after about 1961 or so?

I feel that a similar phenomenon happened with music as well; recordings on those Time Life Fabulous Fifties compilations sound about as clear as this Playhouse 90 example—then, all of a sudden, the quality of Top 40 seems to drop dramatically to fuzzy crap, essentially, until the new noise reduction came out around 1968, where it began a slow climb back to the top, depending on to whom you speak.

Thanks for all the help and recommendations.

I don’t know where you pulled that 1961 year from (The Judy Garland Show that I just referenced above was from 1963). The Ampex videotape recording system, with a 2-inch master recorded at 15 ips, was the standard for network broadcasting from 1956 to 1978. Which means that the masters for videotaped shows like The Ed Sullivan Show, Rowan & Martin’s Laugh-In, The Carol Burnett Show, and All in the Family have the same potentially high sound quality as the Fred Astaire and Playhouse 90 clips you saw. It depends on whether the distributor has gone back to the original master tapes and not relied on an intermediate master of inferior quality.

The technology didn’t change (if anything, it was slowly improving); but tastes in sound mixing did. The Phil Spector “Wall of Sound” (which mixed everything but the kitchen sink into one big glurge) became popular, as did false reverb. The latter was especially bad on the early Beatles released in the U.S. by Capitol until the Beatles got control with their own label in '67. The same recordings on British or Japanese pressings from the same era are sparkling clear.

Wasn’t there a problem in the early 80s where the early digital machines weren’t encoding things at the bit-rate they were supposed to, and were, in fact, encoding it at a lower bit-rate? Certainly, a lot of recordings from that era don’t sound quite right.

That sounds awfully fishy Tuckerfan. CD audio has no compression so bit rate is fixed determined by sampling frequency and bits per sample. Change either of those and it won’t sound right on playback. I remember reading reviews for some early CD reeases of existing recordings that were done by engineers not experienced with CD audio that sounded much worse than the vinyl originals.

I heard it from an audio engineer and he was talking about studio equipment.

I’m sure you did Tuckerfan. My point is that variable bit rates didn’t exist until real time compression formats like MPG came along. CD audio has no compression, an a single invarable sample rate and bit depth. I’m not certain how something could be recrded at a “lower bit rate” without making the playback sound like it was on AM radio or having incorrect pitch, both of which would have been immediately obvious.

Purely a ballpark figure (slightly influenced by the end of Playhouse 90’s run) based on ill research and lack of sleep. Sorry. :slight_smile:

And yes, that was what happened; the stinkin’ Wall Of Sound. Gah. I had blissfully forgotten. :stuck_out_tongue:

By false reverb, do you refer to plate and spring, but not echo chambers?

Ah, old videotape is awesome.

I don’t know what method EMI and Capitol used for fake reverb, just that they went overboard on the U.S. releases. Some discussion here.

Another bad invention was fake stereo, which RCA used on the old vinyl versions of Elvis Presley’s greatest hits, vols. 1-3.