Questions About Videotaped v. Filmed Sitcoms

To me at least, sitcoms filmed on videotape look fresher longer. Seventies classics like All in the Family and Sanford and Son look much better in repeats than their contemporaries Mary Tyler Moore and Bob Newhart, while Golden Girls and Roseanne look decades newer than their contemporaries Designing Women and Murphy Brown.

Is mine the minority view? If not, why so little love for video these days? (Off hand I can’t think of a single sitcom that’s on video; I’m sure there are some but The Daily Show/The Colbert Report are as close as I can come up with.)

Is video more expensive?

Several sitcoms that are still in first run already have that not-so-fresh look to them, and it makes me wonder how well they’ll do in syndication in a few years.

Did you mean to say single-camera sitcom? Actually, I don’t know what you’re getting at even if you did.

Most, if not all, television is filmed on digital cameras now, so there is no video vs film anymore. The differences between shows now lie in colour grading or lens settings.

Film vs videotape??? This is the 21st century, you know!

A distinction you might be seeing in modern TV is that between the look of the evenly lit sets of “single-camera” shows like 30 Rock or Community vs. the more traditional, stagey, filmed-before-a-live-audience look of “multi-camera” shows like Big Bang Theory or How I Met Your Mother.

IMDb lists several things as still being shot on 35mm negatives, including stuff like 30 Rock.

The most obvious difference to me is motion. Film is a series of still pictures with blur on moving subjects; video is fluid, as the bottom of the field is displaced 1/30th of a second ahead of the top on any given retrace. The contrast is obvious once you know what you’re looking at, especially within a show that uses video for studio sequences and film for locations shoots. The folks restoring old Doctor Who serials for the BBC invented an entire processing technique to restore video-motion to shows that existed only in kinescope (i.e., film) archive or sale copies.

You may also be seeing the contrast in the relative picture quality of film cameras vs video cameras in any given era. Through at least the '70s, vidicon cameras were prone to things like overloading and clipping the signal in any excessively bright parts of the picture, which is why you sometimes see dark blotches in the middle of pyrotechnics or fire practicals.

I don’t have a verified cite for which U.S. sitcom was the last to use the “videotaped look”, but it may well have been one of the UPN or WB series. Don’t think there were (m)any on ABC, CBS or NBC after 1999.

Part of the shift towards film (or in the case of multi-camera sitcoms, the filmed look) might’ve had to do with the kind of sitcom each format became associated with: tape was for kid-oriented and/or lowbrow shows like ALF, Full House, Home Improvement and the like, while film was for the more ambitious, trendy productions like Cheers, Seinfeld and Friends. That division was never true in all cases (Roseanne was on tape and Family Matters was on film, for instance), but eventually, I guess, if a producer wanted his or her new show to click with 20-somethings, they didn’t want the visual look to remind viewers of DJ, Stephanie and Michelle.

Grounded for Life started out as a single camera show on Fox (2001-2003), then when it picked up by the WB (03-05) after Fox canceled it, it became a multi-camera show, though not completely sure if they used videotape or not. The show underwent a lobotomy during its run and I lost interest in it.

I’ve always assumed that the old filmed sitcoms like Mary Tyler Moore were shot in 16mm instead of 35mm, but I don’t know for sure.

Not true any more. My semi-pro grade camcorder can simulate 24fps for that cinema-look, record 60i (interlaced, just like standard broadcast), and 30p (progressive). More professional cameras, like sports units, have even more options, and once recorded, can be edited down to other formats. I can make motion look smooth or jerky.

It’s easy to find out. MTM, for example, was shot on 35mm. Which I would have expected, as it was the one of the premiere sitcoms of the Tiffany Network.

Bah. Not “premiere,” “premier.”

Sometimes I want to punch this language in the nose.

I should have known that information was somewhere on the Innertubes. (I assumed it was shot in 16mm it’s one of the shows that really looks dated now in terms of picture quality.)

Go right ahead, it has it coming. “English doesn’t so much borrow from other languages as follow them into a dark alley, mug them and rummage through their pockets.”

This is the pilot to Roseanne. The credits and opening are filmed in a flat way; the show itself (starting at :48) is much different. What is the difference called? (Today, most TV shows look more like Roseanne’s opening credits and less like the actual show.)

So to avoid further nitpicking, let’s find the technical the name for the different styles. I believe these differences were called* film* and videotape in the 20th century, which is when the shows referenced in the OP were recorded and which is why I used 20th century terminology. (I’m actually aware that videotape is an obsolete term, but, again, it would, I believe, be the correct one to use in describing a show from the 70s or 80s that was videotaped.) My understanding is also that some shows were videotaped and then transferred onto film, and while I’m sure this is no longer the process, the “look” of that process still exists.

Almost every hour long drama is recorded on what I’m referring to as the “film” finish. Almost all news programs, talk shows, awards shows, and daytime soap operas are broadcast on what I’m referring to as the “video[tape]” finish. Many sitcoms were presented on one, and many were presented in the other.

Following are two groups. One has what I’m referring to as the “video” style, the other as the “film” style.

STYLE A
Laugh In/1960s
All in the Family/1970s
Golden Girl/1980s
The Nanny/1990s
Absolutely Fabulous/21st century

vs.

STYLE B
Bob Newhart Show/1970s
Designing Women/1980s
Murphy Brown, 1990s
How I Met Your Mother/21st century
All of the above were shows filmed in front of a live audience. One looks more like a home movie, the other like a cinematic movie, at least to me. Am I the only person who can see the difference in the footage?* And if not, does anybody know what the two types are called?

*I don’t think I’m alone, but I do have ocular migraines that cause me to see auras occasionally, so perhaps.

I always love it when these film vs video threads pop up!

I think the issue you’re concerned with is the way older filmed shows look compared to how older videotaped ones do. It’s fairly complicated and I’m no expert, but I think the key issue is that videotaped shows are initially mass copied & distributed onto other videotapes which, although being a generation down, are still a direct ‘electronic to electronic’ duplication.

By the 1970s filmed shows were also duped & distributed via videotape, but, the initial transfer is done via a device called a **Telecine **(a video camera recording a film projection) and these devices can vary greatly in quality. Therefore copies of older TV shows originally shot on film but rerun by local TV stations from videotaped copies often suffer quality degradation more so than those strictly made on video because these dupes are always ‘electronic only’.

BTW a **Telecine **is sort of the opposite of a **Kinescope **which is a film camera filming a television monitor and is the device you’re thinking of in the above post. They were used exclusively to record all television programing before the advent of video recording and were obsolete by the 1970s.

Nifty! I knew there were productions that shot digital and filmized in post, like new Doctor Who, but I didn’t realize it was in-camera now. That’s pretty awesome.

At first glance, I’d call lighting, camera motion, and depth of field. The opening is lit lower and filmed by one camera moving around the table; the show is brighter and uses a multi-camera setup which jump cuts from one view point to the other. “Depth of Field” is photographer talk for “range of distances from the camera in which stuff remains in focus”. The opening has a shallower depth of field; the background is slightly blurry, while the subjects in the foreground are in focus. The show is shot with both foreground and background clear. Roseanne was a fairly big-ticket show, so I’d guess everything was done on film, but Flash video both commits and conceals a multitude of sins, so it’s tough to tell.

Actually, it isn’t. Productions that use tape just use digital storage systems instead of analog. Essentially, they’re computer data tapes, but they’re still physically tapes. DigiBeta is also still used for some editing and transfer stuff, especially when working with older material which may be coming from formats like Quad. Compressed video is more difficult to edit cleanly, and uncompressed video makes for enormous files, so sometimes it’s still the more cost-effective medium.

Sort of. For a significant period after videorecording was invented, videotape was horrendously expensive. Rather than buy oodles of tapes for shooting and also oodles of tapes for transmission masters, production studios only bought one oodle, for tx masters, and bought maybe a quarter-oodle for shooting on, which were wiped and reused continually. Where the film came in was a process called kinescopy, where a recording of the first transmission was created by literally pointing a 16mm camera at a small TV screen and filming the program as it ran. These 16mm prints were then used for time-shifting (in large countries like the US) or for sale to overseas broadcasters, which was especially handy when you were selling stuff to countries that used a different transmission standard than you did. Videotapes aren’t interchangeable between PAL and NTSC; 16mm film is universal.

Kinescoped shows do have a ‘look’ to them – typically zoomed in, crushed contrast, and either blur or moiré depending on how well the camera coped with the line structure of the monitor – but I doubt that’s what you’re talking about unless you’re going back to things like The Honeymooners. Shows that were shot on 35mm film in the first place would just have had reduction prints struck in the 16mm format for sale.

All of the things on both lists are multi-camera setups, where the cameras are on dollys and can pan and zoom to follow action. The older shows tend to have high-key (bright crisp pastel) lighting schemes; I’m not sure if that was a function of the camera or just the style at the time, but it’s common in things from the 1960s and early '70s. Your A list generally does seem to be on video, and your B list on film (except for HIMYM, which shoots full digital HD – AbFab may also, I’m not sure). The way I tell the difference, especially with questionable compression like YouTube, is mostly down to what kind of noise the picture has. VT noise in many cases scales with the content of the video signal, and is horizontal in nature – dropouts on the media cause white flashes that run along the scanlines. (Where the white streaks are and how they move with the picture can actually tell you what kind of video the show was shot on, or at least what kind of stock it’s been sourced from. Quad dropouts look different from 1" tape dropouts look way different from home VHS systems.) Film grain, inherent in the stock, is a haze distributed evenly across the frame, and physical damage to the film is either irregular blotches or goes vertically (i.e., lengthwise down the strip of film, as it travels through the camera or projector), and can be black, white, or colored, depending on whether the film is B&W or one of many different color systems, and whether the dirt or damage happened to the negative or a positive print.

The two also deal very differently with things that are reflective or ‘whiter than white’. While reflective whites will bloom, or have a kind of a glow that runs over the borders of the object, on both formats, older TV cameras had a difficult time dissipating the small spots of illegally-high signal on the sensor before that spot got scanned again, so bloomed whites (or bright colored lights, like fireworks or flares) would also have a comet trail behind them as they moved, where charge remained from the previous field. Because of finite (low) bandwidth, video signals also have a variety of interesting problems with colors smudging into each other, spurious color in areas of the picture with fine high-contrast detail, checkerboard dots showing up when two high-contrast colors meet, etc. – the exact ones depend on what broadcast standard and format you use. For me, at least, most of the real giveaways are down to limitations of the different media.

(Don’t even get me started on compressed digital formats.)

Aside from that, up until recently film had a better dynamic range than videotape, so film productions could have blacker blacks and whiter whites in the same picture, although not all of them bothered. There was also previously a huge difference in overhead and setup time for the two kinds of shoot – video was used for things that had to be done in a hurry, like news and soap operas, where there just wasn’t enough time to lock down a film camera, shoot, and get rushes developed in time to do anything about them.

I also note that everything on your lists except the very new AbFab is an American production. Different studios had ver-r-r-r-ry different standards for lighting and set design sometimes. For pretty much anything done prior to DigiBeta, BBC staging and lighting design were unmistakable, and ITV only slightly less so.

(…I should just give up and write a book on this…)

(Missed the edit window – double post! Sorry.)

Both terms are used for both directions. Kinescope/kinescopy is more common in the US and telecine as both noun and verb is more common in the UK. You’re correct that they were generally obsolete as a storage method by then, as videotape had come way down in price, but the BBC was still making black and white 16mm telecine copies of shows like Doctor Who well into the 1970s for overseas sale. The Restoration Team working on the broadcast and DVD releases of old serials had a heck of a time with a couple of early '70s stories which were originally shot in color, because the BBC had junked their videotape copies and the only extant complete copies of some episodes were the overseas telecine prints.

You can still find people with kinescopes if you look around hard enough. Shops that specialize in transferring old Super8 home movies onto VHS or DVD sometimes have a similar setup with a DVcam or camcorder instead of a traditional TV camera.

Shows that were produced on film long ago look bad basically because the color pigments have faded and the celluloid has gotten scratched/broken/cut-and-pasted over decades of wear.

From what I understand, the initial quality of the film stock had a great deal to do with this. I’m sure that TV shows, with their limited budgets, were often forced to use film that wouldn’t stand up as well as the stuff used for blockbuster motion pictures (but even they deteriorate as time goes by).

Nowadays, prints of “Gilligan’s Island” (for example) can be restored to the brilliance they first had in the 1960s using digital technology. If you watch the documentary clips for the remastering of ST: TOS on YouTube, you can see the difference this makes.

I don’t know how long the signal on a videotape will last, but I suspect most of the older shows that were produced on tape exhibit similar deterioration of the picture and sound.

Kinescope was very primitive technology. The trick was synchronizing the camera with the images on the screen to minimize the scanning effect.

Depends on storage conditions. Broadcast tapes, which were sturdier than home formats, can survive decades, if they’re archived properly and not played often. The BBC Film and Videotape Library has archives that go back to the 1960s (although they’re a bit spotty, prior to the mid-'70s) that include 405-line black and white tapes that are still in playable condition. The bigger challenge these days is finding a machine that works well enough to run them, in fact. VHS tapes top out at 20-25 years, ish. If you had 8mm home movies transferred onto VHS back when VHS was new, you’d have better luck getting a DVD made from the original film stock than the degraded tapes.

How durable film is depends on the stock and emulsion used. Nitrate film has a nasty tendency to burst into flame, but acetate film quietly degrades into flakes and vinegar if left alone too long. Polyester film seems all right so far, but we’ll see. It also depends on whether you’re storing a negative or a positive print, particularly if you’re trying to keep a film made with one of the early color processes in playable condition. Prior to panchromatic film, a lot of attempts at color involved shooting different color separations on separate strips of negative, tinting or dyeing the prints, and then cementing the layers of color together to make the distribution prints. Stuff laminated together with glue tends to come unglued over time, especially when it’s run repeatedly through a hot thing like a projector.

I remember reading about restoring Laugh In. Apparently it was all shot on 2" quad, but the machines were so expensive, and the loss from dubbing from one machine to another was so noticeable, that the show was edited exclusively with razor blades, using ferrofluid to visualize the recorded tracks.

Wow! Wow. Wow. Every one of your posts in this thread could be described by many posters as “too long, didn’t read”. You are a gem. Your knowledge of the mediums of capture and subsequent conversion for distribution of television material literally blows me away. I will return to this thread later to read more in depth what you have had to say, but for the moment let me say thank you for posting and dammit bless the dope as a forum for having posters like you who can describe the technical
aspects of a question like this as engagingly as you have. Seriously, thank you!!

P.S. I think I love you!