Why do some TV programs look "funny"?

Not necessarily. I don’t know how Friends is shot, but most sitcoms use a “three-camera” setup, where the actors go through a scene while three cameras are filming/videotaping them simultaneously, each from a different angle. The three segments can then be edited and spliced together later, according to the whims of the director.

Because of this setup, most scenes are shot only once or twice, which saves money. A good lighting director will accomodate this by having the set lit to look good from all angles.

The three-camera setup was first pioneered on the I Love Lucy show, IIRC.

I always found the orginal Twilight Zone to be a classic example of film v. video. Occasionally you’ll get an episode that so obviously video that it’s jarring.

Sorry, I’ve nothing more to add. But those episodes always bug me.

Well, in the days of the original Twilight Zone, video cameras were different beasts.

Instead of CCDs, they used vidicon tubes, which are essentially imaging photomultiplier tubes.

And in the early days of vidicon cameras, they had difficulty dealing with non-optimal image conditions such as saturation. So you’ll notice dark halos surrounding bright lights and reflections, where saturation effects suppressed the gain of the pixels surrounding the saturated elements. Very early ones also had slow response times and bad image persistence problems. Thus, fast action subjects were blurry, and it forced pans to be kept slow and gradual.

Worse, they could be permanently damaged by bright lights. I remember as late as the 1980s that we had to be careful with video cameras and not point them at the sun, or even at overhead lighting, for fear of burning blind spots and afterimages into the vidicon tube.

All these factors contributed to the image quality of videotaped programs, and for all practical purposes limited their usefulness to stage and studio environments where lighting and reflections could be strictly controlled.

Erp… by “their” in this sentance, I mean the early video cameras of the 50’s. I should have been more clear.

Obviously, by the late 60’s, outdoor sporting events were routinely covered by video cameras.

Some of the differences discussed here are simply the differences between film and video. Other differences are the ones between old video cameras using tubes and newer ones using Charged Coupled Devices (CCDs).

The key difference is contrast ratio, that is the ability of the medium to simultaneously show detail in shadow areas and highlight areas. Film has a higher contrast ratio than video tape, but tape will one day get there. CCD cameras have a better contrast ratio than the old tube cameras.

The reason “excerpt” scenes and “making of” videos look different from the actual feature film on DVDs is because the finished film has been carefully treated to optimize the final print. Most notably it has been “color timed”. That means that every single shot in the movie has had the saturation/contrast/brightness/hue adjusted individually to make one shot look consistant with the others around it and to generally improve the overall look.

Also, most excerpts from films are merely “work prints” or “dailies”–a rougher unfinished version of the scene. Since the scene wasn’t included in the final version of the film, there was no reason to color time it. Therefore it doesn’t look as good. These days a lot of productions use video dailies, I think, so the “excerpt” scene might never have been transfered from negative to positive film at all-- it may have just gone directly from negative to video tape. For dailies and work prints this is often done in what is called a “one-light” transfer. That means an average hue/saturation/contrast/brightness is chosen for the entire reel of film. Thus, some scenes are too dark, some too light, etc., but it doesn’t matter because it is only a temporary print. Later when the final negative conforming (splicing all the negative clips together) has been completed, the film will be color timed. The differece is most obvious.

When I say “exerpts” above, I mean scenes that never made it into the finished films. NOT clips from the actual finished film itself. The latter, of course, would be color-timed.

If you’re watching in a NTSC country like the US, British video will had been “standards converted” from 625 line 50Hz Pal to 525 line 60Hz NTSC - and that does nasty things to a picture, (particularly to motion, scrolling captions can be un-readable) so video will look even softer compared to film.

Dramas may have been shot on film - and this can be transferred direct to NTSC. but sitcoms don’t usually have the budget.
(in older sitcoms film was used for scenes shot outside 'cos old video gear wasn’t very portable - there tends to be a noticable change in look between these scenes and the studio ones - but of course you’d still see the whole lot standards converted to NTSC)

There was a Monty Python sketch about this. They were afraid to go outside because the house was “surrounded by … film!”

Ah, this comes up every few months on the SDMB boards, someone is trying to understand why the image looks different between certain television programs, and what they are describing is simply the visual difference between film and videotape.

The first videotape recorder was demonstrated in 1951, but the image quality was substandard. The first commercially successful VTR was the Ampex VRX-1000, which was introduced in 1956 at a selling price of $50,000 (over $320,000 in today’s money). A 90-minute reel of 2-inch wide videotape, run at 15 inches/second, cost $250 (about $1,600 today). A color model was introduced in 1958.

Some videotaped shows from the 1960s that you can see today are the original Candid Camera with Allan Funt (the host segments, not the candid camera segments, which were filmed); The Lawrence Welk Show; Rowan & Martin’s Laugh-In; The Ed Sullivan Show; The Red Skelton Show; Dark Shadows; and six Twilight Zone episodes. The first situation comedy to be shot on videotape was All in the Family (1971). (The first situation comedy to be shot on film was I Love Lucy in 1951; many sitcoms of the 1950s were performed live.)

Home video: The first consumer-only VTR was the Ampex Signature V (1963), a reel-to-reel system which as you can see from the photo, was affordable only if you were Hugh Hefner. The first home video system that offered pre-recorded videocassettes was Cartrivision in 1972. Betamax system VCRs were introduced in the U.S. in 1975, and the very popular VHS system came to the U.S. in 1977.

I’m pretty sure this is exactly what it is. If you take speeded up film (recorded at e.g. 48 frames per second) intended to get slow-motion when played back on regular 24 frames per second projectors, but then decide not to use it as a slow-motion shot so speed it back up to get real-time motion, then you will get this strobing effect. Sometimes it’s an artistic choice, other times it’s a side effect.

My guess here is that it’s treating video to try and resemble film. Film has grain, videotape does not, but you can artifically apply grain-like effects digitally to achieve a certain level of similarity to film, though it’s not particularly successful at achieving specifically that most of the time. It does give off a nice effect, however, that can be very moody or gritty.

Lord of the Rings and Matrix are all colour-treated digitally to give a certain effect; for LOTR it was to make it seem painterly like Alan Lee’s art,; and for the Matrix it’s to give that slightly ‘off’ look to the Matrix World to make the audience feel like something is ‘wrong’ with it. The same effect can be applied to TV shows. What it’s particularly good for is matching two shots that may have been filmed at different times, days or even weeks apart, or in separate locations, to make them more homogenous.

Which is why so many shows got “lost” - the tapes got wiped and re-used (also 2-inch tapes are big and heavy - they cost a lot to store)

BBC Worldwide, who market old Beeb classics, are now desperate for the surviving film dubs of things like Doctor Who. These turn up in attics in New Zealand or where ever because some geek at the local TV station saved them from the trash.
Sometimes the only record left is an off-air recording made by a fan pointing his film camera at the TV.

A standard in UK productions for many years was for location sequences to be shot on film, and studio sequences on VT … watching videotapes of shows from the Seventies, the disparity can be quite jarring. (Some of these old show get processed and cleaned up during the digital remastering for DVD, which tends to make the change in colour and clarity less of a jolt. There’s a whole range of software tools used to clean up archive recordings, these days … )

The BBC started to use lightweight video units, that they could take out on location, sometime in the late Seventies (the first story for Tom Baker as “Doctor Who” was shot entirely on VT units), but the film/tape mixed format continued some way into the Eighties, nonetheless.

I think I can shed some light on this discussion.

I have actually been on the set of a hollywood sitcom. I was an extra on an episode of “Caroline in the City” in 1998.

I also have often wondered about the difference in “texture” between certain kinds of video. The sitcoms always seemed to be shot on video rather than film, but there was something different in the picture quality between them and the soaps. So when I was sitting around the set all day waiting for my scene, I made sure to watch the four cameras very carefully to see if I could figure out how they worked. I could see the cameras themselves as well as a split-screened monitor showing the view from all four cameras.

Here’s the deal.

Sitcom cameras have a propeller/fan doohickey that spins through the light path during the take. Between takes you can see it wind down to a halt inside the camera, and, if it’s not blocking the view, what you see looks like sharp, soap-opera style video. When it’s spinning suring the take, you suddenly get that “Friends” texture.

I don’t know if its done specifically to achieve that texture, or if it’s something to do with being able to reconcile the video with 24fps film speed in case they ever want to include a clip in a movie, but that’s how it works.

The film “28 Days Later” was shot almost entirely on digital video. Some of the scenes, particularly daylight in heavy downpour scenes, had a picture quality that reminded me of some scenes in Saving Private Ryan.

You’d have to check a “making of” documentary, but I wouldn’t be surpirsed if the battle scenes near the end were done the same way.

Filmmakers will do whatever it takes to get exactly the texture they want.

The films “Slackers” and “Nadja”, to get a certain effect in certain scenes, used a Fisher-Price PXL 2000 camera.

This was a late 1980’s toy video camera that recorded video onto a standard audio cassete tape. I used one once at an after-school program I used to work for. It has a grainy-yet-clear picture quality that is nothing like anything I’ve seen before.

There’s a film festival every year devoted to its use.

Caroline in the City was shot on 35mm film, not on videotape. Likewise, Friends, Will & Grace, Everybody Love Raymond, Frasier, The Drew Carey Show — are all shot on 35mm film. The studio cameras they use also have a dual video pickup so that the director can direct the camera operators on the fly.

I defer to you, Walloon, and I’ll even provide a substitute for the missing cite in your post:

From a page at the website of The Museum of Broadcast Communications in Chicago.

In that case, I have no idea what the fan contraption is for.

The fan-like contraption you saw inside the viewfinder allows the camera operator to see through the same lens that the film uses, instead of having to use a separate view finder. This is called single lens reflex (SLR), and is a feature that can be found on 35mm still cameras too.

In a movie camera, instead of having only one mirror that flips out of the way when you press the shutter button, there is an array of mirrors that spins around in synch with the shutter speed.

It’s pretty easy to tell whether a sitcom is a one-camera or three-camera setup. “Friends” and “Seinfeld” are traditional three-camera. “Scrubs” and “Arrested Development” are one-camera.

ZTV- All zombies all the time!

I wonder if this is the same thing I’m thinking of:

Some episodes of “Are You Being Served?” has this sharper look as compared to some of the other softer-looking episodes. And those sharper ones look more real-life to me- I get a much bigger sense that there are real people walking around on the set. I really don’t know how to describe it either!

The other instance of this that has always stuck with me is that footage of Lee Harvey Oswald being escorted through the basement of the Dallas police HQ and then being shot. Super sharp look and really feels real.