This odd quality some films and tv shows have--frex., some old Brit. sitcoms, some old Twi. Zone eps

All my life I’ve encountered this in some TV shows and films. (I think always as watched on TV, I think maybe not ever in theaters.) It’s this certain… quality… that I’ve never been able to describe. I know it when I see it. The best I can do is give examples.

Usually, I see it in old British shows, for example sitcoms, or Dr. Who episodes. I also see it in some old Twilight Zone episodes, for example, the one with the used car salesman trying to sell a cursed car.

How to describe this quality? It has something to do with the way things look when they move. There’s something unusually… 3-d-ish about it. But that’s saying too much, because it’s not quite right. I know I’m not being helpful.

But today I just encountered it again in an unexpected place. I am currently watching The Goonies on G4, and it’s got that quality! And my wife agrees! (Though she too is at a loss to describe or explain this quality.)

But my wife says The Goonies has never had that quality any other time she’s watched it. (I’ve only seen pieces of it before, but to my recollection, also, it’s never had that quality.) So what’s different? This: We’re watching it on a High Definition TV screen. The movie itself isn’t HD of course, but the screen is. That’s the only think we can think of that’s different about this viewing.

And now, we notice the screen seems to be lending this quality to many other things as well. The “Lollypop” commercial, the “Rollover minutes” commercial, are two examples.

So my question is, what’s going on? What do old British shows, early Twilight Zone episodes, and contemporary non-HD transmissions viewed on HD screens, have in common that might give them this common quality that my wife and I both interpret as having something to do with there being something… off… about the way things move when things are moving in relation to each other on the screen?

Anyone? Anyone?

Is it possibly that they’re recorded on film rather than on videotape? Filmed things do seem to have a different “feel” to them than videotaped ones- it is kind of hard to explain.

I can’t answer for Goonies or the commercials, but for a long time British shows were shot on tape when indoors and on film outdoors. Monty Python, for example, used Arriflex 16BLs outdoors, which could film in lower light and were much, much lighter and more portable than video equipment of the day. They even did a sketch about it. ('Gentlemen. We are surrounded by film!) Some Twilight Zone episodes were also shot on video, with the resulting ‘look’.

Uh…weed?

:cool:

I was also going to suggest that they were “filmed” on videotape. Except for Goonies, which may have had that sort of effect because G4, even the HD version of the channel, often shows 4:3 pan-and-scan versions of movies.

I think I do know what you mean about the way certain Twilight Zone, sitcom, British shows, etc. look (especially in motion), and I’ve always believed it to be due to using videotape instead of film. It’s also why soap operas have a different look than movies or other higher-quality shows.

Wait a minute. I think I know…now…what OP is talking about. The best way to put it is: Things seem to shimmy.

I figure it’s differing fps rates, or scan rates, or some kind of processing artifact to do with pan/scan, in the case of that technology.

As far as the Goonies and commercials, some new TV’s have a feature that makes what you’re watching 3Dish. I only saw it once, and made the homeowner turn it off because it was bugging me, it was almost as if the foreground and background weren’t quite synced up.

It took me a second to grok, but I understand what the OP is talking about as I’ve seen it in the past myself. It seems to me that it’s always been particularly noticeable with British shows, though I have noted it with some American shows and certain films as well.

Some of the difference is almost certainly due to videotape vs. film, but I’ve also chalked it up to the differing standards between PAL and NTSC. I’m not an AV expert but as I’ve always understood it PAL tends to have a higher resolution, (625 lines versus NTSC’s 480), but a lower broadcast frequency, (50 vs. 60), and a lower frame count. PAL averages 25 FPS where NTSC averages slightly less than 30. I don’t know this to be a fact, but I’ve always assumed this would result in a slightly different looking video signal, even after adjustments were made to translate one signal to another.

I think it’s shutter speed.

Normal frame rates cause a standard length of motion blur, but if you increase the shutter speed while recording, the blur is reduced.

For example, when you intend the shot to be played back in slow motion you increase the speed of the film, which reduces the length of a motion blur in each individual frame. If you play it back at regular speed, so the action is slow motion, the length of motion blur looks right. But if you play it back faster, so the action is regular speed, the motion blur is shorter, and causes a staccato shimmy.

Video tape does this when the settings on the camera are adjusted for certain lighting conditions - the shutter speed automatically adjusts, and sometimes locks into an incorrect setting despite everything. The interlacing of standard PAL video seems to emphasise it in certain conditions.

That Twilight Zone looks like a kinescope.

I know that a few TZ eps were videotaped to save money but then they went back to filming them so most of them look “normal” for lack of a better word, but there are a few (Long Distance Phone Call, Room For One More) that have the weird quality due to the videotaping.

I know what you’re talking about! I’ve noticed it with the old Dark Shadows series; it’s kind of like every object in the picture is sharp and clearly defined - more 3D, like you said - whereas in other shows the foreground just kind of melts more into the background. It’s like the difference between the Fat Lady portrait in the first Harry Potter movie and the rest of the movie.

There were about six TZ’s shot on videotape; the used car lot was one. Another was Billy Mumy talking to dead Grandma on the toy phone, and “Twenty-two” (“room for one more, honey”) was another.

Oops, sorry, Freudian Slit.

The motion blur suggestion seems plausible. For one description of the effect that comes to mind sometimes is that things look “too defined” when they’re moving. My wife last night said things looked “too real”! And it makes sense to me to think that when I’m watching something on an HD screen, there’s less motion blur. (I take it motion blur involves a kind of “image echo,” and on an HD screen, that echo would tend to be more narrow since there are more pixels on the screen. Also, does HD refresh faster?)

Can it be confirmed that videotape yields less motion blur than film? If so, why?

I’m intrigued by the suggestion that it’s due to some “3-d-ization” effect some new TVs have. Has anyone else heard of this? Do you know what the technology is called?

Can motion blur be reduced by a faster refresh speed in playback? And does HD have a faster refresh speed?

I’m pretty sure this is what the deal is (Its also a pretty accurate description of the tech parts of it, pre-HD). Especially on earlier (pre-90’s) British TV this will be noticeable. The conversion process from PAL to NTSC and vice versa results in some frames being doubled, and it creates a weird kind of effect that could be described in the way the OP does. Pan-and-scan on the Goonies could also cause something similar - especially when the image is being panned from center to one side or the other. I’m not sure why it would happen with old Twilight Zone episodes, as normally they were shot on film and transferred to video just like most TV shows of the period. Some of these things would tend to be more noticeable on HD.

I used to work in tape duplication, and did PAL to NTSC conversions all the time, and by the mid-90’s the technology started to really improve and get cheaper, and now I can hardly tell the difference - but it used to be very noticeable. Mot people could tell something was “wrong” but couldn’t figure out what.

I was looking at a display of HDTVs in a store recently, and one screen had a sticker on it claiming it had a faster refresh rate. The video on that did look a lot more “real” and dimensional than the other TVs, which I couldn’t quite pin down, or even articulate.

So, yes, I think there’s something in that theory.

Does this cover the same thing that makes fairly current British TV shows look like they were filmed in the late 70’s or 1980’s when broadcast in the US? I get this feeling whenever I watch the British Whose Line Is It Anyway?. Shows may be from 1997, but look more like something from 1985.

Videotape
From IMDB