60fps movies and TV -- should we force it?

Exactly. Rather than suspend disbelief, engage in the narrative, all I can see are real-life actors doing their lines/actions on a set. As if I’m standing right there on the set watching them. It’s far more realistic but off-putting because I’m used to film.

When radio plays were first broadcast in stereo, some listeners complained that it sounded “too real” and lacked the “intimacy” of the old monaural productions.

I fucking hate motion smoothing on digital TVs. I turn it off immediately on every TV I watch, not just TVs I own, which means hotel rooms and the like where the settings aren’t locked out. I also hate high frame rate experiments in film.

This starts to approach the reason why. It’s not just being “used to” the conventional, traditional frame rate. I have read more than once about the psychological effect of perceiving film, that your brain goes into something like a heightened dream state which causes you to process the experience differently than if you were just, like, looking out the window at reality.

Movies are not reality, and it’s foolish to think that they’d be better if they were closer to it.

If you remember the old Dr Who episodes, BBC used to be in the habit of filming outside scenes in video and indoor scenes on film (or vice-versa, I am not sure). This was particularly jarring in Dr. Who, which would shift from one to the other regularly. This certainly occurred during the Pertwee era, and I think the Baker era as well. One or the other would have been easier to take.

I did watch the first Hobbit movie in 60 fps, and it did look “more real”, but not in a good way. It did not look more like Bilbo and Gandalf interacting in Bag End; it looked more like Martin Freeman and Ian McKellen acting on a sound stage.

Part of this is the movie experience I am “used to”, and I think part of it is that directors are not yet used to 60 fps and the alterations it requires in staging/acting/etc.

Tom Cruise made a PSA on motion blur.

Youtube videos not embedding.

Animation traditionally was done at 12fps as a cost saving measure and a lot of animation today continues with 12fps as an intentional artistic choice despite 24fps being barely more expensive.

The shift to HDTV was challenging to a lot of makeup and prop people as traditional techniques became glaringly ugly under the now higher detail.

Special effects had a transition period during the shift to colour since stuff like chocolate syrup used to be used for fake blood in black and white movies.

Outside on film and inside on video. It was the same for Monty Python’s Flying Circus. It was so apparent, Python made a sketch about it.

Sorry, that was a bit off-topic. Might have to be a separate thread…

When I create animations in Adobe After Effects I routinely add the native AE Motion Blur effect to make fast-moving objects (or fast pans) appear more realistic. This is particularly important when layering the animation on top of a live footage composition shot at normal frame rates (~24fps) that already has motion blur. I want to match the blur.

If you want to add motion blur post-production to footage shot at a high frame rate (e.g. 60fps), you can use an After Effects plug-in like CC Force Motion or Pixel Motion Blur.

Is anything on TV (other than some dedicated super-hi def demo channels) actually broadcast in 60fps? Because, otherwise motion smoothing is just upsampling by your television and that looks like crap.

I can’t find any TV shows shot at 60fps, but here’s a list of movies that are.

Vegas (video editing software similar to After Effects) has a good explanation of motion blur, frame rate, etc. here.

Motion interpolation by your TV may look pretty bad, but motion blur used in post-production using professional video editing software looks good.

Just a guess, but if you see a movie shot at 60fps that looks “artificial”, they probably didn’t want to spend the time and money to render an entire feature-length movie with proper motion blur (rendering times can be huge). Or, maybe the studios are grooming audiences to get used to 60fps movies without blur, so they don’t need to use it in the future.

Yeah, but I can’t watch it at home at the original 60fps as far as I know. So, I would be stuck with 30 fps and motion interpolation, which, again, looks like crap.

Betamax and VHS demonstrated that the “superior” format doesn’t always win.

So how much porn is available in 60fps?

I’m not a video expert but I think there are a few inaccuracies in the OP. One thing to not here is that in terms of technology, there’s a difference between frame rate and refresh rate. Most TVs in the past had a standard refresh rate of 60 Hz, which fit well with the the 30 fps interlaced model (actually, 29.97 after colour was introduced, but close enough). Today gaming monitors and newer TVs are pretty much universally 120 Hz (and some gaming monitors have even higher rates).

As for frame rates, the new ATSC digital broadcast standard as well as most non-broadcast digital sources offer a variety of frame rates, typically 30 and 60 fps. In principle a 60 Hz refresh rate should be able to handle 60 fps just fine. A major advantage of 120 Hz on a TV is when you want to watch source material in its original 24 fps film format. Trying to fit 24 fps into 60 Hz refresh obviously doesn’t work well since each frame corresponds to 2.5 refresh cycles, so either the TV or the video source has to use a process called 3:2 pulldown, alternating between adding 2 and 3 extra frames to make up 2.5, which potentially introduces artifacts. At 120 Hz, however, there are exactly 5 refresh cycles for each 24 fps frame.

A small nitpick on the “soap opera effect”. That may indeed by an impression that some people get from high frame rates, but the term is much older than that and goes way back to the days of CRT TVs. Back then the effect was associated with many direct-to-video productions like soap operas but were not so much the direct result of any particular technology, but rather cheap production values such as poor lighting and audio. It had nothing to do with soap operas being filmed at 60 fps; indeed, they weren’t filmed at all but almost always videotaped.

Just my thoughts – if I got anything wrong here I’m sure I’ll be quickly corrected! :slight_smile:

It occurs to me… is this realistic? I mean, when I’m turning my head and changing what I’m focusing on, is it a messy blur or a well focused transition? I sort of get glimpses of items when my head is ‘panning’ but not a smooth well ordered scene.

This realism may be a case where the camera can do what eyes cannot do, or just does it differently, and it seems off when some people view it.

Most, if not all, newly produced porn from the big studios are shot in 4K 60fps.

The hobbit was shot at 48fps.

What does that mean though? Soap operas and lots of television are recorded at 60fps, but they’re also filmed with a lot of artificial light that some might call realistic but isn’t in most circumstances. I have noticed that 60fps hi-res images have a different look to me, but nothing that would make me consider it more or less realistic. I don’t mean that I would be an objective arbiter of that, but I would like to know why you think it looks more realistic, if it’s something you can explain.

Great article! Key quote:

Film is just as much about what you DON’T show the audience as with what you DO. Shallow depth of field, motion blur, lack of sharpness, and movement all help to create movie magic. If images are too sharp and you see too much detail…that’s not always a good thing.

No, they’re not.

Anyway, all I mean by “60 is more realistic” is that it gets closer and closer to a real world near-infinite refresh rate of our eyes observing the real world. That’s all.