60fps movies and TV -- should we force it?

The 60fps phenomenon in movie/tv content is certainly a strange one. In case you’re not familiar, your brain has been programmed that 24fps is the correct framerate for fictional content. The real world, of course, runs at an infinite fps. When presented with media that is at 60fps or higher, it looks “fake” to us, and it seems to be something the brain just can’t get over. It is often called the soap opera effect, since soaps are regularly filmed at 60fps.

Here’s the downside tho – 60 fps is objectively way better. Especially when the camera pans across something quickly, the 24fps image looks very jerky in comparison. If you get used to playing a certain video game at 60 fps, and then go back to 24 or 30, it feels awful.

It feels like we just need to get over this brain-block of 24fps=fiction. Should they force it? Will they be brave enough? If kids grew up with 60fps, they wouldn’t have this problem.

Turn on motion smoothing on your TV and you will see the difference right away.

We tried with our most recent TV purchase, but had to change the settings after a couple of days. But that’s two 55 year olds who have been trained that “video” is inferior to “film”. With the all options available on newer TVs, this sort of thing will find a consensus and the movie theater screens will follow. Until then, majority preferences will, as always, prevail in the marketplace.

I’m not sure I buy your premise about 60fps looking fake. To me, reality looks like soap opera resolution (I call it “live look” . I am used to “film look” and enjoy, if not prefer, it in media presentation, but in no way does it look like reality to me.

In fact you said it yourself, reality is more like an infinite fps, though I bet with understanding the brain and visual cortex we could probably get that to an actual number, which is probably greater than 120. So if anything is “unreal” it is the standard 24fps of film.

We’re just conditioned to enjoy film look due to shear repetition of observance.

I’m trying to think of a similar shift in tech that was difficult for people at first… what about color TV? Windows upgrades on PC? It’s tougher than I thought to think of new tech that is this disruptive despite being an improvement.

No, 60fps is objectively more realistic, but there is no way to declare that “better”. I can certainly see the argument that it’s better for gaming - you control the perspective, and want what you see to match what you expect. But movies/tv aren’t controlled by the viewer - they’re controlled by the director, who wants you to see what he wants you to see. 24fps has done a better job of that. I think that’s because there needs to be a disconnect from ultra-realism, especially on sets. This is very clearly evidenced in the horrendous “Hobbit” movies.

You should have been around here 3 years ago when we switched from vBulletin.

I think the reason 60 fps looks “fake” is because it looks more real. Stay with me for a second… because it looks real, the sets and the acting becomes more noticeable as non-real, and our brain rejects it. Wrap up that same presentation in a 24 fps show, and our brain relaxes a bit and just accepts what it sees.

It’s similar to the uncanny valley effect. If a digital/animated representation of a person gets very close to real, but is not quite there, it looks hideous to us, more “fake” than if it were an obvious anthropomorphic cartoon human.

I asked about this here previously. Had insomnia in the middle of the night, I turned on tv and was vaguely alarmed to see tv shows I’d seen a dozen times before showing on screen like videos instead of films.

Some of the basic assumptions of the “fakeness” are not entirely correct, for those if us who have reached a “certain age”.

Video formats used to be 20 fps, when film was 24. So it was very easy to spot the difference. Sitcoms taped “before a live audience” used studio lighting and controlled camera angles, and really didn’t look “wrong”. But action shows taped outside really looked different. You could spot a taped show in a second. That’s what I call “soap opera effect”.

And that doesn’t even get into shutter speed. Video has different motion blur than film. Saving Private Ryan’s D-day battle was the most notable example of a fast shutter speed giving almost no motion blur, giving a “hyper real” feeling.

For an example of high frame rates, look to the history of Doug Trumbull’s Showscan process.

I think the biggest problem was the amount of film required per movie (2.5x as much), and the technical difficulties of reliably moving 70mm film that fast. Digital removes those limitations.

Remember when the photos on your smart phone started moving a smidge on viewing? I turned that off as soon as it started.

I’m going to both agree and disagree with you here. You’re right that “better” is subjective, and we can’t necessarily say that a higher frame rate is better. And I agree that to me, 24fps does a better job of meeting the director’s vision.

But I disagree with the implication that 24fps is objectively better at it. That’s the way our brains are currently programmed, and a kid growing up on 60fps might have a different idea of what is better. Which gets to the heart of the OP.

My opinion is that over time, 60fps would eventually be recognized as better. I notice the panning effect in movies, and it does take me out of it a little. But I also suspect that it would take me years to get over 60fps, if I ever did. So forcing 60fps would be an investment in future generations, and pretty much everyone Millenial and older gets screwed.

Just to nip this in the bud right now, but motion smoothing is different then shooting something in 60 fps. Motion smoothing takes something shot in 24 or 30 fps and adds fake frames to get it to 60. It’s a terrible, terrible system.

Whoa, I think you just figured out how my brain works.

And it explains my feelings about 3D movies. I have one bad eye (20/800, not corrected…it’s clear at about 16", and is my reading eye). So I see “real life” in 2D.

But, for some reason, 3D still works for me. It’s fun, but it is utterly unlike how I normally see things. So I pick movies that have little relation to ordinary life to watch in 3D. Best example is that I HAD to see Avatar with the polarized glasses on. I ignored the fact that none of it looks real to me (and ignored all the bad writing!), and just hung on for the ride.

I think @SlipperyPete nailed it. I don’t want a Marvel movie to be real life, my brain wants to know it’s a movie, and enjoy it as such.

.

Oh, and if anyone here has two good eyes :stuck_out_tongue_winking_eye:, does 3D look realistic to you, more like your daily life than a 2D movie?

There is no such thing as “objectively better” in art. It’s subjective preferences all the way down.

Forcing artists to use particular a media or style will not end well.

I don’t think he’s talking about forcing artists to do something. I think it’s more about if we should force our brains to adapt to it.

I actually reworded my post to say “has done better” to avoid that argument, because you’re correct. In the future, 60 may be better. (I doubt it, but it might.) I wonder if VR gear will almost demand exceedingly high fps rates if we want virtual spaces to become more common, or virtual media to be more immersive.

That depends on what you mean by “better.” There are some reasonably objective criteria by which art can be evaluated, but a lot of subjectivity comes in when you decide whether or how much each of those criteria matter.

That’s how I view it. At 60fps I notice the set and acting a lot more, which makes it look more like a stage play, or I guess, a soap opera.

I don’t have any studies to cite, so I’m going to just toss out some comments based on vague memories of what I’ve read. I seem to recall that human beings can process between 30 and 60 images per second…maybe a bit higher (75 ips?). We can react to images in as short a time as 13 ms (0.013 seconds), which is a little more than 70 ips.

But it can be fatiguing. Yes, I’m an old fart who grew up with 24 fps and 29.97 fps. Watching 60 fps for extended periods actually bothers me a bit and tires my eyes. It might be different if I were gaming or having to study a video intensely for analysis, but I still think it would be less comfortable than a typical 30 fps video.

I’ve seen a couple movies that were projected in 120fps and I thought they looked incredible! I loved that for once when the camera panned I could actually still see what it was showing instead of a messy blur.