48 fps - is it the future?

I just recently watched The Hobbit in 48fps. Aside from being a cheesy movie in general the frame rate ruined it for me. No matter how good a movie is I can’t take it seriously if it’s filmed at such a high frame rate.

I’m not an old guy. I was born in 1990 which places me firmly within the younger generation. Yet watching 48fps films makes me feel like an old fart who can’t appreciate modern movies. It just feels too real, I can’t immense myself in the film in the same way. Rather than making it look better it makes it look mundane and less epic.

Do you think it’s solely cultural bias on my part for being born 20+ years before it started to emerge or will the youngest people who are still children not buy into it either? I really hope not because it will make it far more difficult for me to enjoy movies.

That’s because you aren’t used to such realism, but familiar with the artifacts imposed by 24fps. Think of what it must have been like when silents (at 18fps) went to 24, then monochrome gave way to color, then sound was added and screens were widened. Each was a jarring change, but somehow people adjusted. I doubt if many want to return to the previous technology now.

Yes, it’s the future. It’s also the present.

How did you watch it? I missed it in theaters and want to see if I can tell the difference myself.

A lot of people apparently felt the same way I did, though. Our brains process 24fps film in a different way from real life; not so much with 48fps or videotape. But maybe the future idea is for film to look more like “real life”. Maybe kids and people in future decades will expect their movies to look the same as waking life rather than having a subtly different quality.

Perhaps CGI and filming techniques under this method need to become less primitive before it looks more acceptable?

I was kind of shocked when I read your OP. I have a problem when I watch a movie in the theater and they pan the scene, because of the jerkyness from the low frame rate. I was thinking you’d be talking about how great 48 FPS was.

I’m twice your age, but I rarely see movies in the theater, so I guess I don’t have the conditioning. I don’t remember it looking like that when I was younger, and went to movies more often.

Also, I’ll report this for forum change.

a frame rate ruined a movie for you?

Just try talking to computer gamers about frame rates. :smiley:

One of the biggest artifacts of a 24 fps is a phenomenon known as “pan judder”; that is, when the camera is panned, the lateral rate of the image is such that it appears to jerk from frame to frame. Although this is highly artificial (albeit, not entirely alien from the way we perceive panning of a stationary scene, as we’ll discuss in the moment) it is what we’ve become used to, and a more smoothly panned image at 30 or 48 fps appears “unnatural” in the medium. It should be noted that because of this effect good cinematographers know to avoid panning rates that result in obvious judder, and so when a higher panning rate is available with a greater frame rate, the result can be jarring, almost more like a zoom effect.

Because video monitors (and especially modern high definition monitors) have a comparatively high effective refresh rate, animations using pure CGI can also look artificially smooth, and part of the evolution of CGI has been to implement filters which make images look as if they are filmed on 24 fps when such an effect is intended.

In reality, when you pan an image by rotating your own head, the effect is also not entirely smooth. If you slowly rotate your head, you will notice that you perceive a slight jerking as your eyes refocus ever few degrees. This is because they eyes try to remain focused on a particular image or object despite the motion of the head or body, which helps you avoid seeing an image judder when you are in motion, e.g. walking. So, you naturally anticipate some degree of juddering, and when an image pans completely smoothly it looked as if the entire scene is moving with respect to you rather than you moving with respect to it. When it pans enough, you anticipate a jerking motion, which of course you don’t feel when you are sitting in your seat in a movie theatre.

Whether 48 fps is the wave of the future in terms of cinema remains to be seen. Previous attempts to implement a 48 fps film-based system have failed due to logistical and technical issues, e.g. it required heavier film cartridges, more expensive and delicate projection equipment, et cetera. However, with digital projection rapidly taking over physical film, the logistical issues for both filming and projection are being overtaken, and I suspect the format will be increasingly used for films for which it is of greatest benefit.

Stranger

This is my reaction too. It’s like saying the movie was ruined for you because you didn’t like the font used for the credits. I suspect people wouldn’t even notice the frame rate if they hadn’t known about it ahead of time.

You’d be wrong. There’s a reason they call it the soap opera effect.

Let’s move this to Cafe Society.

Colibri
General Questions Moderator

I have not yet seen a 48FPS movie, but it is a big deal and really affected a lot of viewers of the Hobbit movie. It was so fake looking, many reviewers recommended not giving a final opinion on the movie until you see it in 24FPS

And to my understanding, anyone would notice the new frame rate immediately, even if they couldn’t put their finger on it.

Aye, this is all true.

I saw The Hobbit in 48fps and it was difficult to watch some scenes because of the frame rate; as Mahaloth says, it looked “wrong” somehow.

A lot of this is the fact that techniques haven’t been developed yet that actually work with the new technology.

Think back 5 or 6 years ago when HD broadcasting really began in earnest: lighting and makeup often looked terrible via the new medium. It took/takes some time to adjust skills and techniques, but today, your local newscast in HD prolly looks great. Eventually, things will get sorted out and 48fps will look “correct”.

I don’t get it. Logically a more realistic format should allow better immersion into a movie. If you think it sucks it’s probably because action movies are made for 12 year olds and trying to make them “more real” just ups the anti on how retarded the directing values are.

I haven’t experienced it, but I suspect that no one is going to bother, other than a gimmick. Forget how good or bad it looks: it’s not likely to be a selling point to get people into the theater, and 90% of those who see it aren’t going to care. So there’s no need to bother with it except as an experiment (no one argues that The Hobbit earned more money because of it).

As for it being “realistic” or not, it will probably be just as unrealistic as HDTV, which is nothing like how most people perceive the world (look in the distance and things will be blurry), but is constantly touted as being a more realistic view of things. It’s actually completely unrealistic; just very sharp. 48fps sounds like the same thing – nothing that anyone actually sees, but very sharp looking.

I’m wondering the same thing. It’s not showing in any theaters in my area. And would the 48 fps rate be apparent if the OP were watching it on DVD or Blu-Ray?

Exactly. That’s why the best paintings are photo-realistic.

The other thing I noticed in the Hobbit was the action scenes seemed to be too clear. In 24fps you get more motion blur, which plays with your expectations of things moving faster than they seem. At 48 fps, while it was pretty awesome to see the whole scene in pin-sharp detail, looked a bit slower and less kinetic.

Interestingly enough, most of the pc/console games in the last 5 years have been adding an artificial motionblur to deliberately make things more ‘cinematic’.

Because of how contentious this topic is, and has been for some time with respect to videogames, I wonder whether the ability to sense higher frame rates varies among different people. As Quartz alluded to, I’ve seen gaming forums in which some posters argue vehemently that the human eye can’t detect any differences in frame rates higher than 30fps. I can discern a 60fps frame rate from 30fps immediately, and I don’t think I’m remarkable in that ability. Some people also seem not to notice screen tearing in games, but I do and find it extremely irritating.

I haven’t seen the 48fps version of The Hobbit but I did see the trailer, and it looked distractingly like it was shot on videotape, like a made-for-TV BBC production. I assume that’s what Mosier means by the soap opera effect. It just looks cheap and hyper-real somehow.

I’ve been trying to figure out something. I know the effect you guys are describing. But the times I’ve seen it have been when watching some old shows (some Twilight Zone episodes come to mind) on a normal TV, or when watching at least some movies on my father-in-laws fancy TV. (Goonies, for example, acquired the effect we’re discussing when we watched it on his TV. I don’t know what was different about his TV.)

My question is, if it’s a framerate thing, how can it affect shows being watched on normal TVs (like mine) which I presume don’t have a high frame rate, and how can it be “acquired” by movies that were presumably filmed at 24 fps? Would a 24fps film shown at 48 fps get the weird effect somehow? I wouldn’t have thought so…