Sounds really interesting. But . . . you were 30, right? I mean, when you lost your virginity?
Are you going to ask him ‘what’s it like?’ :dubious:
Are you kidding? A pro snow-mobile salesperson like RR has his pick of middle-aged divorcees. Toss his mullet, crack his trademark Rand Rover smile, and the deal’s half done.
[Electric guitar flourish]
:rolleyes: You’re trying too hard.
No, I don’t remember the frequency, somewhere around 30Hz I suppose.
Are you still thirteen? :rolleyes:
Perhaps. But you’re worth it.
Pretty much everything in that blog post captures the way I reacted to the HFR.
The blog post essentially boils down to: It doesn’t look like 24 FPS film. And I prefer 24 fps film.
Whatev
No, he did not see the movie in three different ways. He saw bits of the movie as he moved from theatre to theatre in the cinema, so he never committed to sitting and watching the HFR 3D version straight through. Clearly he’s just clinging to legacy because it’s what he’s familiar with.
I’ve never understood this obsession with film, grain, and 24 fps. But then, I also don’t understand 3D, IMAX, and 70 inch TVs.
Isn’t it that high frame rates require new ways of making films that haven’t yet been perfected? So the first ones will seem weird?
I suspect so, and it’s also acclimatization for us as viewers, too, and Vince mentions some of that towards the end of his analysis of the HFR 3D version of the movie.
Which would be fine (though maybe not in a movie that wants to be the biggest movies of the year), if that what was said. But the people who made it say it looks great and the flaw is with the viewer (as have many people in this thread).
There is the nostalgia. There is the way the brain absorbs images that are fed in a flickering sequence ( opposed to a video projector where there is no flickering sequence and therefore no stroboscopic effect ). There is the texture inherent in grain. We associate the unlifelike grain with cinema. The utter lack of grain “feels” wrong.
Back in the day, frame rates were not established and people shot at slightly varying rates with slightly varying results. Footage of silent-era camera operators cranking film cameras..
18fps was deemed too slow and so we saw too much flicker and not enough consistency of moving image. 30 fps was unnecessarily fast- a waste of precious film stock. 24 fps was determined to be the lowest speed where the flicker would for the most part go away to people’s point of view.
In Europe, film cameras run at 25 fps. This has a lot to do with the electricity cycle over there. The USA is on a 60 Hz cycle, The UK and many other countries run electricity on a 50 Hz cycle. 25 fps was a MUCH easier bit of math to do back in the day. It was important for silents and much much more important for talkies. The audio recording device was “slaved” to the film camera by a cable. The film camera had a crystal-controlled motor. The crystal oscillated at a set frequency and was kept going ( so to speak- I’m taking some liberties here ) by electrical a.c. current.
60Hz in the USA meant that an even divisor worked better. 24 fps worked. In the UK, a 50 Hz cycle meant that 25 fps worked better. ( This is also tied to Arc lights, which were fired off of a.c. power and had huge arc tips that flickered in time with the cycling of the power, etc etc )
Suffice to say, 24 fps saved film but provided a perceived smooth image.
While I do think that’s somewhat true, Vince is very much a technophile, and loves the newest and latest things. I mean, that’s exactly how he made his transition to video. When the Mark II came out, he begged and pleaded with Canon to borrow the camera for a weekend, self-financed and produced a film showcasing the new technology (having never done video/film in the past). When he was solely a still photographer, he was right up on shooting digital right away, no nostalgia about film or grain or anything like that. So, he’s not exactly the kind of person who I would expect to be technophobic or particularly nostalgic.
That said, I think he’s wrong about this, and as audiences get a little more acclimated to HFR, it will become more accepted visually. I haven’t seen the Hobbit yet, so I can’t really comment from experience, but I would guess that this is a matter of retraining your brain to accept this form of imagery. When HDTV first came out, I couldn’t understand its appeal or why anybody cares, despite being a very visual person myself (as a photographer). Now, standard def annoys the crap out of me and I wonder how I ever watched anything with such crappy resolution.
We’ll see how it all pans out with audiences as viewers adapt to HFR and filmmakers expand their techniques to adapt to the new technology.
I’m not disagreeing with you at all. But I think 24 or 25 fps was a reasonable compromise at the time for a large enough group of people and a cost-effective film budget.
I only say this because when I was first introduced to film at age 9 or so (yes!), the only preconceived notion I had about it was it was wonderful, or so I was told. Although I became a film buff for the content, I quickly became disillusioned about the medium due to the obvious (to me) flaws: flicker, frames that always jumped just a little, spots and defects on a single frame that I could not ignore, and at first, poor sound.
Modern day video gets rid of many of these flaws, and is improving still. This is a good thing.
It reminds me of the history of disc/tape sound recordings. We all put up with crackle, pops, hiss, wow and flutter because the content was worth it. That doesn’t mean we didn’t hear it, and some of us were highly annoyed by it. I spent many hours trying to eliminate the dirt from my LPs. Digital CDs removed most of those objections.
My point is that poor technology – which we put up with for so long – is being replaced with better technology. It’s a change, but we are getting closer to enjoying the message behind the film or music without distraction, and isn’t the message more important than the medium?
48fps is not a better technology. It is a different setting. They could have done 48fps in 1928 if they wanted to.
I enjoy reading a book lit by a 40 watt bulb. I enjoy it more under a 70 watt bulb and I wouldn’t enjoy it at all under a 200 watt bulb.
Sometimes. But not always. And presenting reality is rarely the intent of a movie.
I’ve been looking at demonstration TVs showing super crisp Bluray for almost a decade and have never had any desire to buy one (and I was one of the very early adopters of DVD). I loved Avatar in the theater and think it looks like shit on Bluray playing on the 70" inch TV at Best Buy. It looks overbright, sped up, and the fakeness of it all is highlighted.
Perhaps it is purely psychological on my part but after nearly a decade of seeing it in snippets that way and three hours of seeing in The Hobbit, I still hate it.
And if I am rejecting a technological advance just because I’m nostalgic for the way it used to be, then that is the first time.
I am curious to know how movie audiences are supposed to get used to it, though. The Hobbit is out. When is the next HFR movie being released? How regularly do you need to see it to get used to it? Will three a year suffice? It isn’t something that the theaters are going to be able to charge extra for so who will be clamoring for more of it?
To a degree it reminds me of a presentation I attended by Jeffrey Katzenberg about a year before Monsters vs. Aliens came out when he was touting the wonders of 3D and trying to convince all the press in attendance that by now almost everything would be in 3D.
If a large portion of the audience thinks CGI megamovies look like crap at 48fps who’s going to be pushing for Parental Guidance to be released that way (even though it is that type of movie that would probably be more acceptable visually)?
As Cartooniverse explained, what they could have done wasn’t the determining factor. If 48fps isn’t better technology, what do you consider 18fps? Are you upset because the industry went from 18 to 24? Motion had never been so smooth.
More illumination beyond an adequate amount is not better technology, so your analogy fails, big time. I’ll bet you don’t think a 2 watt bulb is adequate just because it’s not 200.
Making movies as close to reality as possible has been a goal of the industry for nearly 100 years. When widescreen came out, especially Cinerama, the technology was attempting to put the viewer in the scene more than ever before. This explains the multiple audio tracks, surround sound, sub-woofers, and a screen which is intended to cover the entire field of vision. In the 1950’s, movies had the advantage over TV in that they could make a viewing experience much more like reality, and they promoted that heavily.
I’m afraid you are not the audience it is intended for. Please do not go to the movies or waste your money on a Bluray. I suggest staying at home with a VHS player, a giant 9 inch screen, a single 2" speaker, and be sure to turn down that infernal, newfangled color shit.
Neither is more frames per second better technology. There is no point int eh last 80 years that someone, if they wanted to, could not have made a movie for presentation at 48, 50, 100, 200 frames per second.
More frames is not inherently better. Yes, 24 frames is better than 18. But it does not stand to reason then that 48 is inherently better than 24.
Are you under the impression that sound in movies is more realistic? It is clearer, it is louder, but it is rarely realistic. Because realistic sound would make for horrible movies.
Yes, because when I’ve said I’ve eagerly adopted pretty much every technological advance of filmmaking, not liking this change makes me a neo-luddite.
I dunno; I think the explanation is a bit simpler than that. Any activity we participate in exists in a certain context. This could be the set of sounds, smells, lighting, etc. associated with an event. In the case of films, we’ve learned from the very first movie we’ve watched to associate low frame rate with cinema, and further, to go into a kind of suspension of disbelief mode. It’s not just low frame rates, either–it’s the film grain, and the judder, and the sproket error, and a zillion other defects in the presentation. It’s even the cheap seats that smell a bit like vomit. It’s why theatres still give a cinematic experience even though home screens can often provide a higher quality experience in every way.
I can’t wait until 48+ fps is the standard and we can just all get used to it. The only problem is that we’ll be loathe to go back to the old stuff. Our kids will wonder how we ever put up with it.
I think there’s some talking at cross-purposes here… it’s entirely possible for both of these statements:
(a) 48 fps presents a visual cinematic experience that is closer to “real life” than 24 fps, an aim that cinematographers have been striving towards for decades
and
(b) many/most people will enjoy the film less in 48 fps
to be true. Now, you might want to stick some qualification on the second statement… they’ll enjoy the film less but would enjoy it as much if they got more used to it, they’d enjoy the film less only because of what they’re used to, etc., etc., but enjoyment is enjoyment, there’s no “should” about it.
By the way, the problem is NOT that the sfx are not good enough for 48 fps. The stuff I found the most jarring in 48 fps were shots with just people standing around in city streets with nothing fancy happening at all (unless you count sets and costumes). I’m quite sure that the costumes and sets in The Hobbit were as absolutely perfect as they could be, but (as others have pointed out) I still got this “hey, I’m looking at a person wearing a costume standing in a set” feel, not because I could see seams or rough edges or errors of any sort, but just because of the way it felt and seemed.
I have a suspicion, btw, that a lot of the psychological issue had to do with it being in 3D. Imagine a technology that was absolutely perfect 3D… so sitting in a movie theatre watching this film was utterly indistinguishable from looking through a window into a 3D world in which the action was actually taking place. The first time you saw a film in that format, and there was a guy dressed in a hobbit costume standing in a street made up to look like hobbiton, I strongly suspect that you would be so distracted by the strange 3D-ness of it that you’d find it very hard to actually get swept up into the story (generic “you”).