Do the Blu-Ray Star Wars movies look cheesy to you?

I recently passed by a HDTV in a store playing the Blu-Ray version of Star Wars. I was surprised at how fake everything looked. It reminded me of a TV show shot on video. Rather than looking like real people flying a ship in space, it looked like actors on a set. When the ship flies through space, it looked like a plastic model in front of a black background with little lights on it.

I never got that feeling watching the lower quality versions. I’ve seen the movies in the theater, on video, and on DVD. It always looked real to me in the lower quality. The higher quality does look clearer and sharper, but now everything looks fake. Is it just me? I don’t have an HDTV, so maybe I’m just not used to watching things in that level of clarity.

It probably has nothing to do with the transfer, and everything to do with the settings of the HDTV. If it had interpolation enabled, then it was most likely suffering from the soap opera effect. It’s always bothered me, but today I was in a Fry’s where they had two TVs set up next to each other showing the same content, and it was amazing how differently the two looked, what with one of them having the “feature” enabled and the other did not.

Yes, this has nothing to do with the high resolution of the Blu-Ray version, which is approximately the same as what you saw in the theaters. It is the TV’s motion compensation technology, which can be turned off.

Why anyone would ever turn it on, I have no idea.

Soap Opera effect is a good way to put it. I bet that’s what it was. Next time I’m in the store I’ll see if they can switch off the motion compensation.

Whew! I was worried it was yet another way Lucas was trying to ruin the franchise.

Like the linked article mentioned, it *sounds *like a great feature. More framerate = better, right? For sports, it’s absolutely true. Turning it on for a football game is excellent - you do notice a slightly smoother picture with no judder. But put in a bluray, and it turns into a rerun of “Empty Nest”.

Higher framerate is always better, although it should be done in recording instead of display interpolation. Hopefully, we’ll finally be able to consign 24 fps to the technological dustbin that it deserves once a generation has grown up seeing what proper footage is.

This reminds me of the CD vs vinyl debate that raged for years (with some still holding out). Video tape has been higher quality than film forever. But video tape has been used for lower quality productions so in people’s heads better video quality = worse.
Note that higher fidelity is not always better. Sometimes you want a muddy, blurry or bad color image for a specific effect. That should be up to the film maker, just as a musician can add distortion effects to get the sound he/she wants.

I think it comes down to media creators learning to work in the medium. You want your hardware to be able to produce the crispiest, sharpest, clearest picture with the most lifelike colors and then let the film maker produce what they want you to see on it.

New films, shot digitally and mastered for digital projectors, look great in HD without disabling any features. Inception looked like I could walk into it on my 2011 Sony flat panel. Older films that are released on Blu-Ray may need to be remastered to make them look better in HD.

This seems related to the current controversy over the 48 FPS Hobbit movies, where PJ is taking a lot of grief from people who are perturbed by how said frame rate looks.

Regarding the Hobbit angle, I was under the impression that the motion compensation feature on TVs (which I have trouble even noticing, but drives several of my friends absolutely totally batshit insane) is adding extra frames to things in an attempt to make them smoother. But they don’t actually *have *any extra frames of film/video to insert, so, they’re basically adding repeat/dummy frames into the show to get the frame-rate up?

In contrast, something like the Hobbit is actually filming real differentiated frames at a higher rate?

Is that true? If the former is true, why does inserting repeat frames supposedly make it look better? If the latter is also true, then would something with actual extra frames look like the Soap Opera Effect, or would it be demonstrably different because there are real frames changing there, not just duplicates inserted in afterwards?

Simply repeating frames doesn’t alter the output in any way. When a TV is displaying a still image, for example, it’s really repeating that image many times a second, endlessly.

What this feature is doing is generating extra in-between frames by morphing one frame into the next, thus creating high-framerate output out of low-framerate recordings. Because these frames are created artificially, there’s a potential “uncanny valley” result of movement feeling false because the intermediate frames might not match real-world motion, but the time spans are so short that it’s unlikely. I think all the complaints are really sourced in the simple fact of the framerate being higher, as evidenced by the drama over the Hobbit’s production.

It looked weird to me when I first switched to HDTV and BluRay. I was used to watching a slightly more blurry picture and the sharper image didn’t look right. But I quickly adapted to the better picture and now it looks normal and regular DVDs or non-HD broadcasts are what looks “off”.

That’s funny. My friends and I always called it The ABC After School Special Effect.

The thing that bothers me about this is that, at 120fps, they ought to be able to run at the original 24fps (or 30fps) by just repeating the same frame 5 (or 4) times. Instead, your choice is either to have interpolation or to have it converted to 60fps, which doesn’t work well for 24fps.

I also agree that higher resolution often looks worse, but I’ve found this is only true on LCDs. It looks fine on my CRT monitor.

Wow… now I know why I thought Blu-ray looked so weird the first time I watched it. Almost too real. I was watching LOTR, and kept expecting to see the boom mike guy go running by. It was disconcerting. So what do I do to my TV to make it not like that? Do I want to do that, or just get used to it?

Yes, you want to do it. As for how, it depends on your make and model. I think the link I posted earlier gives a number of different settings to look for.

Come again? :confused:

Actually, repeating frames are how anyone in NTSC-land (North America, for example) has been watching any film (and any TV series shot on film) for decades. See this section of a longer wiki article on the technique, which is called “telecine”.

The Soap Opera effect refers to the fact that television shows taped (instead of filmed) have a higher framerate, are brighter, and clearer. But since tape is cheaper and therefore used on lower cost productions (soap opera’s and sitcoms mostly) audiences associate the brighter clearer look of 50fps with cheap productions and the grainier darker look of 24fps film with better ones.

Video did have its draw backs, but overall it had a better objective fidelity. What you saw on the screen was closer to what you would see with your naked eye if you saw the set.

Sorry, but I don’t think this is the case, and runs counter to everything I’ve ever heard about SD video vs film. (Let me please emphasize that I’m talking about SD “soap opera” video you’d record on analog tape; HD video is a different and more complex discussion.) Film has a much better contrast ratio compared to video. Because of this, video looks flat and lifeless compared to film, and that difference is immediately noticeable if you compare a shot-on-video TV series such as All In The Family to one shot on film e.g. Cheers. I don’t know if this is because of a limitation of the storage media/tape or a limitation in video cameras themselves. In any case, if video were higher quality and cheaper than film, why would most (or any) television series in the SD era be shot on film? My understand is that film was used because it gave a better picture when broadcast. You could certainly make an argument that SD video preserved motion better than film – it did – but the overall image quality of film was (and some would argue still is) much better than video.

I believe it was because video recording equipment was, until the 1980s, large and bulky and required a steady power supply, whereas 16mm film gear - more than enough resolution for old-fashioned TV - could be run from batteries, or even clockwork. Even when video equipment became more practical, 16mm was still used a lot for documentaries, things like Around the World in 80 Days and The Living Planet. Super 16mm is still used for HDTV, because the resolution is more than sufficient, although it’s usually scanned and worked on digitally.

The contrast ratio difference is particularly noticeable in old episodes of Dr Who. One moment Jon Pertwee’s doing Venusian aikido on some guards and it’s just like a film!!!, and the next moment he walks indoors and it’s children’s television, and every time someone lights a torch the video tube overloads. Then they moved to shooting everything on video, and eventually they developed a technique that made video look like film, although some shows just shoot on film, and no doubt they’ll move to shooting digitally if they haven’t already.