3D - why is it ultimately unconvincing?

The main factors listed already are a lack of information, and the image coming from just two light sources. But there is also the factor of your familiarity with 2D images on TV and at the movies. These aren’t very good either, but your brain adapts easily because of the consistency of the image. With 3D you get an increase in the cues that let your brain know that the image is not right.

For me, it’s that the images do not have “sides”. They are not true holograms so when you look at an angle you don’t get any new information.
Let me put it this way. If there was a piano on a stage, a guy sitting on the left side of the theatre would see a different image than a guy on the right. In 3D we all see the same.

Yes. This, and the lack of motion parallax, is tolerable in cinemas, because individual viewers don’t move around much compared to the distance to the screen. At home in front of a stereo (“3D”) HDTV is probably a different story–yiu’re getting separate images for each eye, but nothing else.

That is a good point. Unless you are sitting right in the middle of the auditorium, even the aspect of 3D experience due to binocular disparity (the only aspect of 3D experience actually simulated by current technologies) is not going to be quite right.

Even in cinemas, people frequently move their heads distances equal to or greater than the distance between their eyes. Thus there ought to be (but, of course, there is not) is at least as much 3D information available to them from motion parallax as comes from the parallax of binocular disparity. In other words, no, the lack of motion parallax in a cinema setting is not negligible. (I do not know whether the issue raised by Ají de Gallina is a negligible or a significant factor in a cinema setting.)

I didn’t say it was negligible; I said it was tolerable. :slight_smile:

Due to strabismus, anisometropia, and consequent amblyopia, I am not able to perceive depth in the way other people can.

All three-D effects are therefore lost on me, from the Magic Eye to Viewmasters to old or new generation 3D movies, and I really hope they hurry up and give up so I can once again see movies I can actually see unencumbered.

Fortunately, the kinds of movies that I like (largely period pieces) are rarely shot in 3D.

How high does the framerate go? Is there a way to demonstrate the difference between the two framerates on my laptop’s LCD monitor?

Quoth njtt:

Depending on the subject, there may not even be such a thing as a “real scene” to use, meaning it’d have to be models. Many Viewmaster slides, for instance, feature cartoon characters.

Well, OK, but can definitely recall noticing a distinct “cardboard cutout” effect in Viewmaster slides of natural scenes: rocks and trees and people and such. There is not much puzzle as to why 3D pictures of actual cutouts should look like cutouts, but it is not so clear why 3D Viewmaster pictures of real, ‘rounded’ objects should look like cutouts. That was the question I was trying to address (I do not know the answer), and in this context it is a red herring to say “they might really have been pictures of cutouts.” They were not.

I don’t distinctly recall ever viewing cartoons in the Viewmaster (it was many years ago), but if I had, I do not think I would have been struck by the flatness of the objects or characters. It is what I would have expected. However, I can clearly remember, as a child, being distinctly struck by the weird flatness of natural objects that stood out from the background but did not look ‘rounded.’ In my rather limited experience of 3D movies I have not particularly noticed this, so it may be something peculiar to Viewmaster (although ticker seems to be saying that he/she has noticed it in 3D movies too).

Some friends of mine got a new Samsung TV and complained that movies, “looked like documentaries”. I checked it out and turned off the smooth motion feature and they were happy.

My TV goes up to 240 hz, but the effect is visible at lower frequencies.

Your laptop runs at 60 hz, which is in fact enough to demonstrate the effect, but you need the right source material. You would want a “p60” video (progressive-scan, 60 hz), and compare that with a p24 version of the same video (likely made by duplicating 3 frames, then 2 frames). Unfortunately, I can’t point you to any examples off the top of my head.

The new HDTVs get around this by performing their own motion estimation. This technique breaks down at times (if you have moving objects behind some static text, for instance), but even when working perfectly, it looks strange.

You might just stop by a Best Buy or some such and check out their HDTV section. Look for the TVs that advertise 120 hz or better, and have a movie playing. Then see what happens when you toggle the smoothing setting (called Auto Motion Plus on Samsung sets; I’m not sure about the others).

Yep–exactly my situation, except that it was my own Samsung set. I don’t mind it for some types of content. But for movies it looks very strange.

It’s too bad in a way, since an increased frame rate would yield an improvement in quality, but the association is too strong in our brains at this point. Maybe if moviemakers increased the rate by 1 fps each year, we’d eventually have smoother movies :).

Whether the current artifacts in 3D movies will follow the same path depends on when (if ever) they really hit the tipping point in availability, I think. There will come a point where the technology curve hits a plateau, and there’s enough time for everyone to get used to the artifacts of that generation. Some time later, there will be improvements, but no one will want them because we’re all used to the old way.

Is this a woosh? Are there any films that you’ve encountered which had a 3D version, but not a 2D version?

I suspect the cardboard cutout problem comes from another problem with the geometry of a photographed image. I have a feeling that the disparity in view angle between the original scene and the projected image (and the inherent magnification of the image) is now creating situation where the effective eye separation is much to small for rendering subcomponents in 3D.

In principle, when a 3D movie is shot, the separation between cameras should be linked to the effective view angle of the subject (which is thus linked to the focal length of the lens in use.) They do do this a bit, since this is the basis of creating forced depth. But it becomes a difficult problem when you also have to think about the viewing angle presented in the cinema. The problem with any photograph is that it has only one correct viewing distance - which where the image subtends the same angle to your eye as the original did to the camera. (It is worthwhile experimenting with this to prove to yourself how much this affects our sense of realism in simple still photographs. You have to use one eye, and typically get quite close to the photograph, but there is a magic distance where the depth snaps in. Nothing like stereoscopic 3D, but remarkable if you have not experienced it. Landscapes are typically taken with medium wide angle lenses. The correct viewing distance is thus with your eyeball at a similar or closer distance to the photograph as the width of the photograph).

OK, in a cinema, movies are shot with all manner of focal lengths, zoom lenses are used, and the distance to the screen can vary by as much as 10:1 across the customers. You can pretty much assume that you almost never get to sit at the correct distance for a scene. That is bad enough for 2D movies. Now in 3D you will get all this, and now start to see the entire 3D scene’s forced 3D depth with what amounts to the wrong eye spacing, and with the image subtending the wrong angle for the lens used. The issues may possibly mean that the components of the scene simply don’t have the right internal depth geometry and become flat. The overall forced 3D depth places them at a depth in the scene, but that is all it can manage.

It does vary from movie to movie. Avatar, for instance, spent the money to do it right, while the execrable The Last Airbender did it the cheap way with all of the elements flat. So it could just be that you’ve seen different movies.

In fact I’ve never seen “flattened” 3-D in movies since the “new” ones have come out (except where it was an artistic affectation). Nor have I seen the opposite, stuff that just jumps out at you to say “hey, I’m 3-D” (with one or two exceptions, the shots that do jump out at you have exact analogues in 2-D movies, so if you think those scenes are “gimmicky” you must think those 2-D movies are gimmicky as well. Avatar (the one with the blue people) had a lot of examples of this: since Sci-Fi movies tend to attempt stunning explorations of the Z-dimension anyway, it is natural that a lot of shots that happen all the time in 2-D sci-fi movies would seem to be done just for the 3-d effect.)

I don’t see a whole lot of movies (to begin with), but it seems to me there were at least a couple of movies I’ve been to see with friends where for one reason or another we didn’t have the choice between 3D and 2D. Of course, I admit there are really only two English-language first-run theatres where we go, since all the others are in the back of beyond and we’re going by transit, so perhaps they are available elsewhere and I just haven’t noticed.

But if everyone else in the group wants to see the 3D version, I guess I’ll be there sitting there watching a blurry-ass movie I can’t watch with both my eyes.

I’ve never seen the problems the OP refers to – 3D always looks good to me. Even the crappy 3D applied after-the-fact for Clash if the Titans , for all its faults, doesn’t look like “cardboard cutouts” to me.

** Viewmaster used to make clay models of cartoon characters to photograph to give them a 3D appearance back in the pre-CGI days. (For over a decade they’ve been using computers to give cartoon characters a 3D effect). I can’t recall any cases where they stuck in “cardboard cutouts”. One place that did have the “cardboard cutout” effect was in 3D comic books, as a byproduct of how they were created. But, even in the 1950s, there were some 3D comics that went out of their way to avoid that “cardboard cutout” effect.

**Murch writes

Actually, try looking at scenes 60 feet and 120 feet away. The further away, the less parallax, and the smaller the 3D effect. I don’t think this is the issue he claims. 3D is most impressive and visible for different planes relatively close to the viewer, for which a lot of excessive deep focusing isn’t necessary. It’s an abysmal movie, but Lookin’ at ya! actually handled this very well. So did the much better Creature from the Black Lagoon. lotsa items in different planes relatively close by.

I noticed it in the crappy 3D applied after-the-fact sequences in the most recent Superman movie. They had a definite Viewmaster (or paper tole) feel to them.

One thing about the 120hz or greater displays: they very often actually take the 24hz, scale it up to 60hz, and then applky the effects it. Since 60 is not a multiple of 24, not all frames are shown for the same amount of time, or, more likely, two frames are blurred into one another.

Combined with the fact that you are trusting a computer to make the new frames rather than actually have the stuff filmed at a higher rate, and its no wonder it looks fake.

Still, I agree that there is some “getting used to it” effect. I noticed that old colorized movies leave dark portions in black and white, which is actually more realistic than the commonly used blue filter. In low light, things do appear in shades of gray to the human eye. And yet it looks horribly wrong on film.