I am curious about whether the brain provides some post-processing of images to correct for perspective.
I was taking a photo of my car Sunday to put it up for sale. I set the zoom lens focal length down to 18mm and got very close to the car. But the result was too distorted (straight lines near the camera appeared curved) to use for an ad, so I backed off until I could use 50mm. Photogs are very familiar with this phenomenon. Most casual photographers mistakenly think that distortion is caused by the lens itself, but it is actually caused by the photographer’s position and the resulting perspective. A popular photography mag (maybe either Pop Photo or Peterson’s) did an article on this once years ago, with a clever demo which showed two photos taken from the same position, one with a long lens and one with a wide angle. A small square of the same image frome each photo was selected and blown up (the one from the wide angle was blown up more so) so they were the same size, and the perspective and distortion (or lack of it) was exactly the same in both.
However, the fact remains that when I was crouched at the front bumper of my car with an 18mm lens, lines looked curved, and when I looked over the top of my camera, they most decidedly did not look curved. But I am effectively comparing an 18mm lens to the equivalent of a 50mm lens.
When you are close to an object, does the brain somehow reinterpret the image to make it fit what it thinks must really be there? Another good example is standing at the foot of a tall building. As you first face the entrance looking forward and then lift your head to look up and finally see the top, your perspective changes as you start by looking at something that’s close, and gradually look farther and farther away.
But all the lines look straight. I took a series of pictures of a tall structure to merge them together and you get lines that bow and narrow towards the top.
Is human perception of dynamically changing perspective filtered by the brain, or is there some optics phenomenon I am not considering?
Your camera lens faithfully records the light entering it and going onto its film/CCD/whatever. Your eye’s lens does a good job of imaging onto your retina. But bear in mind that you camera is projectinbg that scene onto a flat surface, and your eye is projecting that image onto the very curved retina on the interior of your eyeball. DFlat doesn’t exactly map onto a sphere. Read up on flat maps and globes.
To top it off, when you look at a photo of the object your eye will faithfully replay what’s on the photo, and won’t try to somehow filter it so that it looks as the scene would to your eye (which is probably why you saw curved lines on your camera that looked straight to your eye.
The camera optics and your eye’s optics are optimized for somewhat different situations. It’sd not surprising they get different results.
I’m not sure what yuou mean by “distortion”. There’s a technical effect called “distortion” that is a defect of the lens. Pictures taken with lenses having differing distortion of this sort will look different. Can you get a cite for that magazine article? It sound interesting, but I’d have to read it to see exactly what they’re doing…
(The brain will do some weird processing of images – look up Mach Bands and the Modulation Transfer Function – MTF – of the Eye, but they’re not the sort of thing you’re describing.)
I’m not talking about defects or aberrations, but what the layman would call distortion. The extreme example would be a fish-eye lens. In this photo the edges of the room, desk, and tiles on floor are all in reality straight, but the exterme wide angle view of this space is “distorted” to make those lines look curved. If you made the equivalent photo by pasting together many photos taken with a 50mm lens lines would also look curved. But when your brain takes a series of “pictures” by turning your head from side to side and up and down, the resulting model your brain builds from a composite of those views is one of straight lines.
It was years ago and I don’t have it anymore but I’ll see if there’s an online archive. Meantime here’s a concrete description.
Walk down the street from your house and take a photo of it with an 18mm lens. Looking through the camera it appears that farther objects appear really far, and closer objects straight lines appear to be curved. Now take out a 200mm lens and take a photo of your house. Now you don’t even see close objects but the *really * far objects do not seem much farther then the far objects.
Now take the two photos and in each one, find a window or some other feature of your house. Crop down to that feature and blow each one up so that they are the same size (the 18mm version will need more blowing up). All of what you perceived as distortion and flattening goes away, because in the end these are two photos of the same object taken from the same distance and same perspective.
I’m not sure what you mean by this. If I took a picture with a 50 mm lens, on each individual photo, I’ll bet your lines would look straight, and they look straight to the eye. It’s not fair to say that taking a bunch of such photos and sticking them together they look curved, because you can’t do the same thing with your eye. You remember them and interptet them as straight, but so do the individual 50 mm photos. So where’s the contradictio? The fisheye lens certainly doesn’t faithfully reproduce geometries, but we know that already – it’s like the problem of mapping a spherical golbe onto a plane. There’s a bit of difference between your curved eyeball and a flat-field “normal” lens, too, but it’s not very significant in most cases.
I think in this case you are talking about distortion caused by the lens, characteristic of fisheye lenses. There are other wide-angle lenses designed to image straight lines as straight lines, called rectilinear lenses..
Dang, you learn something new every day. I was not familiar with this type of lens.
I’ll bet that pros use these to photograph the inside of homes for sale.
The rectilinear room photo in that link reminds me of the perspective of the digitally created views in Riven, the way you see this angular distortion near the lens (edges of the photo).
I never really got over this so I hope this zombie will be excused.
OK. Last summer I stood at the foot of the Arc de Triomphe and took several pictures to make a composite.Each fragment shows straight lines but the overall picture looks distorted. However, my eyes stood in the same position with roughly equivalent focal length and presented my brain with effectively the same series of pictures. My brain didn’t think, “Hey, this looks curvy as my eyes move up.” The subjective experience was that the structure was straight up.
Suppose you stand on a railroad track and look down. (Please do not really go out and stand on a railroad track.) You glance to the left and see the tracks converge towards the vanishing point, and glance to the right and see the same. As you look straight down, that’s where the track is fattest, due to perspective. But as you turn and walk away, you are not left with a subjective impression of a track that is fat in the middle and curves together as it goes away in either direction; you remember two straight, parallel rails.
So I am concluding that the question I brought up really has nothing to do with optics–it’s a cognitive issue about how the brain is able to take perspective into account when interpreting what it sees. Any cognitive psychologists out there?
It’s partly cognition, e.g., you know a line is straight so you see it as straight even though the image is curved on your retina. And it’s partly projective geometry.
In this Flickr group there are a lot of photographs stitched together, including some 360 degree panoramas. There has to be some kind of distortion when you do that, because you are projecting a reality which effectively the inside surface of a sphere onto a flat surface. Those are extreme cases, but even in less extreme cases, such as 180 degree panoramas, you have to make a choice between different kinds of distortion.
It’s the same kind of choices that map makers have when they project the surface of the Earth (which is nearly a sphere) onto a flat surface. For a relatively small area, you can ignore the distortion, but when you are mapping continents, you have to choose which projection to use.
“Rectilinear” lenses are the ordinary lenses we usually use on cameras, including very wide angle ones. If they work properly then any straight line in the scene becomes a straight line on the film or CCD. If this doesn’t work exactly right you get pincushion or barrel distortion. Fisheye lenses are very wide angle lenses that don’t even try to be rectilinear.
It doesn’t matter that our eyes are projecting the world onto a curved retina. For one thing, our retina isn’t curved around the point the light rays all pass through - it’s curved more tightly than that. But most important, the eye and brain system is an evolved system for evaluating and modeling the world in front of us. We don’t really appreciate the image that forms anyway. Straight lines, for example, are recognized by the retina and are communicated to the brain as straight lines on the optic nerve. The brain doesn’t get to sample the image itself.
The issue that I was really trying to get at does not at all involve lens geometry or curvature of the retina, but simply perspective. Your brain seems to understand perspective such that you perceive a series of views from the eye as a 3D model of straight lines that intersect at right angles.
The human perceptual system is definitely capable of adapting to things like distortion - My experience is going to be my cite here (so sue me).
Whenever I get new spectacles, there is always a period during which I not only notice the chromatic aberration towards the edges, but also, moving my head seems to cause the visual field to move at a subtly wrong speed - like when you look through a fisheye lens. Furthermore, lines known to be straight can be perceived as curved when viewed off-centre through the new lenses.
After a week, the effects are not only no longer noticeable - they’re actually gone - chromatic aberration must still be there, but I can’t see it, the visual image must still be moving across my retina in the same way when I move my head, but it looks normal, and the straight lines look perfectly straight, no matter how hard I study them.
I believe there have been experiments where subjects wore goggles that made everything look upside-down - after a couple of weeks, everything just seemed ‘normal’ to them again and when the devices were removed, they were just as disoriented as when they first put them on. Human perception is a really weird thing.