Why do my phone/digital camera pictures look different from the preview screen?

Okay, that’s a lie. I understand it to some degree, I’ve done film photography (with, like, chemicals and a dark room and everything) – so at least to the amount I can remember I know about aperture, shutter speed, etc. Enough to function and understand how it affects the captured image. I know digital cameras, despite saving the picture to a disk/card largely operate in a similar way (though I’m not clear on the specifics).

However, while my camera has options for shutter speed, aperture, and all that (even when I can’t remember how that screen works), my iPhone doesn’t. As far as I can see anyway. Either way, with both digital cameras and camera phones, I often have lighting problems, such as when a scene is “obviously” light enough but it recommends flash. And sure enough, without flash it’s really dark and impossible to see anything.

My question is: obviously if it’s on my screen it’s in the camera’s memory somewhere. Sure, maybe it’s overwritten the next instant, but still, if a picture is good enough, why doesn’t there seem to be an option to just save the currently active frame buffer? Sure, hardcore photographers might get annoyed, but I imagine that most people want to just take the picture they see. Not the picture except with weird lighting because of settings they don’t understand. Especially on a phone where you don’t have that fine of control anyway. I know that at the very least, my damned phone should just save what I see at the same lighting I see on the preview screen.

So… is there any reason for this? Inertia from film cameras? Is it actually a technical hurdle? Maybe I’m just imagining the problem?

You’re saying that the live preview on the iphone screen looks good, but when you hit the shutter button the photo is much darker?

The preview is taken at a higher ISO and/or lower shutter speed than the final image. On a camera with a variable aperture, it’s also taken wide open. It looks fine on a small screen, but if you did capture a single frame of the preview and look at it closely, you’d see that it’s noisy and blurry.

The final image is at a lower ISO to reduce noise, and a faster shutter speed to reduce motion blur. Both of which make it darker.

A screen shot on my desktop computer is 72 dots per inch, and it looks great. But that same 72-dot-per-inch picture out of a printer would be far from ideal, a disaster if a portion of the pic is enlarged to any extent.

I don’t know what the resolution is on a phone’s screen, but it isn’t likely to be high. Using your low-light condition, the picture may look OK on a tiny display, but no such luck with the final result; bumping up its resolution requires more light, whether still pix or video.

And everything tellyworth said.