Okay, that’s a lie. I understand it to some degree, I’ve done film photography (with, like, chemicals and a dark room and everything) – so at least to the amount I can remember I know about aperture, shutter speed, etc. Enough to function and understand how it affects the captured image. I know digital cameras, despite saving the picture to a disk/card largely operate in a similar way (though I’m not clear on the specifics).
However, while my camera has options for shutter speed, aperture, and all that (even when I can’t remember how that screen works), my iPhone doesn’t. As far as I can see anyway. Either way, with both digital cameras and camera phones, I often have lighting problems, such as when a scene is “obviously” light enough but it recommends flash. And sure enough, without flash it’s really dark and impossible to see anything.
My question is: obviously if it’s on my screen it’s in the camera’s memory somewhere. Sure, maybe it’s overwritten the next instant, but still, if a picture is good enough, why doesn’t there seem to be an option to just save the currently active frame buffer? Sure, hardcore photographers might get annoyed, but I imagine that most people want to just take the picture they see. Not the picture except with weird lighting because of settings they don’t understand. Especially on a phone where you don’t have that fine of control anyway. I know that at the very least, my damned phone should just save what I see at the same lighting I see on the preview screen.
So… is there any reason for this? Inertia from film cameras? Is it actually a technical hurdle? Maybe I’m just imagining the problem?