There’s another aspect to all this, and that is that sophisticated digital cameras can do distortion correction. And it also depends on whether that camera is full-frame or has a smaller sensor like APS-C which effectively multiplies the focal length by 1.5.
To put some numbers on it (from a review of the Sony FE 28mm f/2 lens) When not corrected, it reaches a distortion value of −2.41% on the APS-C sensor and a value of −4.82% on full-frame. When corrected in the JPEG format, on the APS-C sensor the distortion amounts to just −0.19% so it is practically zero; on full frame it increases to −0.60%.
Long story short, it’s highly likely that the earth’s curvature in that picture is accurate and not a distortion.
We all agree the horizon is a circle, and we all agree circles are curved. The dispute is about the area inside the circle.
“Seeing Earth’s curvature” means seeing that the area inside the horizon is convex. In your ISS? pic, we can conclude the surface is convex, because we’ve seen maps of Cuba and we know it’s not that shape. But without those maps we have no way of knowing whether the circle beneath us is convex or a flat disc.
Only way to “see” the curvature of the surface is to see a line that’s supposed to be straight, and isn’t. Since the line has to be an actual visible line on the surface, it’s hard to do better than the Lake Pontchartrain causeway, whose chord-arc separation is 1/1300 of its length – i.e., too little for the unaided eye to see, from above.
By the way: I doubt the 28 mm lens distorted the ISS pic much – no doubt it represents the scene well.
No, they can’t. What they can do is to convert some kinds of distortion into other kinds of distortion. Sometimes, in some contexts, the new kind of distortion is more tolerable than the previous kind. But it’s still distorted.
That sounds to me like the most nitpicky kind of nitpickery. I quoted specific numbers on distortion correction for the Sony 28mm lens for different sensor sizes, based on reference to a test pattern of horizontal and vertical lines. If there is perceptible curvature to the lines, it’s exhibiting typical wide-angle distortion, and if there is no perceptible curvature, it isn’t! Granted, the remaining distortion isn’t zero, but it’s close enough. There’s nothing mysterious about software correcting an artifact of a particular lens. After all, correcting shape, colour, and lighting has been the essential purpose of Photoshop for decades.
As a side note, the 28mm lens is quite popular with owners of digital cameras with smaller sensors like APS-C, because the focal length equivalency factor makes it effectively a 42mm lens for most cameras and 45mm for the Canon brand, which is very close to the general-purpose 50mm lens traditionally used in 35mm cameras.
Needless to say, the Nikon Z9 used by the astronauts on the ISS is a high-end professional camera with a full-frame sensor.
It’s not a matter of the correction not quite getting rid of the distortion. Again, it’s not possible to get rid of the distortion at all in a flat image. All you can do is convert it to other sorts of distortion. If a software package claims to be removing over 90% of the distortion, they’re simply lying. What we see is spherical, and so nothing flat can be an accurate depiction of what we see.
In the limiting case of a circle at infinity (imagine standing on an infinite flat plane), the curvature would be zero.
Let us, for simplicity, consider a single eye. What it sees is some sort of projection, and the lens and retina introduce distortion and other aberrations. I have no trouble imagining a high-end optical instrument seeing much better than I do.
The optical instrument can certainly capture data better than the eye, and might be able to interpret it better. What it can’t do is represent it faithfully on a flat image. You could certainly make a camera device that could accurately measure the angular extent of a visible disk, but an image wouldn’t be an appropriate output format for that measurement.