Why, when photos or film footage are shown from space, do we see no stars in the background?
(The same thing seems to happen when film footage is taken at night; movies and TV shows with night scenes all seem to have somewhat poorly done fake night skies.)
I tried to find out if this topic has already been covered by our all knowing benefactor, but the search engine revealed nothing.
Stars are faint. Methods used to record light (film, videos) are activated by light. Since stars do not provide very much light, they often do not register on the film or the videotape. To record a star on standard still-camera film takes almost 5 seconds at f1.8 on 200 ASA film. Any camera recording motion is going to keep exposures much shorter than 5 seconds (although they can use much faster film than ASA 200, of course). Usually, if the film or video was exposed to the light from the stars long enough to record them, any other object in the picture would be over-exposed and would appear as simply a white washed-out area in the picture.
Generally, if you want pictures of stars or planets, you need to have nothing else in the picture but stars and planets, then use a fast enough film (or a slow enough shutter speed) to record the faint light from the stars.
I thought that that was supposed to be one of the “painstaking” details of the movie Titanic. Didn’t they reproduce the night sky–except that it was moonless, so the ocean would have been virtually pitch black. That’s one reason that boats didn’t go searching for survivors–they wouldn’t have been able too see them.
Film speed is secondary, folks! You have to understand you need FOCAL LENGTH and DEPTH OF FIELD. I won’t give a lecture on basic photography, but I will say that these two parameters play a vital role in your pictures. Even wonder why someone in the background might come out blurry while the foreground is sharp? Your aperature setting determines depth of field, and your focal length will determine the size of your image.
If you want to photograph the moon, you need FOCAL LENGTH to get an image of decent size. Capturing the stars on film is another lecture.
In short, if the pros aren’t “shooting for the moon or stars”, then the camera will not capture the moon or stars in the background.
Your eye, however, has an amazing range of all these factors mentioned above. So, the camera doesn’t nearly see all that meets the eye!
Hey, by the way, do you really think the creators of “Titanic” bothered to research the phase of the moon, and it’s position in the sky, time of rise/set, etc. to make their movie “realistic”? C’mon!
Can you say “acting”? I knew you could!
(Hey, I was cheering for the rats, anyway!)
In answer to the space question, I recall hearing that the reason why the lunar pictures of Earth don’t show stars is because the sun’s light reflected off of our planet is too bright and obscures the stars “surrounding” it. Basically the same reason why you don’t see as many stars near a full moon, but compounded moreso by the size of the Earth vs the size of the moon.
“I guess one person can make a difference, although most of the time they probably shouldn’t.”
Actually, if they’re going to show any stars, I would hope that they researched it to show the correct sky. To anyone who has done any stargazing, seeing Orion in a summer sky or the Big Dipper over Africa sky would be very distracting - not to mention a full moon rising at midnight. On the other hand, I don’t think I’d be bothered by a completely random star field - only an obviously incorrect one would bother me.
As for the original question, think about this - if you opened the curtains in your house tonight and looked out without turning off the lights, would you see any stars? No, because the objects in your room are much brighter. And the Earth and maybe parts of your own spacecraft are lit by direct sunlight most of the time, and are even brighter.
Handy, about depth of field, my answer is general depending on the nature of the photo one is trying to compose. In general, many people think that just because you are shooting a picture at night, the stars should just be in the background naturally. There’s a reason why background is lost - depth of field. (Like a picture of a friend in the foreground with a rising moon in the background.)
As I mentioned, I wasn’t going to give a lecture on astrophotography. The MAIN point was that film speed is NOT the limiting factor here.
(I suppose you think that when a lens is focused at “infinity”…you expect to see what “infinity” looks like?)
By the way, Scr4, the asterism, “Big Dipper”, IS visible over many parts of Africa. At the equator, for example, all stars from +90 to -90 declination are visible. This means that Polaris, the North Star, just shy of +90, would lie on the Northern horizon if one were at the equator.
In general, your Northern latitude minus 90 degrees determines just how far South on the celestial sphere one may view southern constellations. The sam idea applies below the equator.
The bowl stars are at about +50 declination, (roughtly) so they can be seen in skies at latitudes as far south as -40 degrees.
Didn’t Titanic go down around midnight? A waning moon that was only 11% illuminated would not be due to rise for a few hours, so it would be more than “practically” moonless.
One of the things overlooked is that the exposure latitude of film and video is far less than the human eye. When the contrast of a photo looks correct the highlights and shadows are beyond the range of detail. If the contrast is lowered enough to capture the shadow and highlight detail the photo looks like crap.