Although an amateur astronomer, this was my first total solar eclipse. I always thought the sky would be black as night at total, but it is not. It hit me the reason I think so is because of the pictures shot through a sun filter making everything else look black. However, I understand you are to remove the sun filter at totality, or you will see nothing. So, why do pics of totality have an all-black background?
They certainly don’t have to. You can expose for the sky, and get the dark blue (or whatever color you’d like to describe it as), but then you’ll end up overexposing the corona. Basically, there is a large difference in the exposures for the corona and for the sky (a wide ‘dynamic range’), so if you expose to get a lot of detail in the corona, you’ll end up underexposing the sky (hence it turning to black), but if you expose to get color in the sky, you’ll end up losing detail in the corona and having it look washed out.
You can actually balance the two by stacking exposures, if you wanted to. To me, the sky as black is not far off from what it looked like to me in real life.
Pulykamell, so I believe you are saying such photos are typically underexposed, hence the sky appears pitch black?
P.S. The sky at my piece of Greenville, SC got just dark enough to see Venus and Jupiter, but not much else. The sky was a weird deep blue, but never truly black like night.
Here’s the same photo developed two different ways from the eclipse yesterday:
The way to get some color in the sky
It just looks a lot more interesting and dramatic the first way.
If you expose such that you can see detail in an ordinary full moon, you also won’t see stars. Likewise, the corona is still bright enough that if you want to capture some detail, the background looks black. One of my photos. There’s actually a star in there (or planet? not sure yet), but so dim that you can’t see it without cranking up the brightness.
It’ll never get truly pitch black during an eclipse, since there’s still light scattering off the atmosphere outside the area of totality (the umbra). So it looks like just after sunset (also where you have only indirect lighting), except in all directions.
Sorry, my internet is being very wonky today and the second pictures doesn’t seem to have uploaded fully. I have no idea what’s going on, but I’ll try it again in a bit.
It’s all about your exposure. If you expose so that only the inner corona is visible, the landscape will be black. If you give it some more exposure (5 stops or so), the outer corona becomes visible, and landscape features and deep blue skies appear.
Just to make this perfectly clear–this may be too obvious for experienced photographers to even mention, but is often not understood by the rest of us:
There’s no such thing as a photograph that perfectly captures a visual experience.
Photography is an art, not a science. Particularly in conditions of unusual lighting and dealing with subjects at unusual or varied distances, there may be no way to get a photo that perfectly captures all elements of a scene such that someone looking at it will see what live observers saw. Talented photographers use a number of techniques to balance the factors that cause this, but for something like a total solar eclipse, there are necessarily compromises.
The human eye automatically compensates for changes in illumination and nearly instantly refocuses to the correct focal length for a new object of interest. Under the eclipse, the sky may look dark indigo and twilight-ish when you look at the sky, and when you shift your attention to the sun and moon, you see a round black patch with an eerie corona wobbling around the edges. But there’s no camera setting to capture both of those views in one static image. That’s not a failure of photographers, it’s an inherent limitation in photography. And the limitations exist in landscape or portrait photography as well, it’s just easier for good photographers to compensate for the limitations under more controlled conditions and make them less noticeable.
^ Very much so. And even photographers sometimes forget that.
Two of my total eclipse shots from yesterday:
- Shot at 1/8th second exposure - Imgur: The magic of the Internet
- Shot at 1/1000th second exposure - Imgur: The magic of the Internet
Not just that, but the human eye has a significantly wider dynamic range than most* cameras. Which means our eyes can see detail in really bright and really dim things at the same time. Which cameras can’t do. If you adjust the camera exposure to show detail in the sunlit part of the landscape, for example, everything in the shadows look completely black. HDR (high dynamic range) mode helps, but it’s still not as good as human eyes.
*I was going to say “all”, but there are specialty cameras that have excellent dynamic range - like this.
Did anyone else have the same feeling I did as it got darker and darker - that the world gradually went from color to B&W, and then the same thing in reverse afterwards?
That’s not what I felt, but that makes some sort of sense. Color perception fades as light does.
For me, the thing about the quality of light that makes it surreal is that its directionality and quality (as in harsh vs soft) remains the same, while its intensity diminishes. So, normally, when you have an overcast sky that mimics the light intensity of, say 80%-90% totality, you have diffuse lighting with very soft shadows, usually so soft that you can’t even see them. But here you have a situation where you have overhead light that is harsh and direct, but with the intensity dialed down. Shadows are still crisp; you still have a point light source, but at x% power. That’s what’s really freaky and weird about the eclipse to me, because in no other situation do you experience that with earthly lighting. Typically,when light is less intense on earth, it’s softer (being diffused through cloud cover), or coming from very low on the horizon (after sunrise and before sunset).
That’s called the Purkinje Effect
ya, it certainly wasn’t *dark *during the 90+ seconds of totality. My first total eclipse. Somewhere around 90% obscuration, the temperature dropped significantly. It got a lot darker but I’m having trouble trying to equate the type of darkness as it wasn’t akin to dawn or dusk. Someone said the street lights came on but a quick glance didn’t reveal the street lamps to me. Definitely the quality of light was altered significantly, as were all shadows. A tree near where we parked 15 minutes post totality had the most interesting shadows, but I couldn’t describe it to you.
nearwildheaven, maybe not a bad descriptor of moving from color to B/W.
The second photo demonstrates that the sky is still there. So the overexposed corona/sun is not the only thing hiding the stars, the sky effect, caused by rayleigh etc scattering of the sunlight, is bright enough it hides away many of the stars…
Thanks for the explanations, everyone.
What is the magenta fringe around the lunar disk? Is it chromatic aberration in the lens? An effect from digital process? Or something else?
When I took my partial-eclipse-in-NJ photos I was amazed at how wide the dynamic range of the scene was.
The darkness-to-lightness scale might be compared to the octaves on a piano, with some children’s instruments only having an octave or two of keys, while a piano has several. The human eye would be the piano, while a digital camera would be more like a small synth keyboard with 3 or 4 octaves–that’s why most sunset photographs have either a blown-out sunset or totally underexposed foreground. If you want to avoid that, you can try a half-dark filter to dim the sunset part.
But the eclipse blew out even the human eye’s ability to see light and dark together.
I used a 10 stop filter and still needed to shoot at 1/8000s in order to get a reasonable crescent. I imagine that at proper totality the problem was not nearly as bad.
Here’s my crude attempt at a partial solar eclipse, taken with too short a lens.
I haven’t confirmed it yet but I think that’s the 656.3 nanometer hydrogen line. it showed up in some of my photos as well.