Okay, here are some numbers as to whether you could see a red ring of the earths atmosphere as seen from the moon during an eclipse of the sun by the earth.
Two major assumptions here.
Approximately 4 vertical miles of the earths atmosphere contribute significantly to scattered light that heads towards the moon. Yes, our atmosphere is a bit taller than that. My WAG is that the very lowest part scatters absorbs TOO much light, while the upper part doesn’t scatter enough. This number is probably right to within a factor or 2 or so either way.
The second assumption is that within that representative 4 mile tall/deep part of the atmosphere, 1 percent of the suns light gets scattered and makes it back out of the atmosphere. I doubt that number could be more than about 10 times higher, but it could be significantly lower (this is the part I am most unsure about).
As Squink? calculated a 100 Watt lightbulb would be magnitude 26 or so as seen from the moon.
For those that don’t know, a magnitude 4 star is 2.5 times brighter than a 5th magnitude star, and a 3rd is 2.5 times brighter than a 4th and so on and so on. The brightest stars in the sky are around 0 to 1st magnitude. The faintest stars seen naked eye in a decently dark sky away from a city are about 6th magnitude. A difference of 5 magnitudes is exactly a factor of a 100. A 10 magnitude difference is 100 times a 100 and so on an so on. The Sun and full moon are large negative numbers for reference.
The amount of solar insolation is roughly 1000 watts per square meter. 1 percent of that is 10 watts per square meter, or about 1 watt per square foot of scatter light headed back into space.
Earths circumference 24,000 miles. 5000 feet per mile. 4 miles “high” of atmosphere to do the scattering. Thats 2.5 x 10^12 square feet (or watts).
Dividing by the 100 watt bulb, thats 2.5 x 10^10 times brighter than the bulb as seen from the moon. 10^10 times brighter is 25 magnitudes. 2.5 is yet another magnitude still.
So 26 magnitudes more light than the 100 watt bulb. Magnitude 26 minus 26 equals 0.
So, the amount of light from the “earth ring” as seen from the moon would have an integrated magnitude of 0, or about the amount of light from the brightest star in the sky.
Thats fairly bright. Next question is, would that amount of light spread in a thin ring of two degree diameter still be visible? My WAG is probably, but barely.
Though it wouldnt be bright enough to cause problems seeing any cities/metro areas that might or might not be visible.
Did I mess up the math any here?