Anomalous color in Mars rover photos?

There’s an old conspiracy theory about the color of the Mars sky. The first Viking photo shoed a blue photo, but the later photo, presumably with correct color calibration, showed the true red color. Some people seem to think that the first one was correct and that NASA is hiding the truth.

Unfortunately the latest Spirit images are not helping. Here is the photo of the color calibration target taken on earth:

Here’s a Spirit image, on which you can just about make out the cal target:

Does anyone have a good explanation of why the blue swatch appears red on the Mars image? Not only that, but the cable tie visible just above the cal target has also shifted from blue to red.

There’s an ongoing thread about this at Slashdot:

It appears that the original story in that article is a conspiracy theory that should not be taken at face value, and the comments below have some explanation of the use of filters and other factors which cause what appears to be color shifts.

There’s more about the sundial and color calibration issues here:
http://seattlepi.nwsource.com/local/153690_mars23.html
http://www.astrobio.net/news/article625.html

I don’t quite understand how the colour of the sky could be evidence of any kind of conspiracy. Nasa would want to hide the true colour of the Martian sky because… :confused:

Cock-up, yes; conspiracy, no.

Anyway, I too am intrigued by that obviously fake-looking red blob over the blue patch. But on the subject of the “true colour”, it’s very difficult to know how human eyes would view a given scene. We know that a lot of orange-pink light is being scattered by dust in the atmosphere - should we remove the effect of that and balance the colours of the calibration chart back to the way they appear on Earth? I would say no, because the colours probably don’t look like that up there on Mars. I guess we will only know what the scene looks like to human eyes when the first human lands on Mars.

I’m more confused by the obviously fake red patches than anything, since this is a picture on the NASA site, and I doubt it is hacked. Seems to me that perhaps thier color curves are out of whack on the image.

As for color calibration so that the scene appears as it would to the human eye, it is not really all that hard as long as you have a known value to compare against. Which the target would provide if the colors weren’t screwed up.

I took photoshop and normalized the colors so that the parts of the calibration target that are supposed to be grey are truly grey. The picture was missing a little bit of blue, not enough to make much of a difference to the human eye. The blue wire in the picture is also reading as red…I’m wondering if the picture isn’t hacked after all. Adjustment of the picture to make what should be blue, blue, makes the whole picture blue.

Because obviously the probe isn’t really on Mars. Our reptile overlords have been maintaining the charade for decades to distract us from noticing how many human babies are being taken to feed to the miniature space raccoons. Or something.

Anyway, the blue sky is evidence that all of this is being shot in the deserts of New Mexico. Occasionally they forget to change the sky to its supposedly true color of red or off-pink. I say supposedly because the Martian sky is in point of fact not red, but a lovely chartreuse-and-sienna plaid.

Is it not evidence of the Atmospheric makeup?
I thought part of the conspiracy was also that there is life on mars and NASA are trying to hide that by showing that the atmosphere is not inhabitable, if it was a blue a few loons might start to think it has the same air constitution as Earth and is therefore breathable.

or have i just inadvertantly started a new conspiracy theory… hang on total recall…

Zaphod

Thanks. I waded through the discussion (are those “scores” inflated or what?) and found a satisfactory (to me) explanation:

I disagree. Imagine you took that calibration chart into a room lit by a red bulb. The colours would appear all screwed up. But if you took a photo under this light, you couldn’t then adjust the image so that the colour chart looked normal and say “A-ha! This is what the scene would look like to the naked eye.”

Do you see my point? You can’t tell how much of the colour shift is due to the ambient light and how much is due to the camera, image processing etc.

You misunderstand. If you took a picture in messed up light, such as a red light, the color calibration works the same as if you used sunlight. The color calibration lets you know exactly how much of the color is due to the ambient light, and how much is due to the camera. ( which as long as it’s using RBG filters will be pretty indistinguishable from the human eye. As explainged eslewhere, the camera was not suing RGB filters, and so some of the colora are off.). Calibration of a camera so that what is sees is what a human would see has to take place with a human looking at the same scene. But once done, it would stay that way.

Im not sure what you mean by ‘normal’ here anyway. A room lit with red light will look red to both the human eye, and to the camera. The grey areas on the color calibration chart will also look red. You can adjust the image so that the grey areas look grey, but this is not how the scene would appear to the human eye. All this does is allow you to subtract the effect of the lighting, and see what the true color of things is.

Which is what I did to the image in photoshop, and the sky was still pink. No as pink as it was, since the lighting in the scene was red, but still pink nontheless.

So, exactly WHAT color IS the Martian sky? Is it blue (like earth’s sky) or pink?

It’s pink. Trust me.

No, this isn’t quite true. If you take a picture of something that’s clearly blue in fairly pure red light, and have a grey point of reference which you later adjust to neutral gray in Photoshop, you will not see the “true” color of the blue object. Why? Because the object is not blue unless you have white light or light which at least contains some blue in it. You cannot create colors where there are none.

As for the photo itself, this little nitpick doesn’t really apply. It looks like some weird digital glitch to me. There’s absolutely no reason why something that blue would appear THAT fluorescent pink under any kind of lighting. If it were some sort of washed-up brown, sure it’s possible. But both blue objects rendered as vivid pink? It’s gotta be some weird digital screw up.

Actually, you can create colors where there are none, behold the digital magic of photoshop. You can, using a known grey value, add in exactly what colors of light are missing from the lighting in the photograph so as to make things appear as they would under white light. Because the blue object, even when lit in reddish light, will be ‘bluer’ than things around it that are not blue. ( which would be redder). When an equal amount of blue to simulate white is added to the whole image equally, the truely blue object would be blue again.

I suppose this would not hold true if you used a red laser to illuminate the scene, in which case it would be mono-cromatic, and no amount of color-adjusting will give you the correct colors, and you would then be 100% correct. But this is an unlikely situation, and even a bulb covered in red foil emits a little blue light to skew how things look.

I think this is where the dis-agreement lies. I’m thinking of more realistic lighting conditions, where the light is tinted, not wholly monocromatic. Such as in the Mars Rover pictures.

…surely there’s a difference between correcting for the colour that we know things are (ie how they appear on Earth), and the colour they appear to be under given lighting conditions? Say the grey colour on the chart shows up as pink - should we correct it back to grey, or leave it on the basis that our eyes will see it as pink too?

My own guess is that on first stepping out of a spacecraft, everything on Mars would probably appear quite pink, but after a while our brains would adjust without our really noticing. Kind of similar to when you go skiing with tinted goggles - after a while everything looks “normal” colours (more or less), but when you take them off everything looks very blue (assuming the lenses are orange/yellow, as mine are).

Of course, quite what colour our brains would “adjust” the sky to is open to question…

No. You are wrong.

I work with Photoshop almost every single day. In most cases, such as tungsten lighting and fluorescent lighting, you could get a decent white balance using your method. Just open up Levels, select the middle dropper, click on something that should be neutral grey and voila!

But this method isn’t perfect, and will not work well with strong monochromatic light sources.

Your blue object, when illuminated by a purely red light, is not blue. It has no blue. It’s black. Pure red light has no blue light for the blue object to reflect black. Your blue object is no longer blue under any meaningful sense of the word. It’s black. And that’s how your camera will record it. If you had a green and a blue object next to each other, and illuminated them with a red light, they would look the same. Your camera and Photoshop are not going to be able to recreate the real color.

Seriously, just go out and try it.

Carl Sagan discussed this in his book Pale Blue Dot. The data is not sent as a Poloroid snapshot (of course). So, the signal (data) that is sent back to Earth needs to be processed in order to recreate the image. When the original Viking photos came back, the guys in the processing lab (who were not planetary scientists!) mixed the coloring until they looked “right”…which to the Average Joe, meant a blue sky. The blue skies in the photos DID surprise the planetary scientists, so they sent it back for re-processing with closer attention paid to the color standards that were aboard the Viking craft. With that correction, the sky turned out to be pink.

The Martian sky is usually pink (salmon) in various shades depending on the amount of dust aloft in the atmosphere. However, like Earth, Mars experiences sunsets and odd weather and the color of the sky can change (hey, sometimes Earth’s skies are pink). IIRC, the Pathfinder mission had a photo of a blue-ish Martian sky.

http://antwrp.gsfc.nasa.gov/apod/ap971013.html

Criminy…I said in my own post that it wouldn’t work under a strongly mono-cromatic light source.

Please read the whole post before bashing parts of it. :frowning:

I also said that strongly mono-cromatic light sources had nothing to do with the OP, which was talking about a slightly reddish lit scene, and the possibility that the sky in the picture was really blue. Which it wasn’t even which adjusting the color target to neutral grey. sigh

I also stated in my first post that this isn’t really applicable to the Mars photograph (“this little nitpick doesn’t really apply.”) I’m just saying you don’t need a light source as strongly monochromatic as a red laser. A regular ol’ red light (or green light or blue light or whatever) will screw up your photo enough to make the “true” colors of a scene impossible to determine, regardless of the grey point. Hell, it’s difficult enough color correcting fluorescent light sources, even with a well established grey point.

It’s just a nitpick. Besides, no matter how off those colors are, there’s no way you’ll turn a fluorescent red into a fluorescent blue unless you actually shift all the colors, not simply add or subtract red, green, or blue.