We’ve all seen fantastic color images “produced” by the Hubble telescope, Mars rovers, etc., but aren’t they all pretty much “color enhanced,” i.e. appealingly colored in by NASA artisits? Aren’t the true colors of the galaxy / universe fairly monochromatic? If so, what is the rationale for them doing this? To make space exploration more appealing to the dull masses?
We’ve all seen fantastic color images “produced” by the Hubble telescope, Mars rovers, etc., but aren’t they all pretty much “color enhanced,” i.e. appealingly colored in by NASA artisits? Aren’t the true colors of the galaxy / universe fairly monochromatic? If so, what is the rationale for them doing this? To make space exploration more appealing to the dull masses?
Not an astronomer but…
… I would speculate that the universe is anything but monochromatic. The problem is that the range of wavelengths is far larger than our visual systems can cope with. False colors are assigned to wavelengths which would otherwise be invisible to us.
Here’s a link to the ESA’s page on Hubble image coloring:
Astronomical images
Images of astronomical objects are usually taken with electronic detectors such as a CCD (Charge Coupled Device). Similar detectors are found in normal digital cameras. Telescope images are nearly always greyscale, but nevertheless contain some colour information. An astronomical image may be taken through a colour filter. Different detectors and telescopes also usually have different sensitivities to different colours (wavelengths).
Filters
A telescope such as the NASA/ESA Hubble Space Telescope typically has a fixed number of well-defined filters. A filter list for Hubble’s WFPC2 (Wide Field and Planetary Camera 2) camera is seen to the right.
Filters can either be broad-band (Wide) or narrow-band (Narrow). A broad-band filter lets a wide range of colours through, for instance the entire green or red area of the spectrum. A narrow-band filter typically only lets a small wavelength span through, thus effectively restricting the transmitted radiation to that coming from a given atomic transition, allowing astronomers to investigate individual atomic processes in the object.
.
.
.
Galaxies are often studied through broad-band filters as they allow more light to get through. Also the processes in a galaxy are more ‘mixed’ or complicated, result from the outputs of billions of stars and so narrow-band filters give less ‘specific’ information about the processes there.
.
.
.
Assigning colours to different filter exposures
The astronomical images we see on the web and in the media are usually ‘refined’ or ‘processed’ as compared to the raw data that the astronomers work on with their computers. In ‘pretty pictures’ all artefacts coming from the telescope or the detectors are for instance removed as they do not say anything about the objects themselves. It is very rare that images are taken with the sole intention of producing a ‘pretty’ colour picture. Most ‘pretty pictures’ are constructed from data that was acquired to study some physical process, and the astronomer herself probably never bothered to assemble the greyscale images to a colour image.
Natural colour images
It is possible to create colour images that are close to “true-colour” if three wide band exposures exist, and if the filters are close to the r, g and b receptors in our eyes. Images that approximate what a fictitious space traveller would see if he or she actually travelled to the object are called “natural colour” images.
To make a natural colour image the order of the colours assigned to the different exposures should be in “chromatic order”, i.e. the lowest wavelength should be given a blue hue, the middle wavelength a green hue and the highest wavelength should be red.
Representative colour images
If one or more of the images in a data set is taken through a filter that allows radiation that lies outside the human vision span to pass – i.e. it records radiation invisible to us - it is of course not possible to make a natural colour image. But it is still possible to make a colour image that shows important information about the object. This type of image is called a representative colour image. Normally one would assign colours to these exposures in chromatic order with blue assigned to the shortest wavelength, and red to the longest. In this way it is possible to make colour images from electromagnetic radiation far from the human vision area, for example x-rays. Most often it is either infrared or ultraviolet radiation that is used.
Enhanced colour images
Sometimes there are reasons to not use a chromatic order for an image. Often these reasons are purely aesthetic, as is seen in the example below. This type of colour image is an enhanced colour image.
We’ve all seen fantastic color images “produced” by the Hubble telescope, Mars rovers, etc., but aren’t they all pretty much “color enhanced,” i.e. appealingly colored in by NASA artisits? Aren’t the true colors of the galaxy / universe fairly monochromatic? If so, what is the rationale for them doing this? To make space exploration more appealing to the dull masses?
The star Rigel is blue and Betelgeuse is red. Saturn is yellow and Mars is red. Mostly the colors are pastel but Rigel is quite blue.
If by monochromatic you mean all one visible color the answer is no, the universe that we can see isn’t all one color.
If you mean all one wavelength the answer again is no. Wavelengs range all the way from micropulsations which are electomagnetic pulsations of a duration ranging from maybe 1/10 sec. up to a few hundred seconds and having correspondlngly long wavelengths, to gamma radiation which, I believe, doesn’t really have a lower limit to wavelength.
The Cosmic Spectrum and the Color of the Universe . The cosmic latte color is not what you’d see if you merely averaged all the light reaching earth because the authors removed redshifts from all spectra before combining them.
Universe’s Location on Chromaticity table