Assuming you define “true color” as that as perceived by the human eye / visual system.
These fabulous artificially multi-colored space photos remind me of those rainbow hued pictures one always sees of the Mandelbrot set – a well-defined set where every point on the complex plane is either in the set or it ain’t, yet they colorize the pictures in spectacular ways.
You almost never see color in deep sky objects through an eyepiece, but that’s really because they are very dim so your eye doesn’t resolve color. Stars themselves are bright enough to see their colors, as is the Orion Nebula. Most other deep sky objects appear as little more than grey smudges in most eyepieces, even for relatively large telescopes.
Color cameras are not used much, for multiple reasons. One is that you are stuck with the choices for color made at manufacturing time. Another issue is that color cameras use a Bayer filter to resolve the colors in the pixel grid, which reduces resolution and cuts the amount of light the camera can capture quite a lot. More space on the pixel array is taken up by gaps between pixels as well (because you need three for each color), reducing quantum efficiency.
When you shoot in monochrome, you can use LRGB filters to get natural colors. That’s luminance, Red, Green, and Blue. That’s most of my planetary astrophotography, and the rsults are much better than when I shoot in color with a DSLR. Four times the work, though.
But the biggest advantage of monochrome cameras is that you can use them with narrowband filters to produce ‘false color’ images. Many of Hubble’s photos are narrowband, with the color image actually representing things like hydrogen-alpha or sodium or oxygen emission lines. In fact, there is a defined ‘Hubble Palette’ that looks like this:
Red: S-II Greem: Ha Blue: O-III
So that’s a sodium, hydrogen-alpha and oxygen filter representing red, green and blue.
For me as an amateur in the middle of a city, narrowband filtering ignores street lights so light pollution doesn’t get in the way (unles you have sodium street lights and are imaging using sodium filters). However, narrowband imaging requires much longer exposures, so you need a larger scope and most importantly a very good tracking mount with an autoguider.
I would not expect that. Usually they keep the wavelengths in the same relative order when they false color images. Those are all visible wavelength emissions: sodium-II is yellow; H-alpha is red; and O-III is green. So I would expect the H-alpha and Na-II to be swapped.
An approach to coloring images that I’ve seen in other contexts, for example photographs of the Earth’s surface from satellites taken to understand geology or land use, is to capture multiple wavelength bands through filters or similar means – and more than just 3 bands, for example 11 bands in some picture I remember reading about. Then, principal component analysis identifies the three strongest combinations of all 11 bands, and these are assigned to the 3 bands human eyes distinguish. Or, other principal components (ie not the 3 strongest) are chosen. In a certain sense these are the most informative pictures imaginable.
I’ve used this method myself, for example using not wavelength bands but different polarization rotations to take maybe 50 different images of the same scene of a pile of clear plastic panes at random angles. The different panes each take on their own “color”.
They would pretty much have to be, given that the human eye can’t perceive infrared.
I’ve heard it said that it would make sense for images of distant objects, whose radiation has been seriously red-shifted into the infrared, to be “violet-shifted” back into the visible frequencies originally emitted.
But the actual scheme used is arbitrary - can be anything that produces interesting & useful images.
When you get a chance check out the Where is Webb page again:
They’ve added a 3D model of the solar system labelling several points of interest including planets, moons, comets, asteroids, and other space craft. When you click on an object you can then click a second link to take you to a close up of it and see the solar system from its perspective. I just spent a fair bit of time exploring it and found it very interesting!
Cool - if very jerky.
I was surprised to see halley’s comet still inside saturn’s orbit.
According to this site (among others) it’s outside neptune’s orbit.
Yeah, Hale-Bopp looks like it’s inside of Saturn’s orbit until you change the angle and see that it’s going straight “down” and is a long ways away now. This site makes it really easy to see the ecliptic for the major planets and just how random and scattered the Kuiper Belt objects and comets are in our system. No wonder it can be hard to spot comets until they’re nearby.
Hmm - I find that I cannot recreate this Halley (yes - not Hale-Bopp) phenomenon.
Probably something to do with the jerkiness on my laptop…
Sadly it is too slow to be usable for me.