James Webb Space Telescope general discussion thread

For research purposes, sure, but for public images stick to what’s instinctual (and pretty – gotta keep the taxpayers happy).

In her previous posting, the live event when the images and data were released she was more interested in the absorption lines in the spectra data.

Here is Dr Becky’s more detailed explanation of the released pictures:

Brian

I love how well Nova can prep a program on a topic and tag on the end the most recent stuff. Pictures released on the 12th, program airs on the 13th, we watch on the 14th. They did something like this with New Horizons almost exactly 7 years ago.

The program is presumably available on the app, online, etc.

Sorry (not sorry)

The actual filters used by scientific telescopes usually correspond to interesting spectral lines, or (occasionally) spectral ranges, and are chosen for what they can tell us about the composition of what we’re seeing. They might, for instance, correspond to one of the lines of hydrogen, one of ionized oxygen, and one of helium (among many possibilities). For scientific purposes, they usually pick red to represent one one filter, green to represent another, and blue to represent a third, because that’s the easiest way for humans to see the content of three filters at once (if there are more than three filters used, then they’ll need multiple images to show all of them). The scientific images will come with a legend to show what each color represents, because it’s not always the same filters, and so what’s represented by red in one image might be represented by green in another image.

For the pretty pictures released to the public, they’re still using those images taken in scientifically-interesting filters, because time on the telescope is too precious to waste it on non-scientifically-interesting filters. But they’ll usually at least try to use use a filter that’s sort of close-ish to red for the red channel, a filter that’s sort of close-ish to green for the green channel, and one that’s sort of close-ish to blue for the blue channel (to within the limits of what filters they have available). So even though the Hubble pictures were all false color, they were at least a vague approximation of what the eye would see.

With infrared, of course, that’s not possible at all, because none of the filters would correspond to anything we can see. There, I imagine that they have some intern who just tries different mappings of filters to colors, and picks out whichever one they think is prettiest.

The Hubble has specified ‘palettes’ for choosing which color to show for each narrowband filter:

Most amateurs follow such conventions when not doing straight RGB imaging.

I used to do all my astro photography with a DSLR in color. When I switched to monochrome with a filter wheel, the quality of my images went up dramatically. Shooting in color is convenient, but mono imaging with filters is much better.

The simplest filtering is LRGB (luminance, red, green, blue). Luminance is basically an unfiltered mono shot, which brings in more detail. This is ‘true color’ filtering, and combined image will look much like what you’d see with the naked eye.

One nice thing about RGB filtering is that the red wavelengths are typically not as sharp as blue, and colors get filtered out of the atmosphere at different attenuations. So by shooting each color individually through a filter you can boost colors back to where they need to be and you can separately sharpen each color, giving you much more control over the final image.

More advanced filtering is done using narrowband filters. For example, filters for Hygrogen-alpha, Oxygen and Sodium emissions. Because there is no ‘natural’ color for each of these, you simply assign a color to each one. This is ‘false color’ imaging.

For example, this is typical for Hubble colors:
SII – Red
Ha – Green
OIII – Blue

Combining them gets you an image like this, of the Orion Nebula::

Whereas an RGB filtered view looks like this:

The first is false color, the second true color.

I’ve always thought they went a little too far with color saturation. They look like something out of a 60’s TV show where the intro of color TV required exaggerated colors to demonstrate the technology.

Oooooohhhhh. Pretty.

If you’re referring to the latest release of JWST images, I disagree. I think the visual presentation was fantastic. Some degree of enhancement, whether due to post-processing or whether due to the imaging process itself, is inevitable in astrophotography of any kind. The stuff you see in pictures is almost never the stuff you would see through the eyepiece of even the most powerful telescope, if indeed (in the case of faint, distant objects) you could even see anything at all. The false colours usually carry important scientific content.

All of the raw data is available if you want it. There’s usually a delay in releasing the raw data, but anything sufficiently old is out there. Hubble, Spitzer, JWST, etc. data can be found here. Grab the original sensor data or more mildly processed imagery. Not everything is to oversaturated.

And that’s important to note as well. Even if we could go stick our own eyeball at JWST’s focus point, and we could see in infrared, we still wouldn’t see what JWST does, because JWST can do a long exposure, and the best our eyeballs can do is a few milliseconds of exposure. You aren’t going to see those clouds of gas unless you spend a fair amount of time detecting photons.

I was looking over the JWST schedule, and there’s some interesting observations already done and more coming up:

On July 10, Webb apparently used the NIRspec instrument to look at the Wasp 39 system. Wasp-39b is another hot Jupiter, this one about 700 light years away, as opposed to the 1100 ly or so of Wasp 96 already imaged. Water has already been discovered in the atmosphere, but apparently there are some anomalies that Webb will help to resolve.

On July 11 Webb supposedly imaged the Trappist-1 system. This is really exciting, as the system is full of planets, including at least one in the ‘habitable’ zone. Again, NIRspec was used to collect spectra.

On July 12, the Wasp-52 system was imaged with the slitless spectrometer NIRISS instrument. This system is interesting because another ‘hot Jupiter’ was dound there, but this one has tentatively given up a spectrum including sodium and potassium, which you wouldn’t expect in a hot Jupiter, Speculation is that it may have one or more volcanically active moons, and that’s where the sodium and potassium come from. Webb should resolve that. If it does detect an exomoon, it will be the first confirmed exomoon discovery, I think. There are several other candidates for exomoons, but none confirmed.

Also on the 12th, NIRCam was supposed to have imaged Neptune.

Also on the 12th, HAT-P-1 slitless spectroscopy. Another hot jupiter, this one even closer (571 ly), and unusual because it’s larger and less dense than it should be.

On the 13th, the WR137 star was imaged. WR137 is a ‘Wolf-Rayet’ star with unusual emission lines. It’s about 6,000 LY away. It’s also a binary. This one will be really interesting.

On the 14th, Web imaged asteroid 1998-bc1 and also took its spectra. This is an asteroid in the belt between Mars and Jupiter, around 15km in diameter.

Also on the 14th a bunch of Jupiter imagery was done with NIRCam.

Today, the 15th, Exoplanet system HD149026 was imaged. This is a fairly close (250ly) large yellow star. This one has an unusual exoplanet that orbits really close to the star (period of 2.5 days), has about 1/3 the mass of Jupiter but is much denser than a gas giant. It may be a huge, rocky world. It’s so hot it likely glows red.

Also more imagery and spectra of WR137, and throughout the last week lots of imagery on the Lesser Magellanic Cloud, which looks to be calibration stuff.

And that’s just some of the stuff imaged in the last week. The amount of data we are going to get from Webb will be incredible. In the next couole of days they are going to be look8ng at SN1987-A, a Supernova, lots of high redshift galaxies, and more imagery/spectra of Trappist-1.

This is the new golden age of astronomy. Between Webb, Gaia, TESS and Kepler, and the upcoming Nancy Roman Grace telescope, our knowledge of the universe is growing exponentially. And the best part is that all the data is open spurce and anyone can use it to do science and discovery.

Can the JWST only look at things that are on one plane? (i.e. Can it point itself in different directions to look at various objects?)

We usually see the JWST presented as below with the telescope looking “down” (I know there is no “down” in space, just orienting myself). Let’s say the top of the graphic is north, the right side east and so on.

So, the earth and sun are “behind” the telescope to the west and the telescope is pointed (mostly) “south”. Can the telescope ever look at something to the east? Or is it forever locked looking only at things that fall on that one plane (I assume the telescope can spin so it could look “up or north” if it wanted to but I am not even sure about that). I get that as it orbits it can look at things that are “behind it…to the west” today. Wait six months and that stuff will be in front of it.

From the (cited) wiki page:

The sunshield allows the optics to stay in shadow for pitch angles of +5° to −45° and roll angles of +5° to −5°.

That’s a fair amount of pitch ability. Roll is somewhat irrelevant, though it might be needed to get a diffraction spike out of the way of an interesting object. Yaw is the full 360 degrees, of course.

ETA: I should add that all of this stuff is as if you were sitting on the heat shield as if it were the floor. Not much ability to roll left or right, but you can pitch 45 degrees “up” and 5 degrees “down”. A reasonable view of the sky, but if the interesting object is directly above, you’ll have to wait until later in the year to view it.

Thanks for that excellent report on recent JWST observations. Just to clarify, last I heard Trappust-1 was believed to contain no less than three planets potentially in the habitable zone, with one being a particularly good candidate for life.

Video from post 194 in this thread explains (video queued to explanation):

I think this is untrue of “deep field” images, in which individual objects have typically been redshifted by amounts that vary considerably. You’d need a separate filter for each object in the field, because interesting spectral lines originally emitted by the object are now all over the infrared spectrum.

I think it’s also untrue of distant galaxies that may be interesting because of their shapes and sizes. They are very faint, and narrow band filters pass very little radiation.

I think spectral line filters are more useful for nebulae within our own galaxy, as well as stars and sometimes planets. They are absolutely often used, but there are subfields in astronomy where they’re not, including some of what JWST was intended to image.

JWST is more likely to observe interesting spectral lines spectroscopically, in the case of targets outside our own galaxy, isn’t it?

Sorry (okay, not sorry)

JWST can do both slitted and slitless spectroscopy. Furthermore, it has a grid of shutters over the sensor that allow it to isolate up to 100 targets at a time and take a spectrographic reading of all of them at the same time. This will allow Webb to easily group galaxy spectra together by redshift or other criteria, and speed up star analysis dramatically.

Multiple instruments on JWST have spectrographic capability of varying resolutions and features. JWST also has several types of coronagraph which will allow,it to directly image and take spectra of exoplanets and other objects near bright stars. JWST will supposedly be able to tell if an exoplanet has liquid water on its surface.

Another cool exoplanet hunting tool is that JWST can partition its sensors in such a way as to create an interferometer with different parts of the main mirror, which will give it tremendous sensitivity for exoplanets.

JWST will be able to take spectra and surface temp readings of every known Kuiper belt object. It will probably discover a bunch more of them.

JWST will also be able to get detailed spectra from plumes above Europa and Enceladus.

JWST should also be able to see back to the moment when the first stars and galaxies formed. We already have spectra from JWST for galaxies that are 13.1 billion years old, or only a few hundred million years after first light. Shocking to me is that the spectra of those galaxies looks almost the same as modern ones. I wouod have thought that early galaxies would be depleted in the heavier molecules.

Apparently imaging the very first galaxies will require a 100 hour exposure, so we’ll probably not see that for a little while. The deep field we already saw was, I think, an 11.5 hour exposure. But we will get good spectrographic data for the first galaxies that formed. That’s either going to confirm things we have only theorized, or it will turn cosmology upside down.

I know a Doomsday Machine (aka planet killer) when I see one :slight_smile:

Sorry (not sorry):