What would UV or IR light look like?

I realize there have been threads like this before, but from what I can tell none have covered my question as I’m asking it.

Until recently I always assumed that if we could see UV or IR wavelengths they would look like some “new” colour, one that we cannot possibly imagine.

Recently I considered a new idea. The visible spectrum can be shown as a wheel, with red and violet merging nicely just as blue and green do. Could the range of colours in the visible spectrum simply be the range that a human brain is capable of imagining, “stretched” over the range that we’re capable of physically detecting? If we could see IR and UV, would IR look red and UV violet, with the wavelengths we see as “red” looking somewhat orange and the “violet” wavelengths somewhat blue? Or if you could only see the middle of the visible spectrum, could light that we see as yellow look red, with green light appearing to be violet? Is the price for our (comparitively) wide visible spectrum a lesser ability to distinguish between different colours?

Here is a lazily edited image to illustrate the idea.

P.S. Not sure if this is a “General Question” a “Great Debate” since I’m too ignorant on the matter to know if there is (or even ever could be) a concrete answer to this question.

We don’t see these things because our eyes don’t have receptors for them. There’s no way to say what they would look like. But we would probably compare them to objects which reflect a lot of these types of light, just as we use apples to describe the color red, and the sky to describe the color blue.

There is no simple or indeed single answer.

You could argue that our cognitive ability to determine colour is limited to the current gamut, and thus any extension of the physical wavelengths we can perceive must be fitted somehow into that gamut. Thus you get a remapping of colours, and that remapping could come in arbitrary ways.

We could imagine extending out eye’s perception of colour in a number of logical ways. An obvious one is to extend the range of the cones, and extend the wavelengths of overlap of the cone sensitivities, so that we just stretch the entire visible spectrum. This would mean that we would perceive UV as violet and IR as deep red. But it would also mean that all the current visible colours would shift. Oranges would become more yellow, yellows greenish, blues more green, all to make room for the IR and UV. This would be the easiest way of modifying the visual system. No new perceived colours, but colours in our current vision have different colours.

A more advanced modification would be to add new types of cones - adding specific IR or UV (or both) receptors, and modifying the colour coding system. This could be done in an almost arbitrary manner. But intrinsically the brain would be delivered 4 or 5 instead of 3 channels. The manner in which the sensors provide an overlap in sensitivity would determine the form of the gamut possible, but in the end it is impossible to say what the brain would perceive. It would involve new brains. You may as well ask what colour octarine is, or try to imagine a blind person asking what red is.

Our retinas are slightly sensitive in the near UV, and people who have cataract surgery, and don’t get a UV absorbing replacement lens, report seeing near UV, but because all the sensors in the eye are equally sensitive to UV it is whiteitsh, and an annoying washing out of contrast and colour definition.

The manner in which the eye perceives and codes colour is remarkable - and quite a bit different to the standard way that is taught. The brain does not receive red blue and green signals, and indeed the eye does not have red blue and green receptors either. Not in the sense that there are receptors that are specific to the primary colours. It is much more messy.

What, you can’t see the colours murple and gred?

if you put a UV filter on a video camera and point it at a UV light you get this amazing yellow-purple-green color, which makes no sense but thats the best way I can describe it. Obviously it’s actually only RGB values but it shimmers in an interesting way, and I’ve always imagined if we could see UV it would look like that.

For IR I imagine it as as a dull throbbing angry earthy red/blue/black.

Thats a writers descriptions but probably as good as any scientific answer you’d get.

Actually, that’s debatable; there’s at least some evidence that a few percent of women have 4-channel color vision, although it’s limited to the normal range.

If I were to take a stab at it, I’d compare the visual spectrum to the audial one. As we look at colors, the lowest and highest frequencies we perceive basically start to wrap around again and create what would be analogous to a visual octave. If we could extend that analogy, then in the same way that we could imagine that it duplicates in a similar way.

For instance, according to wikipedia, the low end of red is 400 THz and the high end of Violet is 789 THz. Thus, we could probably even go with the idea that the spectrum might repeat with every doubling of the frequency just like with sound (ie, A is 440 Hz and again at 220 Hz and 880 Hz). So, we’d see a “low red” again in the 200-250 THz range and a “high red” again in the 800-1000 THz range, and it would repeat like that to the edge of perception again.

Related to this, is there a reason that the top and bottom ends of the spectrum appear to “overlap” such that they can be put on a wheel? Why should the very top end of the frequency spectrum appear as an increasingly reddish blue (i.e. purple) such that it blends perfectly with the red at the opposite end?

Would it be possible to take a picture with a UV/IR sensitive camera and photoshop the results to look like this? Has anyone done so?

McCoy: C’mon, Spock, it’s me, McCoy. You really have gone where no man’s gone before. Can’t you tell me what it felt like?
Spock: It would be impossible to discuss the subject without a common frame-of-reference.

McCoy: You mean I have to die to discuss your insights on death?
Spock: Forgive me, Doctor.

----- Star Trek IV: The Voyage Home (1986)

The colors in a color wheel do not match up with those in the single-frequency spectrum. In particular, the reds and violets are blended the join point. For instance, in this color wheel, the two upper-left colors are not to be found in the EM spectrum.

The CIE chromaticity diagram is a good way to see this. In this figure, the single-wavelength colors are labeled around the outside. Notice that the blues stop at the bottom left and the reds stop at the bottom right. All the colors across the bottom (the red-blue “merging” colors) are composite colors, as are all the colors in the inner regions.

How to see IR.

I believe most modern digital SLR camera sensors are capable of detecting IR/UV. But the lenses probably have coatings to filter these wavelengths out.

I know there are filters available for blocking out most of the IR/Visible spectrum, and capturing “reflective UV”. And, the other way around to capture IR.

I’m not a photographer, and don’t have the equipment, but it’d be easy enough to composite images as long as one is able to capture a channel for R, G, B, IR and UV.

But, why stop there? Let’s go into Xray and Microwave. :cool:
ETA: Our clothes would have to be made of pretty good insulating material if we could naturally see infrared; assuming we were still concerned with concealing our hoo-hoos, bazoombas, and wang’a’langas.

These might be an example of ‘seeing’ IR light.

Didn’t notice this before I posted my above link.

Modern DSLRs have an IR blocking filter over the sensor; the company in the link I provided above they remove this filter and replace it with various filters to enhance the IR effect.

I had one of my cameras modified a few years ago with an IR 72 filter. Great fun; IR images handheld with normal exposure times.

Like this and this.

You can see IR and UV like that, but you’re just substituting other colors. There are many ways to do that by varying the intensity and consistency of colors ordinarily perceived.

Well, consider this.

You look at an apple. It looks red to you.

I look at the same apple. It looks red to me.

How the heck do we know that “red” looks the same way in both our brains? In other words, the apple reflects a certain pattern of light, and our eyes recieve it, and we both use the same word to describe the pattern we see. But we don’t see with our eyes, nerves in our eyes are stimulated and our brains interpret those results.

So there’s so way to say that the image that forms in my brain when I see a red apple is the same as the image that forms in your brain when you see the same red apple.

Or to look at it another way, imagine talking to some who is color blind. You want to describe “red” to them. You can say, “it’s the color you see when you look at that red apple”, but they won’t understand how that color is different than the color you see when you look at a green apple. Both apples look the same to them.

Sure, you could analogize to “dark grey” and “light grey”. But it wouldn’t make sense to say to a color blind person that if they could see red they’d see it as dark-dark grey and green as light-dark grey.

Cell phone cameras and other cheapish digital cameras can see infrared. Just point one at the business end of a TV or cable box remote. It looks somewhere between purple and white.

I’ve occasionally seen spots of the same color when taking pictures of chrome or tinsel in sunlight.

All the time. However the place this happens where we see it most often is in remote sensing, but also astronomical images. False colours are very common. In any imaging system where you have either different spectral ranges than the eye, or more than three bands (or both) one needs to perform a mapping from that space to RGB. Even the early Landsat imagers had four channels, and saw in the IR. The common (trival) colour map was to just drop one of the visual colours (green I think) and replace it with the IR channel. The result was actually quite visually acceptable, and since the IR channel was sensitive to foliage, the green in the image did actually correspond to vegetation, so the images made some sort of sense. When you get to multispectral images it gets arbitrarily complex. You get imagers that have 80 plus channels - which compared to our puny three. Anything goes.

Probably the more interesting question would be, ignoring what the notion of colour would be, what additional nuances and information would we see if we saw into the UV and IR? (Worth pointing out that there is a big difference between near IR and the deep IR that is used by heat imagers. There is essentially no possible way we can see into the deep IR. Our body heat alone makes it essentially impossible. But near IR - mush as seen by digital cameras is conceivable.

Depends on what you mean by “IR”. Some synthetic fabrics are transparent to near IR, and that can be an issue with cameras that see in that range: Someone wearing such clothes appears, in that camera, to be naked. But that’s basically just the same process as visible light: Ambient light bounces off the skin, and into the camera. Most natural fibers, though, are just as opaque in this band as they are to visible light, regardless of their insulative properties.

Given that you mention insulation, you’re probably referring to thermal IR, or infrared in the band that’s produced by our body heat. You can definitely see that through normal indoors clothes (to a camera in that band, a winter coat seems to be a cloak of invisibility), but in not nearly enough detail to be an offense to modesty.