Can the sun harm your eyes even with sufficient UV protection?

If you view the sun through enough UV resistant glass to effectively remove all UV, will it still harm your eyes? If so, what if you also removed all infrared as well? In general, does biology know what intensity (W/m^2 for example) is required to harm your eyes for a given wavelength of non-coherent light?

Well, of course it will harm your eyes. Sunlight is bright enough that just the visible rays, when focused on a point – as the eye’s optics do on the retina – will burn wood. Infrared has less energy than visible light, so removing it has relatively little effect on the potential for retinal damage.

I can’t answer your other question, but there’s certainly plenty known about how much damage a photon of a given energy can do to when absorbed by tissues, and how it accumulates.

I think you are being pretty hasty in saying of course it will harm your eyes. It is not obvious to me that sunlight sans UV or infrared will burn wood when focused on a point. Oh, and how small a point, exactly (makes a big difference)? And saying infrared has less energy than visible light, and therefore has little effect, seems like a pretty bold statement as well (also a plain wrong statement). According to wikipedia, infrared accounts for more solar irradiance than visible and UV light combined. Infrared also affects tissue differently from visible and UV light (it heats it more efficiently, of course; it’s also known as infrared heat).

I wouldn’t be surprised if there was plenty known, but I’m posting here to hopefully be a recipient of said knowledge! My google-fu hasn’t turned up what I’m looking for…

Here is the solar spectrum. Note how little of the red area under the curve is UV. So the answer is undoubtedly yes.

That’s a little more contentious, but the visible portion is still maybe 40% of the spectrum coming through, so I’m pretty sure it would still cause your eyes damage, although it would take a little longer.

But UV is ionizing. UV causes sunburn, visible doesn’t. It’s more complicated than you are realizing.

Again, isn’t it more complicated than just considering the incident energy? You have to consider how the energy is deposited in the tissue, such as whether each photon dislodges an electron (UV) or causes molecular wiggle (IR) or just causes orbital transitions (visible).

Eye damage from looking directly at the Sun is caused by heating, so the total energy pretty much is all you need to worry about. And even if it’s not enough to burn wood, well, you can damage living tissue much more easily than you can burn wood. There’s no problem dipping a wooden spoon in a pot of boiling water, but you’d never do the same with your hand.

Many surfaces absorb infrared more efficiently than visible light, but not all. The only reason we think of infrared radiation as “heat” is because most radiative heat sources we deal with (fire, light bulbs, electric heaters, etc) are much cooler than the sun, and therefore most of their radiative output is in the infrared.

To be fair, even Feynman got it wrong -

Bolding mine.

This is just wrong. Infrared radiation affects molecules differently from other types of radiation (though it is most similar to microwaves). It transmits energy directly to molecular vibration. It heats things up. Visible light transmits energy to moving electrons to higher orbitals, which then re-emit radiation as they relax down to their lowest energy state. The only way visible light can heat something up is through the re-emission of infrared radiation that is subsequently re-absorbed. This re-absorption is not perfect, and a lot of the energy can be diluted to the environment (heating up the air, neighboring tissue over a larger area, escaping the eyeball, etc). UV, in turn, is ionizing radiation, whose effect is even more different.

I’d be pretty surprised if you didn’t need to worry about the reflectivity of various visible frequencies. That aside, I think we are all on board that damage to living tissue is a possibility! The question is what is the threshold, and how much above or below that threshold is represented by the visibile spectrum coming from the sun when considered alone? I mean, I’ve stared at the sun for maybe 10 seconds through UV-protecting glass without any problems, and on the internet there are various claims of staring for many minutes without any problem, so I think I don’t think it is clear that experimentally the visible spectrum alone can cause damage, and if it can, the amount of light is not too far beyond that threshold.

Any mechanism which results in light being absorbed can and almost always will heat up the thing that’s absorbing the light. There need not be any infrared photons involved at all.

But how efficiently? There is a reason infrared light is preferred in ovens/toasters/heaters/heat lamps, etc, over white light. Are you disputing this? If so, please explain.

Because it’s a lot easier to make. You can get infrared just by running current through an exposed nichrome wire, which is basically what’s going on in a toaster. To get hot enough for visible light, you need to use an extreme melting point material like tungsten, and put it in a partially-evacuated bulb.

I agree with you, but you are dodging my point. I’d rather not distract the thread from the possibility of someone actually answering my question, but I do find your insistence on totally ignoring the different efficiencies of energy transference to heat for different light frequencies perplexing. Transmission/reflection/absorption spectra are different for different materials and frequencies and re-emission spectra are different for different materials and frequencies. Different frequencies have different attenuation depths for different materials, determining how well you can focus energy on a point, and so on…

He’s not dodging the point. Absorption is absorption. The case where the absorbed energy is not converted to heat is the exception at any frequency. There is no fundamental quality to infrared radiation that makes it more efficient. At room temperature and at cooking temperatures, the primary thermal radiation frequencies are in the infrared, but at higher temperatures that radiation is visible or UV. It’s all thermal radiation.

Well, let’s consider what your eye would absorb. Your retina is filled with pigmented rod and cone cells, as well as a network of blood vessels. Opsins, the pigments that are used for vision, efficiently absorb visual light (obviously). Hemoglobin, the major pigment in blood, efficiently absorbs shorter wavelengths of visible light (again obviously, since blood is red). Here are some absorption spectra for opsins and hemoglobin. The lens and cornea absorb most of the UV spectrum, protecting the retina, but they can be damaged by large UV exposures, causing cataracts in the long term. Acute UV damage to the cornea is possible from a “sunburn” of sorts, but this can happen without directly looking at the sun. ETA: Water, another thing that’s present in your eye, can absorb IR very efficiently. (Which is convenient, since lots of things that we like to heat up contain water).

So, in the OP’s scenario, blocking UV will protect the cornea and lens, but these parts weren’t greatly in danger from directly looking at the sun. UV blocking will not protect the retina, which can still absorb visual light and IR quite efficiently. Staring at the sun with UV protection is still a very bad idea.

(Fun fact: the “blue” pigment in the eye can absorb UV light, so some people with the earliest artificial lens replacements can see an extra color of sorts).

The question is about eye damage. I agree that regardless of frequency absorbed energy eventually translates to heat. But the question is how. Absorbed UV radiation will eventually result in heat. But it’s not the heat that damages the eye in this case, it’s the individual photons’ abilities to alter chemical bonds. For visible light, the radiation energy is absorbed by raising the electron potential energy that is then given back when the orbital relaxes to the ground state. This process only indirectly causes heat; the absorbed energy is given back to the environment as the electron relaxes to its ground state. I’m actually not completely clear on how this works (maybe someone here can explain the physical process), since naively heat would never be produced, just continued emission and reabsorption of visible light. My guess is that in the time between emission and reabsorption there are slight changes in the potential in a given molecule due to molecular vibrational states and interactions with neighboring molecules, and so the emission energy is often different from the absorption energy, and over the course of many energy level changes across neighboring molecules the vibrational modes are slightly raised and diffused across many molecules. Infrared light on the other hand has photon energies that directly correspond to vibrational modes of molecules, and their absorption directly takes the form of heat. The point is that if you are trying to cause tissue damage by focusing visible light at a point, the fact that the absorption and conversion to heat is indirect matters. On a microscopic scale, it may be crucial, as instead of a small number of molecules being directly heated to a higher temperature, you may have a larger number of molecules being indirectly heated to lower temperatures. When trying to damage the tissue (with non-ionizing radiation) you are wanting to raise the temperature of a microscopic region as quickly as possible before the heat is diffused into the surrounding tissue.

Thanks!

But without any real data I’m not convinced of this. It seems like it must be a pretty close call given that you can stare at the sun for a while without any problems. What’s the threshold? Maybe if we cut out 50% of the visible light, we could stare indefinitely without problems? Given the difficulty of doing controlled studies vision loss due to staring at the sun, isn’t it possible that no harm is done at all even with no protection (beyond UV)?

Well, after a bit more digging here’s a thorough review of the literature on mechanisms of retinal damage. About damage from looking at the sun, it says:

So it looks like we can ignore heating pretty much entirely (admittedly not what I was expecting). But both UV and visible light cause photochemical damage. Blocking UV would somewhat protect someone looking at the sun, but full protection would also require a fair amount of visible light blocking.

There’s nothing magical about infrared radiation. I think a lot of folks relate it to heat transfer because of the way they are taught heat transfer in high school (condition, convection, and radiation, with the latter usually being infrared). While most objects that we are familiar with tend to radiate infrared at temperatures we are familiar with, that doesn’t mean that other frequencies of electromagnetic radiation transfer energy any less efficiently. Microwave ovens for example work quite well at transferring energy at frequencies that are much lower than infrared.

It is true that different materials absorb and reflect different frequencies of electromagnetic radiation. But, if the radiation gets absorbed, it gets converted into heat, plain and simple. At high enough power levels, visible light and infrared will both easily damage your eyes.

Once you get part way through the ultraviolet part of the electromagnetic spectrum, things change a bit. At these higher frequencies, electromagnetic radiation becomes “ionizing”, meaning that it can strip the electrons off of atoms and create ions. Ionizing radiation is what most people think of when they hear the term “radiation” since it causes damage at much lower power levels than non-ionizing radiation, and is well known to cause things like cell damage and cancer. For example, UV light will cause cataracts at power levels well below where infrared and visible light will cause heat damage to your eyes. There’s a reason suntan lotions and sunglasses are made to be UV blockers.

With UV protection, you’ll stop lower exposure levels of sunlight from damaging your eyes. Higher exposure levels can still damage your eyes from heating though. If you are foolish enough to look directly at the sun, it will be mostly heat damage that destroys your retinas, not the ionizing effects of UV rays. Something that only blocked UV would not offer much practical protection in that scenario.