Let’s say I lay outside on June 25 and it’s 95 degrees outside. Then, I switch universes to the same place, time and weather conditions - except it’s 80 degrees. Would I get more sunburnt in the first situation?
My instinct tells the opposite. You are more likely to sweat at a higher temperature, and the layer of sweat on your skin should give some protection against UV radiation (not a very effective, though).
That may be true, but what I’m wondering is whether the actual sun rays are stronger. I thought of this while watching the evening newscast. The reporter was talking about precautions to take during the current heat wave, and one of them was wearing sunblock. I thought, why is it more important to wear sunblock during a heatwave than it would have been otherwise on the same day?
Yes, but I’m talking about a situation in which all other conditions are the same. Let’s say today was the exact same except it was 80 degrees instead of 95. Would the sunlight be less powerful?
Nope. It depends on how much UV radiation you get, which doesn’t depend on the temperature.
More than temperature, the level of UV exposure is affected by time of year (due to the axial tilt of the earth and the sun’s apparent position in the sky), cloud cover (which reflects some of it), altitude (thinner atmosphere allows for more exposure), and such. It’s why sunburn can be a problem even in the snow - UV rays reflect rather well off snow.
Several weather sites will give the UV index as well as temperature, humidity, etc, to give you an idea of how much of a risk there is. So you can actually measure this sometime. A cold front can move through a region but not affect the UV index, which is the operative number for risk of sunburn.
However, the level of UV radiation and the temperature do tend to be correlated with the amount of sunlight we get. So, it is true that warmer temperatures are correlated with more sunburns, because, well, more sunlight tends to occur in summer which is when it’s warmer.
The reason for your instinct is that all else usually isn’t equal. If it’s cooler one June 25th than on another, there’s some reason for that. Likely, it’s cooler because it’s cloudy, which would make the cooler day also the less sunburny day.
Now, it’s possible that it’s cooler just because it was cloudy earlier in the day, but it just cleared up, and the temperature hasn’t had a chance to rise yet. In that case, either day would be equally burny, but that’s not so common.
Sunburn is quite a serious problem for people in Antarctica (and presumably in the Arctic too). Use of sunscreen on any exposed skin is strongly recommended. It does not even have to be direct sunlight. The UV can reflect off the ice to burn you.
Here’s a thought experiment. Imagine that you are in a swimming pool getting x amount of UV. Now you jump out of the swimming pool into a hot tub. Same amount of water is blocking UV in each case - only the water temperature is different. You don’t expect the UV to change here do you?
A change in air temperature makes no more difference than a change in water temperature.
As others have said, you might think there’s a difference with air temperature because it is pretty strongly related to the amount of sunlight being received. But in your “all else equal” scenario, it’s no different than moving from the pool to the hot tub.
Most of the filtering of UV-B rays from the sun occurs in the ozone layer, which is between 60,000 and 100,000 feet above sea level. The Wikipedia article has this to say about variations in ozone layer density/thickness:
So in the US, there’s more ozone in late winter and early spring, and less ozone in late summer and early fall. Depending on your timing (late spring vs. late summer), the same amount of exposure time could get you a more (or less) severe sunburn.
All other things being equal, no: a photon is a photon, and it does not have a temperature. The temperature we speak of is a property of the medium (in this case, air) through which the photons are traveling. A UV photon at a wavelength of 300 nm has exactly the same amount of energy, regardless of what it’s traveling through - and a sunbeam composed of X photons at 300-nm UV has the same sunburn potential, regardless of what it’s traveling through.
When it’s 30 or 40 degrees out, people tend to bundle up when they go out, and they don’t stay out for too long; it’s not really necessary to warn people about the dangers of sun exposure in this setting. When it’s 90, people head en masse for the lake or lay on a towel next to the pool with most of their skin exposed to the sun for long periods, creating a potential for sunburn; a reminder about sunblock is helpful to many people here.
Note that if you’re out cross-country skiing on a warm winter day (30-40F), during which time you’re likely to leave your face/head uncovered, you can easily get a severe burn. In this situation you are being hit with direct sunlight as well as light that’s reflected off of the snow at your feet. You can actually burn faster than you would for a similar exposure time in the summer.
Warnings on the evening news tend to be given when they will do the most good. Thousands of people headed for the water in July? “Remember to wear sunblock.” A few hundred people cross-country skiing on a mild, sunny afternoon in February? I hope they’re smart enough to think for themselves, because the folks on the evening news ain’t gonna say shit about it.