Normal household microwave ovens use a radiation of 2.45 GHz to heat food. This is much less - by many orders of magnitude - than visible light, which is in the PHz range. I would guess, then, that the energy of a photon of visible light is much more than that of a photon of microwave radiation. But we are constantly exposed to visible light and being bombarded by it from all directions in every second of our lives, which seemingly doesn’t do us any harm. Why is this? Is it because the total wattage of visible light (i.e., the number of photons?) that hits us is too little? Is it because visible light is spread across many different frequencies, whereas microwave oven radiation is monochromatic?
Different RF frequencies have different penetrating depth.
2.4GHz radiowaves can penetrate the human body a few centimeters, so if the power is sufficiently high, they can overheat and damage internal organs.
Lightwaves on the other hand cannot penetrate more than a few mm, so any damage is localized on the skin.
And of course, there’s nothing inherently dangerous about microwaves. Things like mobile phones, wifi, bluetooth, GPS, all operate on microwave frequencies.
What makes radiowaves potentially dangerous is power. A microwave oven puts out about a kilowatt. Putting your hand in one will be painful because the power is so high. But so will be putting your hand next to a 1000 watt incandescent bulb.
Mobile phones and wifi power for comparison is in the thousandth of a watt range.
Visible light has been penetrating the atmosphere for a couple of billion years. Any organism with a biochemistry that was too sensitive to those wavelengths was removed from the gene pool long ago. Visible light doesn’t mess us up because we are accustomed to living under it.
Worth pointing out that visible light is only just too low in energy to cause damage. You only need step a little further, into the UV, to get to energies that cheerfully cause very real damage - from sunburn to cancer.
The question is whether a photon is able to disrupt something when it hits - basically can it shift an electron enough to allow a chemical change to occur. This is usually considered as ionising radiation, but there is something of a blurry area at the low end, where disruption might occur before true ionisation does. Microwaves are vastly too low in energy to do this, by many orders of magnitude. The damage they do is by cooking you. That needs power, as your body is constantly cooling itself (just to get the energy of ordinary living out of the tissues) and so you need to overwhelm this natural cooling before you can start to cause damage anyway. But energetic photons do damage one by one, and the total power delivered doesn’t affect the ability to cause damage - just that more photons means more damage.
This harks back to one of the critical early demonstrations of the quantum nature of light.
The thing with microwaves is that they are at the resonant frequency of hydrogen, that is why it can transfer energy into stuff with hydrogen in it, like last night’s pizza or your own body.
An analogy would be the breaking of a crystal cup with sound waves, you hit the right frequency, not too low or too high, and it breaks even if the noise is at a very tolerable level for a person.
Microwaves really aren’t all that dangerous. Sure, if you get enough of them concentrated in one place (like inside a microwave oven) then you can cook things, but you can do the same thing with visible light. 2.54 GHz does happen to be a frequency at which the energy from the incoming wave is coupled fairly well into water, as well as certain sugars and fats. Other than that, though, there’s nothing special about microwaves.
Microwaves are used all the time without causing harm. The automatic door opener at your grocery store is most likely the type that blasts you in the face with microwaves and listens for the echo. If it detects an echo it opens the door. Cell phones, cordless phones, wi-fi, police radar guns, GPS, Bluetooth, and a whole bunch of other things use microwaves, and they don’t cause harm (no matter how much you may read to the contrary on the internet).
There is an important distinction between ionizing and non-ionizing radiation. Part way through the ultraviolet portion of the electromagnetic spectrum, the radiation becomes energetic enough to strip the electrons off of atoms and creates ions (hence, ionizing radiation). This is what causes things to fade when left out in sunlight, and also causes cell damage and things like cancer. The visible light spectrum is too low in frequency to be ionizing, and as the OP notes, microwaves are even lower than that.
So yeah, the inherent danger in microwave frequency radiation is pretty small.
You do a lot more damage to your body by walking out into sunlight than you do sitting next to a wi-fi antenna. Sunlight contains ionizing radiation. Wi-fi signals don’t. While your body is designed for a certain amount of exposure to sunlight, as this is how your body produces vitamin D for example, you can get cancer from too much exposure to the sun. You can’t get cancer from sitting next to a wi-fi antenna.
Yes. Microwaves are actually probably not dangerous to humans. I read of an experiment recently where an entire house was heated with microwaves. Rather, the house wasn’t heated, the humans inside were. Very energy efficient. The only problem was hot spots - the experimenters used better diffusers than a microwave oven uses in order to spread the microwaves around more easily.
Well, that, and pacemakers - that would be bad if you had one. Probably.
Evolution …
Yeah, when microwaves damage things, it’s by heating them up and cooking them. Which is something any form of energy at all can do. Put a living thing in a one-cubic-foot box with ten 100-watt light bulbs, and they’ll be cooked just as thoroughly as they would be by a 1000 watt microwave oven.
AIUI, radiation interacts with things similar in size to their wavelengths.
More like Stockholm Syndrome.
I was recently released from the hospital due to sunburn. just sayin’.
Think of it like heat.
Heat is beneficial to humans, even an absolute necessity sometimes (I know that from living through 60+ Minnesota winters).
But going into a blast furnace would be clearly “hazardous for human beings”.
So the amount involved is quite relevant. Humans are rather fragile – many things that we enjoy or even require are only OK in a short range, too much or too little is harmful to humans.
Absorption of electromagnetic radiation only occurs when there is a movement of charge that can occur over a distance comparable to the wavelength. (That’s oversimplified, of course – in fact absorption still occurs off wavelength, but rapidly less efficiently.) The various types of radiation are absorbed by whether the absorber has electric charge that is mobile over the right distance. In short, whether there is an “antenna” of the right size.
So, for example, infrared radiation has a frequency that corresponds well with the natural vibrational motions of molecules, so almost any solid or liquid material absorbs IR well. Molecular gases will absorb weakly, and only in very specific bands. (This is why we can speak of greenhouse and non-greeenhouse gases – some gases absorb in the relevant wavelengths, many do not.) By taking careful note of at exactly what frequencies IR light is absorbed (or reflected) by a substance, it is often possible to identify it chemically. This is how New Horizons identified the various components of Pluto’s surface.
Visible light, interestingly, corresponds to very little in the way of motion at the atomic scale. It is too high frequency to excite vibrations, and too low to excite the transition of electrons from one level to another in an atom or small molecule. There are only two ways for an ordinary substance to absorb visible light strongly: one is to contain transition metals (e.g. iron or titanium) which have unusually closely-spaced electron levels, so that transitions between them absorb visible light. This is what gives the color to almost all minerals. Another way is using large molecules – if the molecules are long enough, there will room, essentially, for electrons to run back and forth and form an “antenna” long enough to absorb visible light. This is the origin of color in organic dyes. The green of plants comes from big molecules – chlorophyll. The color of humans is dominated by the color of blood, which itself comes from the iron atom at the center of every hemoglobin molecule, and by melanin, which is a big molecule. Still, on the whole, there is not much in humans that easily absorbs visible light. This is why your eyes have to have special molecules to detect the stuff — and you can’t just see with your skin, the way you can feel heat (IR) and even tell where it’s coming from with your skin.
When you get to the ultraviolet, you begin being able to boost electrons to higher orbits, at least in the lightest of atoms and medium to large molecules. This means again most things absorb in the UV. Since boosting electrons quite often changes the chemical bonding, it also means UV is usually pretty destructive of chemical bonds. This is what causes things made of natural fibers (or synthetics made from hyrocarbons) to bleach and become brittle in the Sun: the UV destroys the chemical bonds that make them strong, and also give them color. Obviously UV is also dangerous to your skin, because it can break chemical bonds there. Your body goes to some trouble to create stuff that absorbs UV and transmutes it harmlessly to heat to avoid that.
Microwaves tend to be absorbed by most materials because they correspond to rotational motions of molecules. Microwave ovens in particular are tuned to the rotational motion of water molecules, so when you flood something microwaves, it sets the water molecules to spinning like crazy. Of course, they crash into other molecules and the motion dissolves into an increased jostling and general wiggling – i.e. heat. Since you’re mostly made of water, you can be “cooked” by microwaves in the same way. But all its doing is heating you up – unlike UV, it cannot generally cause chemical reactions, i.e. direct damage. (Heat damages you because it deranges the rates of key chemical reactions, and ultimately because it causes proteins to lose their precise shape required to do their jobs.)
Materials in which electrons are held rigidly tend to absorb poorly, for example diamond and glass are transparent. On the other hand, materials in which electrons are free to move absorb wonderfully, so for example metals absorb nearly any kind of radiation below the far UV.
X-rays tend only to be absorbed by the heaviest atoms (chiefly heavy metals) that have sufficient energy between their energy levels, otherwise most matter is transparent to it. When you get up to gammas, absorption occurs in the nucleus, between nuclear energy levels, and absorption depends on the exact structure of the nucleus. In the other direction, in the radio regime, matter is exceedingly transparent and you have to make unusual efforts – build big antennas – to absorb it at all. An exception is large areas of plasma (gas made of ions and electrons) which is the gaseous equivalent of a metal and can readily absorb, reflect or refract radio waves.
I should add that just because something doesn’t absorb light doesn’t mean it will be transparent. Generally, in fact, it will not – because it will reflect or scatter the light. The question of absorption can be quantified as the wavelength- (or frequency-) dependent albedo of an object or substance – how much of the incident radiation is absorbed, versus reflected and scattered (or more rarely transmitted). The albedo of a person in the IR is low – you are nearly “black” and will absorb most infrared radiation that falls on you. In the visible, your albedo is high – you are “white” and will reflect or scatter most of the radiation that falls on you. In the microwave and UV you’re pretty black, in the radio almost entirely transparent (because you’re very thin on the length scale of radio waves), and so forth.
See Easy-Bake Oven, which allowed children to bake cakes using the heat emitted (in its earlier versions) by an incandescent light bulb.
Jesus… how badly burned do you need to be to be admitted to the hospital? :eek:
To be fair, your injury was due to invisible light, not visible light.
Especially if you paint your hand black, so that it absorbs most of the incident light (instead of reflecting most of it).
If the energy of a given photon is high enough, that photon by itself can damage living tissue. But that’s only applicable to UV, X-rays, and gamma rays. Visible, infrared, radio and microwave photons don’t have enough energy in any one photon to do that; the only way they can damage you is by throwing too many photons at you at once.
Direct sunlight at sea level has an intensity of about 1000 watts per square meter. Nice and warm, but not enough to cook your flesh. Use a magnifying glass to concentrate all those rays into a much smaller area, and you can heat things enough to light them on fire.
A microwave may have 1500 watts of power, with the cooking chamber having a cross section of maybe .1 square meter. Assuming the microwave rays are evenly dispersed, this would be an intensity of 15,000 watts per square meter. Imagine the heat of 15 suns on your back (again, painted black for best absorption), and you start to understand the difference.
No they aren’t. The resonant frequency of hydrogen is 1.4GHz, and most commercial microwave ovens are at 2.4GHz. Not even particularly close.
I stand corrected, I checked and it has nothing to do with resonant frequencies from hydrogen or anything else for that matter.