Help me settle a bet: More electromagnetic radiation inside a room from a cell phone, the sun, or something else?

We recently went to a fun science trivia night at the local brewery, and one of the questions was:

Sitting in this room right now, what is the single most powerful source of electromagnetic radiation you are receiving?

I answered “the sun” and another team said “the lights [in the room]”, but the correct answer was apparently “your cell phone, because of the inverse square law”.

I challenged that answer because intuitively it didn’t seem quite right, so we took it to email. Several hours and much math and suffering later, we still don’t have a conclusive answer. What do you think, Dopers?

Factors to consider:

  • The event took place inside a west-facing room at Worthy Brewery in Bend, OR. On the satellite view, it is the very lower left (southwestern) part of the building, directly below the darker solar panels and south of the telescope (yes, our local brewery has a small telescope!). The room is located at 44.055179, -121.260996 and facing due west.
  • It was about 7:00pm PDT when that question was asked (we observe daylight savings time).
  • The room let in a lot of natural light, with many ceiling-to-floor windows, but they were partially shaded (about 30% shaded).
  • That room had many light fixtures, probably LED. It also had a projector that was on.

Our estimates so far:

  • Solar irradiance at that time of day hitting our town: About 300 W/m^2
    • After accounting for angle, attenuation by the windows, etc.: About 100 W/m^w
  • Phone: ~0.5 W continuous, 2-4 W peak (if we were being generous)
  • Another person: ~100W+ at idle, maybe a bit more if they were thinking really hard
  • LED bulbs: 10W each
  • Projector: Maybe 150 W

It seems like even if the sun were further attenuated (by the partial blinds or just the layout of the room), it would still be a contender… it just has to beat the phone’s measly 4 W.

And even if the sun weren’t the strongest source, it seems like any of the other sources would still win out.

If the question had been about radio-frequency radiation only, then yes, probably the cell phone would win. But given the way it was phrased, “electromagnetic radiation” would include all the other sources of visible light and infrared that we tend to take for granted, but are nonetheless quite powerful compared to a power-sipping cell phone.

Who was right? How can we best settle the bet?

Probably start by agreeing amongst yourselves on what the question actually was. Photons of all energies from DC to gamma rays, visible light only, radio only or ???

After that we can begin to catalog the emissions and absorbtions of each.

Probably the Sun, maybe the room lights or projector, but nothing else is even close. Unless the windows are very, very small, the Sun is accounting for hundreds of watts of power. The room lighting might be accounting for hundreds of watts, if it’s a large room (but still with relatively small windows), and they’re somehow still incandescent lights. Each human body in the room is shedding around a hundred watts, but most of that is going to be via convection or conduction, not radiation. A cell phone can’t get anywhere even close to 100 watts, unless the battery has just shorted out.

“Because of the inverse square law” is just incoherent as an answer, because the question asked for “most powerful”, and the inverse square law doesn’t change power, just intensity. A cell phone might be the most intense source in the room, for any person in the room, but that’s not what the question asked.

Are we sure that Randal Munroe hasn’t already answered the question? If we’re sure, we could ask him politely.

Agree, and the question is worded very poorly. But based on their answer, I am going to assume they meant EM field intensity at your skin. In other words, if we look at all of the EM sources that are beaming energy to you (the Sun, room lights, cell phone, TVs, etc.), and we sort them from highest to lowest in terms of W/m² at your skin, which is #1?

Since the question asks about the source of radiation wouldn’t the sun be the clear answer pretty much everywhere in our solar system? I think they intended to ask about how much radiation was reaching people in the room but as for sources it’s difficult to rival the sun.

The inverse square law actually works against the phone and for the Sun.

If you move ten more feet away from the Sun, the electromagnetic radiation drops so little it couldn’t be measured. For the cell phone, 10 feet away is a big deal.

I mean, for that matter, we’re also receiving radiation from a lot of other stars, some of them much more powerful than the Sun.

(side point - I love these questions, so very, very much. This is why I’ve stuck with The Straight Dope. Now back to your regularly scheduled question).

Well, it was a science trivia night, and the entire category was on electromagnetic radiation. If they wanted to specify a particular part of the spectrum, they would’ve (some of the other questions were more specific), but since they didn’t… I think it’s only fair to assume it’s the entire spectrum as generally understood by, say, a high-school physics chart like:

So that should be everything we commonly know about, including RF, visible light, body heat as IR, etc. Maybe not any exotic hypothetical phenomena that would take a few PhDs to even be able to name… just the regular stuff.

My understanding is that most sunlight that passes through the atmosphere is in visible light anyway, and the windows probably block most of the UV. I looked up some common solar heat gain coefficients of windows in our state (which I think is a measure of how much total broad-spectrum solar energy remains after a window blocks some of it), and they let in from 17%-40% of the original sunlight.

This doesn’t take into account the angle of the sun in the horizon vs the perpendicular windows, but given that the illumination is so diffuse anyway (the sunlight is “everywhere” in the sky relative to tiny little us), I don’t think that would make that big of a difference.

Depending on the variables, by my calculations there should still be somewhere between 20-50 W/m^2 inside the room… even at the worst case. That might be enough to make sunlight lose out to an incandescent bulb, or a “single” LED fixture consisting of several bulbs… but either would still be a lot more than the cell phone (which typically operate at around 0.5 W idle).


Yeah, the peculiar phrasing of:

“Sitting in this room right now, what is the single most powerful source of electromagnetic radiation you are receiving?”

Makes for several possible different answers =/

  • Are we measuring power or intensity?
  • Do we care about absorption into the body, or just what hits the skin?
  • Does the “source” itself have to be in the room with us, or just transmit into it?

I think that’s the fairest way to interpret that question… total EM intensity reaching a human-sized perfect absorber in that room.

Really? That’s news to me (if we’re talking about intensity received by us, as opposed to the star’s output power). What other sources can compete?

The trivia host also shared another person’s post from an email group he’s part of:

Interesting question. Probably too many variables to get a precise answer, but lets dance.

Lets start with an estimate of the sun’s power. The sun puts out a total of about 1250W/m^2 when straight up and measured from the surface of the earth with no cloud cover. The drop in power approximately follows a cosine function and looking at a sunrise/sunset chart for Bend OR on March 18, 2025 we see sunrise is at7:11AM and sunset is at 7:15PM. The meridian is at 1:13PM. That’s close enough to a 12 hour day. To compute the cosine value we equate 12 hours to 180 deg which give 15 deg/hr.

You were in the room at 6:45PM which is .5 hrs before sunset so we take the cosine of 90-7.5 degrees which = 0.1305.

Now we need to adjust that for the longitude and time of year. The peak sun angle for Bend, OR this time of year is 45.3 degrees. The cosine of 90-45.3 = 0.7108. Combine the two cosine terms and get 0.1305*0.7108 = 0.0928.

I just found an online calculator that gives the combined elevation angle as 4.68 Degrees for that time and date. The cosine of 90-4.68 = 0.0816. Lets use this number, but at least my estimate was pretty close…

So 1250 W/m^2 x 0.0816 gives 101.99 W/m^2. Again this is outside with no cloud cover or other obstructions. Although the sun is a wide band emitter including at RF frequencies, the vast majority of the radiation is between 250 and 2500 nm (UV to IR ). We will neglect the RF spectrum and assume a very small percentage of the UV to IR energy makes it through your south facing windows with sunshade. The sun’s azimuth angle at that point was 265 Deg, so only a reflected path into the room. The glass will also block much of the UV that might happen to reflect off an outside surface before making its way to the window.

I would say it’s safe to assume the contribution from the sun will be minimal given the time of day and direction of the widows. It’s a bit of a guess, but you can make an argument that the indoor lighting would dominate. Other sources that might contend with the cell phone are WiFi, radiated heat from people and equipment as well as other broadcasts in the area such as radio/TV station/Cell Tower.

A person expends about 2500 kcals per day. That’s about 418,400 J/hr or about 116 Watts. Not all of that is radiated heat though. Much is conducted (or convection) away from the body, some does work and much goes into evaporating sweat. See BODY (HUMAN) HEAT TRANSFER for a discouraging primer on how complicated it gets. With what is radiated, you have to consider it radiates in all directions, not just towards the person sitting next to you. If you consider yourself as the source does that mean you’re radiating yourself? I don’t think that works…

For the cell phone and other emitters in the area, the inverse square law applies. Every time you double the distance to the emitter you only receive 1/4 the power. That gives your cell phone a big advantage since it’s in your pocket. Of course half the phone is facing away from you so only half is radiated into your body. If you have the WiFi enabled in your phone that will contribute to the electromagnetic radiation too.

A cell phone can legally put out up to 4 Watts. Usually it’s less than that, and it’s dynamic meaning the output power it negotiated with the cell tower. If the tower is further away the phone will put out more power to compensate. You can strengthen your position by saying the cell tower is way down the street ;o)

The other thing to consider is the cell phone’s 4 watts is peak power. Usually you would be working with average power so if the phone is only transmitting 10% of the time the average power would be 0.4 watts. That’s in contrast to the sun which is a continuous emitter so the average power is the same as the peak power (neglecting solar storms/coronal mass ejection…).

Did the room have WiFi? LED lighting? I think the room lighting would win if you had incandescent lighting…

LED room lighting would mostly be in the visible spectrum which would largely be reflected off the body. Maybe you could bend the question a bit and say "absorbed electromagnetic radiation "

Similar conclusions… there’s a lot of EMR in that room, and the phrasing of the question will likely determine what the winner is. But the cell phone is not necessarily the most powerful OR most intense in any of those scenarios…

Wait, is the xkcd guy a Doper?

Not that I’ve ever heard.

Not if we’re talking intensity. I was responding to @TriPolar , who was talking about total power.

Although, in radio waves, the Sun is only the second-brightest source in the sky. In that band (though not in the overall spectrum as a whole), Sag A*, the black hole and associated matter at the core of our Galaxy, is brighter.

I suspect that all of the above options are dwarfed by the thermal (infrared) radiation from the walls and furnishings. Unless I’ve done my math wrong, if you’re sitting in a “thermal bath” at room temperature (293 K), you would absorb a little over 400 W of thermal energy from your surroundings for every square meter of surface area you have exposed. (P/A= \sigma T^4 for the eggheads like me.)

Of course, your body radiates all this power (and more) away, because you are personally warmer than your surroundings. But if we’re asking “what source do you receive the most EM radiation from?”, it’s almost certainly the walls and the furnishings of the room.

Hmm, interesting! Do you mean because of their thermal mass, the energy they stored from the sun and maybe the HVAC and such?

Is it fair to count that as a “source” if they’re just absorbing and re-emitting energy from elsewhere?

But I guess you can say that about any energy, if you trace it back far enough… human output comes from calories which come from food which comes from animals and plants which ultimately lead back to the sun and the big bang and yadda yadda… hmm :thinking:

This exemplifies one of my favourite observations about comfort levels indoors. A room can feel hot or cold irrespective of the air temperature. It is the temperature of the walls, floor, and ceiling that can dominate your perception of the environment. This is especially apparent in rooms with brick or stone walls, or bare stone or concrete floors. They can take ages to heat up or cool down (depending upon what you need) and a room can feel as if the HVAC system just isn’t doing anything, even when the air temperature is exactly what is desired.

Low emissivity glass is an interesting factor here as well. One side of my house is essentially a very large wall of low-e glass. Standing next to it is slightly weird. To the touch the glass can be quite warm, but it doesn’t feel warm to stand next to. Ordinary windows do. The difference in emissivity is remarkable.

With that wording, it has to be the Sun, right? If the host wanted the answer to be “your cellphone”, the wording should be “what is the source of the single most powerful dosage of electromagnetic radiation you are receiving in this room right now?”

It’s true that the walls & furniture are absorbing and re-emitting photons, but there’s no sense in which you can identify any particular absorbed photon with any particular emitted photon. The energy from the absorbed photons gets turned into kinetic energy of the atoms in the object, and then later a different photon gets emitted using some of the atoms’ kinetic energy.

It’s like if you had a faucet dripping into a leaky bucket. There are drops going in and drops coming out, but you can’t identify any particular droplet entering the bucket with a droplet exiting the bucket.

The question in the thread title and the question in the OP are very different questions.

The title asks about EM radiation “inside a room”. The quoted quiz question asks about EM radiation “you are receiving”.

The latter has ambiguity about what “receiving” means. My clothes prevent me from receiving a lot of EM radiation. Also, if something passes through my body unabsorbed, did I “receive” that radiation?

For some choice of interpretations, “cell phone” seems like a good answer to me for the quiz question.