Help me settle a bet: More electromagnetic radiation inside a room from a cell phone, the sun, or something else?

Yeah, I wish it were phrased more precisely. But we can still steelman it and give it the fairest possible interpretation and it’s still an interesting question, even assuming a perfect human-sized absorber that completely sucks in all possible wavelengths that hit it.

Any particular reason why?

Even if most of your body is covered and blocking visible light, your face and neck and arms and hands etc. are still enough surface area (according to my back of the envelope math) to give light (from the sun or the bulbs) a fighting chance. 0.5W from the phone just isn’t a lot of power even if you absorb 100% of it.

Hmm. Looking at the photos of the brewery now, those are much bigger windows than I was picturing. Here’s taking a crack at the sun candidate in more detail, then.

Relative to maximum sun, you’re keeping 10% due to time of day, 70% due to latitude, 50% due to glass transmission (killing mostly IR), and maybe 10% due to non-line-of-sight (i.e., ambient light rather than direct light.) The amount of viewable solid angle is restricted by the walls/floor/ceiling and the seat location, but 20% access seems generous. That’s 0.07% of the ideal 1000 W/m2, so about 7 W/m2.

Arms, head, and neck presented for maximum light access would be maybe 0.09 m2 of surface, yielding about 0.6 W.

So, yeah, I agree with you that the details will matter and that the cell phone isn’t a slam dunk.

Wouldn’t one of those tri-field meters settle this?

Is that a real thing, or like a tricorder from Star Trek?