A guy I know is arguing against the installation of a cell tower in his neighborhood. I’m tempted to tell him that if he’s that worried, he should never use a smartphone.
His claim is based on the increased amount of cell phone traffic, and higher power of cell towers, since what may be the last study of harmful effects; in 2007.
I’m looking at it from the viewpoint of signal strength. I calculate a (max) .5W phone 4 inches from the center of your head presents about 0.5W/sq-ft of radiation at the center of your head.
A 100W cell tower (if that’s accurate) presents 0.7mW/sq-ft at a distance of 100 ft. The cell phone is delivering on the order of 700 times the signal strength, although granted not 24/7.
But I could be all wrong…and am looking for a more informed analysis.
I am not debating whether this radiation in fact actually causes any damage - just that a smartphone is potentially providing a much stronger signal.
Assuming you already have some sort of cell service (that is, you’re going from poor signal to better signal with the tower, not no signal at all to some), you’ve still got plenty of cell phone-tower signals going across your local atmosphere.
Plus, if the tower is further away/the signal is worse, the phone will ramp up the transmit power in order to reach the more distant tower.
Not only does the power drop off by distance squared - which means the cell phone is a greater source of RF based on the calculation - but having the cell tower located nearby REDUCES the RF power from the cell phone.
This article claims that because the cell tower is closer it can transmit at a lower power than if it were further away. Not sure how one would calculate impact of exposure due to signals to other phones in the cell.
Point is, your colleague should be lobbying for more densely packed towers to reduce exposure (even though there is no solid evidence of harm).
On review, Dzeiger beat me to it.