This column has got me curious. Is it really possible to measure the distance to the horizon accurately? How does one go about it?
Calculating it based on the assumption that the Earth is a perfect sphere:
You are standing on the surface, so your line of sight to the horizon forms a right triangle with your height plus the radius of the Earth as the hypotenuse, the radius as one of the sides, and your line of sight to the horizon as another side. The angle at the center of the Earth will be:
arccos(R/(R+H))
where R is the radius of the Earth, and H is your height. This angle, in degrees times the circumference of the Earth/360 is then the distance to the horizon along the surface.
If you don’t like the trig function, Pythagoras will give you the straight line of sight distance (DH) as the other leg of the triangle:
(R+H)2 = R2 + DH**2
DH = SQRT((R+H)2 - R2)
it will be very nearly the same because the angle is very tiny. In either case, you wind up dealing with the difference or ratio of two numbers identical to several decimal places, which can be a problem for numerical calculations, but these days, your average scientific calculator carries enough precision to get the first
couple of places of the calculation without worrying about that.
No, it’s only a theoretical calculation.
Thanks. I was thinking that there was no way to make it precise.