A question about height vs. the distance I can see?

I posted this question (or one similar) on here once before, but for the life of me, I can’t find it.

Anyways, if I am 35,000 feet in a plane and I look out the window, how far am I seeing if I look all the way to the horizon line?

In the old thread I started, someone posted a chart that was Height vs. Viewable Distance or something like that… I can’t find that on google either. Maybe I’m using the wrong search terms, but I know it exists, I’ve seen it.

So, if I am 10,000 feet high on a clear day, how far am I looking (in miles) if I look as far as my eye can see?

Here’s a distance to horizon calculator.

Darnit, beaten by a google search. But I worked out the trigonometry for myself. :slight_smile:

Let the adjacent be the tangent radius of earth at the observed horizon point, the hypoteneuse be the line from the center of the earth to the high observer, (length = height over earth plus the radius of earth,) and the opposite be the distance from the high observer to the horizon.

First, to find angle A, (at the center of the earth, the angle between the observer and the horizon he sees,) take the arc-cosine of the adjacent, 3960 miles approximately, divided by the hypoteneuse. 10,000 feet = 1.894 miles, so the hypoteneuse is 3961.894, and angle A is 1.7717 degrees

Now, to find the opposite side, take the tangent of A and multiply by 3960. Distance = 122.5 miles. The online calculator agrees with me.

Working notes:
hypoteneuse = radius (of earth) + height over earth

radius of earth = 6370 km, 3960 miles

adjacent = radius of earth

opposite = distance seen

angle A (at center of earth) = arccos (adjacent / hypoteneuse)

opposite / adjacent = tan(A)

opposite = tan(A) * adjacent

10000 feet = 1.894 miles

3960 / 3961.894

A = 1.7717 degrees

122.5 miles

About 120 miles in theory. In practical terms, atmospheric conditions probably won’t allow that.

Wiki discussion of the horizon calculation:

As they note, if the height of the object is much smaller than the radius of the Earth, a fairly simple approximation suffices: D = SQRT(13H), where H is height in meters and D is distance to the horizon in kilometers. Call that M = SQRT(1.56F) in feet and miles. That gives me 124.9 miles, pretty close to the more precise 122.5 just calculated.

Thanks everyone.

When get far enough into space, you can see exactly 50% of the earth’s surface, but never any more than that. Does the equation tell at what point that becomes possible?

It never does. There is no point from which you see exactly 50% of a sphere’s surface, although you can get arbitrarily close to that figure.

Ah-ha, when I was typing that I felt qualms about using the word “exactly,” had a hunch it wasn’t quite right. Should go with my intuition more. Anyway, to rephrase:

Once you’re far enough out in space, there’s a point where a maximum of the sphere’s surface can be seen, where going further out won’t reveal any more of it. Does the equation show how far away that point is?

No, there’s no such point. The further out you go, the more you can see, though the returns are diminishing, with 50% of the surface being the asymptotic limit.

That is, suppose you started at the North pole and began levitating. You’ll be able to see a point on the Northern hemisphere just as soon as your line of sight to it coincides with a tangent to the Earth at that point. But you would have to go out infinitely far for your line of sight to become tangent to the Equator; thus, you are forever approaching, but never reach, seeing 50%.

That’s the same question. You’ll asymptotically approach 50%, but you’ll never reach it. That is to say, there’s some distance at which you’ll see 40% of the surface, then if you go out further than that you can see 45%, then 49%, then 49.9%, and so on. There is some distance at which you can see 49.999999%, but there is no distance at which you can see 50%.

Now, you could ask “How far out do you have to be to be able to see almost half of the surface”, but then you’d have to specify what you meant by “almost”.

Well, you all are neglecting gravity. Lasers mounted at the equator aimed due North will curve slightly due to Earth’s gravity, meeting eventually over the North pole. That’s how far away you need to be, but I’m not doing the calculations to figure just how far away that is.
:slight_smile:

And what if the laser beams are on sharks? Frikken’ sharks?

Ooh. Good point. The total deflection for light passing by a spherical mass is (in appropriate units) 4M/b, where M is the mass of the object and b is the impact parameter. Since we’re starting at the point of closest approach (instead of far away from the mass), we’ll only have a deflection of 2M/b, M will be the mass of the Earth, and b will be the radius of the Earth. So the distance needed will be the radius of the Earth squared divided by twice the mass of the Earth, or 4.586 × 10[sup]15[/sup] meters (about half a lightyear).

As a practical matter what are the limits of the resolving power of a telescope in the earth’s atmosphere? Assuming there is a wonderfully clear fall day in New York, and I have a very powerful telescope mounted at (let’s say) 1000 feet on the Empire State Building. Assuming I can see (according to the calculator) 38.7 miles away can I actually see individual people moving around 20 to 30 miles away with a big telescope?

Way too much work.

Since a tangent line is perpendicular to the radius, you have a right triangle and using Pythagoras, the distance to the horizon is

((r+h)^2 - r^2)^(1/2) where r is the radius of the Earth and h is the height above the surface.

For your numbers 3961.894^2 - 3960^2 = 15004.067236
sqrt(15004.067236) = 122.491090435