A couple of questions about the horizon

I was idly wondering how far away the horizon is at ground level, a quick google search tells me its approximately 2.9 miles.

My real question is how long does it take an average passenger jet at altitude to travel from one horizon directly overheard and disappear over the other horizon, from the perspective of the observer? Its not simply dividing 6 miles by the average speed because the jet is moving at a higher altitude, but my maths fails me.

Also while researching this question I came across a post on a different website where someone asked a different question.

They stated that in ordinary circumstances the horizon always appears at eye level, its at eye level when standing on the ground, when in a tall building, when in a passenger jet at 30’000 ft. At what height does the horizon start to drop below eye level?

Thanks in advance :slight_smile:

Not sure the exact answer, but you can probably use the facts on this page to double check any math suggestions proposed:

Although it’s kinda the reverse question there - so not sure

Oh and I believe the horizon is the point where you can’t see any further.

It would usually not be at eye level, but you could see it without having to tilt your eyes too much in most conditions.

The distance travelled by an object at altitude A from horizon to zenith as perceived by an observer at altitude a should be (R+A)*(arccos(R/(R+A))+arccos(R/(R+a))) where R is the earth radius. (Set your calculator to radians before plugging in numbers).
And DataX is correct: the horizon is always below eye level. The reason we see it at eye level is that we’re looking at it!

This was discussed in a recent thread with a conclusion that, ignoring the effects of refraction,
*Distance to the horizon is the geometric mean of eye height and Earth’s diameter.*As Wikipedia shows, this yields OP’s 2.9 miles for eyes 5’7’’ above sea level.
The effect of refraction is variable, but you can get to the right ballpark by increasing the non-refraction number by 10%.

Disposable Hero writes:

> They stated that in ordinary circumstances the horizon always appears at eye
> level, its at eye level when standing on the ground, when in a tall building, when in
> a passenger jet at 30’000 ft. At what height does the horizon start to drop below
> eye level?

There is no level at which the horizon suddenly changes in how far below your eye level it is. If you’re four foot tall, it’s a tiny amount below your eye level, although you don’t notice it. If you’re seven foot tall, it’s a larger but still tiny amount below your eye level. If you’re in the fifth floor of a building, it’s a slightly larger amount below your eye level. If you’re in the fiftieth floor of a building, it’s a still larger amount below your eye level. If you’re in a plane 10,000 feet up, it’s a yet larger amount below your eye level. If you’re in a plane 40,000 feet up, it’s an even larger amount below your eye level.

You have come to think of your eye level as being a certain angle depending on how tall you are and what floor you’re on usually and how high you are in a plane usually, even though that level isn’t really your eye level. That is, if you could actually map your vision field to a sphere around you, your horizon would never be at precisely the level that divides the sphere into half unless your eyes are laying on the ground. The higher you are off the ground, the lower the horizon gets.

Let’s see if we can’t work this out. There’s a pedestrian whose eyeball is 5’7" off the ground, and he/she just barely has a line of sight to a snail on the ground 2.9 statute miles away. At the same time, a passenger in a jet at 30,000 ft has a direct line of sight to that same snail, at a distance of 212.1 miles away. So if those two people are on opposite sides of the snail, they have a line of sight to each other at a distance of 215.0 miles. If the plane flies directly over the head of the pedestrian, it’s within view for 430.0 miles. Divide that number by the speed of the plane. Wikipedia says 546-575mph. If we use 560 mph, that gives almost exactly 46 minutes.

This ignores refraction, haze, etc.

Sure, except that at the precision we’re doing these calculations to, you might as well ignore the person’s height completely.

Just giving this a quick thought-over, I think we must have:

Assuming a spherical earth (as some folk do), the horizon must always fall below “eye level” (meaning below the horizontal).

But assuming an infinite flat earth (do any flat earthers actually believe it’s infinite?) then the horizon at infinity must always asymptotically approach the horizontal from below. This must be true no matter how tall a building, or how high an airplane, from which the observer observes.

Another related question I have been trying to find the answer to for decades, and I can’t find it, even now in Google. What two points on earth are the furthest apart, and can still have a line of sight between them? With an actual geographical relief between them that is not high enough to obstruct the line of sight.

My guess would be two peaks in the Andes, on which the curve of the mountain range would allow a view across the Amazon basin. Or else two high peaks on Pacific islands. They do not have to be actually visible, according to atmospheric conditions, but theoretically distant from the same point on their respective horizons.

Based on a somewhat related problem I had to solve once, I’m not immediately so sure of this.

My problem: In a large bay, whales and boats were observed from the top of a hill or cliff at the water’s edge. Observations were made with a theodolite, which measured the whale or boat’s angle from true north (IIRC that was called the “azimuth”) and the angle of the object below the horizontal (called the declination), from which the object’s distance could be computed. (This distance was typically in the low numbers of miles.)

The height of the observer above sea level was also relevant, and figured into the computation. (So far, what we have is a standard first-week-trigonometry problem, similar to finding the height of a building by standing a known distance away and noting the angle of line-of-sight (inclination) to the top. But wait, it gets worse . . . )

The azimuth and declination of a series of observations was entered into a computer program (which it was my job to write), which computed the object’s position in X-Y coordinate terms, and then analyzed the respective paths of the whales and boats. (The object of the research was to determine if the presence of the boats disrupted the behavior of the whales.)

And now, here’s where it gets messy: Some other competing researchers were measuring the distance to the object taking the earth’s curvature into account, which we all though must only make a petty difference at such close distances. (I’m pretty sure we were right about that.) But our PI thought that competing researchers would take that as an opportunity to criticize our work. So we needed to do our computations with a spherical earth model too. This, you didn’t learn in your first week of trig class.

It turns out, of course, that the radius of the earth is necessarily part of that computation. And the radius of the earth is a big number. And the radius of the earth varies, by a few thousand km IIRC, from one place on the earth’s surface to another. And the altitude of the observer seems like a tiny number compared with the earth’s radius, and even compared with the variation in the earth’s radius. And the desired results should be smallish distances in the range of just a few miles or km. And all these numbers had to be plugged into a thoroughly obscure and messy trigonometric formula that I had to devise myself, having failed to find it in several trig and surveying textbooks.

First question: Since the earth’s radius varies on the order of a few thousand km IIRC, do we need to know the exact radius there where the observations were made? Or could the radius vary by a few thousand km without making a significant effect on our results? Next question: It is obvious that the distance computation is very sensitive to the observer’s altitude, which therefore had to be accurate to the nearest meter. So, in a computation involving the earth’s radius of about 6400 kilometers, can the formula still be sensitive to variations on the order of 1 meter in the observer’s altitude?

I wrote a program which ran the formula with a range of reasonable figures for all these variables, and found: Yes, it works out well. Large variations in the earth radius (on the order of kilometers) did not significantly affect the result, while small variations in the observer’s altitude (on the order of meters) did.

So, to answer Chronos: In my problem, at least, the computed distance to an object at sea level, in the range of a few km, was very sensitive to the observer’s altitude. So my first intuition is that is might be for the discussion of this thread as well.

That’s because the thing you were observing was at a much lower height than you were. Whichever one is at the greater height matters more, and when one’s height is 5,000 times the other’s, the other’s height matters almost not at all.

Where did you get this idea?

The radius varies from about 6353 km at the poles to about 6384 km at the equator - a range of 51 km.

So I’m remembering the numbers that wrong? Possible. This was a project I did 30-some years ago. Anyway, I know I had the right numbers at the time. :smack:

I believe so.

The earth would be a strange-looking planet indeed if the distance from the center to the surface varied by anything like 30%.

At first glance, it might seem that way, but no.

If, by “ignore the person’s height completely” you mean that we could substitute 5’3" instead of 5’7" and get pretty much the same answer, then yes I agree. But if by “ignore the person’s height completely” you mean we could use ZERO as the person’s height and get pretty much the same answer, then I disagree. That would change the answer by nearly 1.4%.

Using h=pedestrian’s height, a=plane’s altitude, d=Earth’s diameter, and f=the distance the plane can fly while in view of the pedestrian, we get f=2*(sqrt(ad)+sqrt(hd)). Given h=5.5833, a=30000, d=41804000, we calculate f=2270981, which is 430.11 miles. If we change h to zero, we get f=2239750, which is 424.20 miles. That’s a decrease of 1.375%

OTOH, if we lower the plane’s altitude to from 30,000 ft to 29,200 ft, we get f=2240240, which is 424.29 miles. That’s a decrease of 1.354%.

Using zero as the pedestrian’s height has a larger effect than reducing the plane’s altitude by 800 feet.

Suppose you’re standing on a perfectly smooth, perfectly spherical Earth. The “horizon”, from your point of view, is an imaginary circle on the ground, 2.9 miles away from where you’re standing. Whenever you look at a point on this circle, your eyes are looking slightly downward. The downward angle is arccos(R/(R+h)), where R is the radius of the Earth and h is the height of your eyeball. At 5’7", the downward angle is roughly 0.042 degrees. If you go to the top of a 100 ft tall building, you’re looking downward at about 0.182 degrees. That’s roughly 1/3 the width of a full moon. I guess the question here is, can you tell the difference between holding your arm out at a 90 degree angle verses holding your arm out at an 89.817 degree angle?

Looking at the horizon from an altitude of 30,000 ft, your eyes are looking downward at 3.068 degrees. Can you tell the difference between a 90-degree angle and an 87.932 degree angle?

d’oh! 86.932

And when you’re doing calculations to a precision far coarser than that, like we are here, 1.4% is indeed negligible. If we were doing our calculations to sufficient precision that we had to worry about the person’s height, then we’d need to start by throwing away the assumption that the Earth is spherical, because of course it’s not. On the real Earth, the variation in height of the ground over any horizon-scale distance is going to be much greater than the height of a person. If, then, we’re ignoring hills, then it only makes sense to ignore human height, too.

If a person is standing on a hill (or, conversely, standing on a depression while surrounded by hills) then you get answers that vary wildly from 2.9 miles. You could get 40 miles or 400 feet depending on the topography. But the OP started out by saying I was idly wondering how far away the horizon is at ground level, a quick google search tells me its approximately 2.9 miles. It seems to me that this establishes the context in which we should be approaching the problem. But if you ignore human height too, you don’t get 2.9 miles, you get zero miles.