Horizons

I’m standing on the beach at Myrtle Beach last week looking out towards the horizon. Then I think to myself how far can I see until the horizon drops off? I don’t know.

So, how far can one see until the curviture(sp?) of the Earth makes the horizon drop off? I realize that it depends on what height you’re looking from, so let’s say from a height of six feet. Anyone?

Almost 3 miles

I seem to remember asking this question a long time ago and I think the answer was 20 miles (assuming you are at sea level on a clear day with no obstructions).

I seem to remember asking this question a long time ago and I think the answer was 20 miles (assuming you are at sea level on a clear day with no obstructions).

While I don’t claim to know the real answer I am almost certain it is more than 3 miles.

Actually, Lance is right. I just found this http://www.agric.wa.gov.au/agency/Pubns/agmemos/SRang/1998/memo998/horizon.html. It says if you multiply 1.23 by the square root of the height you are looking from, it gives you the miles you can see.

I understand how height effects how far you can see, but wouldn’t height stop affecting the distance you can see after a certain height? That is to say, even at some magnificent height, I wouldn’t be able to see so far that I could see myself from the back.

Let’s try this out with some geometry…

r: radius of Earth
h: height of person (6 feet)
d: distance person can see

http://www.geocities.com/zai_h/man-on-earth.jpg

Taking the mean radius of the Earth as 6.37E6 meters, assuming it to be perfectly spherical, and 1 foot to 0.3048 meters, we have:

d = r * tan { acos [ r / ( r + h ) ] }

Which is about 4.83 kilometers, or 3.00 miles. Remember that this requires a perfectly spherical Earth, and that no obstacles can be in your line of sight.
Note: The distance a person can see is measured from the head of the person to a spot on the ground, and this value can be increased by increasing the height of the object said person is looking at. If the object were as tall as the person, then the distance will double in the example above.

The formula is simple, if you use a few assumptions. The line of sight to the horizon is tangent to the earth’s sphere. That distance x and the earth’s radius R to that horizon are two legs of a right triangle, whose hypotenuse is the earth’s radius R plus your height h above the ground. So by the pythagorean theorem:

x^2 + R^2 = (R + h)^2
or
x = sqrt(2Rh + h^2)

If h is a lot smaller than the earth radius, x is approx. just the square root of (2Rh). Of course, R is expressed above in miles and h is in feet, so you have to convert. The constant that you multiply the square root of your height by is sqrt(2R/5280) – which comes out to about 1.23

If h is large, then that approximation no longer works–x is closer to (R + h), as you’d expect.

How far away would you have to be to no longer see any part of another person? That person would disappear when they were twice as far away as your horizon. They would start to lose their “bottom” when they passed your horizon. That effect is sometimes adduced to show that people knew that the earth was round long before Columbus (they did), but three miles is a long way, and telescopes hadn’t been invented yet either.

I probably could have added a little more detail to my orginal post, but I didn’t want anyone to beat me to the answer.

With radius of the Earth (in feet for the purpose of this example) = R. And viewing height = H (also in feet). Actually you can use any unit of length as long as you use the same unit of length for both radius and viewing height. I am using feet because the OP specified a height of six feet.

The distance to the horizon (assuming a sphereical Earth and neglecting refraction or any other weird ass effect of the light) will be:

sqrt((R+H)^2-R^2)

or

sqrt(2RH + H^2)

These are equivalent so you can use whichever you think is easier to punch in to the calculator.

This equation gives the distance from the “eyes” to the horizon. Meaning in the case of the original post that the result is the distance from 6 feet above sea level to the horizon, and not the distance from the viewer’s feet to the horizon. At a distance of six feet this difference is negligible. However, for very large H this difference is significant.

In response to WIGGUM’s post about a magnificent heigth. If you were looking at the horizon from a height of a gazillion light years, it would be pretty close to a gazillion light years to the horizon. (or a gazillion light years plus R.) The same equation applies.

The distance from the base of the observer is limited by one quarter of the Earth’s circumference as H approaches infinity. Meaning that no matter how high you get, you can only see the side of the Earth that is facing you.

Dang

Look at all those people with the answer.

That is why I didn’t bother to explain my answer the first time.

I just came up with a killer formula that’s relatively simple:

Let:
r = radius of sphere
h = height of observer standing on sphere
D = arc distance from foot of observer to horizon

then

D = r*(arccos(r/(r+h)))

given that arccos returns radians. If it returns degrees:

D = r*($#960; / 180)*(arccos(r/(r+h)))

(Hope that pi character comes out OK.)

I just came up with a killer formula that’s relatively simple:

Let:
r = radius of sphere
h = height of observer standing on sphere
D = arc distance from foot of observer to horizon

then

D = r*(arccos(r/(r+h)))

given that arccos returns radians. If it returns degrees:

D = r*(π / 180)*(arccos(r/(r+h)))

(Hope that pi character comes out OK.)

Over the ocean, at least, mirages and other optical illusions may sometimes allow you to see much farther.

We can see the sun after it has set below the horizon because of the changes in the density of air. The changeing density causes the index of refraction to change, and bends the light.

If warm air is above cool air, you can see ships beyond the horizon that apprear to float in the air.

Sometimes temprerature changes make the ground or water appear to loom up like a wal. There is often distortion and magnification involved that makes the land or sea appear like mountains or castles in the sky. This is called Fata Morgana.

Some people believe that such illusions allowed early explorers to the discovery of Iceland and Greenland. Some even belive that it might have led to vikings to set off to America

Proof:


Let
r = radius of sphere
h = height of observer's eye above sphere
a, b, c = sides of triangle described below
A, B, C = angles of triange described below
A = point/angle at eye
B = point/angle at center of sphere
C = point/angle at horizon

By definition of the horizon, line AC is tangent to the sphere's surface at C.

This also means that AC is perpendicular to BC.

This means triange ABC is a right triangle

c is the radius of the sphere plus the height of the observer's eye, r+h
a is the radius of the sphere, r
C is 90 degrees (pi/2 radians)

From the Law of Sines:
  a       c
----- = -----
sin A   sin C

Rearranging:
          a               r
sin A = ----- * sin C = ----- * 1 = r/(r+h)
          c             r + h

For complementary angles A & B, sin A = cos B

B = arccos(r/(r+h))

The arc distance from the observer's feet to 1/4 around the sphere is:
(2 * Pi * r)/4 = (pi/2) * r

The ratio of B to 1/4 the circumference is:
B/(pi/2) {in radians}

This ratio is amount of a quarter circumference's arc is intersected by B.

So B times the quarter-circumference will give us the arc
distance from the observer's feet to the horizon.

   B
------ * (pi/2) *r = B * r = r * arccos(r/(r+h))
(pi/2)

AWB said:

Then AWB went off on a tangent of his or her own.

Segment BA = r + h
Segment BC = r

apply the pythagorean theorum and to get the length of CA which has been done at least two times already in this thread. This is how far away the horizon appears to be.

Now to measure the arc length from the base of the observer to the horizon. The formula given by AWB is correct, but AWB went about it in the wrong way. Here’s the simple way.

Step 1. Screw the law of sines.

Step 2. By definition the cosine of one of the non-right angles of a right triangle is the adjacent side / the hypoteneuse. (sp?) Therefore Cosine(B) = BC/BA

Step 3. Substitute r+h for BA and r for BC and solve for B to get B = arccos(r/(r+h))

Step 4. Let D be the point where BA intersects the sphere.

Step 5. By definition arclength DC = r * B with B in radians.

Step 6. Repeat step 1.

(Note: I inserted a hard return to try to straighten out the thread. -manhattan)

I have inserted some hard returns into the posts of AWB and Lance Turbo to stop the thread from scrolling. So if it appears that there are stylistic errors in either of those posts, the fault lies squarely with me, not them.

But at least the thread doesn’t scroll.

Carry on.

Maybe one of you mathematical wizards can calculate the apparent distance to the horizon on the Moon? The radius of the Moon is 1737.4 KM, according to NASA and JPL.

Jeeze, talk about repeating the same thing over and over and over and over and over again. Glad I got the first shot at least :stuck_out_tongue:

Give the same parameter of a 6 feet high view point, the horizon of the moon will appear at 2.52 kilometers away, or 1.57 miles with those figures you gave, jab1.

I never could master trigonometry and I never took any math higher than algebra. Besides, I made that request to point out that that is why the horizon looks so close in the photos taken by the Apollo astronauts.

If I didn’t have a calculator, I don’t think I could balance my freakin’ checkbook! :o

Since the formulas have square root of R, and the moon’s radius is about 1/4 the earth’s, the formula for the moon would give about 1/2 the earth distance to the horizon.

Zor

OK, you won, so how do you get the 1.23 from your formula?

I don’t know, RM Mentock, you want to tell me where it is in the OP? :smiley: