How far can a shooting star be seen?

Most nights I wind down in the hot tub and once or twice a week I see a shooting start. XMas eve I was traveling to Mom’s and saw two totally awesome shooting starts. I was about 180 miles north of my house and perhaps a little east. Would I have seen these if I had been sitting in my hot tub?

Well, this site indicates that fireballs start at around 80 km high. Without doing any fancy maths, intuition says “yes”, you could probably have seen those particularly bright meteorites from 180 miles away, provided atmospheric conditions permit and assuming that they were high off the horizon where you actually saw them. They’d probably be a lot dimmer, though.

Reasoning behind intuition – I’ve seen 14K+ mountains at a distance of probably 50 miles away, and, from fairly modest peaks, cities at the same distance. So something 80 km high is not going to drop below the horizon for a long, long distance.

Since a line of sight operates in both directions, your question is equivalent to asking how far away the horizon is for a meteor at the height which it starts glowing.
Meteors start to light up about 70 miles (110km) above the earth’s surface, and the formula for distance to the horizon is d = ( h[sup]2[/sup] + 2hr )[sup]1/2[/sup], where h = height, and r = the earth’s radius (6,380 km).

d = ( 110[sup]2[/sup] + 2 X 110 X 6,380)[sup]1/2[/sup] = 1189km.

Now, that 1189km (739 miles) is the straight line distance from the meteor to the horizon, not distance along the ground from a point directly under the meteor to the horizon, so it’s a bit off in terms of driving distance, but to get a better answer someone will have to break out their trig textbook.

If the meteor is high in the sky, people 180 miles away from you will see it easily.

h2 = a2 + b2 (square of hypotenuse = the sum of the square of both sides) is the formula for finding the lengths of the sides of a triangle. I’m ignoring the curvature of the earth for this exercise, as though it’d add some distance, it’s not enough to matter in this case.

739miles^2 = 70miles^2 +x^2 where x is the distance along the ground.

so, 546121 = 4900 + X^2 or x^2=541221 or x=735.6 miles.

You’d have seen it from home no problem.

I guess it depends upon the brightness of the meteor.

Hmmmm. After I typed the above post I dedided to search Google for “stellar manitude”, typed it in and hit enter - and my partial message was posted. Hmph!

Anyway, I wanted to look up stellar magnitude because I’ve fortotten exactly how it’s defined. I suspect that bright meteors at the limb of the earth could be seen from the moon if some means of blocking out the bright earth were available.

Having looked up magnitudes I find there are two of them. The apparent magnitude and the absolute magnitude. The apprarent is how bright it appears to us and the absolute is how bright it would appear if it were about 33 light years (10 paresecs) away. I suspect meteors would not be visible to the naked eye at that distance.

I’ve got a real sensitive new keyboard and apparent came out apprarent and parsecs came out paresecs. But you get the idea, I’m sure.

David Simmons, that system of absolute magnitude is used for stars. It’s not appropriate for meteors, because a meteor viewed from a distance of 10 parsecs would be quite thoroughly invisible! It would be equally invisible from the Moon. Brightness drops off with the square of distance. The typical meteor begins to burn about 100 km from Earth, and the Moon is 4,000 times farther away than that. So the meteor would be only one 16-millionth as bright if viewed from the Moon!

For meteors, we define “absolute magnitude” as the magnitude that the meteor would have if viewed from directly beneath–that is, if the meteor were at the observer’s zenith.

Which brings us back to the OP. To judge whether a meteor will be visible from a distance, remember that the meteor must not only remain above the horizon, it must remain bright enough to be seen. Imagine a meteor directly overhead. From more distant vantage points, it would be seen lower and lower in the sky. This cuts into its brightness for two reasons–the inverse square law (the light travels a longer line of sight to reach the observer), and atmospheric absorption.

According to this site, the combined effect is such that:

So there are three magnitudes of attenuation between the zenith and 15 degrees, or a little more than five magnitudes of attenuation between the zenith and 5 degrees. How much attenuation can a meteor suffer before it becomes invisible? It’s hard to say–it depends on the brightness of the meteor, the darkness of your sky, and how closely you’re observing. Just for the sake of argument, though, let’s use a drop of five magnitudes as the effective limit.

The question then becomes, if an object 100 km high is at the zenith, from what distance will it appear to be 5 degrees above the horizon? (Somebody help me out; I suck at spherical trigonometry.) That would be a typical radius of meteor visibility.

I get 695 km. Didn’t need an spherical trig, though, just the Law of Sines.

All right, call it 400 miles. I have a feeling that’s a little on the high side–I suspect that meteors bright enough to suffer five magnitudes of attenuation have to burn down further into the atmosphere before they become that bright. But with respect to the OP:

If they were “totally awesome”, they must have been pretty bright. It sounds like you saw them out of the windshield of your car, which would indicate they were somewhat low in the sky. You don’t mention one important point–were they in the same direction as your house, or in the opposite direction? If they were in the opposite direction, and were already low in the sky from your vantage point, then they probably weren’t visible from your home. But if they were in the direction of your house, they might have been more nearly overhead from home and would have been even more awesome!