When a satellite looks straight down, how much atmosphere is it looking through?

Presume a satellite is at 36 000km altitude, if it looks straight down at the earth, how much sea level-equivalent atmosphere is it looking through?

What if the satellite is at 360km altitude and looking down at a 45 degree angle?

90% of the atmosphere is below 16,000 meters, which is why most airplanes can’t go higher than that. With that in mind, I would think the difference between the two scenarios would be a fraction of a percent or something similar.

Yeah I think the angle would have more effective than the altitude. Remember that the colors of a sunset are caused by the light shining at an angle that takes it through more atmosphere.

For all practical purposes orbital altitude doesn’t matter at all. Once the satellite is high enough to maintain orbit, going higher doesn’t change anything. So in that sense **Telperion **is right.

What has a huge impact is look angle. Which **Telperion **mistakenly ignored in the OP’s question, ruining his/her conclusion.

Look angles vary between straight down versus 45 degrees versus grazing the horizon. To a (very) crude first approximation the total atmosphere burden is proportional to the sine or cosine of the look angle, depending on which way you call zero degrees and which you call 90 degrees.

In either case at a 45 degree look angle you’re talking about 1.4 times as much atmosphere as there is looking straight down.

I think reality is more complex that simple sine(LookAngle) or cosine(LookAngle) because of the exponential distribution of atmosphere density. But I don’t want to think hard enough to figure that out for sure.

It can be calculated from air density (1.225) and air pressure (101,325) at sea level = 8.4 km
For satellites in low orbit, you can use trigonometry when looking at an angle. For satellites in higher orbits, you need to account for the curvature of the Earth.

Yeah, you’re totally right. I was thrown off by the straight down formulation in the title and skimmed over the rest of the question.

Thanks. I was using this table: U.S. Standard Atmosphere adding up each km and trying to guess at the missing values so I really wasn’t sure if my result was right.

Follow-up question: Would K band radar be sufficient to make it to seal level at a middling angle? Seems like K band SAR could give very good resolution.

Would mm wave radar be sufficient to detect and track targets at 10km altitude at middling angle? According to my very approximate calculations, the atmosphere from satellite to 10km altitude, straight down, is equivalent a bit more than one 1km of sea-level atmosphere. CIWS mm wave radars are well capable of that, although I don’t know if that’s because they have greater access to power when mounted on a ship than they would on a satellite.

Tangentially, the moisture/weather in an area affects the satellite transmission times, effectively making it seem like more/less atmosphere (from the point of view of the satellite, anyway).

It gets easier with CAD software.

for a 360km orbital altitude, a simple cos45 calculation (assuming a flat earth) would give a 318.19 km viewing distance. Because of the curvature of the earth, the actual viewing distance is 327.70 km, a difference of about 3%.

NOAA actually uses time-of-flight errors in GPS satellite signals to gauge atmospheric humidity.

Atmosphere absorption is a non-issue at any reasonable frequency. The problem is getting a detectable return signal. With a radar, doubling range requires 16x increase in power. And space-based systems have very little power available.

IIRC, Soviets used 5 kW X-band radar to track ships from low orbit. With modern electronics, L-band may work better to detect aircraft.

You misunderstood my point. It had nothing to do with curvature of the Earth, although you’re 100% right that the curvature affects the distance to the ground; we can’t use a simple cosd(45) because the Earth’s surface is dropping away as your look angle changes away from satellite nadir.

What I was saying is that the density of the atmosphere is exponentially distributed. The bottom meter is thicker than the one above, and so on up to the top. What I don’t know, without some careful reasoning and careful calculus, is whether that exponential distribution means we can’t use a simple cosd(look angle) even while ignoring the Earth’s curvature.

IOW, cosd(45) gives us the distance from the satellite to a flat Earth’s surface. But then we need to integrate the density along that diagonal path up to the top of the atmosphere towards the satellite to see how large that integral is versus running the same integral from nadir straight up towards the satellite.

My intuition is they’ll come out to differ by the same cosd(45) factor, but then a nagging “no, wait, what about …” makes me hesitate. And my calculus is rusty enough I’m not going to do the work to answer that nag. But intellectual honesty compels me to admit to it. And I invite somebody more proficient on the math or with better tools to answer definitively.

I don’t know much about electromagnetism so I don’t know if K band or mm wave count as reasonable frequencies for looking through about 10KM of atmosphere.

Yes, satellites can’t rely on brute strength for anything. Is it not also possible to increase range by using high sensitivity, low noise receivers and amplifiers combined with signal processing?

Why would L-band work better for aircraft?

It’s got to be just cos(45). Imagine a light beam, 1 square inch in cross section, projected straight down; it’s viewing the surface of the earth through 14.7 pounds of air. Now tilt your beam 45 degrees. How much does this new column of air weigh? 14.7/cos(45).

When you tilt your path 45 degrees from vertical, the path through each infinitesimally thick layer of atmosphere increases in length by 1/cos(45), reglardless of the altitude of any given layer.

All of that disregards the earth’s curvature. If that curvature is taken into account, then for very shallow angles, e.g. an angle that’s nearly tangent to the surface of the earth, the path length through lower/denser altitudes could be lengthened more than the path length through higher/more rarified altitudes. I suspect the difference here would not be very much, since the earth’s atmosphere is very thin compared to the radius of the earth itself (100 miles vs. 4000 miles).

To throw another wrench into the works, don’t we also have to take refraction into account? Especially if, per MichaelEmouse’s questions about radar, we’re talking about a RF signal? I remember seeing an online horizon calculator where the answer was significantly different for an RF return vs. a visual horizon.

Yes, it is, but it only helps you so far. For comparison, “Zaslon” X-band radar in MiG-31 can detect airliner-size plane out to 400km. This is similar to the range from a low orbit satellite. On one hand, “Zaslon” requires about 30 kW, which is completely impractical for a satellite. On the other hand, satellite could have a much large antenna, if it can be folded for launch.

In 1970s, signal processing could not cope with the strong noise in the return signal at lower frequencies. Even in X-band, it was only marginal. The soviet satellite only worked well on calm seas, since high waves created too much noise. Today’s signal processing is far far better. And lower frequency will get you stronger signal out for the same power input.