Based on a somewhat related problem I had to solve once, I’m not immediately so sure of this.
My problem: In a large bay, whales and boats were observed from the top of a hill or cliff at the water’s edge. Observations were made with a theodolite, which measured the whale or boat’s angle from true north (IIRC that was called the “azimuth”) and the angle of the object below the horizontal (called the declination), from which the object’s distance could be computed. (This distance was typically in the low numbers of miles.)
The height of the observer above sea level was also relevant, and figured into the computation. (So far, what we have is a standard first-week-trigonometry problem, similar to finding the height of a building by standing a known distance away and noting the angle of line-of-sight (inclination) to the top. But wait, it gets worse . . . )
The azimuth and declination of a series of observations was entered into a computer program (which it was my job to write), which computed the object’s position in X-Y coordinate terms, and then analyzed the respective paths of the whales and boats. (The object of the research was to determine if the presence of the boats disrupted the behavior of the whales.)
And now, here’s where it gets messy: Some other competing researchers were measuring the distance to the object taking the earth’s curvature into account, which we all though must only make a petty difference at such close distances. (I’m pretty sure we were right about that.) But our PI thought that competing researchers would take that as an opportunity to criticize our work. So we needed to do our computations with a spherical earth model too. This, you didn’t learn in your first week of trig class.
It turns out, of course, that the radius of the earth is necessarily part of that computation. And the radius of the earth is a big number. And the radius of the earth varies, by a few thousand km IIRC, from one place on the earth’s surface to another. And the altitude of the observer seems like a tiny number compared with the earth’s radius, and even compared with the variation in the earth’s radius. And the desired results should be smallish distances in the range of just a few miles or km. And all these numbers had to be plugged into a thoroughly obscure and messy trigonometric formula that I had to devise myself, having failed to find it in several trig and surveying textbooks.
First question: Since the earth’s radius varies on the order of a few thousand km IIRC, do we need to know the exact radius there where the observations were made? Or could the radius vary by a few thousand km without making a significant effect on our results? Next question: It is obvious that the distance computation is very sensitive to the observer’s altitude, which therefore had to be accurate to the nearest meter. So, in a computation involving the earth’s radius of about 6400 kilometers, can the formula still be sensitive to variations on the order of 1 meter in the observer’s altitude?
I wrote a program which ran the formula with a range of reasonable figures for all these variables, and found: Yes, it works out well. Large variations in the earth radius (on the order of kilometers) did not significantly affect the result, while small variations in the observer’s altitude (on the order of meters) did.
So, to answer Chronos: In my problem, at least, the computed distance to an object at sea level, in the range of a few km, was very sensitive to the observer’s altitude. So my first intuition is that is might be for the discussion of this thread as well.