When asked, this question has never been sufficiently explained by any gun nut I’ve talked to, and I am certainly at a loss to figure it out myself…
The precondition is: A rifle and scope is “dialed in” (sighted properly) to be dead center on the crosshairs @ 100 yards when shooting horizontally.
Why is it necessary then, with the same set up, to raise (elevate) one’s aimpoint above the intended target (at the same linear distance, i.e. 100 yds.) when shooting uphill or downhill to achieve a dead center shot? For the sake of argument, let’s say the angle is 45 degrees.
Yeah, I confused my shoot over with my shoot under… Thanks.
But your Chuck Hawks Cabela’s quote doesn’t quite explain it for me.
Line of sight or not, 100 yds is 100 yds.
A ballistic trajectory is approximately parabolic. So draw/visualize a parabola and align it to the angle of the barrel at slug exit. Notice that when you hold the barrel horizontally, the slug begins its journey at the apex of the curve, where the rate of descent starts out at its maximum, but the further you move the barrel from horizontal, the closer to flat the exit angle becomes. Presumably the sight is tuned to the horizontal descent curve.
Draw a vector of the velocity at the point the bullet leaves the gun barrel. Then break the vector into its x and y parts. The x part will tell you how long it will take to get to the target (it will take longer than the parallel shot because the x vector is shorter). You can calculate the x and y trajectories independent and then add together at the end.
The bullet is pulled from a direct path by gravity, and the horizontally “dialled-in” sight aims high to compensate. When the desired trajectory is horizontal, the entire 1g force is applied perpendicular to the path. When the trajectory is uphill or downhill, the the 1g force is applied at an angle greater or less than 90-deg relative to the desired trajectory, so either uphill or downhill the component of gravity perpendicular to the trajectory is less than the 1g of the horizontal case. Less deflection relative to the desired path means that one should aim less high than the horizontal case, i.e. apparently aiming low relative to a horizontally dialled-in sight.
For insight, consider the extreme case of aiming vertically up or down. In either case, the bullet would not be deflected at all from the desired trajectory by gravity.
Post #2 pretty much got this right. You need to hold under if you were shooting level at a distance less than 100 yards in your example. Likewise, if you were shooting either uphill or downhill and the straight line distance to target is still 100 yards, your horizontal distance is reduced. If it’s a 45 degree slope, the horizontal distance is 70.7 yards, so hold under the same as if it was level but only 70.7 yards away.
As a footnote, there’s a second-order effect that the uphill and downhill situations are not quite identical, because the parallel component of gravity slows the bullet in the uphill case and accelerates it in the downhill case, so the time of flight is different. You’d have to aim low in both cases, but slightly lower in the downhill case.
Try it this way: If you’re shooting uphill at a 45 degree angle at a target that’s 100 yards away (slant range), than that target is only 70 yards away horizontally. (And 70 yards away vertically, but that’s not important now).
From gravity’s POV the target is only 70 yards away. Since that’s closer than the zero-sighted range, you need to aim lower to score a bull.
The same numbers apply if shooting downhill at a 45 degree angle. (Net of Reimann’s small adjustment just above).
Because the effect is proportional to cos(sightline), it’s real small until you get to real large angles. For most normal shooting situations it gets lost in the noise of overall accuracy & repeatability. Which is why it can remain controversial and confusing to the non-physics/math geeks in the crowd.
Guys, I’m using 100yds. and 45 deg. as nice & easy conceptual numbers, not necessarily as an example of a real-life situation. It is understood that: @ 100 yds. the effect generally doesn’t come into play, and that it is progressively magnified with increasing range and angle of inclination/declination. I haven’t hunted for decades, so why I care, who knows? It’s just something that stuck in the back of my mind and never went away.
ANYWAY… As I interpret some of the comments, it has to do with the horizontal distance of trajectory as derived from the angle of flight, not the line of sight distance?
Here all along I thought projectile drop was simply a function of flight time based upon a constant, invariable, acceleration of gravity, period. So it followed (in my mind) that flight time being the same over any 100 yd path, (be it horizontal, 22.5, 45, or 68 degrees) a projectile would be subject to the same magnitude of gravitational force. Hence my confusion.
So if I understand correctly, the calculation for the acceleration of gravity is a (vector?) function based upon the number of degrees from horizontal a projectile is traveling? Simply put: Gravity ain’t the same for an object traveling at an angle?
The magnitude of the force of gravity is always the same, and you can treat the flight time (to a first approximation for small angles) as always the same.
The difference is the direction in which gravity is acting relative to the path of the bullet.
If the path of the bullet is horizontal, gravity is acting at right angles to the path, and has the greatest possible effect in deflecting the bullet. If the bullet is traveling at a 45-degree angle (whether uphill or downhill), then gravity is acting at 45 degrees to the path. It is only the component of the gravity force vector that is perpendicular to the path that deflects the bullet, so now that component is smaller.
Think about the extreme case of firing the bullet vertically, whether up or down. Now gravity won’t deflect the bullet off its path at all, because gravity is only acting parallel to the path.
Just a brain fart here, BUT: If I’m on the right track… It would require a greater energy input to cause an object to travel a horizontal path of distance x, than the same object @ 45 degrees over distance x ?
That’s a second order effect, forget about it for now.
The important thing is - how large is the force pulling the bullet down from the aimed path. That force is largest when gravity is acting exactly at right angles to the bullet’s path, i.e. when the aimed path is horizontal. The force pulling the bullet off its aimed path is zero when gravity is acting parallel to the bullet’s path, i.e. when firing vertically up or down. Do you get that so far?
Now you should be able to see that the force is an intermediate amount for a “diagonal” shot aimed uphill or downhill.
I don’t understand: If the measurable, quantifiable, force of gravity is always the same, irrespective of the angles we are discussing here, how can it have varying effects on an object at said angles? It seems contradictory.
No intention to be argumentative here… I just do not understand.
ETA The examples of vertical (up and down) projectiles has always served to perplex me…sorry.
But gravity works over time, not distance. And it still takes the same amount of time for the shot to get there, so it falls the same distance.
It took me a few moments to figure out what’s going on. Ignoring velocity loss due to air resistance or gravity, the bullet will drop the same distance from its straight-line target regardless of the angle. But note that it always drops straight down. From the perspective of the scope, a straight-down drop from an angle appears smaller than it would be head-on. And conversely, a given amount of drop from the scope would correspond to a greater amount of physical drop if it happened to be at an angle.
To be more concrete: suppose the rifle is 1500 f/s and the target is 300 feet away. The bullet will drop 0.64 feet at the end of its path regardless of angle.
If looked at head-on, this makes an angle of 0.12 degrees, so you would point the scope down at that angle if calibrating for that muzzle velocity and distance. But if you were shooting up at 45 degrees, it would look like only 0.09 degrees. So the pre-compensation in the scope would be too much, and the shooter would have to point down a bit to correct for it.
Do you understand that if you fire a bullet vertically straight up in the air, gravity will not deflect the bullet from its aimed path? It will just act to slow the bullet down.
I drew you a diagram, I think you’ll get it in a few seconds with a picture, much easier than words. Remember that the amount by which a bullet is deflected from its aimed path depends on the size of the force acting at right angles to the aimed path. See here:
So, if the amount of force acting to deflect the bullet from it’s aimed path is the full 1g of gravity when aiming horizonatlly, and zero when aiming vertically, when aiming at an intermediate diagonal angle uphill or downhill, it must be some intermediate figure less than 1g, right? So now see the diagram I posted.