A golf/baseball physics question, and an old thread

I stumbled across this very old Straight Dope thread on a Google search about golf dynamics:

The crux of the thread, basically, is which will hit a golf ball farther, a golf club or a baseball bat. That’s exactly the answer I was after in my Google search.

I want to get back to that question, but in the meantime I want to ask about what seems to be a definitive answer in the thread to a sub-question, and an answer that doesn’t feel right to me.

Member **Omniscient **claims that a pitched ball (i.e., one moving with some velocity in a direction opposite the swinging bat) will travel further than a stationary ball hit off a tee:

How can that last line be correct? That ball speed in the opposite direction can add to the energy of the bat? That seems wrong on so many levels. For one, aren’t we dealing with a simple conservation of momentum problem here? Before impact the two objects have momentums with opposite signs. Doesn’t conservation of momentum say that a stationary object (momentum zero) will rebound with more momentum in the opposite direction (in the case of this collision) than one moving in the opposite direction?

I know that we are dealing with impact dynamics for objects–the club or the bat–that have very high coefficients of restitution, but I don’t understand how that affects the question at hand. You can’t get a COR over 1.0, which would seem to be the case if what **Omniscient **suggests is true.

I think I get where he is coming from, conceptually, when I imagine the case of a tennis player just “blocking” the incoming shot, and the ball rebounding back over the net with seemingly no effort of the player’s own. But…that ball is never going to come off the racket with more momentum than it had before (even if the signs are reversed).

Anecdotally, I know that I have hit softballs off a tee well over 300’, which was at least as well as I could do on a pitched ball…and I think I could guarantee that his “guarantee” is worthless, on MLB players not being able to hit a teed ball out of the park consistently. On the contrary, I’m sure that every person who ever played in the major leagues could do it. It ain’t that hard.

So…that’s my first physics question. My second one is the same as was offered in that old thread: Which would go farther: a teed golfball hit with an aluminum softball bat, or a teed golfball hit with a driver?

The one thing that was not addressed in the old thread (when talking about the distances the respective balls travel) is the difference in the masses of the balls. Softballs routinely travel over 300’, and up to probably 400’ or more for the biggest hitters. The softball has mass of 6.8 ounces. Golfballs routinely travel over 300 yards, and up to probably 400 yards or more for the biggest hitters. Golfballs weigh 1.6 ounces.

The aerodynamics between the two are different, to be sure. But the question I want to answer is this: Which one–the batter or the golfer–is transferring more energy to the ball?

What bounces higher, a ball dropped from one inch or a ball dropped from one foot?

Which deflects the bounced surface more, the ball dropped from one inch or the ball dropped from one foot?

When two object collide, the speed that they rebound away from each other depends on the speed that they approached each other, regardless of the speed that each object has relative to the observer. For example, consider a ball traveling at 100 MPH (relative to the ground) that hits a stationary bat (again, relative to the ground) - the speed that the ball travels away from the bat (note: relative to the bat), will be the same as if a bat swung at 100 MPH hit a stationary ball (say, on a batting tee).

To understand this, it is important to realize that the frame of reference of the observer is arbitrary. For example, imagine the ball is moving at 100 MPH relative to the ground, and you are in a car going the same speed alongside the ball. It would look to you like the ball was stationary (well, actually, like it was falling straight down, since gravity is still working). Now this stationary-looking ball is met by a bat that looks to you like it’s going 100 MPH in the opposite direction. From the car’s frame of reference, a 100 MPH bat hit a stationary baseball, while from the ground’s frame of reference a 100 MPH ball hit a stationary bat. The result is the same, at least with respect to how quickly the ball rebounds away from the bat.

Momentum is conserved because when the ball rebounds away from the bat, the bat also rebounds away from the ball. Again, consider a ball traveling 100 MPH (relative to the ground) that hits a stationary bat (relative to the ground). When the ball bounces off the bat, the bat will be pushed back a little bit, due to conservation of momentum. In the case where a bat swung at 100 MPH hits a ball on a batting tee, the bat slows down a little bit as a result of the collision, again due to conservation of momentum. The frame of reference is still arbitrary - in one frame momentum is conserved by the bat being pushed back, and in the other momentum is conserved by the bat slowing down.

Once you see that the frame of reference is arbitrary - that with a given closing speed the same thing happens regardless of whether the speed is due to the motion of the bat, the ball, or both - it becomes obvious that a higher ball speed before the collision must result in a higher ball speed after the collision.

If what you say is correct, then a one hundred mile an hour fast ball hitting a brick wall should drop to the ground and die at the face of the wall.

ETA: smarter heads prevail

If you’re coming in my direction, whether you are a ball or a car moving just as fast as a ball, the first thing I have to do in order to send you in the opposite direction is to overwhelm the momentum you have coming in my direction. If you are in a car, I am not going to be able to do that. You will overwhelm my bat, and we will both go in your direction.

Which is the same thing that happens to the ball in this circumstance, of course. When the bat does work on it.

Equal and opposite and all that 17th century stuff.

Hitting a moving car with a bat is the same as hitting a stationary car with a bat going swing speed + speed of car. The car will move back a little on it’s springs, but since it is not very elastic most of the energy will go into deforming the metal or breaking the glass.

What you see with a moving car will depend on your frame of reference. If it’s the car, then it is the same as the stationary scenario above. If it’s the ground or the bat, then the car will continue to move forward, albeit more slowly.

What makes you think that hitting a ball with an object moving 100 MPH is different than a ball and an object moving towards each other, each traveling 50 MPH relative to the ground?

As it works out in the real world:

A slightly slower ‘Home Run Derby’ 80 MPH pitch is sometime crushed just as far (and then some) as a blazing 98 MPH pitch, but it has nothing to do with all that was explained above. Well, some… but other variables come into play for the observer.

In the real baseball world, when the batter can sit back on a certain pitch and location AND swing with perfect timing, or get ahead and generate more bat speed, you have a new set of variables to contend with, because now you have extra bat speed over a ‘typical’ game scenario, and are more likely to connect at the sweet spot of the bat… plus… Game time conditions are usually early- to mid- July, and this is results in decently high humidity w/ warm air, creating far flying balls.

Toss in the greatest HR hitters, and at the Home Run Derby specifics above, you can expect ‘measuring tape’ home runs, which means, “WOW… that was about as far, or farther, as I have ever seen a HR ball go!”… even though it was hit off an 80 MPH pitch and not a 100 MPH pitch.

.

Because velocity, and thus momentum, are vector quantities, meaning they have both a magnitude and a direction. And in this case, their directions are opposite. Some energy must be used up in turning the ball around.

To continue with your line of reasoning, what would be the difference between the bat moving 1000 mph and the ball moving 50 mph, and the other way round? I don’t know at what mph exactly, but at some point there is going to come a time where the ball is moving too fast for the bat to even turn it around at all. At some point the ball would push the bat back and keep on going forward.

The difference there is 950 MPH. It’s the same case, observed from two different frames of reference, one moving 950 MPH relative to the other.

And yes, at some point the speed of the ball will be fast enough that it will knock the bat back. That speed will depend on the relative masses of the bat and ball. So there is some “optimum” pitched ball speed that will result in the ball traveling farthest. For a baseball and bat, that speed isn’t zero (and it’s probably below 1000 MPH).

For the normal baseball case, because the ball weighs much less than the bat, the bat speed doesn’t change much between when the ball is on a tee and pitched at 100 MPH. The ball will bounce off the bat faster in the second case, but the bat won’t be moving that much slower after hitting the ball, so the ball is moving faster relative to the ground.

Let me throw out some made-up numbers, to see if that helps.

Case 1: ball on tee, bat moving 100 MPH. Closing speed is 100 MPH, and ball bounces off bat with that same speed relative to the bat. After the hit, the bat is moving 10 MPH slower. The ball is moving 190 MPH, and the bat is moving 90 MPH.

Case 2: ball pitched 100 MPH and bat moving 100 MPH. Closing speed is 200 MPH, and ball bounces off bat with that same speed relative to the bat. After the hit, the bat is moving 20 MPH slower. The ball is moving 280 MPH, and the bat is moving 80 MPH.

The key is the bat doesn’t slow that much, because it weighs more than the ball.

Nothing.

Roughly speaking: A baseball bat is how many times heavier than a baseball? That’s how much faster than normal the pitcher would have to throw to get up to the optimum home run speed.

No. A ball traveling 50 MPH hit by a bat moving 1000 MPH will initilly be traveling 950 MPH faster than a ball traveling 1000 MPH hit by a bat moving 50 MPH.

That is simply not true. Let’s put your experiment in space where there is no nearby body to create a privileged frame of reference. How could we determine whether the bat is moving 1000 MPH and the ball is moving 50, the ball is moving at 1000 MPH and the bat is moving at 50, or they are each moving at 525 MPH?

A baseball is about 5 ounces, and a bat about 30 ounces, so a factor of six. So if we believe the 100 MPH batspeed figure (which I think was pulled out of the air, but sounds about right), the optimum pitch speed would be around 600 MPH.

We’re not talking about space, we’re talking about a batter at the plate. All speeds relative to the ground.

A further complication is the spin on the ball. What was the pitch that most pitchers feared to make lest it end up in the stands? Not a slower than usual fastball, but a hanging curve – one that breaks, but not as much as it should. If it were solely a matter of action and reaction, then a hanging curve wouldn’t be as easy to hit for a HR since it was going even slower.

A recent study showed that curve balls did indeed travel further than fastballs when hit because they developed extra lift from their spin. If thrown properly, this isn’t a problem because the batter doesn’t make good contact, but if the curve doesn’t break, the spin becomes lift and the hanging curve ends up in the stands.

And what difference does that make?