I was watching my beloved Red Sox the other day and my favorite marvel of nature Tim Wakefield was pitching. For those who don’t know, he is a knuckleball pitcher. As such his fastball is about 70mph and his standard knuckleball goes as low as 53 mph.
I am a reject when it comes to physics, but I would imagine there is a minimum speed at which a pitch can cross the plate. I would assume gravity would put, say a 20 mph, pitch in the dirt before it reached the plate. Am I correct in this assumption? If so what is the minumum speed one can throw a pitch and still have it cross the plate?
Well, that’s an interesting question. And more complicated that it looks.
A pitch travelling at 20mph would be moving at 264 feet per second. If thrown flat it would take roughly 1/4 second to get to the plate (60 feet 6 inches from the pitching rubber to the leading edge of home plate). In that time it would drop approximately 8 feet (one quarter of 32 feet per second per second if I’m right).
Given that an overhand pitchers released point should be between 7-8 feet above the level of home plate (pitchers height plug overhand throw plus the height of the mound) means the pitch should be at or very near the ground as it reaches home plate.
Longer can be acheived if the pitcher throws the ball with upward velocity as in the eephus pitch of days long gone. I’m not aware that a speed was ever clocked for one of those.
How long is the distance from the plate to the mound, and how high from the ground is the mound…
I did think of the Eephus pitch and think that should be excluded from the discussion. Let us assume that the pitch does not go more than a foot or two above the batter’s head.
I have also wondered about this after watching countless celebrities and the like throw out first pitches and rarely get them to home plate. Made me wonder (never having pitched before) if it was that hard and if you needed that much steam to get the ball over the plate.
Well, I never could get above about 35 mph on pitching machines when trying out in school.
That kills ones dreaming of major league glory pretty quickly.
Distance from pitcher’s rubber to home plate: 60 feet 6 inches
Height of mound: 10 inches at the rubber
<b>jon</b>: I don’t believe it would be 8 feet… you have to remember that 32 f/s^2 is an acceleration… s=.5at^2, so it would be (.5)(32)(.125)= 2 feet vertical drop… Assuming 8 feet vertical drop, the Vmin to make it to where it just touches the plate is s=.5(32)t^2, therefore time is sqrt(8/16), 60.5ft/sqrt(1/2)~85.56ft/s~56.3mph. Wow… there’s obviously an error somewhere in my work. And <b>jon</b>… where did you get 264 f/s? (20 miles/1hr)(1hr/3600sec)(5280 ft/1mi)=32.333…
duh… :slap … Vf^2 = 2as, = 8sqrt(8) ft/s ~ 15.43 mi/hr
note: When the pitcher releases the ball, he is closer to the plate than 60’ 6" (the release point).
Good point, Philster. Assume 55’, then, and a height of 8’. So the question becomes: (a) how long does it take a ball to drop 8’, and then (b) divide 55/(a)–that’s the answer in ft/sec.
I have no idea what the answer to (a) is, not having taken physics.
I suppose if we are talking about Randy Johnson who stands 6’10" the ball could drop eight feet and still almost be in the lower reaches of the strike zone.
I was just amused imagining what a Wakefield off-speed pitch might look like.