At what point would increasing the distance of the rubber from the plate have a significant, maybe 20 points or so, effect on the batting average of most MLB Players? Would 2-3 feet make a difference?
90 mph is 475,200 feet per hour, or 132 fps. A 90 mph fastball gets to the plate in 0.458 seconds. (This isn’t exact as the ball gets released from the hand a lot closer than 60.5 feet, but for the sake of an answer bear with me.)
Moving the mound back two feet means the ball would get there in .473 seconds, or 0.015 more seconds. I don’t see that as a difference maker, certainly not a 20 point difference maker. My guess is you’d have to move it back about 10 feet to have that sort of a difference. That would take 0.534 seconds to get there, or 0.08 seconds more.
But a ball thrown at 90 mph won’t remain at 90 mph. How quickly does it lose velocity?
The distance is also a factor in pitch control. Sneaking one over the corner will get more difficult with just a few feet improving the hitting stats. Some pitchers will have to slow their pitches to maintain accuracy.
I’d like to expand on Lamar Mundane’s example. According to this web site, the average distance from where the ball leaves the pitcher’s hand to where it makes contact with the bat is 53.5 feet. At 90 mph (132 feet per second), that takes about .405 seconds.
It would take the ball about .420 seconds to travel two feet father (55.5 feet). We’re used to thinking of pitches in term of speed, not reaction time. A pitch that traveled the original distance (53.5) feet in this amount of time (.420 seconds) would be going about 86.9 mph. So by moving the pitching rubber back by two feet, we’ve made a 90 mph pitch the equivalent of an 87 mph pitch (in terms of reaction time).
20 points is a big difference in batting average. Let’s say that batters could hit 20 points higher against an 83 mph pitch than against a 90 mph pitch (I don’t know this - I just suspect 83 mph is in the right range). At this speed (about 121.7 feet per second) it would take about .440 seconds for a ball to travel 53.5 feet. A 90 mph pitch would travel about 58 feet in the same amount of time, or about 4.5 feet farther than 53.5 feet. So moving the pitching rubber back by 4.5 feet (to 65 feet) would make a 90 mph pitch about the equivalent of an 83 mph pitch thrown from the standard distance.
None of this takes into account the problem of pitch control that TriPolar pointed out.
I was thinking the extra few feet could have an effect based upon the hitter seeing the ball for an instant longer.
Move the pitcher back to the center field wall and I could get on base. (Base on balls)