I was listening to Vin Scully this morning on a tape from the 1980s (hey! we have our own ways of amusing ourselves) rattling on, as is his wont, about how minor league batters allow pitchers to get away with 2-0 cripple pitches that their major league equivalent batters drive into the gaps.
Now, I hate Scully’s consistent lack of reasoning here–obviously, MLB hitters are just flat out better than minor league batters, that’s why they’re in the major leagues, because they’re better–while it’s pretty well unprovable that the big difference in a 2-0 count. But this got me thinking:
How much better are MLB hitters than minor leaguers? if Babe Ruth had played the 1921 season, for some bizarre reason, in the minors, would he have batted .400? .500? .600? You see all the time, young players come up to the majors and start hitting well from the git-go, and very often these were NOT players who burned up the minor leagues statistically, making me think that the difference between the two groups of hitters is, while certainly significant, much more subtle than the Vin Scullys of the world believe.
Could we design a study (or has anyone done so?) that would quantify the difference? I might suggest isolating every batter over a long period of time who has played a full minor league season and then played a full major league season, and someparing the difference in batting stats. This would probably need to be normed by then comparing the difference between the two seasons with the difference between seasons (had at the average age of the players in the first study) that are both at the MLB level and the minor league leavel, to gauge how much improvement is attributable to the difference between being 21 and 22, or 23 and 24, whatever the the average ages turned out to be in the first study. Any stats geeks who know if someone has tried to study this effect in this fashion?
I hope so, because if no one tells me YES, I may sink a week of my life into doing this study, and I ain’t got that much time free these days.