I’m going to change the parameters to make this an actual math problem. The result won’t reflect reality exactly, but the orders of magnitude of the numbers involved are so greatly different that I think it will reflect which event is more likely to occur.
Assume a 162 game season. (So far, we’re still grounded in reality)
Assume the player gets exactly 4 abs per game. (Bye bye reality)
Assume we’re talking about 1 player and trying to decide whether it would be more likely for him to get exactly one hit per game or hit .500 for the season.
Assume this player has a career BA of .300.
The proability of this player getting exactly 1 hit in 4 abs is 0.4116 (0.3 * 0.7^3 * 4)
The probability of doing this 162 games in a row is 3.508 * 10 ^ -63 (0.4116 ^ 162)
The probability of hitting at least .500 is 1.596 * 10 ^ -26 this can be found using this formula:
p = the sum of (0.3^n)*(0.7^(648-n))*C(648,n) for all n = 324 to 648. (C(648,n) is 648 choose n)
Hitting .500 is 4.55 * 10 ^ 36 times more likey than getting exactly one hit in every game. (That’s waaaaay more likely for the math impaired.)
Are we done?
Hell no!
The odds of this .300 hitter putting together a 162 game hitting streak are somewhat better than getting exactly one hit in every game. In fact, they are a lot better:
0.7599 chance of getting at least one hit to the power of 162 is 4.8144 * 10 ^ -20
This event is more likely than hitting .500.
However, it is important to note that there is some overlap in these events.