According to the story, the casino played about $600 million in bets, and the results deviated from expectation by about 5% of that, or $30 million.
We can get a feel for the likelihood of this by making some simplifying assumptions: (1) The only game the casino offers is a wager on a fair coin flip, with a fair payout; therefore, the casino’s expected profit is zero; and (2) Every gambler bets the same amount of money on each flip.
Let b = the dollar amount of each bet, and n = the number of bets in the quarter during which the casino lost money. Obviously, n = 600 million divided by b. We can then ask, for various values of b, what is the likelihood of a deviation in profit, from the expected value of 0, of at least $30 million?
It should be obvious that the likelihood rises as the bet size rises and the total number of bets falls. Mathematically, the standard deviation of the number of coin flips the casino wins is (n/4)^0.5. To achieve a deviation from expected profit of $30 million, we need a deviation in expected wins of 30 million divided by b.
If b = 1, then n = 600 million, and we need a result 2,449 standard deviations away from expectations. This is effectively an unobservable event. Likewise with any b up to 100,000, at which point a loss of $30 million is still 7.75 standard deviations away from normal.
Finally, with b = 1,000,000, we are 2.45 standard deviations away from normal, a result we will achieve (in one tail) 0.7% of the time. Still not common, but certainly possible. With b = 10,000,000, we are only 0.775 standard deviations away from normal, which will be achieved 21.9% of the time.
Conclusion: If the accounting and the games are honest, the casino must have been victimized by an unlikely, but not unimaginable, series of jackpots in the million-dollar range, or larger.