Sorry I haven’t been back here before now. I’ve got a few ideas to write (some of which have already been touched on in various forms), so I’ll go ahead and spell them out and see what people agree or disagree with.
Admittedly, some of this stuff I’ve never given much thought to until this discussion. I mean, I knew about expected values and that sort of thing going in, but I’d never given much thought regarding basing lottery “strategies” (i.e., to play or not play) on it before.
Anyway, you’ve made some good points, Roadfood. Here are some thoughts that occur to me:
First consider a game that costs, say $1 to play. Initially, let’s say this game offers a 100% chance of winning a net of $1 (you pay $1 and immediately get $2 back). You’d be a fool not to play. The expected value of your winnings in this game is $1.
Now alter the game. You pay $1 and have a 50% chance of winning $4 back. Do you play? What if instead it’s a 1% chance of winning $200?
Or, in general, what if the $1 game gives you a 1/n chance of winning 2n dollars (where n is some real number >= 1)?
The relevant feature of all these games is that they each give an expectation of winning a net of $1.
Specifically, would you pay $1 to have, say, a 1/googol chance of winning two googol dollars? I certainly wouldn’t; my chances of winning are so small that I’m virtually guaranteed of losing my dollar, even though the expectation of the game is that I would actually win one dollar.
So here we have a class of games in which the expected value of winning is the same for ($1) every game, yet it certainly seems that some games are good deals, while others are bad.
This raises the question: At what point do these games switch from being good deals into bad deals? How large does the probability of winning have to be before it’s a good decision to play?
I’ll say a bit more on this later.
Another (extreme) example, similar to one Roadfood mentioned earlier:
Two people, A and B, play a game. We’ll say that B has only a 1/googol chance of winning, while A wins otherwise (there are no ties).
When A wins, B pays $1,000,000 to A.
When B wins, A pays one googolplex dollars to B.
If you played this game, would you rather be A or B? The expected value of winning is overwhelmingly in B’s favor. On the other hand, the odds of winning, period, are overwhelmingly in A’s favor.
I would certainly play in A’s position, in spite of the expected values. Even playing a lifetime’s worth of games it’s doubtful in the extreme that I would ever lose a single game.
On the other hand, if A and B were immortal beings, playing the game over and over again throughout eternity, it’s clear that player B will kick the shit out of A in the long run, provided B has enough funds to survive the initial drought of wins he will most certainly experience when they begin playing the game in the first place.
Now back to the earlier question regarding the class of games I described earlier:
At what point do these games switch from being good deals into bad deals? How large does the probability of winning have to be before it’s a good decision to play? (Or maybe it’s not a single “point” dividing good and bad games, maybe a better description is that the expected value slowly becomes less and less relevant as the probability of winning decreases)
I’ve tried to illustrate what I think are two of the main factors in answering this question by giving the example of the game between A and B–time and funds:
Any game from that class (in which you have an expectation of winning $1) is worth playing when 1. You have enough time to play the game often enough to give you a “reasonable” chance of winning occasionally, and 2. You have enough funds to cover the cost of those games, as well. (I’m going to leave “reasonable” undefined).
In this sense, an immortal being with unlimited funds (who, for whatever reason, likes to win money anyway) would consider any of the games from that class to be a “good” game.
I understand that the immortal being is only a theoretical concern, however. The initial situation was whether or not it was a good deal to play a “single” game. At this point I would claim it’s a good deal when the individual considering the game thinks there is a “reasonable” chance of winning that single game. (I’ll assume paying the single dollar won’t be a problem here). 100% chance of winning? Obviously a good deal. 1/googol chance of winning? Obviously a bad deal (to me).
As I mentioned, I’m going to have to leave “reasonable” undefined here. I don’t think there are going to be any hard and fast rules by which you can define it; it is in large part up to the individual considering the game. Still, as I’ve mentioned before, it would be a function of (among possibly other things) how well you can afford that single game, and how much benefit the winnings would give you.
Ultimately, I’m not saying that the expected value is necessarily irrelevant when considering a single game, but I am saying that it becomes less relevant as the chance of winning the game decreases.
Anyone agree or disagree with any particular points?