How many lottery tickets must be bought to assure every combination of numbers?

How many possible combinations again?

Just a couple more things I thought of:

First, the succinct way to put my point: If you buy a lottery ticket, you’re going to lose, that’s the expected outcome. If you run the numbers through your statistical equations and it says that the expected value of that ticket is a penny, the outcome is that you’re going to lose. If you run the numbers through your statistical equation and it says that the expected value of the ticket is ten million dollars, the outcome is that you’re going to lose. The “expected value” doesn’t change the outcome. You’re just going to lose.

Now, a way to put the lie to anyone who believes that “expected value” has meaning w.r.t. the lottery: Let’s say that the jackpot in the lottery this week is one million dollars. I’ll make you the following proposition: I’m going to buy one ticket. I put a hundred dollars down on the table and you put a thousand dollars on the table. If my ticket wins (and by wins here I mean hits all six numbers), then I get the $1100. Otherwise, you get it.

Would you take that bet? I would hope that any rational person (unless they just couldn’t raise the $1000) would jump at that bet.

The next week, the lottery jackpot is one billion (one thousand million) dollars. I’ll make you the same proposition: I buy one ticket, if it wins I get your thousand dollars, otherwise you get my hundred.

Would you take that bet? Of course not! The “expected value” of my ticket is now a thousand times more than it was last week!

Yeah, right. You’d take the bet, regardless of the lottery prize. Your decision would be based soley on the fact that the odds are overwhelming in your favor that my ticket would lose. But when we turn it back around to the decision to buy a ticket or not, suddenly the expected value of the ticket is a significant factor?

OK, let’s look at a different extreme. I have a deck of cards with one joker, and for the price of 1$, I’ll let you pick a card at random. If you pick a heart or a spade, you win $3. Would you play that game? Remember, the most likely outcome here is that you lose one dollar. But the expected value (in the statistical sense) of your winnings is almost a buck fifty.

Now, supposing you would play that game: How unlikely does a win have to be before it’s “effectively zero”? Would you play the game if you had to draw a spade to win, but the prize was $6? What if only the ace of spades is a winner, but it’s worth $78?

You don’t seem to be understanding my point. You’re trying to get me to pin down the line between when expected value is a relevant concept and when it’s not. I don’t know the answer to that, and I’ve never claimed that there is some rule for determining when expected value is reasonable to use.

It still remains, however, that in the real-world example of your typical state lottery, where the odds of winning are so overwhelming large in comparison to the number of trials that a person can reasonably participate in, it is just incorrect to claim that it’s reasonable to buy a ticket when the prize is over $14 million, and not reasonable to buy a ticket when it’s less than $14 million. The odds completely dwarf the “expected value”; you’re going to lose in either case, so how can one be more reasonable than the other?

And remember, that’s how this sub-discussion got started. Someone asked if it was reasonable to only buy a ticket when the prize was over $14 million. Someone else answered that it was, and I took issue with that claim. I still have seen nothing that supports that specific claim. If you believe that that specific claim is correct, then tell me why, in terms of that specific claim, rather than analogies wherein the conditions of the relative sizes are different.

Ok, an answer to your question that might actually help make my point: It depends on how many times I get to play the game. In the first example, if you say I can only play once, I won’t play. Why? Because even though the “expected value” is a dollar fifty, that is a statistical concept that just does not have a real meaning in a single trial. You yourself said that the most likely outcome is that I’ll lose, and in a single trial, the likely outcome is more significant than the expected value.

And if you dispute that, let me ask you the question: If you were on the playing side of that game, would you play if I said you could only play once? Of course you would, because the expected value is greater than the cost of a play! I don’t think so…

But if you say I can play a hundred times, I’d probably play. Why? because now the statistical concept of expected value has a realistic meaning. The odds are slightly less than fifty/fifty that I’ll win, right? So in a hundred plays, the expectation is that I’ll win, what, 45 or so times? So it is perfectly rational to believe that if I play a hundred times, I’ll win enough times that my overall winnings will exceed my payout.

But that’s only true because the number of times I’m going to play is a bit less than 50 times the odds of drawing a winning card. In other words, the number of trials is large compared to the odds of winning. And so “expected value” has meaning.

But getting back to the lottery, even if you buy a hundred tickets, the number of trials (100) is so incredibly less than the odds of winning (14 million) that you have no realistic expectation of winning on any ticket. And so “expected value” is meaningless. It’s different from your card game because you just can’t play enough times for it to be rational to believe that you’ll win even once, let alone win enough times for “expected value” to come into play.

And again reminding you of how this started: I did say that if you could play the lottery more than 14 million times, it would be reasonable to only play when the prize was more than $14 million. But in the real world, when you’re talking about buying one ticket or even a few hundred tickets or even a few hundred tickets in every lottery from now until you die, your reasonable expectation is that you’re not going to win, ever, and so “expected value” has no rational meaning.

Actually, I have to argue with Q.E.D.'s reply about the odds of someone who buys multiple tickets. I believe the answer is 1 in (the typical odds) MINUS (the number of tickets sold MINUS one). Let me elaborate:

If there are 10 numbers, and you buy 1 ticket, your odds are 1 in 10. If you buy two tickets, then only 1 of those tickets can possibly be the winning ticket. This doesn’t make your odds of winning 1 in 5, it makes it 1 in 9.

Say your two tickets are numbers 1 and 2. We know that the winning number can only be one of those two numbers. We therefore know that your second ticket will not be the winner - so we can eliminate that from the possible picks. So the odds are reduced by the number of tickets you purchased, except for one ticket.

Computer Science 340 Combinatorics may have paid off after all!

Ok, my last post was done in haste, and I think Chronos’ card game is on the road to making my point.

Let’s make the game a little different, though. I have a regular deck of 52 cards. You can pay $1 to buy a “ticket” to play the following game: You draw a card at random. If it’s one of the four aces, I’ll pay you a thousand dollars! If it’s any other card, I keep your dollar.

Now here are the statistics (someone can correct me if I get any of this wrong, but try to focus on my point):

Odds of winning: 1 in 13.
Cost of one play: $1.
Payout of a win: $1000.
Expected value of a ticket: $1000/13 = $76.92

So is that a good game to play? Hell yes! $1 gets you almost $77? You’d be a fool not to play!

But here’s the catch: You can only play ONCE.

Are you still a fool not to play? Or are you a fool TO play? Correct me if I’m wrong, but I’ve yet to hear anyone say that expected value is dependent on the number of plays. So the expected value of your single ticket is still almost $77. It it still costs only one dollar. But you only get one play, so your odds of getting that $77 are 1 in 13, period.

So the expected value tells you that it’s a great game. But the odds of a win in your one play tell you that it’s a pretty bad game.

Which is the more relevant statistic, expected value or the odds against winning? Which one do you use in your decision to play or not? If you still say that you use expected value, come on over and I’ll let you have that one play (but I’ll only let one person have one play, of course, because otherwise the situation from my side becomes as in the next paragraph).

Now, as I’ve said before, if I up the allowed number of plays, expected value comes into play. If I let you play, say, 26 times, then expected value becomes more relevant than the odds, because the odds say that you’re likely to win twice in those 26 plays. So now the expected value tells you what those likely two wins will actually earn you. Spend $26 dollars in those plays, win twice, take home $76.92 times 26.

If I could play the lottery 28 million times, I’d agree that the expected value of each ticket would be relevant. But when I can play only a vanishingly few times COMPARED to the odds against winning, the odds are all that’s relevant.

Just to point out, I don’t think this is a very effective example: the payoff of $1100 is in no way affected by the expected value of the lottery ticket, only the odds of winning (which don’t change between the two scenarios).

That being said, I agree with you that the expected value can be a misleading parameter when it comes to deciding whether or not to “take the bet.” It’s not a worthless parameter, understand – it’s still useful in classifying scenarios – but other things like probability of payoff and amount of risk are important, too.

For your $76.92 card game, sure I’d play! If I lose, I’m only out a buck! Pffft; I’ll skip buying a cup of coffee next week. But if I win, $1000 is a nice chunk of change. But I get your point; it’s not only expected value that’s important. Here, let me try to make your point:

As long as we’re throwing out hypotheticals, how about this one: Donald Trump offers a sporting game of dice – one game only! Game goes like this: The Donald puts up $1,000,000. You put up $10,000. You roll a die. If you roll a six, you get the Donald’s cool million. Roll anything else, you lose your $10,000. Do you take the bet? It’s an awesome expected value! But, hmmmm… you’re likely to lose $10G.

What about if the numbers were $1000 and $100,000? Less risk, but less reward. Or $100,000 and $10,000,000? More risk, more reward. Or – oooh, I know! – what if the numbers were $100,000 and $10,000,000, but you won by rolling a one through five, and only lost on a six? You’ve got a great expected value, and a good chance of winning… but what if you lost?

Actually, there’s a famous pseudoparadox relating to using the expected value of a game to decide whether to play.

Basically, you pay to play, and you flip a coin until it comes up heads. I count the number of tails you got, call it n, and give you 2[sup]n[/sup] dollars. So the expected value of a play is 2/2 + 4/4 + 8/8 + …, which is infinite. That means you should be willing to pay an arbitrarily large amount of money to play the game, right?

I would still most assuredly be a fool not to play that game. The odds do not overwhelm the expectation, since the odds are part of the expectation. And the odds do not overwhelm the prize, because the expectation (which takes into account both the odds and the prize) is greater than the cost.

If the only relevant factor is most likely outcome, how about reversing the card game: In my 53 card deck, you pick a card, and if it’s a heart, spade, or the joker, you win $4. But the cost to play is now $3. You only get one play. You’re probably going to win, right? That is to say, winning is the most likely outcome. Should you play?

Ok, please explain it to me in more detail than that. Let’s get back to my original question: Please explain how the “expected value” has meaning in a single trial. You say that “the expectation is greater than the cost.” I’d like to hear more detail as to how that applies to a single trial.

And you really, really believe that it’s statistically/mathematically wise to play a single trial of a game where your odds of winning are 1 in 13? And your rationale is that, IF you win, you’ll win more than you paid to play? Again, explain that, I don’t get it.

My “proof” that expected value has no real-world meaning in a single trial:

Forget the lottery, forget the card games. I have a coin (proven to be honest, etc.), and I’m going to flip it in a (proven to be) random way.

You can pay $1 to buy a “ticket” in a very simple game: If the coin comes up heads, I pay you $2, if it comes up tails, I keep the dollar you paid for the ticket.

So, clearly, the expected value of a ticket is $1. And the cost of a ticket is also $1.

If you play the game 100 times, what is the expected outcome? You pay $100 for 100 tickets that have an aggregate expected value of $100. So you start with $100 and you end with $100.

Now we’re going to play the game twice. What’s the expected outcome? Still easy. You pay $2 for two tickets with an aggregate expected value of $2. So you start with $2 and you end with $2.

Now we’re going to play just once. What’s the expected outcome? You pay $1 for one ticket with an expected value of $1, so you start with $1 and you end with $1.

Uh, wait a minute. Can someone show me how to play this game exactly once and end up with a dollar? If the coin comes up heads, you’ll end up with two dollars. And if it comes up tails, you’ll end up with zero dollars.

There is no physical way in the real world to play the game once and end up with that “expected value”. It’s an abstract concept that is valid in multiple trials, but meaningless in one trial. Show me how to play once and end up with a dollar, and I’ll retract everything I’ve said.

Ok, so here’s a hypothetical: They’re going to have a one-time super-duper lottery; same deal as usual, hit all six numbers to win, one in 14 million chance of winning. Further, they are going to sell exactly one ticket of any given number combination, so you are guaranteed that if you win, you’ll get the entire prize, you won’t have to share it. They are going to keep the sales open until all 14 million tickets are sold, thus absolutely guaranteeing that there WILL be a winner. The prize will be one trillion (one thousand thousand million) dollars. Each person is allowed to buy one ticket. The cost of the ticket will be your entire net worth, which for the sake of simplicity in your case is, let’s say, one million dollars. So the expected value of your ticket is way, way, WAY over what it’s going to cost you, but it is going to cost you everything you own to buy that ticket.

You’d be a fool not to play, right?

Right? You said it yourself, right up there, you would still “most assuredly be a fool not to play”, your words.

You ARE going to lose, the odds say so overwhelmingly, but since the expected value of your one ticket is so incredibly high, you’d be a fool not to play. Right?

Once again, please explain this to me. My logic and math say that you’d be the biggest fool in the world to risk your entire worth on a single one in 14 million shot. But your expected value tells you that you’d be a fool NOT to. I don’t get it.

Ok, so I should have done the math before my last post. So say the prize is one million trillion (one thousand thousand thousand thousand million) dollars. By my calculations, a one million dollar ticket has an expected value of a bit over $71 billion.

So NOW you’d be a fool not to play.

First of all, a one trillion dollar prize, at odds of 1 in 14 million, has an expectation of only $71428.57, so no matter how you slice that one, a megabuck for a ticket for such a lottery is a ripoff.

But there’s another problem, here, that of the actual value of money. One trillion dollars is not worth a million times as much as one million dollars, to me (and, I suspect, to most folks). I could buy anything I wanted with a trillion bucks, but then, I could probably buy anything I wanted with a billion, and most of what I wanted with a few million. On the other hand, my entire net worth (however much that might be) is worth an awful lot to me, in so far as I’d be a lot less happy if I lost it. So in terms of actual value to me, the expectation for such a lottery is not good.

However, this is not an issue (or at least, not much of an issue) for smaller amounts of money. To a very good approximation, $1000 is worth a thousand times as much to me as is $1. Losing the dollar wouldn’t break me, and I could just add the $1000 to my bank account. So in that case, it is a good bet.

And in the coin-flipping example, yes, the expected value of a single play of that game is one dollar. What would you argue the expected value should be? Any other number one might try to claim as the expected value would be even worse.

There are so many things going on in this thread that it’s hard to know where to start. It seems to me that there are at least a couple factors that are confusing the issue.

One complicated issue here is the perceived diminishing marginal utility of money. Basically, the more money a person has, the less importance he tends to place on each additional dollar he gets. This makes a good deal of sense–with the first $20000 you make, you can get food and shelter, which most people perceive as pretty important. With the next $20000 you might do things which are less critical to survival (entertainment, etc.) and hence have lower utility. It’s pretty difficult to be quantitative with the idea of “utility,” and different people have different utility functions, but for most people there’s definitely a fairly rapid decrease in the utility of additional money.

This is part of the problem with bets like Roadfood’s last wager:

So part of the problem is that most people place a much higher utility on their current holdings, $0-N, than on future additions to these assets, (N+1)-$(large number).

(Another problem with this specific proposal, which makes me wonder if Roadfood understands what “expected value” actually means, is that these numbers don’t add up. A prize of 1 trillion dollars ($1E12) with a 1/14E6 chance of winning gives a ticket expected value of $1E12/14E6 = $71429, well under his stated ticket price of $1E6. This would be a foolish bet even in the absence of diminishing marginal utility.) [On preview I see this has been corrected.]

This diminishing marginal utility makes itself felt most in large wagers (those with a large buy-in or a large payout or both). In principle you should be able to reduce this effect by dividing both ticket price and payout by some large number to make both values small relative to your current total assets. (For example, dividing the above [corrected] wager by 1E12: would you buy a $0.000001 ticket with a 1 in 14E6 chance of winning $1E6? That’s a pretty high payout, and the cost is vanishingly small.)

Another issue which Roadfood might be trying to get at is a practical issue. The ticket price is not the only “cost” of buying a lottery ticket; there’s also a cost in time and effort (in actually purchasing the ticket, not losing it, checking the numbers). These are pretty large hassles (relative to the $1 price)–maybe they double or triple your effective ticket price.

So, while I don’t agree with Roadfood’s unconditional statement that for a low-odds game only the odds of winning, not the expected value of winning, matters, there is a level of abstraction here (in equating money with abstract utility or value), common in dealing with abstract probability problems, that not everyone in the conversation may be using.

Um, Roadfood, no offense meant, but after reading this I lean back to the doesn’t-understand-expected-value camp. “Expected value” is a technical term. It does not mean the “value” you’re “expected” to have, though it’s true that in a large enough number of trials with a reasonable game these two concepts converge. The fact that you can’t end up with your “expected value” doesn’t invalidate the concept at all.

Yeah, I realized my mistake and fixed the numbers.

But you’re just dodging the issue here. I mean, you’re not saying that the relative values that the money amounts have to you change the mathematically derived “expected value”, are you? The expected value is what it is.

When you said that you’d be a fool not to play the one-time card game, I assumed that you were basing that decision strictly on your understanding of the odds and the expected value. I tried to make an extreme example to see if that decision would stand up to a real-world example involving a lot of money, and rather than just addressing that directly, telling me why the high expected value would make you a fool not to play the super-duper lottery, you tell me about the value of money to you. That’s a distraction from the real question, which I’ll ask one more time as plainly as I can:

How does the concept of expected value have meaning in a single trial?

You missed the point entirely. Of course the expected value is a dollar. But you can’t actually GET that expected value in a single trial. Expected value is an abstract concept, that was the point.

No, I’m not trying to get at any of that. Factor in whatever other costs you want, in the end you still have some cost for the ticket, some odds against winning, and some computed expected value of the ticket. And, one more time:

How does expected value have meaning in a single trial?

Thank you. Thank you, thank you, thank you!

THAT’S the point I’ve been trying to make! You call it a “technical” term, I call it an “abstract” term, but we’re saying tha same thing, and I really DO understand what it means, thank you. It doesn’t, as you say, mean the value you’re expected to have. What it DOES mean, again as you say, is what you can reasonably expect in a large number of trials.

If expected value is NOT, as you say, the value you’re expected to have, then how does it have meaning in a single trial?

So, I believe that this at least partly supports what I’ve been saying. Why do you even care if losing the dollar would hurt or not? If the expected value is the most important determinant in your decision, the cost of losting the dollar should be irrelevant.

Ok, so the super-duper lottery was over the top and lost the point. Let’s get back to the real world, back to my card game. Would losing $10,000 hurt you? I’ll set up the card game (52 cards, draw one of the aces, you win, any other card you lose) as follows: $10,000 to play, $500,000 if you win. Is $500,000 worth approximately 50 times $10,000 to you? If not, then pick some other numbers that will work for you, and try again to just focus on my point.

The odds are 1 in 13 that you’ll draw an ace.
The prize is $500,000.
The expected value, then, is $38,461. That’s almost four times the price of a ticket.

Same offer: you can play once. Are you still a fool to not play? Again, if you say yes, EXPLAIN IT TO ME! How does risking $10,000 at a 1 in 13 chance to win $500,000 make any sense at all? The odds say you’ll lose that one try, so what does expected value mean in the case of a single trial?

You and Omphaloskeptic keep saying that I don’t really understand what expected value means, and yet you haven’t tried to explain it or answer my simple question of how it has meaning in a single trial.

I’m with Roadfood on this one. The expected value is a poor predictor for a single trial. If you disagree, is there any amount that’s too much for a chance to play the game in post 89?