How 50/50 is actually 2 in 3 odds

In one of the SD books (forget which one), there was a long column about choosing one of three doors to find a prize. You choose a door, and the host eliminates one door. He then gives you the option to switch between the remaining two doors. And instead of 50-50 odds, they now become much better if you switch.

There were alot of other examples of this, but I still can’t understand it.

This being the STMB, I figure that it’s been talked about for people like me already, and I really really really really want to understand how you can have two options, and have better odds with one than the other. GAH, I can’t seem to remember the details.

Someone please explain this logic to me.

Here’s the handy link to the original column:
On “Let’s Make a Deal,” you pick Door #1. Monty opens Door #2–no prize. Do you stay with Door #1 or switch to #3?
And a recent thread on the issue: The BEST and FINAL answer on the MONTY HALL PARADOX!

You should definitely see the aforementioned threads for the answer to the Monty Hall problem. However, I will comment on this.

Having two options does not necessarily imply that your expected gain from each option is the same. Suppose I give you the choice of shooting yourself in the head or not shooting yourself in the head. Let’s say that you “win” by not dying. Clearly, one of the options gives you a better chance of winning than the other.

This example isn’t exactly the same as the Monty Hall problem, because the sample space doesn’t change. But the principle above is applicable to many problems.

in my opinion, this question is entirely moot. a game show is such an awful example of how to determine odds, because of a HUGE outside factor: human strategy. monty’s strategy will cause the problem to not always give the same answer. other than that, both the previous thread and the original post explain how the odds are 2/3 in a problem without any outside factors. now how about we just let this question be?

jon from denver

Er, yes, that’s why we have game theory. This is a game-theoretic problem.

That’s why we make assumptions about Monty’s strategy, and work out the odds for every possible strategy he might have. The so-called “Monty Hall problem” deals with one of those strategies.

Those who don’t understand have a right to ask. If you’re tired of explaining it, leave it to those of us who aren’t.

ultrafilter,
you are totally right. this is a game-theory problem, which is what cecil ultimately approached it as. what i’m getting tired of, though, is everyone who is approaching it as a probability problem. every thread i’ve seen on the subject likens it to the “two children, one is a girl” problem, which has no relation to game-theory. this might actually be an interesting topic to pursue if it were about game-theory, so please direct us to links that pertain to this aspect of the problem, or put some of your own thoughts in this thread.

Crank the numbers up a bit and it becomes more obvious. So you play “Let’s Make a Deal” and Monty asks you to pick a door. One door contains a wonderful prize. The others contain useless garbage. The only thing is that there are one billion doors and you must pick one. Let’s say you pick door number 5,947,034. Monty knows if this is a winner or not, but you don’t. Monty now exposes the contents of 9,999,998 other doors that he knows contain garbage…he will never expose the grand prize. This leaves, say, door 2,634,001 with it’s contents still concealed along with your original pick of door 5,947,034 whose contents are also still concealed.

Now, Monty asks if you would like to change doors. If you switch you change your odds from one in a billion to one in two.

My thoughts were that it’s a game theory problem. Here are some more of my thoughts: Game theory problems often involve the solution of related probabilistic problems. Once you’ve specified Monty’s strategy, figuring out your expected gain from a particular strategy is a probabilistic problem. The comparisons to the “two children problem” are, as you say, not entirely valid. But the two problems look similar to those who aren’t familiar with the subject.

To get back to the OP, the 2/3 probability of winning when you switch is based on the assumption that Monty will open a door. If he doesn’t open a door but allows you to switch, there’s no difference (in terms of expected gains) between switching and not switching.

That’s not entirely correct. What I should have said is that, in the scenario where Monty doesn’t open a door, you have three strategies: stick with your current door, switch to door A, or switch to door B. All choices have the same expected payoff.

Welcome to the SDMB, Evno, and thank you for posting your comment.

Please include a link to Cecil’s column if it’s on the straight dope web site. To include a link, it can be as simple as including the web page location in your post (make sure there is a space before and after the text of the URL).

Cecil’s column can be found on-line at the link provided by SmackFu.


moderator, «Comments on Cecil’s Columns»

No; the odds change from one in a billion to 999,999,999 in a billion (for the same reasons that the odds in the three-door Monty Hall problem change to 2 in 3)

I understand the linking rules, but the reason I didn’t was because I couldn’t remember any of the key words in the column. Next time, superfriends!

The main point, of course, that people miss in this problem is that odds/probabilities can change when you get more information.

I have a playing card, face down, and I ask you to guess what it is. You guess something, say the eight of clubs, and have a 1 in 52 chance of being correct. Then Something Happens – like, I tell you the card is a diamond. Or I turn the card face up and ask if you want to change your guess. OK, somewhat obvious, but the point is that the probability has changed, and so your best strategy would almost always be to change your guess.

so it looks like we’ve all learned a valuable lesson: change is good. in most of the scenarios that have been presented, the odds seem to be in the favor of changing your choice after more information has been gained. we also know that we could change around the situation to possibly make to odds go the other way. it seems that it’s all about the elements of the scenario, just like most things are.

wow! Probability sure works some neat tricks. Here’s how I proved it to myself. With the following rules

  1. There are 3 doors, A, B and C. A has $ behind it and B and C have ¢ behind them.
  2. No matter what door you first choose Monty will then open one of the other two doors revealing ¢
  3. You then have the choice among the closed doors

Now it starts off with a 1/3 chance at each door. I’ll put chance in paranthesis and try and make a flow chart

A(1/3)…B(1/3)…C(1/3) this is your choice
|…|…|
|…|…|…|
B(1/6)…C(1/6)…C(1/3)…B(1/3) this is Monty’s choice
|
…|
…|…|
|…|…|…|…|…|…|…|
A(1/12).C(1/12).A(1/12).B(1/12).A(1/6).B(1/6).A(1/6).C(1/6) lastly your choice again

We now have 8 fractions and going left to right we will assign them to 1 of 4 outcomes.
Stay and win, switch and lose, stay and win, switch and lose, switch and win, stay and lose, switch and win, stay and lose
Ok, now add up the odds. There are 4 outcomes.
Switch and win: (1/6) + (1/6) = 2/6 = 1/3 ~ 33.3%
Switch and lose: (1/12) + (1/12) = 2/12 = 1/6 ~ 16.6%
Stay and win: (1/12) + (1/12) = 2/12 = 1/6 ~ 16.6%
Stay and lose: (1/6) + (1/6) = 2/6 = 1/3 ~ 33.3%

So if you switch you will win twice as often as if you lose. If you stay you will lose twice as often as you win.

And winning twice as often as you lose is always a good thing. The other, bad.

Although this is a moot point since like Cecil said

damn that Cecil is one smart fellah