The reason we can’t consider it 1:2 is that Monty choice on which door to open is not always random. He will have this problem two thirds of the time, since you have two chances of picking the goat. This means that, two thirds of the time, he is telling you which door contains the prize. This means you are also not choosing randomly.
So basically, it goes back to Nemo’s post above. To put it a different way:
2/3 chance to pick goat. Switch and win.
1/3 chance to pick prize. Switch and lose.
There is no provision made in a probability table for switching. You can only look at the relative odds of every possible choice.
Perhaps in response you can argue that the entire concept of a probability table is therefore flawed and perhaps as far as its ability to capture the subtleties of this hypothetical that might be true - or, it might not. IDK.
What I do know is that when you look at the gross odds for every possible outcome at every stage of the process, the final odds turn out to be 50-50 no matter how you look at it.
I know that my table isn’t clear because you can’t lay it out properly in text only format. But please note that for Monty’s draw, if the contestant chose the prize, I gave a 1/2 probability of Monty picking the first goat and 1/2 of picking the second goat. Further, if the contestant chose one of the goats, then I assigned a probability of 2/2 that Monty would chose the other goat and a probability of 0/2 that Monty would choose the prize.
The problem is you are ascribing a probability to switching and use a probability tree, while what I’m doing is setting up a table of possible outcomes, all of which are equally likely. The probability tree is the wrong approach to this problem as the 50-50 probability you introduce has nothing to do with the chance of picking the right door after one has been revealed, it’s the probability of making either choice if you do so randomly.
Let’s say instead of of flipping a coin you roll a six sided die and if it shows 1-4 you stay with the original door, that is a 1/3 chance you switch. Now the correct probability tree goes:
1/3 chance you picked the right door
1a) 1/3 chance you switch = 1/9 chance you lose
1b) 2/3 chance you don’t switch = 2/9 chance you win
2/3 chance you picked the wrong door
2a) 1/3 chance you switch = 2/9 chance you win
2b) 2/3 chance you don’t switch = 4/9 chance you lose
Sums:
Chance you lose 5/9
Chance you win 4/9
Now lets move on to the probability if you decide to always switch.
1/3 chance you picked the right door
1a) 1/1 chance you switch = 1/3 chance you lose
1b) 0 chance you don’t switch = 0 chance you win
2/3 chance you picked the wrong door
2a) 1/1 chance you switch = 2/3 chance you win
2b) 0 chance you don’t switch = 0 chance you lose
Sums:
Chance you lose 1/3
Chance you win 2/3
And to hammer the point in: the probability if you decide to never switch.
1/3 chance you picked the right door
1a) 1/1 chance you don’t switch = 1/3 chance you win
1b) 0 chance you switch = 0 chance you lose
2/3 chance you picked the wrong door
2a) 1/1 chance you don’t switch = 2/3 chance you lose
2b) 0 chance you switch = 0 chance you win
Sums:
Chance you lose 2/3
Chance you win 1/3
That’s a second rigorous mathematical proof that you’re wrong.
My point is that there is a difference between the game where Monty picks first and the game where Monty picks second. Cecil’s point in the column is that there’s a difference between the game where an ignorant Monty picks (and randomly picks only goats) and the game where a knowledgable Monty picks (and always picks only goats).
Suppose we have a game where you purchase one out of 1,000,000 lottery tickets. Once you do that, the lottery commissioner, who knows which one is the winning ticket, destroys 999,998 tickets which he knows are all losers, leaving only two tickets left, including yours. Do you really think that both tickets have an equal chance of being winners? If you do, I’d like to propose a little game.
Xema has already noted this, so I’ll quote him:
The point of this problem is to compare the odds of winning when you switch to when you don’t switch, not calculate the overall odds of winning if you switch half the time. Here, let’s use the six outcomes from your own probability tree, and label them based on whether the contestant switched or stayed with the original door:
Adding these up, overall the contestant wins 1/2 of the time (since the choice to switch or stay is 50-50). However, “switch-and-win” occurs 1/3 of the time, and “stay-and-win” occurs 1/6 of the time, meaning the contestant is more likely to win when switching.
I don’t want to seem like I’m ignoring the thoughtful replies, but i’m a bit tired and just want to try a different approach.
I understand that the chance after the first draw is 1/3. My point is that even before the first draw, you know that one losing choice will be eliminated.
Now I was hesitant to embrace the following concept, but it seems to be the only way to reconcile the result. Specifically, the fact that you know that a losing choice will be eliminated means that your odds from the first draw are not 1/3 and 2/3 but 50-50.
Here’s why. From the first draw, there are only 2 possibilities
you pick the prize, monty shows you a goat, the remaining choice is a goat.
you pick a goat, monty shows you the other goat, the remaining choice is the prize.
It doesn’t matter which goat you pick in 2 since monty is obliged to show you the other goat.
It doesn’t matter which goat remains in 1, since monty has already shown you the other one.
Therefore, your odds were never 1/3 vs 2/3. It only appeared that way because you are calculating the odds without taking into account the effect of having a losing choice revealed - always.
In fact, this situation is no different from letting monty pick first. Is it?
If monty goes first and always picks one of the goats, aren’t you in precisely the same situation you would be after the first draw?
And my response is NO. You are given more information when 2/3s of the time his choice is forced. Your choice on whether to switch is not random, while it is if Monty goes first. Your chart does not note the extra information, and assumes that the choice at the end is 1/2.
Plus there’s just the simple fact that I actually wrote a program, and switching always gives better results, at a ratio of 2:1. And, honestly, that wasn’t what I was expecting. After I saw it actually happen, all of the sudden I could make sense of all the explanations that seemed so wrong before.
There’s just no way around the experimental evidence. Heck, the experiment is so well-known, it’s actually used to test animal behavior. Some birds will actually figure it out faster than humans because it doesn’t let false logic get into the way. If the bird and the human are put together, the bird will win more often, after it finally gets it, while a human will win more only early on when he doesn’t.
OK, I’m willing to admit I’m deluded if confronted with peer-reviewed scientific evidence. I’m sure you’re a very good programmer, but since I used to be one too, I know how easy it is to make a mistake. So I can’t accept your program as dispositive, but I will accept online citations to authoritative sources (which would NOT include things like wikipedia).
Everyone is quick to emphasize that Monty’s choice gives you additional information. Well, if that’s true (and my claim is that it is not), then it shouldn’t matter whether you get that information before or after you make your own choice. In fact, if it really is valuable, it should be even more so if you have it ahead of time.
The argument really revolves around the idea that there are 3 choices. Initially, that’s true, and if we left things as they were after the contestant chooses a door, there is absolutely no doubt that he would only win 1/3 of the time.
But you know before the first choice that whatever he picks, one of the 2 that remain will be eliminated. Therefore, there are only 2 choices in actual fact - the one the contestant makes and the one left after monty shows you a goat.
I think this is about as obvious as it could possibly be. The 3rd choice is not only an illusion but a transient one at that.
What you are saying seems to be “there are only 2 possibilities,” and “therefore, your odds were never 1/3 vs 2/3;” in fact, the odds “are not 1/3 and 2/3 but 50-50.”
Why do you think that “there are only 2 possibilities” leads inexorably to the odds being 50-50?
It is different. In one case, Monty is constrained in which door he can pick; in the other case, you are constrained. In both cases, Monty’s pick gives you information you didn’t have before the game started, but the information you receive is different.
No you are not.
Again, please refer to my previous post, where I point out that according to your own analysis, when Monty picks second, your odds of winning are 2/3 if you switch doors. Is your own analysis incorrect?
To start with the last point, I never said that switching increased your odds to 2/3. If I had, why would I still be arguing about this? Isn’t that the heart of the issue? I did say that 1/2 of that 2/3 went to make up part of the 50% chance of winning. That is not the same thing - well, it may or may not be. I’m not completely sure at this point. I’m glossing over it because my current tack is to argue that there were never 3 choices to begin with.
Monty’s choice should tell you even more if he makes his first - shouldn’t it? Monty has to show you a goat. This true whether he goes before you or after. If it’s monty’s turn, you get a goat. So if getting a goat from monty is so informative, why is it less so if he shows you the goat first? It makes no difference if this is the basis of your argument.
i’ve already shown with the probability tables the correspondence between choices and odds. In fact, table doesn’t care about how many possibilities there are, but only their weight.
And your position is that two choices necessarily yield a 50-50 chance of winning. But that is incorrect.
Here’s an easy example: You are asked to select a card at random from a fair deck and place it face down to your left. A team of 10 people then each select a card, look at all 10, agree which is the highest, and place it face down to your right. You now get to choose one of the two cards - if it proves to be the higher of the two, you win $100.
You have two choices, and you knew in advance that you would - but I’d hope it’s obvious that the probabilities of each card being a winner are not equal.
It just works out that way here, I wouldn’t make that as a general claim. Even I can see that it is a special case and usually 2 possibilities would have very different odds.
I’d like to outline what happens if Monty picks first - given that he still always has to pick a door with a goat
Monty Player Result
picks picks
goat 1 goat 2 loss
goat 1 prize win
goat 2 goat 1 loss
goat 2 prize win
Each scenario is equally likely. The result is that the player wins 50% of the time.
I submit that this sequence is statistically the same as the one where the player picks first. The only difference is that in this case, the illusory nature of the 3rd choice is highlighted.
So, when you STAY with the original door, you win the prize 1/3 of the time and get the goat 2/3 of the time. When you SWITCH to the other door, you win the prize 2/3 of the time and get the goat 1/3 of the time. So according to your own analysis, it’s better to switch.
If you don’t believe that, then what, in your own analysis is incorrect?
Define “even more.” What Monty’s choice tells you is simply different, depending on when he makes it. There’s no requirement that the order or amount of information you get steadily increase or decrease your odds.
I’ve never included the concept of switching or not in my analysis - which is why you had to add them in yourself. The concept is irrelevant to the construction of a probability tree. How you managed to assign those concepts to each row of the tree, I have no idea.
My argument does not in any way relate to what I know my options will be. What is relevant, and I have stated this a few times, is the fact that one of the choices will be eliminated.
I can’t really adapt that idea to your example in a way that is exactly comparable, but I’ll try to come close.
It would be as if, in your example, you knew that 9 of the 10 cards had to be 4 2’s, 4 3’s and 1 4 (assuming that ace is high and not low). In that case, the only real unknown is the value of the 10th card.
And, as a follow-up, perhaps you could explain why you would say
when you’ve “never included the concept of switching or not in [your] analysis.”
I honestly have no idea how you’re approaching this problem, because it looks like you’re saying contradictory things as well as completely missing the point of the whole question. Perhaps the best thing to do would be for you to restate in your own words what, exactly, you think this problem is and what question you’re answering.
they may seem contradictory, but they’re not. My focus has been on the probability of winning at the second draw. Since my conclusion is that the odds of winning are even, that precludes the possibility that they are not - if it turns out that I’m right. You can disprove something directly or you can do so indirectly by proving something that makes what you want to disprove impossible.
However I have changed my argument over the course of the discussion. I’m not trying to hide that fact but it may be contributing to the confusion. I started by arguing that the the initial odds were 1 in 3 to win and accepted the proposition that those odds were descriptive of the hypothetical until after Monty eliminated once of the choices by opening a losing door.
I’m not sure whether or not that argument was valid but either way, I’m ditching it in favor of the argument that the odds were never 1 in 3 to begin with but 1 in 2. I’m not going to restate that argument here or subsequently since it’s already been made in my most recent posts.
BTW, I have no idea if this argument necessarily has any merit either. To be honest, I constantly flip-flop between thinking I might have some genuine insight here and thinking that I’m completely full of shit and my arguments are pure sophistry. Personally, I think the odds are 50-50, but I might just be saying that out of habit.
Do you think that the concept of switching is irrelevant? If it’s irrelevent, why is it explictly mentioned in the column title. If it’s not irrelevant, why are you ignoring it?
There is no “second draw” in the standard Monty Hall problem. This leads me to believe you’ve constructed your own version of the problem which is completely different from the one discussed in the column.
Please state exactly how you think the Monty Hall problem is constructed and exactly what question you’re answering.
And with this, you’ve hit on the fundamental error in your analysis: There is an option to switch, and a probability tree can’t capture that. Therefore, a simple probability tree is not an appropriate model for this problem. There is a more complicated model that does capture what you need here, but it’s not going to give you much insight beyond the correct arguments given by zut, Xema and others.