Can somebody explain the two envelope paradox to me

But that’s a very different game. And in your scenario, you know exactly what game you’re playing.

Buck’s point is that in the two envelope scenario, in a real sense, you don’t know what game you’re playing.

If I find $200 in the envelope, this means I’m either playing a $100/$200 game, or a $200/$400 game. In order to decide whether to switch, I have to have an idea what the chances are that I’m playing each of these games. (If the probability is high enough that I’m playing the 200/400 game, I should switch. If the probability is high enough that I’m playing the 100/200 game, I should keep my money.)

But I have no information helping me determine which game I’m probably playing. So I have no basis from which to reason about whether to switch or not.

If you know that upper bound, then you have a way to work out some probabilities once you open the first envelope. For at least some values found in that first envelope, you’ve got a basis from which to reason about whether to switch or not.

But if you don’t know that upper bound, this basis is taken away. You will have no idea what the chances are that the value you see in the first envelope is the lower value and what the chances are that its the higher value. Since you need to have an idea about these probabilities in order to reason about whether to switch, you will have no basis from which to reason about whether to switch.

You are mixing prior and posterior probabilities. When you are first presented with two envelopes of money, it is true you have 50% chance of picking the higher amount and 50% change of picking the smaller amount. However, once you pick an envelope, you cannot use the 50/50 probabilities to determine that expected value of the envelope you didn’t pick. You cannot assume that the EV of the other envelope is (0.5)(2X)+(0.5)(0.5X) because you do not know what the probability distribution of the values are relative to your envelope. So you cannot say that the probabilities are 50/50. In fact, since you don’t know the value of your envelope, nor do you know the distribution (the starting values of the two envelopes), you can’t determine the posterior probability at all after you have selected one envelope.

This is similar to the Monty Hall problem in the sense that the posterior probabilities are different than the prior probabilities (although in the Monty Hall problem, you can actually calculate the posterior probabilities).

This would be an excellent opportunity for someone to explain, in clear layman’s terms, the notions of “prior probability” and “posterior probability” and the ways one should be careful in using them to reason about probability.

Ah, but you say, with X as a random variable, the other envelope is a random variable Y where Y~2X with chance .5 and Y~X/2 with chance .5. Then E(Y) = 2*E(X) with chance .5 and = .5E(X) with chance .5. Since E(E(Y)) = E(Y), we have E(Y) = 1.25E(X).

Except, it’s not true that Y is distributed in that way: it’s by definition distributed identically to X, since the two envelopes are indistinguishable. It is absolutely immaterial that Y will be realized as exactly half or double whatever X is realized. If you didn’t have the choice of what envelope to pick, and then were told you could either switch or not switch, then you could derive the expected value of Y in that way, but in such a situation you have no reason to assume that the other is larger with chance 50%. If you’re forced to start at a particular envelope, and then another envelope is brought out and you are told that it’s a 50% chance, you should switch, but that’s not the situation as presented.

There is a very simple way out of this paradox which avoids any appearance of special pleading. :cool:

Your expectancy is 0.5(2 X’) + 0.5(0.5 x’’)
where X’ and X’’ are the expected values of X *** for the respective cases where X is the smaller (resp. larger) of the two amounts.***

The paradox disappears once you allow X’ not equal to X’’. And indeed they’re not equal; instead iit is easy to see that for any pdf,
X’’ = 2X’

Quoth Glowacks:

This would be correct if the stakes were “points”, or something abstract like that. But in the concrete situation with money, we can put an even stronger condition on it, since the gamemaster must have a finite bank.

Let’s change things a little. Suppose you are offered 2 identical envelopes, one of which has nothing of value, the other of which has something of indeterminate value. You pick an envelope. It has expected value equal to half of whatever the thing of value is. You then determine the other envelope has the exact same expected value, so everything is as it seems.

Now let’s go back to the original problem, and say once you’ve decided on an envelope, you will be required to pay whatever is in the smaller envelope before getting to claim what’s in your envelope. This is functionally identical to the above problem, but also functionally equivalent to the original problem because the amount you will have to pay has already been fixed by the organizers.

There is an even simpler way out of the paradox (because really there is no paradox).

Let’s define X as the value in the envelope you are holding.
The envelope you are not holding contains either 2X or .5X with equal probability.

That’s it.

Introducing garbage beyond that just produces more garbage. This isn’t a question of a paradox, it’s a question of why some mathematical formula with the word “expected value” may or may not have any bearing on reality.

I am going to try to explain this without resort to statistics. The fallacy is that you are using X twice in the same equation to mean two different things.

The equation that leads to the apparent paradox is (0.5X + 2X)/2 = 1.25X. But X is used to represent two different things in the same equation. You can only talk about 0.5X if you know that you are holding the higher amount, and you can only talk about 2X if you know you are holding the lower amount. Instead, the equation assumes both at the same time, which of course can’t be true.

(The equation above does apply to a different situation: You have an envelope, and are given a choice of two others. One of those contains an amount that is twice the envelope you are holding, and the other has half the amount. You must switch. In that case X is the same amount everywhere it is used, and the expected value is 1.25X where X is the value of the middle envelope. Probabilities are equivalent to Little Nemo’s game.)

On preview I see that I am just echoing what septimus said but I’ll let it stand.

This is circular back to what creates the paradox in the first place. It is incorrect to say that the envelope you are not holding contains either 2X or .5X, as explained first in the post you quoted.

If it were true that “The envelope you are not holding contains either 2X or .5X with equal probability”, then it must be true that the expected value is 1.25X. And it’s not. So there must be something wrong with the statement.

I don’t think this is adequate. If you have a dollar, and I tell you that I have put two dollars in one envelope and fifty cents in the other, and offer you the opportunity to pay me a dollar for one of the envelopes, your best strategy is to pay the dollar.

So if, as you put it, in the original scenario, the envelope you’re not holding contains either 2X or .5X with equal probability, then you should switch. But that can’t be right.

What people in the thread are showing is that the statement is false, or at least, if true, not knowable as true just based on the scenario as described. We don’t know if the probabilities of 2X and .5X are equal or not.

As has been stated, the apparent paradox applies even if you open one of the envelopes. If it has $10, the other has $5 or $20 and the expected value is $12.5. So much of the discussion in this thread dealing with the two unknown values is a bit of a red herring.

The problem is not, as has been stated, one of an improper probability distribution (i.e, that the sum of the probabilities is not 1). The same problem can arise with a proper probability distribution. For example:

Suppose the possible amounts are 2, 4, 8, 16, etc. The nth outcome is 2[sup]n[/sup]. Now let’s choose a probability distribution that is well defined: The ex ante probability that 2[sup]n[/sup] is in the envelope is (1-p)p[sup]n[/sup]. These probabilities do sum to 1 and are all positive provided p < 1.

You open the envelope and discover 2[sup]m[/sup] dollars. The other envelop has either 2[sup]m-1[/sup] or 2[sup]m+1[/sup] dollars. What are the probabilities? 2[sup]m-1[/sup] had a probability 1/p times as large as 2[sup]m[/sup], and 2[sup]m+1[/sup] had a probability p times as large as 2[sup]m[/sup], so the relative likelihood of the larger and smaller amounts is p[sup]2[/sup] to 1. So the two conditional probabilities of the lower and higher amounts now are 2[sup]m-1[/sup] with probability 1/(1+p[sup]2[/sup]) and 2[sup]m+1[/sup] with probability p[sup]2[/sup]/(1+p[sup]2[/sup]).

Sticking gives you 2[sup]m[/sup] for sure. Switching has an expected value of 2[sup]m-1[/sup]/(1+p[sup]2[/sup]) + 2[sup]m+1[/sup]p[sup]2[/sup]/(1+p[sup]2[/sup]). A little algebra shows that switching has a higher expected value if p[sup]2[/sup] > 1/2 or p > (approx) 71%.

But all this reasoning can be done before seeing the amount – you always want to switch (if p is high enough). That would mean even before opening the envelope you’d want to switch, and then switch back, ad infinitum.

The paradox is explained by asking what value does entering this game have. Ex ante, if you never switch you’ll get 2[sup]n[/sup] with probability (1-p)p[sup]n[/sup]. The ex ante expected value is the sum of 2[sup]n/supp[sup]n[/sup] which is (1-p)/(1-2p) if p < 1/2 and infinite if p >= 1/2. The value with switching allowed cannot be smaller. So if the probability is high enough to make you want to always switch, the ex ante expected value is infinite. This means you can’t use expected value to analyze this game.

Hmm…I think there’s a problem in the assumptions.

2 envelopes, one with x, one with 2x.

If I choose the one with x, then the expected value over many trials would be (x+2x)/2 = 1.5x, but if I originally chose the one with 2x, then the expected value would still be (2x+x)/2 = 1.5x.

Or to put it in real terms, if there’s an envelope with $1 and another with $2, and I open one or the other 100 times, then I should end up with about $150.

You can’t end up with 1.25x from the average of x and 2x.
edit: ah, I think it’s because you’re redefining an unknown constant as a variable. x should be the base value, not whatever is in the first envelope.

Now the question is why you can’t re-derive the probability distribution for Y based on what you know about its relationship to X. That is because while it is true that the value of Y is half or double the value of X with equal probability, it is not true that the distribution of the value of Y follows the distribution of half or double the value of X with equal probability: the probability that Y~2X is dependent on X. That dependence on X could be removed if the values could be distributed uniformly across the entire positive numbers, but they can’t be. That is the faulty assumption that the paradox is based on.

edit: Perhaps I am incorrect in emphasizing that aspect, but I’m gleaning most of what I’m talking about from that article, where it appeared to be the lynchpin to the entire case. I’ll have to think about it a bit more.

The next to last sentences of my post would read better as: So if p is high enough to make you want to always switch, then the ex ante expected values of always switching and never switching are both infinite.

(but was over my time limit to edit)

I don’t really see the paradox at all, as long as you make sure that the X you are talking about is always the same. Suppose we analyze the game like this:

I am given two envelopes, one has X in it, the other 2X. If I pick an envelope at random, the expected value is 1.5X.

Let us suppose I have now picked an envelope. What is the expected value in the other envelope? Clearly, still 1.5X, exactly the same as the expected value of my envelope. If my envelope has X (a 50% chance), there is a 100% chance that the other envelope has 2X. Similarly, if my envelope has 2X (a 50% chance), there is a 100% chance the other envelope has X. The expected value of the other envelope is 1.5X.

It doesn’t seem to matter whether you know what X is. Assuming my adversary has fixed X ahead of time, if I open the envelope and see $100, there is a 50% chance that I’m playing the $100/200 game, and a 50% chance that I’m playing the $50/$100 game. As long as that’s the case, it doesn’t make a difference which envelope I pick.

This is exactly what I wanted to post. The apparent paradox comes from the fact that you’re not using the variable X consistently in the traditional phrasing. On the other hand, if you use X as above, you get a correct analysis.

I’ve seen people make the argument that in the real world you can use your prior beliefs about the distribution of X to inform your choice, and I don’t doubt that for a minute, but I think that it’s a more difficult approach than most people realize. For instance, suppose that you open the first envelope and see how much is in it. What values would make you believe that you’ve picked the smaller envelope?

Anyone who says that this situation can’t be analyzed with probability or expected values is just silly.

Excellent example Old Guy. I was trying to dome up with one, but was trying to have values at every integer and so was not arriving at a proper probability distribution. Your example also clearly demonstrates why looking just at expected value when making decisions can screw you up.

For example I would gladly play the following game with anybody who came around. You pay me $1,000,000, and in return you start flipping coins and count the number of heads in a row you get before your first tail (N). I will then pay you (2^N)/N cents. This game has infinite expected winnings for the player, but the game master will basically always come out far ahead.

We can’t assume that there’s an equal probability that we’re in the 50/100 or 100/200 game. We know that one envelop contains twice as much as the other, and seeing the $100 value in one of them limits it two those two possibilities, but without any additional information, where do you get that there’s a 50% chance that we’re playing one or the other?

Yes, we’re definitely playing one or the other, but we can’t say anything about what the chances are that we’re in one distribution or the other without making assumptions about the likelihood of those distributions. To say that there’s a 50% chance of being in either is to assume that either distribution is equally likely, which is a baseless assumption. If we’re in the 50/100 game, then there’s a 0% chance that the other envelope has $200, not a 50% chance, and if we’re in the 100/200 game, there’s a 100% change it has $200, not a 50% chance.

That’s the crux of the paradox, that we assume these two scenarios are equally likely, where whether we switch or not is based on how much we believe we’re in the 50/100 or the 100/200 game, as seeing the $100 gives us no information about which of those two we’re in, but it seems like it does.