2 envelopes problem revisited

Link to wikipedia page

The short version for those who don’t want to click the link:
I have 2 sealed envelopes, one of which has exactly twice as much money as the other.
I tell you if you pick one of the envelopes you can keep the cash.
After you pick an evelope I offer you the opportunity to take the other one instead.
You reason thusly:
Let X be the value of the envelope you are holding.
The other envelope must have either 2X or X/2 value with 50-50 probability.
The expected value of the other envelope is therefore 50%(2X) + 50%(X/2) = (5/4)X
The expected value of the other envelope is more than the one in your hand, therefore you should switch.
Doing so leaves you in the same position as before so according to the above analysis, you should continue to swap forever.

Im sure there have been many threads asking to explain how to resolve this problem, this is not one of those.

This thread is about what is wrong with what I am about to show.

Clearly the expected value (in reality) of the other envelope is the same as the one in your hand.
Let p be the probability that the other envelope contains twice as much money (2X) as the one on your hand. Therefore the probability of the other envelope containing X/2 is (1-p).

The expected value of the other envelope is p(2X) + (1-p)(X/2) which as stated above is equal to the value of the envelope in your hand.
So solve for p:
p(2X) + (1-p)(X/2) = X
2p + (1-p)/2 = 1
p = 1/3

So it turns out that the probability of the other envelope containing twice as much cash as the one in your hand is always 1/3 :dubious:

What is wrong with this reasoning?

If one envelope has twice as much money, you should be using 2x and x, or x and x/2. To use 2x and x/2 means that one envelope has four times as much money.

I would just say that the total amount of money is x, and that the envelope you hold in your hand (let’s call it envelope a) has an expected value of:

(1/2 * 1/3 * x) + (1/2 * 2/3 * x) [that is, a 50% chance of having 1/3 of all of the money and a 50% chance of having 2/3 of all of the money]

the envelope that you don’t hold (let’s call it envelope b) also has an expected value of (1/2 * 1/3 * x) + (1/2 * 2/3 * x)

Therefore, ev(a)=ev(b) and switching accomplishes nothing

No. If this envelope contains X, the other envelope contains either 2X or X/2.

I don’t have the time or patience to read the whole wiki article in detail and then not understand it in the end anyway. But their description of a paradox seems to be a calculation that shows switching is an advantage based on the average of the two amounts, but doesn’t see that the same calculation could be used for not switching with the same result. I don’t really understand the paradox here.

The paradox is that anyone with half a brain can see that the expected value of each envelope is the same as the other, whilst the analysis I demonstrated says otherwise.

Thats fine but to be fair all you have accomplished is proving something that everyone knows anyway.
The problem is to identify where the reasoning that leads to the paradox goes wrong.

But anyway, as I stated that is not what this thread is about.

The problem is that there is no information as to how the values in the envelopes are chosen. You are naively assuming that since there is no information all situations are equally likely, but there is no reason for this to be true.

In stat speak, you are attempting to apply Bayes rule, without a fixed prior.

In a bit more detail there is no proper prior for which it is equally likely that the other envelope has either twice as much or half as much. This prior would have to have a equal value for all numbers of the form a2[sup]n[/sup] where n runs from minus infinity to plus infinity. But such a prior cannot be normalized to sum to unity.

If you say that the second envelope holds either 1/2 A or 2A, there’s your problem, because the A in 1/2 A is not equal to the A in 2A. You’ve got two different As.

The proper way to analyze it is to assume that there is a fixed probability density f(x) which represents the relative probability that there are x dollars in the smaller envelope.

Then if you see Y dollars when you open the envelope, your expected value for staying is obviously Y
While your expected value for switching according to Bayes Rule is (Y/2f(Y/2)+2Yf(Y)))/(f(Y/2)+f(Y))

So you should switch if (Y/2f(Y/2)+2Yf(Y)))/(f(Y/2)+f(Y))>Y or more simply if 2*f(Y) > f(Y/2).

The first version of the analysis made the assumption that there was a flat prior but this leads to an improper distribution, hence the apparent contradiction. That doesn’t mean that the assumption that switching envelopes makes no difference is any better, since that would inherently assume that 2*f(Y) = f(Y/2).




When you use 2X, you are assigning the variable X to the value of the money in your envelope if that money is the smaller amount. Let’s call this X1 instead. When you use X/2, you are assigning the variable X to the value of the envelope if that money is the larger amount. Let’s call this X2 for clarity.

But by definition, X1 is X2/2. They are not the same thing. But you proceed to treat them as if they are. The rest of your calculation errors flow from here.

There are a number of ways to analyze this problem but that’s the simplest way to look at it, in my opinion.

Fonzie had that, too.

I was thinking this way, then I changed the way I considered the problem.

Instead of X, let’s say you opened your envelope, it has $100. What is in the other envelope? Absent information regarding the total amount of money available, the other envelope is going to have either $200 or $50.

If $100 is X1, then the other envelope has $200. If $100 is X2, then the other is $50.

The expected value of each envelope is the same, but once you open your envelope and find it’s value to be $100, how do you get the value of the other envelope to equal $100, without changing the probability of $200 vs $50? Does that mean it’s more likely that $100 is X2? How does that happen?

You and the OP are mixing expectation with conditional expectation. Before you open your envelope both must have the same conditional expectation. But after opening and seeing how much money it contains, the expected value of the other envelope changes. Consider a simple example where the envelopes contain $1, $2 or $4. Before you open yours, both envelopes have a conditional expectation of $7/3 = $2.33… But when you open:
[ul][li]If your envelope contains $1, the conditional expectation of the other is $2 (because it must contain $2).[/li][li]If yours contains $4, the conditional expectation of the other is $2.[/li][li]If yours contains $2, then the conditional expectation of the other is (1/2)$4 + (1/2)$1 = $2.50.[/li][/ul]
Once you open your envelope you’ve changed the information available to you about the other envelope.

The OP’s reasoning wrongly assumes that if the two envelopes have equal unconditional expectation, they must also have equal conditional expectation.

Schroedinger wept.

That’s only if you know the totals. If you don’t, but only know that the envelopes contain X, 2X, and 4X, when you open an envelope and it contains $20, you still don’t know which one of the three you opened.

How did we get three envelopes all of the sudden?

I think that Ascenray is the correct one that there are two As.

Does this change if you have no idea what the amounts are (as implied in the OP)? Upon finding $4, you don’t know whether the other envelope has $2 or $8.

There’s also nothing in the OP indicating that you make your choice AFTER opening your envelope. Does it change anything if you have a sealed envelope in your hand?

To clarify my example, there are only two envelopes. The possible outcomes are
[ul][li]Your envelope contains $1; the other contains $2[/li][li]Yours: $2; other: $1[/li][li]Yours:$2; other: $4[/li][li]Yours $4; other :$2[/li][/ul]

If you really have no idea what amounts could be in the two envelopes then it’s meaningless to talk about the expected value of the envelopes, because there is no probability distribution that would model such a situation (as OldGuy pointed out).

This is the key. You can’t have the same variable standing for two different amounts.

*E.g. let’s say the envelopes contain $500 and $1000.

If the second envelope contains A/2, then A must equal 1000. If the second envelope contains 2A, then A must equal 500.

What you need to do is say, the two envelopes hold A and 2A dollars respectively, where A = 500, in this case.

The two cases are that you are holding A in your hand, or you are holding 2A in your hand. The probabilities of each of those happening are 0.5, agreed? (Barring any psychic ability on your part.)

Therefore, with p(0.5), the value of the second envelope is 2A (because you are holding A), and with p(0.5), the value of the second envelope is A (because you are holding 2A).

Expected value of envelope in your hand = (0.5 x A) + (0.5 x 2A) = 1.5A
Expected value of second envelope = (0.5 x 2A) + (0.5 x A) = 1.5A
So the expected value of both envelopes is the same, and is the mean of the higher and the lower values (in this example $750), which is, er, as you would expect, right?
Now, I didn’t study maths beyond school, so maybe I’m missing some subtlety here, but I am really not seeing that there is any paradox here at all, other than using one letter to stand for two different values and expecting an equation to work.

Can anyone explain to me in layman’s terms why this so-called problem is worthy of its own Wikipedia entry, etc? :confused: