That depends. If you open one envelope, and it’s $10, and you know that the house’s budget is such that they would never put as much as $20 into an envelope, then you shouldn’t switch.
Now, the details of this will vary depending on the real-world situation. But in any real-world situation, there will come a point where, on seeing one of the two values, you should conclude that it’s probably the larger of the two envelopes.
The presumption is the envelopes are correctly labelled, else what’s the point of this?
This is the thing.
If you cannot look before switching, then it’s pure chance and does not really matter. Just pick one.
A good game show then shows you what the alternative was, so the audience can cheer or groan along with you. Good thing there’s not 3 envelopes
If you can look in the envelope before switching, then:
it contains $10. That means the other contains $20 or $5.
Switch, and you lose $5 or gain an extra $10. 50-50… “Do you feel lucky, punk?”
So it would seem the scenario is the same for not knowing what’s in the envelope.
But, you open the envelope you chose and it contains $20. You will never know if your other choice was $10 or $40 unless Monty shows you. Same thing only different.
Either way, you choose and open one envelope and it contains $X which has a 50-50 chance of being the big prize or the little prize.
Either way, someone handed you free money.
(You can’t have more than 2 envelopes in this game, or else the wording is a lie…)
I think that’s the only way the wording wouldn’t be a lie. If there’s only two envelopes, and they both say the other one either has 2x or 0.5x the amount of money of the other one, they can’t both be correct unless they both have zero money, which makes the other premise a lie. The only way it makes sense is if Envelope #1 gets replaced by a brand new Envelope #3 (or if Monty opens up Envelope #1 while you aren’t looking and changes what’s in there) once you have Envelope #2 in your hand, but the description of the paradox doesn’t make it clear if this is what the actual scenario is.
Someone gives me an envelope with money and I say Thank You. I’m not going to open it in front of that nice person, that would be rude. I’ll put the envelope in my pocket and open it later. I couldn’t care less what would be in another envelope I could have had. One in the hand and all that.
Maybe I’m missing something, but: why can’t they both be correct?
Say I seal a $10 in one, and a $20 in the other, and write on each envelope that ‘the other one either has 2x or 0.5x the amount in this one.’ Isn’t that true of the one with the $10 (because the other one in fact has 2x that amount) and true of the one with the $20 (because the other one in fact has 0.5x that amount)?
Assuming truthfulness on the part of the envelope filling and labeling. …
Both statements are true. The “paradox” (which is really a different sort of logical conundrum) is that the normal logic of expected value says you should always switch to the other envelope. And having switched, the same logic says you should switch again. etc. Opening an envelope, as others have said, tells you exactly zero. Unlike the Monty Hall problem, you’re no smarter for knowing the dollar value of a single envelope.
Setting aside peeking in envelopes and going back to just choosing between two closed envelopes then opening your choice, or switching then opening your second choice …
The problem is confusing discrete outcomes of single trials, e.g. you open only envelope A or B, with the outcome of playing the game umpteen times. If you flip one coin one time, the one outcome you will not get is what expected value predicts: a 50/50 split on one half of a head and one half of a tail.
Concretely, if the envelopes contain $10 & $20, opening one or the other will not give you the expected value of $30/2 = $15. $15 is NOT a possible outcome of a single trial.
This two-envelope scenario is not exactly the same thing, but it barks up a similar tree. Leading to similar confusion on the part of folks thinking about it.
Another way to phrase the problem that is logically equivalent and may be easier to understand since it avoids some tricky language is this:
These two externally identical envelopes form a related pair. Call them A & B. We have put $10 in one and $20 in the other. You may open either envelope and keep whatever you find inside. Now please choose one to open using whatever method you like.
That gets all the misleading fluff out. All really good word problems include stuff meant to mislead. This simpler version of the words cuts to the chase. But it’s the exact same chase as the OP’s.
The mistake here, which happened on another thread which I participated in, is trying to assess probability on an infinite number range. It simply cannot be done. You may have a formula but you really know nothing about anything as long as an infinite range is in play.
So your formula correctly tells you that you have two envelopes with dollar values $1x and $2x. At this point your expected return from picking one envelope is actually $1.5x. Problem is that x can still be infinite, so there are no odds on x being anything at all. Can’t be calculated.
Once you open one of the envelopes, infinity goes away and you can calculate odds. That’s why you need to open an envelope. Because there is no answer without doing that, because even though you have a formula, you can’t apply it.
There is not an infinite amount of money in the world. Infinity is removed from consideration by the terms of the problem.
Even if they’d said “we wrote two numbers on bits of paper” in a 0.5 or 2.0 ratio, that would still exclude infinities. You can’t sensibly multiply infinities by ordinary integers. Yes, there are advanced mathematical techniques which are analogous. But it’d be misleading bordering on an out right lie to call them “multiplication”.
I reject entirely the idea that infinities are present in the problem.
As so often in these threads, xkcd has a relevant cite:
There’s no valid reason to exclude an upper bound. There’s also no valid reason to restrict the problem to integers only. So your conclusion is incorrect on multiple levels.
That’s not quite precisely true. You can come up with a distribution on an infinite integer range, and then once you have that distribution, use it to do probability calculations. What you can’t have is a uniform distribution, where all numbers are equally likely.
Right, they’re not. But it’s not enough to say that infinities aren’t present. We also have to say precisely how they’re not present. We’re used to glossing over the question of what probability distribution we have, because we default to assuming a uniform distribution. But since we can’t do that here, we have to explicitly specify our distribution.
I think the mistake lies in reassigning the value of x in the middle of the calculation.
Let’s assume that the value of the money inside the first envelope you pick is x. What is the value of the money inside the other envelope; it’s either .5x or 2x (with an equal chance of it being either). So the expected value of the money in the other envelope is 1.25x and the expected gain is .25x.
And this is where the classic version of the paradox goes off the rails. It repeats the calculation I just did above with x now being the value of the money inside the second envelope.
That’s wrong. We’ve already established that x is the value of the money in the first envelope. So the other envelope must contain either .5x or 2x. There’s no way both envelopes can contain a value of money equal to x.
So let’s work out this new expected value. If we switch from an envelope containing .5x to one containing x, we gain .5x. And if we switch from an envelope containing 2x to one containing x, we lost x. Both possibilities are equally probable. So the expected loss of this exchange is .25x.
We don’t have an endless chain of 25% gains each time we switch envelopes. We have an endless chain of alternating 25% gains and 25% losses, which average out to zero percent.
Upthread, I mentioned doing this with a $10 envelope (call it Envelope A) and a $20 envelope (call it Envelope B). And let’s say there’s an equal chance of me having just handed you either Envelope A or Envelope B. Whichever one I gave you, you have it now. That’s now yours.
[Long, meaningful pause.]
Hey, would you like to swap? If you don’t swap, you’ll of course gain nothing and lose nothing; the expected gain from not swapping is, as it were, 0. If it’s A and you swap, you’ll gain $10. If it’s B and you swap, you’ll lose $10. So, is the expected gain from swapping — also 0?
But he prefaced that by saying — when you’ve just gotten the first envelope — “So the expected value of the money in the other envelope is 1.25x and the expected gain is .25x.”
I’m asking whether it’s instead the case that — when you’ve just gotten the first envelope — the expected gain from switching to the other envelope is 0. If he’s right, it seems like the expected gain of .25x would mean there’s an incentive to switch; if I’m right, it seems like the expected gain of 0 would mean there’s, uh, not. Which is it?
If we just think about the first switch scenario and the expected value calculation. It doesn’t matter what X is, so let’s set it to $10.
There are two scenarios here.
One envelope contains $10 and and the other contains $20 (or $5). You have picked one at random. There is a 50% chance you picked either one. Then it doesn’t matter if you change.
The first envelope you choose is assigned $10. Then the other envelope is assigned either $5 or $20 with a 50% chance of being either, then you get an expected value of $12.5 if you switch.
Play this out a couple of times with a buddy, and you will see that it is correct.
Why doesn’t the usually expected value formula work in the first scenario? Because the other value is already set. The randomness is just the shuffling of the envelopes.
If you want an answer to this, you first have to make up an exact procedure for how the amounts are decided. If you do that, all paradoxes disappear.
OK, pretend I’m one of the math-clueless because clearly I’ve misconceptualized something.
I’ll take this at face value as fully unambiguously true:
I accept that if e.g. you tell me ahead time that one envelope contains $2 I can assume the odds on the other being $1 or $4 are 50/50 to within a hair’s breadth such that it would take millions of trials for that difference to emerge.
Conversely if you tell one envelope contains 1 Beeelion dollars [pinky to mouth], I’m probably better off assuming the other is $500M, rather than $2B. I can accept that logic. OTOH, if you can swing putting 1.5B into two envelopes for the hell of it, you may well be able to swing $3B. Elmo might well offer someone that bet just to watch them squirm in indecision.
OK, now I’ve accepted that there’s one envelope with $1B in it and a different envelope with $500M, or maybe $2B, but with $500M the (slightly / somewhat / rather ) more likely.
How does that knowledge assist me in deciding to initially choose envelope A or B? IMO it does not. And if I have chosen A, then you give me that knowledge and ask whether I choose to switch, have I learned anything useful to inform my decision? I think not.
In the case where I choose, e.g. A and then the host says: "You are now holding the $1B envelope; wanna go double or halvsies on envelope B? In that case, and only that case I can see that the bigger the number, the less likely that switching leads to double rather than half.
However that is not the problem before us as I understand it.
Where have I gone wrong?
Here’s a different take on the gratuitous “at random” that folks seem to toss into this problem at, well, random points in the problem exposition.
One interpretation is that the total amount is chosen by non-random processes; e.g. Elmo wants to spend $3B to watch some prole sweat. Why’d he pick $3B? Heck, who knows why Elmo chooses to do anything? But it wasn’t random in the rigorous mathematical sense of “random”.
Now $3B can only be divided one way into something that satisfies the “double or half” criteria: $1B and $2B. OK. So now he grabs two envelopes, stuffs the money (the checks?) in, and flips a coin to decide which envelope to label “heads” and which “tails”. We’ve just injected true randomness but no infinity.
Or am I tripping over something simpler, that the problem *as properly stated* involves "magic" envelopes where indeed the amount of money (or the simple number written inside, to take issues of the practical limitations on amounts fully off the table) is dynamic?
IOW imagine I’m holding an envelope (“A”) with the number 100 inside. But I don’t know that. Then Og whispers in my ear “Psst: it’s 100”. OK, so now I know the other is 50 or 200, so I choose to switch to “B” on the logic of expected value.
Is somebody who understands the problem as properly stated asserting that the value in the other envelope now “magically” changes to be double or half of what I’m now holding? IOW, If my “B” is now holding 50, then A has magically become 25 or 100, whereas if my “B” is now holding 200, then “A” magically changes to 100 or 400? And since I don’t know whether B is 50 or 200 I have no way to decide between 25, 100, 100, or 400 as the now-current value of “A”?
But that given the stated nature of infinite sums, the bigger I ratchet the number by lucky repeated upwards swaps, the more the odds will tilt back towards smaller numbers. Such that given a truly vast number of trials the numbers will converge towards someplace?
Is this what we’re talking past each other about?
It’s clear to me that several people are discussing this problem at several levels of mathematical knowledge. But beyond that it’s unclear to me that we’re even all discussing the same problem.
I’m sure not claiming I’m right in my understanding of either the problem or the solution. The closer I look the less I’m sure of.
But I feel like the folks who’re (probably rightly) mentioning infinite distributions haven’t quite “dumbed down” their explanations well enough for semi-proles like me to get it.
I pick up one envelope at random, take out a cigarette lighter, and burn it along with its contents. Then I take the second envelope, put it in my pocket and go home.
But what if, instead of playing it out multiple times with a buddy, I play it out multiple times with you and your like-minded buddy?
I hand you an envelope. I hand him an envelope. I ask you if you want to switch with him, and I ask him if he wants to switch with you. If this is scenario #1, you both reason that “it doesn’t matter if you change.” If this is scenario #2, you both reason that you have an expected increase in value over the long run, and so you both enthusiastically agree to switch — after which, all is revealed, and one of you says “well, I gained X,” and the other one says “and I lost X.”
We play this out as many times as you like: one of you always gains some amount with a switch, and the other always loses the exact same amount with that switch. Is it true that you should both keep agreeing to switch? Or do each of you eventually conclude that, as with Scenario #1, there’s no such expected-value incentive?