Screw the theory, just gimme one of the damn envelopes already!
[sub]Well, that’s how I would approach the problem anyway[/sub]
Screw the theory, just gimme one of the damn envelopes already!
[sub]Well, that’s how I would approach the problem anyway[/sub]
True, but I’m more engineer than mathematician. “I solve practical problems.”
If all you are trying to do is get better than 50-50 odds, and not necessarily the best odds, there is a simpler approach (and it’s easier to understand why the approach works):
[spoiler]Given the distribution of possible amounts, pick a random number R between the minimum and maximum possible values. The amount in the envelope you open is x; the (unknown) amount in the other envelope is y. If x < R, switch to the other envelope. If x > R, keep the one you have.
Consider these three cases:
In case 1, you switch envelopes and get amount y. Because x will be greater than y half the time, your expected win rate is 50%. Similarly for case 2, you keep amount x, which is the greater amount half the time, so your win rate is again 50%.
But for case 3, if x < R < y, you keep y. And if y < R < x, you keep x. Therefore your win rate is 100%.
There is no way to determine the combined win rate for the three cases because we don’t know the relative likelihood of case 1, 2, and 3. But since the possibility of case 3 is non-zero, the combined win rate must be greater than 50%.
I hate seeing a thing phrased less abstractly than it should be, so I’ll note that
The specific function 1/(1 + e[sup]-x[/sup]) is of no importance. Just, choose a cutoff value via any continuous distribution you like, and stay with your peeked-at envelope if it exceeds the cutoff and switch otherwise. If you happen to choose a cutoff value between the two envelopes’ values, you end up taking the higher envelope for sure; if not, well, you’re basically choosing between the two envelopes at random in an ordinary un-clever fashion, but at any rate, you haven’t worsened your odds of grabbing the better one. So, sometimes no boost, and sometimes a gain, hence a gain overall.
I disagree with this. The outcome is still 50/50. Why does selecting a random number in your mind add any pertinent information to your choice?
Using the envelope scenario, why would it make a difference if I selected $1 in my mind versus $167,284? It still adds no information to what is actually in the envelopes.
I realize now TroutMan already said basically the same thing as I did, in the post immediately prior to mine (as opposed to the post I quoted). :smack:
(The phrasing “there is a simpler approach” threw me. This isn’t different from the Randall Munroe answer; just more abstract. There’s nothing about Randall Munroe’s use of the logistic function specifically that would tend towards better results than any other way of choosing a cutoff value.)
It’s true: You don’t gain any new information about the envelopes by picking a random cutoff.
The purpose of the cutoff is just to commit yourself to a course of action such that you are more likely to switch when you see low values than you are when you see high values. This is all you need.
You could avoid making any random choice and just say “I’ll always switch if I get an envelope with less than $100, and stay otherwise”. And then the probability you choose the better envelope will be 50% * (1 + probability that $100 is between the value of the two envelopes). You get a benefit just in case there’s any chance the cutoff value for your strategy (in this case, $100) is between the value of the two envelopes.
The only problem is, perhaps the envelopes are stuffed in such a way as that it’s never the case that $100 falls between their two values. Then there’s no benefit in using $100 as a cutoff value; doing so is as good as just picking at naive random.
So, the reason (the only reason!) to make the cutoff choice randomly is so we can say “Even though I have no idea how the envelopes’ values are being determined, I nonetheless might pick any cutoff value, and therefore there is definitely some chance my cutoff value falls between the value of the two envelopes, and so some gain by doing so”. But, yeah: If you knew ahead of time it was possible for $100 (or whatever) to fall between the value of the two envelopes, then you wouldn’t need to do anything random; you could just settle on that cutoff value deterministically and in just the same way get a better-than-naive strategy.
It’s a very contrived goal, to be sure, this business of crafting a strategy for envelope switch-or-stay such that, no matter what method is used for initial envelope stuffing, the corresponding odds of getting the larger envelope are better than even, though the amount by which they are better cannot be precomputed without knowing the stuffing method and may be arbitrarily small. A very contrived goal, indeed, but it’s what the random cutoff strategy accomplishes.
This would depend on my personal financial situation. Right now, if I had an envelope that contained $1,000, I’d immediately take it and keep it and not open the other one. Because $1,000 is meaningful enough to me at the moment that I cannot afford the risk of going for the other envelope and having it turn out to only contain $20 or something like that.
Now, if I were rich and had nothing to lose, sure, why not try the other envelope just out of sheer curiosity; after all if I miss out on $1,000, so what.
Or rather, I suppose, keeping in mind the distinction between possibility and probability, I should’ve written “there was a positive probability” in place of the bolded words.
This kind of evaluation is what I was talking about when I mentioned opportunity cost in one of my examples. If the envelope you examine contains an amount of money that isn’t financially significant to you, you’ve got nothing significant to lose by switching, even if the other envelope turns out to be less valuable. In essence, if the contents of the envelope make you go, “Oh, yessss!”, keep it. If your response is more like, “Hey, I could go get a burger”…not so much. This doesn’t give you a better-than-even chance of improving your take, so it doesn’t answer the original question. It is, however, pragmatic. (Also, it doesn’t require fancy math. :D)
Dude, you touched it last.
Ignoring special considerations (for example, if you see $666 you might bet it’s the larger sum since your pal hopes you’ll reject this unwholesome number) best strategy is to pick some positive X and keep an envelope with more than $X. Nevermind what X is — in the absence of any other assumptions, pick X arbitrarily!
Best is to treat the ($X,$2X) paradox without opening either envelope.
BTW, Little Nemo, can you resolve that paradox? It’s much tougher than the SocSec Trust Fund “paradox”!
That’s the answer I was shooting for, though, I must say there are certainly some interesting pragmatic solutions as well.
The puzzle was created by El Camino College mathematician Leonard Wapner based on a principle identified by Stanford statistician David Blackwell where an unrelated random variable can aid in predicting the unpredictable.
I came across this originally in the Futility Closet blog, which is a compendium of stuff drawn from science, technology, sociology, history, math, pop culture, chess, and just about everything else.
The source was here: Blackwell's Bet - Futility Closet
I realize I was insufficiently abstract in phrasing this before, as well.
Here’s the entire, general solution:
Pick any rule under which you are more likely to stay when you see a higher value than when you see a lower value (and, therefore, conversely, more likely to switch when you see a lower value than when you see a higher value). Then you’re done. That’s it.
Why does this work? Well, there are two envelopes out there, High and Low, and four things that can happen: stay on High, stay on Low, switch on Low, and switch on High. By design, the strategy above ensures P(stay on High) > P(stay on Low) and P(switch on Low) > P(switch on High). So the two cases where you “win” are more likely than the corresponding two cases where you “lose”; thus, you are more likely to win than to lose.
[Now, it happens that this is equivalent to “Generate a cutoff value from some distribution in some manner”, but only because any probabilistic yes-or-no decision can be thought of as implicitly “Generate a cutoff value from some distribution in some manner, so that you choose yes with some probability and no with complementary probability”. But nevermind that; that’s not necessarily the best way to think about it.
Put another way: Any strategy for how to play this game is of the form “If you see X in the initial envelope, then keep it with probability F(X)” for some function F, and essentially entirely described by F. And the observation is that F yields a guaranteed better-than-even strategy just in case F is an increasing function (i.e., you are more likely to stay when you see a higher value), which is exactly what we would naively try anyway]
My understanding is that a number of different arguments have been put forth to resolve the paradox. But none of them has convinced a consensus of mathematicians.
I wasn’t aware the Social Security Trust Fund situation was regarded as a paradox.
We’ve of course had many threads on the (X, 2X) envelope paradox before; my knee-jerk reaction to hearing it mentioned is to want to repeat what I said in this post, and, indeed, in the rest of that thread.
So the problem would actually be solved if there were three envelopes rather than 2:
– Pick one envelope at random.
– Note the amount but reject the envelope.
– Pick another envelope at random.
– If it is greater than the first envelope then accept it.
– Otherwise pick the third envelope.
Once I heard of this algorithm, that’s probably what I’d do unless the first envelope contained $10,000 or more in which case I might go for the sure thing.
If we assume the three envelopes always have distinct values, and you are exposed to them in uniformly random order, then that strategy has 2/3 odds of picking the larger out of the second and third envelope. Hooray! (Proof: just look at the six equiprobable orders in which the envelopes might be presented to you, and observe what happens in each). In case you care, this is also 1/2 odds of picking the largest out of all three envelopes (in the 1/3 of cases where the first envelope is the largest, you can’t pick it and are screwed, of course).
Just adopting from the outset “Keep an envelope if it contains $10,000 or more; otherwise, switch” is a perfectly fine strategy that has better than even odds of picking the larger of two envelopes if there’s any chance one envelope has more than $10k while the other has less.
The ONLY reason to not use a fixed cutoff in answering this problem is to get rid of the objection “Maybe the envelopes are stuffed in such a way as that they’re always either both below or both above $10k, so you get no advantage from making that your cutoff”. By picking a variable cutoff capable of taking on any value, you are assured some advantage no matter how the envelopes are stuffed (though the magnitude of that advantage cannot be determined without knowing the stuffing method and may be trivial…).
And, again, the easiest way to think of all of this is like so: ANY mindset under which you’re more likely to keep high values (and accordingly, more likely to switch away from low values) results in better-than-even odds. The better-than-evenness is only neutered if the envelope values never fall in a range your mindset distinguishes between. But if you’re always somewhat more likely to keep an envelope with somewhat more money in it, you’re golden.
Maybe we are talking past each other, but I still do not see how this increases the choice above 50/50. Just to be clear, let’s hash out the ground rules: the amount of money in the envelopes is absolutely unknown, right? It could be $1 or it could be $1 googelplex dollars, correct? And every amount in between has an equal chance of appearing?
Or, as others have suggested, are we assuming that there is a reasonable upper limit? Like if it is a game show, chances are it won’t be more than $100,000 or $1 million at the absolute most. If it is in a bar with your buddies, it would be $50 or $100 max. In no scenario are we ever talking multiple millions. So the context of the game does matter in real life.
If scenario one, then the amounts are infinite and it is always preferable to switch (because any finite number will be eclipsed by the infinite possibilities above).
If scenario two, then I could see your point. If I am with my bar buddies and can discern that the cheap bastards will think $1 is a low prize and $20 is a good prize, then I could pick a number like $5 as a cutoff point to switch and have better than even odds of switching.
But I thought we were talking about scenario #1 where there was no outside information at all.