THE PREDICTOR IS NEVER, NEVER, EVER WRONG!!! That’s the rule.
If the boxes were transparent, everyone would take both boxes, the predictor would know it, and one box would always be empty. Since one box is opaque, then you choose the one box because you know, WITH ABSOLUTE CERTAINTY, that it contains $1M. If you choose both, then you know WITH ABSOLUTE CERTAINTY that one box will be empty and you will walk away with $1K.
Those are the only two possibilities under the “Predictor is always correct” rule. NEVER would you choose both boxes and have $1M in the opaque box. NEVER.
NEVER would you choose only the opaque box and have it be empty. NEVER.
***I say “never” because the rules state that the Predictor knows his predictions with such certainty as to make an incorrect guess an infinitely negligible possibility. So, the only way a one boxer would EVER be wrong is if you chose to bet $1M against $1K on infinitely negligible odds.
And yes, two boxers admit that they will never with $1,001,000 and that they will always walk away with $1k. So why advocate for that position being the correct one?
jtgain, you have a gross misunderstanding of the two-boxer argument. None of them have said anything about there being a 50/50 chance, or even ANY chance of their being a million dollars in box B if you choose both boxes.
Those of you who keep saying “it makes no sense to be a two boxer” simply are not following the two-boxer logic. You keep restating the 1-boxer logic, and that’s fine. It’s good logic. But just stating the 1-boxer logic over and over and over again does not disprove the 2 boxer logic. You simply refuse to acknowledge that there’s any paradox at all.
I am a one boxer, but I am a one-boxer paradoxer, whereas it looks like most of you one-boxers refuse that there’s even a paradox at all. And that’s just silly.
I can’t restate the two-boxer logic any better than has already been laid out here, especially in Mr Sunshine’s post where he laid it out in 4 very easy steps. If you think that there’s no paradox, and that the two-boxer logic is incorrect, then I think you simply don’t understand the two-boxer logic and maybe never will.
Try this mental exercise instead: Assume you are The Predictor and the Chooser. You are never wrong. Ever. You can’t game the system and predict something wrong and walk away with $1,001,000 or $0. $1000 and $1,000,000 are still your only actual options.
So you make your prediction: I’m going to choose only box B and walk away with $1,000,000. And then the moment comes to actually choose. Now at this point in time, what SHOULD you do, if you want to maximize your winnings? Regardless of what’s POSSIBLE under the constraints of your prediction, what SHOULD you choose? You know box B has $1,000,000. It’s literally impossible for B to be empty at this point in time, because you were The Predictor and the box’s contents were sealed and unchangeable. Your predictions are never wrong. Obviously you are going to pick only box B. But what should you pick, if you wanted to maximize your winnings? Both A&B. We know it’s IMPOSSIBLE to walk away with $1,001,000, but it’s what you should have done if both boxes were full.
As a real world example to this problem, I opened a new bank account yesterday and after going through the usual rounds of questions, the cheerful account manager slid a piece of paper toward me asking whether (a) I choose to have overdraft protection and pay a monthly fee or (b) waive the protection and pay no fee. Complete with checkboxes and a line for a signature…
She said she was good at predicting who would take which option at this point in the application process and sure enough she was exactly correct. Moral - people aren’t as clever as they think and are often easily read; especially by folks who deal with a lot of people in a narrow set of circumstances.
I would imagine with a little practice many of us could take the “Predictor” seat and send a few folks packing with a mere $1000 to show for their appearance on the show…
Yeah, but it’s safe to assume that anyone who would reason this way has a “tell” of some kind, so the alien correctly pegs them as two-boxers. Since the alien has a (presumably near-perfect) track record of predicting who the two-boxers are, it’d be foolish to think yourself sufficiently unique that you can wheedle an extra grand out of the deal. So unless you never even considered being a two-boxer before the offer, consciously or subconsciously, just take B.
The two-boxer argument is like trying to make sense of time-travel paradoxes. You can’t do it by applying the rules of cause and effect that work in our universe, because our universe doesn’t allow time travel.
Our universe also doesn’t allow a perfect predictor, but the paradox assumes that you’re in a universe that does.
To my thinking, there definitely is a paradox as a logical argument can be made for either choice.
I think that what it comes down to is that this paradox is a reductio ad absurdum argument that a perfect predictor is not logically possible.
In the universe we live in, quantum uncertainty combined with the butterfly effect makes any sufficiently complex system unpredictable, regardless of your level of knowledge of the initial conditions.
Newcomb’s paradox seems to show that a different kind of universe - one where a perfect predictor could exist - isn’t possible even in principle because the existence of such a predictor leads to a paradox.
If we’re in our universe, the predictor is impossible, and the scenario doesn’t make sense: from a false premise, anything can logically follow.
If we’re in a universe that allows a perfect predictor (i.e. one which violates unidirectional causality), then the two box argument is wrong.
If you’re somehow going to allow the predictor to exist in our universe, then you have to allow him to actually predict as specified by the OP – i.e. he’s not predicting your choice “before” the boxes are “set,” he’s predicting your choice after, and the two box argument is still wrong, since it’s predicated on a non-existent time when you can change your mind and “fool” the predictor.
Or else you can ignore the logical arguments altogether, and point out that empirically, the outcome is always what the predictor says, and the two-box argument is STILL wrong.
That last is the most powerful argument of all, I think. Assuming we’re agreeing that “wrong” is “getting less money than you could have,” it’s pretty hard to argue with the experimental results.
Fundamentally, I think the problem here is that the two boxers are trying to say “this is a logical reason why you should choose two boxes”, ignoring the fact that it’s a losing proposition. What they should be arguing, I’d say, is, “The fact that the two-box strategy never produces a winning outcome is proof that the initial scenario is impossible.”
You explained this much better than I tried to in post 125. The hypothetical is a fun thought experiment, but it’s a cheat, a non sequitur. The Predictor is a supernatural entity, whose supernatural traits are hand-waved away by decree. He’s not magical or God, we are told–he’s just never, ever wrong. Adding, “but he could be wrong” is the cheat, the attempt to make the non sequitur disappear.
The thought experiment can exist in this universe, without need of backward causality. It works much better if you ignore the brackets in the OP. “Nearly infallible”, rather than “(Nearly) infallible”. Set his predictive ability arbitrarily high, (Anywhere from 51% to 99.9999999% accuracy will work, and will probably create the same 2 tribes.)
The important things that need to be understood to be a two-boxer is that he works by being an amazing judge of your character, and bases all his estimates on stimuli available to him at the time he puts it in the box. He cannot actually see into the future but can read peoples character to a ridiculously high degree.
As such, if he puts the money in the box at 16:00 and you choose at 17:00, any number of people could enter the room in the intervening hour and check the boxes. If they see $1M they would think to themselves “Ah, he’s almost certainly going to take just this box, but he would be better off taking both” And they would be right. There is no paradox, just a very counter-intuitive solution, where the people who make the “wrong” decision get rewarded heavily in advance.
A question for 1-boxers. Say I had a time machine; you pick the box at 17:00, find it has $1M in it. I then send you back in time to 16:30. The $1M is already in the box, when it comes back to 17:00 what do you do now?
So, your solution to avoid causality violation is to induce causality violation?
By the way, my answer is: sure, that works the first N times. Then, the predictor realizes his prediction percentage has fallen way under 50% and the general solution (not the solution in one specific (did I mention just the one time)) violates the terms of the original problem again.
This is why the 2 box argument is simply denying the omniscience of the Predictor (which I agree is illogical, but is one of the rules that we must follow). According to the rules, the Predictor would see that you are going to use a time machine and try to one-up him and take both boxes. And when he sees you will take both boxes, there is no $1M.
And those people seeing money in both boxes would be wrong to say that the chooser should take both boxes. Why? Because if the chooser was so inclined, there wouldn’t be money in both boxes in the first place. So the answer that the chooser *should take both boxes is wrong because by having that mindset, you don’t allow the $1M in the first place.
Yup. This is exactly where I stand on the issue. The paradox is real, and it provides evidence that a perfect predictor is impossible.
Everyone who keeps saying “there is no paradox” all you are doing is restating the logic of the one-boxer, over and over and over and over again. I have not seen ANYONE successfully refute the 2 boxer logic. It simply cannot be done. That’s why this is a paradox. You seem to think that just by restating the 1-boxer logic that this somehow refutes it, but it doesn’t.