But I don’t know I’m the simulation. I might be me.
I’ve no idea. It doesn’t matter. If I had to guess probably that those 1000 people were carefully chosen and/or coached to take both boxes. As that would mean the people running the experiment never have to give away a million dollars, which I imagine they don’t want to do.
Monty Hall wanted to give away a few cars. These people want to give away a few million. Dunno why, and they seem squicky to me, but i think they will sometimes give away $1M.
One box. A thousand bucks is not chicken feed and pretty good payment for one minute’s work.
But if they’ve done this experiment on 1000 people, then even if they all two-box, they’d still have given away a million dollars! I think you just have to assume the people running the experiment have very idiosyncratic motives for doing so and don’t care how much they end up giving away; often these kind of thought experiments involve “eccentric billionaires”.
I see we’ve done this before
There were already 5 links in the very first reply, including this one.
Yeah but if they predicted one box each time they would have given away a BILLION dollars.
People go straight to supernatural explanations or super intelligent computers perfectly predicting the future, or universe being a simulation. But by far the most obvious explanation is it always predicts both boxes as it doesn’t want to give away a million dollars. Getting 100% success rate is easily explained by coaching. It wouldn’t take much coaching simply explaining that the 1000 is theirs to keep, and there is no downside to taking it would do.
So considering this.
I would be data driven. I don’t need to know how it works, I don’t need to accept its “predictions” claim. The observed data on a thousand trials is that those who picked two boxes got $1000 and those who picked just the opaque box got $1,000,000. That pattern may suddenly change but I’d go with it.
Again this is just mistaking correlation with causation. Everyone who owns a Bugatti Veron has a net worth over 10 million dollars. So if I want to make at least 10 million dollars all I need to do is obtain a Bugatti Veron and that will cause me to earn 10 million dollars or more
It is the observed association. Unlikely to be reverse causation. That’s even less tenable than the supercomputer predictive powers.
Except we know it is. It cannot be a causal relationship, the boxes are filled before we come in the room
What we don’t know is if anyone actually came away with a million bucks. For all we know all 1000 people walked away with a grand and no one walked away with a million. Given how this would be a good outcome for the people organizing the experiment, this seems very likely.
I put forth a similar possibility and was assured no, we have data that it has been “correct” in both cases in a significant number of times.
But what I’m trying to point out is that, unless the computer has precognition, no choice you make can possibly affect the result.
It really isn’t clear to me why anybody’s arguing that it can.
You Cannot Affect What’s In The Boxes By How Many You Take. Unless the computer is using precognition instead of any analysis of your past history. The problem as stated is just nonsensical.
Maybe I can clear it up: nobody is.
I have the evidence of one thousand people being in the same situation before me. I’m not going to reject the evidence I have witnessed because I don’t understand the process that produced those results.
I have observed that everybody who made choice #1 gained $1,000 and that everybody who made choice #2 gained $1,000,000. I’m going to follow the evidence and make choice #2.
Sorry, I only scanned answers after the first 20 or so, but it doesn’t seem like my question has been answered.
According to the scenario presented, the computer will put $1million into the unopened box if it knows you will choose one box. It will put $1000 into the open box:
(Bolding mine.) So why on earth would I not follow this scenario:
Before entering, I say” “Hey computer, I am going to take ONE box.” And I mean it.
Then I go in, disregard the open box containing $1000, and take one box: the unopened box, which I know contains $1 million, because per the OP:
Or did the OP mean, “If it predicted you will take the first box (the open one with $1k in it) it will put $1million into the second.”? If so, that changes things.
So why on earth is anybody arguing that how many boxes you take affects how much money you wind up with?
Nobody is arguing that ‘how many boxes you end up deciding to take’ retroactively affects how much money is in the box.
The argument is that the same factors that lead you to take either one or two boxes are the same ones that the predictor looks at to make a prediction, which is why the prediction is accurate.
Maybe.
It seems to me that some people are saying’why would you take two boxes? You should only take the one box, then you’ll get the million.’ But I’m too tired right now to go back through the thread looking for examples.