A man was walking along the street one day when he looked down and saw these number painted on the ground. What number should replace “xx” ?
16 06 68 88 xx
A man was walking along the street one day when he looked down and saw these number painted on the ground. What number should replace “xx” ?
16 06 68 88 xx
Samarm, good solution, but how do you explain the missing dollar? The guys paid $9, or $27 for the room, the bell boy has $2, $29. Where’s the missing dollar?
There is no missing dollar. My earlier post broke down the answer.
Heres 2 more ways of writing the same answer:
The students ended up paying $27.
The desk clerk got £25 of it.
The bell boy got $2 of it.
OR
The students have $3.
The bell boy has $2.
The desk clerk has $25.
Add them up = $30.
It sounds like you folks are making the miller’s daughter problem more complex than it needs to be. All she needs to do is reach in and grab both stones, and open her hand for everyone to see.
But keeping it in the spirit of how the original problem listed them, shouldn’t the answer be L8(can’t really get a perfect upside down 7)?
You mean, this?
“It seems to me that the given reasoning about the expected value of switching would only apply to problems like “You have $100 and may, if you like, flip a coin to double or halve your money. Is it a good bet?” As written, the problem is not equivalent.”
I agree that the problem is not equivalent. But how is not equivalent? Why is it not equivalent?
Game 1:
I pick a number. I write it on a piece of paper. I then write double that number on another piece of paper. You then choose one of the pieces of paper at random, look at the number, and can either keep it or switch to the other paper.
Game 2:
I pick a number. You then look at it. You then either keep it, or switch to twice it or half it, depending on a coin flip.
These two games are not the same, yet they are very similar. And I’m still waiting for a clear explanation as to why what turns out to be a winning strategy in game 2 is not a winning strategy in game 1.
For further reflection…
Game 3:
I pick a number. I write it down on piece of paper A. I then flip a coin, and based on the coin flip, write double or half the number on piece of paper B. You look at piece of paper A and can either keep or switch.
Is this game equivalent to game 1 or game 2 or neither? Why? If it’s not equivalent to game 1, why not?
Actually, Game 2 would also have a guarenteed 25% extra payout so in terms of payout, they would be identical.
Also, I have been thinking of the pirate problem and I have managed to get a completely different answer.
Pirate 10 says he keeps all the money and none of the pirates disagree except 9.
Reasoning:
Assume 1 pirate, he would state he gets all the money.
Assume 2 pirates, pirate 2 would give himself 100, pirate 1 cant do anything.
Assume 3 pirates, pirate 3 says he would get all the money. Pirate 2 would say no but pirate 1 would say yes since he gets the same amount of money either way and since pirates value life, he is bound to say yes.
Assume n pirates, the n-1 pirate says no but all other pirates say yes due to the fact that they would all of receieved no money in the 1st place.
Pirates are ranked top to bottom, from A-J.
Pirate J is first up to bat.
His proposal is that each pirate be given ten doubloons, and pay one doubloon to each pirate who outranks him.
Pirate J gets 10, gives up 9. Ends with 1.
Pirate I gets 11, gives up 8. Ends with 3.
Pirate H gets 12, gives up 7. ends with 5.
Pirate G gets 13, gives up 6. Ends with 7.
Pirate F gets 14, gives up 5. Ends with 9.
Pirate E gets 15, gives up 4. ends with 11.
Pirate D gets 16, gives up 3. ends with 13.
Pirate C gets 17, gives up 2. Ends with 15.
Pirate B gets 18, gives up 1. Ends with 17.
Pirate A gets 19, gives up 0. ends with 19.
Anyone getting more than 10 will surely be happy to get more than his “fair share.” And Pirate J doesn’t feel like swimming, so he’ll be happy to stay alive.
6-4 in favor. Maybe even 10-0 in favor, as pirates certainly understand the concept of paying “protection.”
So what’s the answer to the miller’s daughter, black stone question?
You didn’t see it?
She pulls a stone out and makes it unavailable for viewing by the crowd (I envision it being a pebble small enough for her to swallow). She then pulls out the other black stone, and displays it to all and sundry, “proving” that her original pick was the white one.
The only way the mayor can contradict her is to admit that he rigged the whole thing in his favor.
There are these three people who go to a hotel and get a room at the Hotel -Gry. The room costs $30/night
…d&r…
Consider if the piece of paper you choose has $0.99 written on it, would you swap? Seeing the number can affect the chances it is the higher.
You have to consider how the person choosing the numbers chose them, to the best of your ability, since it’s not specified how it’s done, which brings in second guessing him, etc.
Most people say ‘chosen randomly’ when doesn’t help since, say, 50/50 chance of $1 or $2 is technically random, but what they normally mean is ‘each number is equally likely’ which is impossible.
BTW I don’t get it either, but I thought this might help.
Also the midget’s wife held an umbrella over his head to keep the rain off.
kaylasdad99,
Wouldn’t the converse also be true? Without using deception she could get the mayor to draw the pebble (thereby losing as he would draw a black stone.)
Considering the two-envelope paradox :
As Shade has alluded to, a lot does depend on how the numbers were chosen for the envelopes. While certain cases can be explained, it’s not clear (to me anyways) if the essential problem has been solved.
It’s fairly easy to see that for any bounded distribution, there’s no problem – given N in Envelope A, the probability of B = 2N will not be 50% for all N.
Of course, you’ll just say you pick the amounts of A and B to be any positive number. The problem here is that this is a uniform distribution over an infinite range. It’s (normalized) integral over all values is > 1 (it’s infinite), meaning it’s not a proper probability distribution. You can’t apply the rules of probability to it and expect them to work. I’m not exactly sure how to explain it simply, I think the problem would be that you can’t derive a particular expectation from it.
However, there are still distributions for which you can get the paradox. In particular, you can have a decreasing distribution (so that its integral over all x is 1) with values 2x and 0.5x sufficiently close to each other that the expectation of ‘switching’ is still greater than 1.
This approach is taken from David Chalmers’ paper The Two-Envelope Paradox : A Compete Analysis?. He notes that in the paradoxical distributions, the expectation of the value in either envelope is infinite. It’s an important point, which might resolve the paradox. While in the original paper he felt it was enough to leave it at “infinity’s, like, weird, you know”, he did later expand on it – see the note at the top of that paper about his “St. Petersburg Two-Envelope Paradox”. I’m still not entirely sold on it, though.
An interesting similar variant is the Kraitchik paradox, presented as a wager :
Two people bet on who has less money in their wallets. Whoever wins gets the money in both wallets. Both players will reason - “If I lose, I only lose the money in my wallet. But if I win, I will win more money than I have in my wallet.” So both will take the bet.
p.s. I bet the miller’s daughter ate the missing dollar.
Sorry, I’m not following your objection. Hers is the task to draw the white stone and save both her virtue and her father’s mill. Given that she already knows of the mayor’s duplicity, she has no motive to employ any strategem to allow him to get away with it.
1,4,9,16,25,36,49,64,81,100. The square numbers.
Each locker is switched when a student that bears the number of a factor of the number comes along. Thus, if the lockers start closed, and they end open, there must have been an odd number of students that switched it. The only numbers with odd numbers of factors are square numbers. Tada!