Logic Puzzles

RE: the paper and the strangeness thereof. If there is no accepted solutions or explanation, what are some of the controversial ones? That problem is making my head spin.

I believe I have the solution to the pirate problem. Call P1 the first pirate to go and P10 the last. P1 offers this arrangement:
90 to P1 himself, 1 to P3, 2 to P5, 3 to P7, and 4 to P9. All those that receive coins vote yes. Reasoning:

If it came down that 8 pirates swim and it is only P9 and P10 left, P9 will propose 100 for himself and zero for P10. The 1-1 vote would make it so.

P8 knows this… so if he had a chance he would offer 1 to P10, nothing to P9, and 99 for himself. P10 would vote yes for this, since it is either that or get nothing for P9’s plan.

P7 knows this… so if he had a chance, he would offer 1 to P9, nothing to P8 and P10 and 99 for himself. P9 would vote for this, since it is either get 1 coin for P7’s plan or get nothing from P8’s.

P6 knows this… so if he had a chance, he would offer 2 to P10, 1 to P8, and 97 for himself. P10 would vote yes since it would beat what P7-9 would offer him. P8 would vote yes as it would give him 1 coin as opposed to zero for P7’s plan.

P5 knows this… so if he had a chance, he would offer 2 to P9, 1 to P7, and 97 for himself. P9 would go for it as only would get 1 under P7’s plan, P7 would go for it because he would get nothing under P6’s plan.

P4 knows this… so if he had a chance, he would offer 3 to P10, 2 to P8, 1 to P6, and 94 for himself. P10 knows that pirates P5-9 would give him less so he votes yes, P8 gets 2 as opposed to the 1 that P6 would offer, and P6 votes yes because it avoids getting zero from P5.

P3 know this… so if he had a chance he would offer 3 to P9, 2 to P7, 1 to P5, and 94 for himself. P9 votes yes as this gives him 1 more coin than P5 would offer, ditto for P7. P5 would go for it rather than get stiffed by P4’s plan.

P2 knows this… so if he had a chance he would offer 4 to P10, 3 to P8, 2 to P6, 1 to P4, and keep 90 for himself. P10, P8, and P6 would make more in this scheme than any later scheme and vote yes, as would P4 who would otherwise get stiffed by P3.

But… none of them will have that chance. P1 knows all of this. He keeps 90 for himself, 4 to P8, 3 to P6, 2 to P4, and 1 to P2. The others receiving coins know that they would fare worse from succeeding pirates so they all vote yes, the vote is 5-5. Make it so.

arrgh–I messed up the last paragraph. As corrected it should read:

But… none of them will have that chance. P1 knows all of this. He keeps 90 for himself, 4 to P9, 3 to P7, 2 to P5,and 1 to P3. The others receiving coins know that they would fare worse from succeeding pirates so they all vote yes, the vote is 5-5. Make it so.

Re MaxTheVool’s double-or-half problem:

It seems to me that the given reasoning about the expected value of switching would only apply to problems like “You have $100 and may, if you like, flip a coin to double or halve your money. Is it a good bet?” As written, the problem is not equivalent.

Suppose the amounts are called S and L, and list the possible events, plugging in some values to make a specific example:

Draw S and hold it - value $100
Draw S and switch - value $200
Draw L and hold it - value $200
Draw L and switch - value $100

Since you don’t know that we used $100 and $200 (it might be $50 and $100), you have no way of knowing whether $100 is S or L. So our table collapses to:

Draw <something> and hold it - expected value $150
Draw <something> and switch - expected value $150

Looks like an even bet to me.

MSU, damnit, I’ve been trying for the last hour to find a fault with that solution and I freakin’ can’t. I give up. I was convinced that erislover’s method of looking at it from top down was going to prove fruitless.

Damn, I hate to be wrong.

Kudos to you, MSU 1978, for stumping me and also props to you, eris, for being the first one to put forth the top-down method.

Don’t need any weighings! Zero blocks are “aluminimum”.

(unless you are the current President of the United States :wink: )

Or perhaps shivering from proximity to a massive amount of recently purchased frozen non-sex-toy vegetables?

(looks left and right, ducks, and runs)

All that you’re saying is true. Trouble is, you’re not answering my question. It’s a bit of an odd question, because I’m not really asking for an answer or a proof. What I’m saying is “here’s a chain of reasoning. It leads to a obviously false result. Where’s the flaw in this chain of reasoning?”. You’ve provided a different set of arguments about the same topic, but that doesn’t point out the flaw.

To make things very precise, here’s the chain of reasoning:

(1) There are two papers. On one of them is a number, and on the other is a number twice the first number.
(2) I choose a paper at random
(3) I therefore have a 50% chance of having the higher of the two numbers, and a 50% chance of having the lower
(4) I look at the number. Call it N. I now realize that the other paper must have either 2*N or N/2 on it.
(5) Some quick arithmetic tells me that my expected payoff if I switch papers is 1.25 * N dollars, as compared with N dollars if I don’t
(6) So I switch
(7) But this chain of reasoning applies no matter what N is.
(8) Therefore I always come out ahead by switching.
(9) Therefore, it is a better strategy to pick a paper at random and take the other than it is to pick a paper at random
(10) Which is ridiculous.
Where is the flaw?

Ultrafilter

The miller’s daughter draws a stone from the bag, and then tells the mayor to show everyone the stone that remains in the bag.

Ooops. Sorry. Page two. [blush]

The flaw is that it is not obviously ridiculous. you estimate your payoff based on you knowledge of the situation. You expect a 25% more payoff becasue you lack the knowledge to know if the other is higher or lower.

Responses for Ultrafilter’s questions.

  1. In addition to the pro-noun “he”, depending on your definition of ‘word’ it could also be “he-he” when used to reperesent someone laughing.

  2. The daughter could pick out black stone, cause some sort of comotion and throw stone off to the side. She then tells the evil mayor she’s lost the stone she originally selected before noting it’s colour, however, if he looks in the bag he can see the opposite of what she choose. Whadaya know it’s black! she must have selected the white one

  3. Man I loved Zelda. I’d say yes. Red items are generally good for hero, blue less so.

Chit! I’m now the third person to say “Whoops, didn’t see the second page!”

Shalmanese:

I thought of that too. But the problem is, it really only applies after you have seen the number on one card.

The chain of reasoning, however, leads to the conclusion that even before you see either number, it is a better strategy to pick one, then switch, than it is to just pick one. This is clearly ridiculous.

However, you do bring up a good point. Maybe the flaw is in the very fact that the benefit of switching is determined after seeing one number, and then applied to the situation before seeing either number. In other words, maybe it is always better to switch, but only after you see a number. So it would not be better to pick one at random and then switch without looking at the first one, because then you would have better odds if you switched back to the first one.

I don’t really have much faith in this answer though…

As Shalmanese points out, you don’t know enough to use the original reasoning. If you step back a little and look at the problem from outside, you’ll see that as valid as that reasoning may be, it just doesn’t apply. The flaw is using your procedure in the wrong kind of situation.

Take another look at the first paragraph of my original reply.

The millers daughter grabs the stone and holds it covered in her hand. She is too afraid to look at the stone, so she has the mayor pull out the ‘other’ stone and show it to the crowd. Since it is black, her stone must be white.

On the twelve coins. When the problem was presented to me, it was billiard balls and one was heavier OR lighter. Since you had to determine if the object was heavier or lighter, you can’t go 13, as you won’t know if it’s heavier or lighter.

Have you heard the one about the three students who are heading back to school, but their car breaks down? They find a hotel and share a room that the desk clerk says costs $30. The students each pitch in $10 and get the room. Later the clerk realizes that he charged the weekend rate instead of the weekday rate of $25. The clerk tells the bell boy to give the students $5. The bell boy is lazy and doesn’t want to deal with the change, so he keeps $2 and gives each student $1. Now, the students each paid $9 for $27, plus $2 for the bell boy equals $29. What happened to the extra dollar?

  1. $30 was paid by the guys to the desk clerk.

  2. The desk clerk gave the bellhop $5 to give back to them.

  3. The desk clerk now has $25.

  4. The bellhop give $3 of the $5 back to the guys, leaving himself with $2.

  5. The guys now have $3.

$25 + $2 + $3 = $30

Will someone explain how they got 29 minutes to the first puzzle?

I don’t get it. :frowning:

The number of cells doubles each minute.

By 30 minutes, the dish is full.

Therefore the dish must have been half-full 1 minute earlier - at 29 minutes.

Hope this helps.

:smack:

It helps a great deal, thank you.

As soon as I read the answer it made perfect sense.
Now I feel somewhat stupid.