Tricky Logic Puzzle

(Not sure if this is the right forum for this)

There’s a gang of murderous pirates. They are motivated as follows:
Highest priority: They want to live. Above all else, each pirate will never take an action that results in his dying.
Next priority: They are greedy. They will act so as to maximize the amount of gold they end up with
Third priority: They are bloodthirsty. They will act so as to kill as many of their fellow pirates as possible.

They are also (of course) brilliant logicians with perfect calculating skills, and are all aware of all of the above about each other. They are also totally untrustworthy, and know that none will ever keep his word if it means getting less gold.
Let’s assume that there are P pirates, numbered 1 … P (with 1 being the most senior and P the most junior), and they have G indivisible and interchangeable gold pieces.
They are going to divide up the Gold pieces in the following fashion: The juniormost pirate will propose an allocation scheme (ie, “1 gets 7 gold, 2 gets 5 gold, 3 gets 0 gold, etc.”). Then they will vote on it, each pirate (including the juniormost) voting yea or nay. If a majority vote yay, the scheme is accepted, everyone walks off with their gold, and no one dies. If there is a tie, or if a majority vote nay, the juniormost pirate is killed, and the next juniormost now proposes a scheme, etc.

What happens for various choices of G and P. For instance, G=100 and P=10?

(Note: I have actually left undefined a subtle but important characteristic of pirate behavior so, assuming my understanding of this problem is correct, which is not necessarily a safe assumption, your first response should be “But Max, what happens when X”)

What happens if P = 2?

Surely it occurs to the pirates that the more junior members that get killed, the fewer left to share the loot and the more gold per pirate. So #1 has a strong motive to vote down every proposed scheme, while the juniormost pirates have a strong motive to propose a plan that gives them the best chance of survival. You might end up with the guys at the bottom of the totem pole giving up any gold at all just to live, while the guys in the middle would have a complicated risk/benefit balance of how much gold they can claim and not get killed for it. Assuming all pirates are perfectly logical, there should be some optimum arrangement, but hang me by the scuppers if I can figure it out.

Ok; after some thinking, the best arrangement I can come up with is that P proposes that all the gold be given to the top 51% of the pirates. His fellow 49%ers will be unhappy of course but they’ll be outvoted. I just can’t think if the majority might decide there’s a better plan. Plus it just occurred to me that the pirates like killing for it’s own sake, not just for gold, so all other factors being equal they’ll vote down a proposal just to see guys die.

It would seem clear to me that if P=2, the seniormost pirate will vote against any scheme proposed by the junior pirate. Because the vote would then tie, pirate 1 gets to kill off pirate 2, and Pirate 1 takes all the gold.

That case seems pretty trivial, at least.

And Lumpy’s right – the seniormost pirate always has the most to gain by voting NO (and thus killing off junior pirates – his survival is basically assured, unless the missing characteristic of pirate behavior is that a majority of junior pirates can gang up on senior pirates and kill them).

So, Max – what say ye? Is there ever a way for the seniormost pirate to get killed? If the only way a pirate dies is by being a junior pirate with a scheme that’s voted down, then it seems like Pirate 1 is safe no matter what.

I don’t think this would work. (I assume by “top 51%” you mean seniormost?). At least half of the seniormost pirates would reckon that they could get a lot of juniors killed off by voting NO just as a matter of principle until the numbers get smaller. And thus guarantee higher shares for themselves.

I would argue that for the junior pirate (making the plan), most schemes he comes up with will get him killed, only because the rest of the pirates are going to be trying to get the juniors killed off (while increasing their shares).

At first thought, I would think that Junior’s only chance of survival would be to convince the junior-most 51% to vote for his plan (out of fear of death).

But this will take much more pondering.

Max – here’s a question about pirate behavior (I don’t know if this is the subtle thing you mentioned or not).

If a pirate has two courses of action, which will not only satisfy priority 1 (staying alive), but such that both choices will definitely satisfy priority 2 equally (i.e. they will get the same gold no matter what) – will they also work to maximize priority 3 (killing off the fellows)?

i.e. if they can vote YES on a plan that gives them the known maximum gold, will they go ahead and vote, if they know that by voting NO, they will still get the same gold AND get to kill more pirates? (Is satisfying priority 2 enough, or will they try to maximize the satisfaction of ALL priorities simultaneously, while keeping the importance levels intact?)

Ok, so we figured out the case where P=2. If P=3 what happens?

1 will vote against as we determined that it is always in his best interest.
2 will vote for it because if he votes against P=2 and he’ll die.
3 will vote for it because if he doesn’t he dies.

The amount of gold shouldn’t matter here, or the specific plan.

If P=4, 2 and 3 should vote against so that the scenario above is in place. Same with P=5 and 6. If P=7 the lowest 4 should vote for it because they’ll all live if they do and die if they don’t. SOmebody else can figure out the higher ones or find out where or if I messed up.

P=1: The lone pirate gets all the gold.

P=2: No matter what split #2 proposes, #1 votes against it. Tie vote. #2 is killed. #1 gets all the gold.

P=3: #3 will vote yes on any split to save his own life. #2 will vote yes on any split to avoid situation where P=2. #3 proposes that he keep all the gold. vote carries.

P=4: #3 will vote no on any split, to force situation where P=3. #4 will vote yes to save his own life. #4 proposes a split where #1 and #2 each get one gold piece. This buys their votes (if the vote fails, P=3 and they get nothing) and the vote carries.

Above this, I’m not sure. I have to figure out how many votes each pirate has to buy, and whether one gold piece will always be sufficient to buy a vote.

Unless there’s only 1 gold piece to divvy. In which case #4 is f***ed, because he can’t buy both votes (#1 and #2).

I agree with the rest of your assessment (and I find it interesting that when P=2, the cap’n gets all the gold, but when P=3, “Junior” gets every last piece).

Good call on P=4. Should probably ignore the rest of what I said then.

One gold piece will probably not be sufficient to buy a vote. For instance, if P=5, then 1 gold piece each would not buy votes from #1 and #2, because they know they can get 1 gold piece each when P=4 – and if this comes to pass, they get to kill #5 in the process. So at P=5, if we determine that #5 needs to buy votes #1 and #2 (I haven’t reasoned out yet whether this is the case) – it would take at least 2 gold pieces to buy them.

Although #5 has a better shot at buying #3, because if P=4, #3 gets squat.

I think Monstre’s question is answered in the OP.

Following that:

The juniormost pirate just has to propose a split that will buy enough votes. To buy a vote, he has to offer more than that pirate would get if P went to P-1.

P=5: Junior pirate has to buy two votes. He proposes a split where #3 gets one gold piece, and #1 or #2 gets two gold pieces.

P=6: Junior pirate has to by three votes. He proposes a split where #4 gets one gold piece, and two-of-the-top-three get two gold pieces. (#3 will sell his vote because he only gets one gold piece when P=5. #1 and/or #2 will accept because two gold pieces now is better than a chance at two gold pieces later.)

And I think it continues like that. The juniormost pirate offers the pirate two steps above him one gold piece, and two gold pieces to enough of the rest to buy the votes he needs.

Until there aren’t enough gold pieces to go around, and Monstre’s Law kicks in.

All that logic and I can’t spell “buy”.

Robot Arm, that’s the pattern I’m seeing, too.

Pirate P can always buy the vote, given that there’s enough gold.

The number of votes #P needs to buy is: floor(P/2) (and with his own vote, that tips the vote to the majority).
The reason they will buy is that if the scheme goes to P-1 pirates, they will get less gold if and when that scheme succeeds.

Also, I’m coming up with the minimum amount of gold needed to make it work (for “Junior”) as only 1 more gold piece than the number of votes needed to buy. (i.e. floor(P/2) + 1 pieces of gold required). After P=5, there will always be one person that needs to be bought off with 2 gold. And the rest can be bought with 1 gold piece each (because they would have gotten 0 on the last round).

Of course, the primary problem I see with the pattern is that it is pretty arbitrary at the higher numbers who to offer the “bribes” to. What if a higher ranked pirate chooses not to accept the 1 gold-piece buyoff, thinking they might luck out and get offered the “2 gold piece” bribe later. (Although chances are that they will get 0 next round).

Maybe this is the “what if” on pirate behavior? Max, what if a pirate is offered a bribe and knows it’s perfectly likely he will get less gold (i.e. NO gold) at the next round (after killing off Pirate #P). Will he go for the sure thing? Does this satisfy #2 (maximizing gold) – as it is maximizing what you know you can guarantee yourself – anything beyond that point is a “maybe” only.

Pirate P can’t offer the senior pirates just the same amount of gold they’d get for P-1, because of the bloodthirstiness rule. All other things being equal, all pirates want to see P dead. So he has to offer at least 1 gold more than they would get for voting him down. (Gold trumps blood every time).

So here’s a little table I did that helps illustrate the pattern I’m talking about.

Tell me what you think, Max. (And Robot Arm).

Now, I think this only works if a pirate will choose a known amount of gold over a “luck of the draw” situation that might yield 0 gold for him instead (in the next lower case). Because the choices of who got the 2 gold pieces is arbitrary at each level. Those who got 0 at case P-1 should get bribed with 1 gold at case P. ONE of those who got 1 gold at case P-1 will get bribed with 2 gold at case P. The others all get 0. So there are different ways to fill that chart.

If we start at case P, there’s uncertainty at who would get specific bribes at lower levels. BUT, if the pirates getting bribed at case P don’t accept it, there’s a very real possibility they lose the vote and get nothing at P-1.

So will they choose certainty over uncertainty?

Exactly, but I don’t think the offers always have to go to the most senior pirates. You only need a majority, so you need to offer bribes of 1 gold to those who would get nothing in case P-1. And then there’s only one more vote to buy (by my reckoning).

Hmm, that’s not quite the way I see it.

When P=5, either of the top two pirates will only sell his vote for two gold pieces, and #5 needs to buy one of those votes. The way gamblers figure such things, a one-out-of-two chance to get 2 gold pieces has an expected value of 1 gold piece.

When P=6, any of the top three would only sell his vote for two gold pieces (have to offer them more than what they get if P=5), and #6 needs two of those votes. A two-out-of-three chance to get 2 gold pieces is worth 1.333 gold pieces.

When P=7, the top 4 pirates would only sell for two gold pieces, and junior has to buy two of those votes. Two-out-of-four chance at 2 gold pieces equals 1 gold piece.

When P=8, the top 5 pirates would sell for two gold pieces, and junior needs to buy three. Expected value is 1.2

The expected value for the top P-3 pirates never goes below 1 (so you always have to offer 2 to be certain of buying a vote), and never goes above 2 (so no vote will ever cost 3).

Admittedly, I’m still working on this theory.

How 'bout it, Max, are we getting close?
On preview, I see monstre’s chart. I think the flaw is at P=6 #1 will not sell his vote for just one gold piece, because he can vote no and then have a 50-50 chance at two gold pieces when P=5. In probability terms, that’s equivalent, so the bloodthirstiness rule applies and #6 tells no tales.

That’s why I feel it comes down to that question – will a pirate choose a known amount of definite gold over an unknown possibility of more gold? This is an aspect of pirate behavior I don’t know how to decide – is this what you were talking about, Max? Will a pirate pick a known quantity of gold over an unknown possibility of gold? or will he let probabilities dictate his choice (even though he might lose the gamble?)