New post! And it comes with an activity, just like Highlights! Yay!
I was told there would be no modulo arithmetic on the exam.
As first pirate, I would propose 1/3 split among pirates 1-3 along with killing 4-5.
3: Mmm, good answer. I e-mailed mine to heebie because I wasn't sure if I was playing outside the rules, but yours sounds right.
maybe a 1/3 (or some other split) among pirates 1, 2 and 4 while killing pirates 3 and 5.
I'm mandating that answers have to be submitted in the thread. And the puzzle is still alive! Sorry, Lemmy.
I guess we are to assume that the order is randomized (and completely know or unknown when it is the 1st pirates turn?) by some fair ungamable system?
I said the first pirate would propose he go 4th (50% of gold, 3 dead pirates). But whether the others would vote to let him live would have to depend on how farsighted they are and what their proposals are, so more of a psychological than logical solution.
Are these perfect-logician pirates and do they know the order in which they have to propose solutions?
I don't think you're allowed gratuitous killing Lemmy, and without it your plan won't work. Pirates 2 and 3 only get one third of the gold, and don't get to see anyone die. But if they vote down the first two plans, they are guaranteed to see two people die and still have a shot at a third of the gold.
I think pirate one has to give half the gold to pirate 2 and half to pirate three and none for himself. It is the only way to ensure a three vote majority. Every other outcome gives the others a chance to see some bloodshed for the same amount of money.
I don't think I understand your question, JP.
Fun! I assume that the pirates do not trust each other (and so can't make agreements). In that case, I think the first pirate gets almost all the gold. Here's my analysis, working backwards:
a. Assume that no agreement is reached before the fifth turn. In that case, the fifth pirate gets all the gold and gets to see all of the others die.
b. Assume that no agreement is reached before the fourth turn. In that case, there is no solution that the fifth pirate will agree to (because there is nothing he would like better than seeing the fourth pirate die *and* keeping all the gold for himself).
c. Assume that no agreement is reached before the third turn. In that case, the third pirate can give himself all 100 pieces of gold. He will vote in favor, and so will the fourth (because the fourth pirate knows that he will die on the fourth turn if no agreement is reached on the third).
d. Assume that no agreement is reached before the second turn. In that case, the second pirate can give himself 98 pieces of gold, and the fourth and fifth pirates one piece each. He will vote in favor, and so will the fourth and fifth pirates (because it's better than what they would get on the third turn).
e. So we come to the first turn. The first pirate can give himself 97 pieces of gold, and then give 1 piece to the third and 2 pieces to the fourth (or the fifth). Each will vote in favor because it is better than what he would get on the second turn.
And no unauthorized killing?
pffft. some pirates these are.
Knowing whether they know who gets to make the 2nd proposal before voting on the first does seem critical.
At the time that the 1st pirate is making his proposal does each pirate know his number?
13 sounds like the unexpected egg paradox.
no unauthorized killing? they are pirates.
Let's call the pirates Awilda, Blackbeard, Ching Shih, Dampier, and Eustace.
Awilda offers 1 gold piece to Ching Shih and 2 gold pieces to Dampier, and proposes to keep the remaining 97 for herself. Ching Shih and Dampier naturally vote for this proposal, and Blackbeard and Eustace get nothing.
Just realized there's an ambiguity in 13 -- when I say I assume no agreements, I mean no agreements other than those explicitly specified by the rules.
13a is wrong, because the 5th pirate can't kill the 4th. A tie vote means #4 lives.
Pwned by widget. That's what I get for taking the time to choose nice names for the pirates.
Widget is good. Does he get to assume a fixed order like that? Or is the next pirate to come up with a plan chosen at random after the previous pirate is killed?
I've heard it before but I don't remember the answer.
On the quibbling front may we assume the plans must not cut up the gold coins and must be deterministic (ie no pirates a and b flip a coin for n coins)?
20: Oops, you're right. I misread the rules. Never mind.
I've never seen this one before, but here's my reasoning:
Pirate 5 will always vote no, because if all the proposals fail, he gets all the gold.
Pirate 4, knowing this, will vote yes to anything number 3 proposes, because if it comes down to him and number 1, his proposal fails and he dies.
Therefore if 3 gets to make a proposal, it will be "3 gets everything".
But 2 can't win a proposal. 5 will vote against him, as will 3, because if three gets to make a proposal, he gets everything.
So 2 will vote yes to anything that 1 proposes.
So 1 has his own vote, and 2's vote. He just needs to buy off one more pirate. He can't buy off 5 or three, but he can buy off 4.
1 proposes that he gets 99 gold pieces, and 4 gets 1 gold piece. He wins with the votes of 1, 2, and 4.
Do I get a hug?
WIDGET IS THE WINNER! Boy that was fast.
Will, open your arms.
It seems to me that the first pirate should propose that pirates 2 and 3 get all the gold. He votes yes and lives, 2 and 3 vote yes and each get half the gold, 4 and 5 are shit out of luck, and no one dies.
If 2 and 3 are too crazy to be happy with half the total, #1 dies.
No, wait, I retract 24. A tie *does* mean a plan fails, right?
mmmm... ok the 5th pirate will never vote for any plan because if all the previous four plans fail he gets all the gold.
If the 4th pirate is the one to set the contract (first three plans fail) he will always die since the 5th pirate will vote to kill him. This means he has to vote for any plan proposed by the 3rd pirate if he wants to live.
IF the 3rd pirate sets the contract, he can safely give all the gold to himself since since the 4th pirate has to vote for the plan to live (we'll assume pirates would rather live then see someone else die), so he should vote against any bill that doesn't involve him getting all the gold.
If the 2nd pirate sets the contract he's dead because the 5th will always vote no, the 4th might be bribable with money, but it won't matter because the third will vote no, and he'll be killed.
So knowing this when the first pirate sets the schedule he writes off 3 and 5 (always vote no on his bill) and instead offers 4 whatever the value of seeing two pirates killed is to vote for the distribution, while reserving the rest for himself. Two has to vote for him cause he dies otherwise, and one has the third vote. So.
1. 100-2p
2. 0
3. 0
4. 2p
5. 0
if p is greater than 50 the problem collapses to.
3. 100
4. 0
5. 0
There.
20: No, a tie vote means #4 dies.
if three gets to make a proposal, he gets everything
Why? He's got two votes against him. 3 would propose a 50%/50% split with 4 or 5, and they'd outvote the other one.
Wow, these are coming in too fast for me to read and respond to. I'm working through slowly. Hang on. Although at this point you all can referee better than I can.
30: Oh, crap. It's me who didn't read the rules. Forget almost everything I've said.
Beat by preview, but I point out mine is a more complete answer.
My 10 was assuming the voting order was not fixed in advance. (Do we have a ruling on this yet?) If the order is fixed, widget is right.
the 5th pirate will never vote for any plan because if all the previous four plans fail he gets all the gold
This reasoning is incorrect, because the Eustace cannot ensure by himself that all the previous plans will fail: maybe Awilda, Ching Shih, and Dampier (for example) will combine against him. So if Eustace always votes against all plans he may get nothing. So he may do better by voting for a plan in which he gets some gold.
I say we reduce taxes on the rich, because this invariably increases the total amount of treasure, and then sell the idea to Fox as a reality show (with real bloodshed!) for even more treasure. We also spread the rumor that pirates 4 and 5 are having a gay affair, don't wear skull and crossbones flag pins in their lapels, and are secret Muslims. Are any of the pirates live blogging?
I'm beginning to realize why I never did well in school.
Ack, I can't be on this thread. I have to be on a thread talking to students about whether OTC Plan B will "just lead to girls being irresponsible."
So what's the other puzzle, Heebie?
25 has an error, in this sentence: But 2 can't win a proposal. 5 will vote against him, as will 3, because if three gets to make a proposal, he gets everything.
2 just needs to sweeten the pot for 4 and 5. So he offers them each one piece of gold.
29: Same error as 25, in this step: If the 2nd pirate sets the contract he's dead because the 5th will always vote no, the 4th might be bribable with money, but it won't matter because the third will vote no, and he'll be killed.
36: The order is fixed in advance. (And pirates are perfect game-theory style economists. My apologies if these omissions deprived any one a big hug.)
And pirates are perfect game-theory style economists.
This may actually be the wrong way round.
Pirate 5 picks up the barometer and notices that a hurricane is coming. He leaves the beach to seek shelter and comes back afterwards and picks up the heavy gold that is left after the storm. (If he is a Web 2.0 Pirate he leaves a webcam running so he can satisfy #3 by recording the other's getting killed either by each other or the storm (plus he can sell the footage to The Weather Channel and/or Fox).
40: Shouldn't I save it for a future Puzzling Friday Adventure?
Ok, taking 42 and 37 into account.
I'd say the game ends on round one with a distribution of
1. 100-2p (where p is value of seeing a dead pirate)
2. 0
3. 0
4. 2p or 0
5. 0 or 2p
One can pick which one he buys off.
49: I was being silly. economists --> pirates.
also:
||
SWPUTL?
|>
48: That wouldn't pass, because pirates 2, 3, and whichever is getting screwed between 4 or 5 will all vote no, because then they get to see 1 die. Pirates like to see other pirates die!
Then it would go to 2, who would keep 98 pieces, and give one to 4 and 5 each, and that would pass.
48 is still wrong, because if A proposes the division (98,0,0,2,0) then B,C, and E vote against it because after A is killed then they know that B will propose the division (98,0,1,1) -- see 13d for the reasoning -- and they will all be better off (B by 98 gold pieces, C by the death of A, and E by 1 gold piece).
48: Also, I had interpreted the rules to mean that each pirate prefers even one piece of gold to seeing any number of other pirates die ("in order of priority" = lexical priority), which means there's no need to calculate a value in gold pieces of seeing a pirate die.
The assumption in 55 is how the problem was presented to me, too.
No, I didn't know the answer to this in advance, but working backwards...
If there is only one pirate left (pirate 5), his plan assigns all the gold to himself, he votes "yes" and the plan passes. This satisfies objectives 1 and 2 for pirate 5
If there are two pirates left (#5 and #4), there is no plan that pirate 4 can draw up that will not result in his death - even if the plan is "give all 100 pieces to pirate 5", pirate 5 should still vote "no" and since the plan must pass by a strict majority it will fail. Pirate 4 will be killed* and we will move to the "only pirate 5 left" condition. Voting "No" on _any_ plan proposed by pirate 4 will satisfy objectives 1, 2 and 3 for pirate 5. Pirate 4's actions are irrelevant.
If there are 3 pirates left( #3, #4, #5)
- pirate 5 should always vote no, regardless of the plan, even if the plan is "give all 100 pieces to pirate 5". Pirate 5 will always do better if the plan fails, since the inevitable result of pirate 3's plan failing is that pirate 5 gets all the gold, and all the other pirates are dead.
- likewise, pirate 4 should vote "yes" on any plan proposed by pirate 3, even if the plan is "give all 100 pieces to pirate 3" - the reason? if he votes no, the plan will fail and he will die in the "2 pirates left" scenario. any 'no' vote will lead to his own death, so objective 1 takes precedence.
- pirate 3 should therefore plan to give himself all the gold, and vote for his own plan. Pirate 4 will vote "yes" and feel lucky to be alive, Pirate 5 will vote no and be mightily pissed off, and the plan will pass.
For 4 pirates (#2,#3,#4,#5):
- pirate 3 should vote against any plan, because if it comes to a 3-pirate-left scenario, he gets all the gold. Even if the plan is "give all gold to pirate 3, he should still vote no, because if the plan fails, he gets all the gold anyway, plus one more pirate dies.
- pirate 4 should vote for any plan that allocates him even one piece of gold. and vote against any plan that allocates him 0 gold. If he votes no, the plan will fail, and he will end up alive, but with 0 gold from the 3-pirates-left scenario. If he is allocated even one piece of gold in the plan, then objective 2 supersedes objective 3 and he should vote yes. If he is not allocated any gold, he should vote no, since obj. 2 will not be satisfied either way, but he will get to watch one extra pirate die (objective 3)
- pirate 5's reasoning is identical to pirate 4
Pirate 2 should therefore propose a plan where he gets 98 pieces of gold, and pirates 4 & 5 each get one piece. Pirates 2,4,5 vote yes, pirate 3 votes no, the plan passes.
In the initial scenario (all 5 pirates), pirate 1 should offer 2 gold piece to pirate 4, 1 piece to pirate 3, and the remaining 97 to himself. Pirates 1,3,4 should vote yes (pirates 4 gets 2 pieces of gold this way, and only 1 piece if the plan fails and we move on to the 4 pirates left scenario. Likewise pirate 3 should vote yes because he gets 1 more piece than he would under the 4-pirate scenario) Pirates 2,5 should vote no, and the plan will pass.
*no, he won't fight back - those are the RULES, and I didn't make them up
55: well if you assume that it's a lot easier. So yea (98,0,0,1,1)
Good puzzle, heebie.
I can't say it was too easy, obviously, because I got it wrong. But it wasn't too difficult, either, because I did understand how to proceed.
Dammit, Widget types faster than I do (and is less verbose)
I used to ask this as an interview question all the time. It gets really interesting when you have five pirates and only one gold piece...
Empirically, the boss pirate divides up the gold according to a prearranged plan according to rank, and if enough people don't like his division they either desert or elect a new leader. Killing the old leader may or may not be part of the solution. In general the leader was able to keep a large share, but only so large.
Here's a tricky followup question: can you say what happens if there are six pirates?
61: As much as I enjoyed this as a blog post, I'd be really pissed if asked this in a job interview.
The interviewer would be lucky I wasn't carrying a cutlass.
Assuming that "first, second, third pirate" are pre-ordained roles, and not randomly assigned, the first pirate's proposal is that 100% of the gold goes to the last pirate.
59 gets it right. Yay, Heebie!
And I'm obviously wrong because I forgot the voting thing and wasn't patient enough to think it all the way through. Sigh. Lemme see if PK can figure it out....
is it time for pirate jokes yet?
YES.
63: There should be multiple solutions. Consider the distribution from the five pirate solution, (97, 0, 2, 1, 0) to start with. The first pirate will give himself 94 pieces, offer the second nothing, and buy off three of the remaining four by offering them one more gold than they would get in the five pirate solution, e.g. (1, 3, 2, 0), or (0, 3, 2, 1), etc.
One multipart one that I like:
Q: What's a pirate's favorite vegetable?
A: ARRRtichokes.
Q: What's his favorite body part?
A: His ARRRRms.
Q: What's his favorite animal?
A: An ARRRmadillo.
Q: What's his favorite letter?
A: P for pirate!
what kind of socks is he wearing? ARRRgyle!
A pirate walks into a bar with a steering wheel sticking out of his crotch.
The bartender says, "Hey, did you know you have a steering wheel on your crotch?"
The pirate says, "Arrr, it's drivin' me nuts!"
A pirate walks into a bar and the bartender says, "Hey, I haven't seen you in a while. What happened, you look terrible!"
"What do you mean?" the pirate replies, "I'm fine."
The bartender says, "But what about that wooden leg? You didn't have that before."
"Well," says the pirate, "We were in a battle at sea and a cannon ball hit my leg."
"Oh, I see," says the bartender, "But what about that hook? Last time I saw you, you had both hands."
"Well," says the pirate, "We were in another battle and we boarded the enemy ship. I was in a sword fight and my hand was cut off."
"Oh," says the bartender, "What about that eye patch? Last time you were in here you had both eyes."
"Well," says the pirate, "One day when we were at sea, some birds were flying over the ship. I looked up, and one of them shat in my eye."
"So?" replied the bartender, "what happened? You couldn't have lost an eye just from some bird shit!"
"Arr," says the pirate, "it was the day after I got me hook."
A horse walks into a bar. The bartender says, "Why the long face?"
A nun, a priest, and a rabbi walk into the bar. The bartender says, "What is this, a joke?"
72: there's more to the answer than that...
72: Don't you need information about the risk preferences of the pirates, though?
The five-pirate solution is (97, 0, 1, 0 or 2, 0 or 2). If the last two pirates are risk-neutral, the new first pirate needs to offer them 2 each (to beat an expected payoff of 1 + watching a death) in order to get their votes. That would make the new distribution (95, 0, 1, 2, 0 or 2, 0 or 2).
If they're risk-averse, then you need to compare their disutility for taking the risk of being the pirate who gets nothing in the five-pirate scenario against their utility from watching the new first pirate die, which means you're going to need to calculate the p that Asteele wanted to know above.
If they're risk-preferring (and they could be! they're pirates!) then the new first pirate would need to offer more than 2 to the new fifth or sixth pirate.
"Arrgh! What time is it? Where's me bottle o' rum?" the pirate said groggily.
PK says the first pirate should offer everyone 20 gold pieces, because that's fair, and all this game theory crap just pisses him off.
Did you tell PK that pirates aren't always fair? (not to mention economists...)
81: A nun, a priest, and a rabbi walk into the bar. The bartender says, "What is this, a joke?"
And the priest says. "Yeah, yeah, I know; most guys just leave her hanging on the tree."
87: He knows that. It wasn't so much fairness per se he got hung up on as forgetting that the pirates actively like killing other pirates, if they can get away with it.
That said, I didn't tell him much because after I explained the logic of pirates 5 and 4 he started to get it and yelled at me to shut up, he didn't want to hear any more.
So thanks a lot for making PK yell at me, Heebie.
Q: What kind of salad do pirates like?
A: ARRR-ugula!
Q: What if they're at sea and can't get arugula?
A: What? How about some ARRR-tichokes?
Q: At sea? Like from a jar?
A: Okay, how about a-GARRRR?
Heebie, I'm so happy you're posting here. The combination of math/logic and nurturing/fostering/adopting posts nicely reminds me that there are other crazy (in a nice way) women like me.
91: they can use LARRRD for dressing. mmmmmmm, lard.
91: Q: What kind of salad do pirates like?
A: ARRR-ugula!
Pirates, the original coastal elites.
84 is right: you can't say what will happen unless you know the attitudes of the pirates to risk (and their knowledge of each others' attitudes to risk).
So widget wins again.
(You may also need to know whether pirates can credibly make promises, since if the first pirate of the six were to be killed, the second pirate would have a choice (as in the original puzzle) and so may be able to make a promise about which way he'll make that choice before the vote on the first pirate's proposal, and so possibly sway a voter.)
PK was right about pirates, and according to the research of Herbert Gintis et al, few people (except sociopaths, economists, and some libertarians) are game-theory rational. Pirates are like soldiers of a tiny, predatory nation, and they must be loyal to each other. Up to a point.
So thanks a lot for making PK yell at me, Heebie.
ARRRR, instigating fights!! That's what inspires the pirate in me!
Q: What do you get if try to build a neural network that will pass the Turing Test, and have all the inputs be conversations with pirates?
A: ARRRRR-tificial intelligence.
Thanks, Jackie! I'm glad you're glad. :)
I'm glad that Heebie and Jackie are glad. (Cue Eric Clapton).
I'm furious that John Emerson and Heebie and Jackie are glad.
pirate jokes i can't recall, here's a cowboy's
so two cowboys walk into the bar, the first one says, -look, she is my gf
-which one, which one?
the first cowboy shoots and points: - the falling one
104: these were cowboys, jms. Who said anything about women?
I know better than to play puzzles with you high-information people, so I'll just say 'Shiver Me Timbers!'
103: That's roughly the plot of Willy Nelson's "Red-headed Stranger". Moral: don't touch a cowboys's horse, even if you're his GF.
Read, that IS a joke. Americans don't really kill their girlfriends in a casual manner like that. Not even cowboy-Americans.
95: I'll balance that against my guilt for participating in upsetting PK.
Note that if the pirates can credibly commit to future actions then the five-pirate solution breaks down as well. For example, the second pirate could, on the first round, promise pirates three through five an equal share, which they prefer to the outcome in the case where commitment isn't possible. But then the third one could offer a three-way split of the gold, and then the first might try offering even more money to the fourth and fifth, and so on. Maybe you wind up with multiple solutions, or none at all? I'm not sure. At any rate, the position of the pirates later in the order is much strengthened by the availability of bargaining.
For 6 pirates, it gets tricky because there are 2 optimal distributions, and we'll probably end up appealing this to the rules committee. But given that our pirates are "perfectly logical economists" I'm going to say that they will vote for a plan where their allocation is higher than their expected allocation in the next lower plan. (as opposed to their maximum possible allocation in the next lower optimal plan, or their guaranteed minimum allocation in the next lower optimal plan)
1 pirate {100} 1 Aye
2 pirates {x,100-x} 1 Aye, 1 Nay, fails
3 pirates {100,0,0} 2 Ayes, 1 Nay
4 pirates {98,0,1,1} 3 Ayes, 1 Nay
5 pirates {97,0,1,2,0} 3 Ayes, 2 Nays
alt: {97,0,1,0,2}
2 solutions for 5 pirates, equally valid and subject only to the lead pirate's whim, therefore the expected value is 1 gold piece each for the two lowest ranked pirates.
that yields 3 solutions for 6 pirates:
6 pirates {95,0,1,2,0,2} 4 Ayes, 2 Nays
or {95,0,1,2,2,0}
or {95,0,1,0,2,2}
meaning the expected value is 4/3 gold piece for each of the 3 lowest ranked pirates
there are 6 solutions for 7 pirates
7 pirates {95,0,1,2,2,0,0} 4 Ayes, 3 Nays
or {95,0,1,2,0,2,0}
or {95,0,1,2,0,0,2}
{95,0,1,0,2,2,0}
{95,0,1,0,2,0,2}
{95,0,1,0,0,2,2}
expected value is 1 gold piece for each of the 4 lowest ranked pirates
For N pirates,
- the second ranked pirate can't be bought off (and so should be allocated 0). The 2nd ranked pirate will always vote for you if N=3, or against you otherwise, regardless of the gold allocation.
- the third ranked pirate can be bought off with 1 gold piece (not necessary if N=3)
- any lower ranked pirates can always be bought for 2 gold pieces (or 1 if N = 4)
So the payoff for the top ranked pirate is
100 - (2*ceiling((N-1)/2) - 1)
i.e. for 7 pirates, the top pirate gets
100 - (2*3-1) = 95 gold.
at 50 pirates, he gets 51 gold
at 100/101 pirates, he only gets 1 gold
(1 for himself, 1 for pirate 3, 2 each for 49 other pirates in the range 4-100)
at 102+ pirates, the top pirate will always be killed, because there is no way to buy off a majority of pirates, even if he takes no gold for himself.
Is this sufficiently run into the ground?
72: Okay, I thought about this a bit more on the bike ride home, and concluded that the first pirate can do better than 94 pieces in the six-pirate version.
Start with the five-pirate solution, (97, 0, 2, 1, 0). The first pirate can then screw over pirates 2 and 4, keeping their 99 gold pieces, and bribing 3, 5, and 6 to vote yes to his offer by upping their gold by 1. So the six-pirate solution is: (96, 0, 1, 0, 2, 1).
First pirate proposes to take nothing and divide the money four ways among the other four. That way everyone gets their top priority, i.e., living, and four of them get a guaranteed payout on top of that.
Priority c. is a red herring and at least the first few comments seemed to overprivilege it substantially (I only read up to 20 or so).
The benefit of my proposal is that it requires no further votes, meaning that every pirate knows he will live -- something he's not willing to trade for the pleasure of seeing other pirates die, according to the hierarchy given.
Damn it -- I go off the meds for a couple days and look what happens.
before reading this thread, i'm going to go with 16 gold to pirate 2, 34 gld to pirate 3, and 50 gld to pirate 4.
(reasoning: Pirates 4 and 5 can expect 50 gold each, so you can't have both their votes. Buy one, lose the vote of the other. Pirate 3 probably can't get more than 34 gold on his own, so he can be bought for that. Pirate 4 is looking at likely death, so anything sounds good to him. But he could vote against your plan since he'd like to see you die, so you have to give him the max he could get if he proposed a similar plan on his turn. You get nothing, but get to live, which is #1.)
i went for the glory, but alas i was not up to the task
Can someone explain why the vile Kotsko's idea isn't the solution?
114: You didn't need to divide it four ways, because you don't need four "yes" votes.
Attempts at realism ruin this kind of puzzle. Unfortunately, social scientists don't believe that there is such a thing as "reality" (**snort ** chuckle ** giggle**).
111
Bargaining is only viable if punishing a pirate for reneging on a promise is worth at least one piece of gold.
Thanks Heebie for the puzzle. It was pretty fun to work out.
As with all problems that rely on long backward-induction chains, the "right thing to do" is more indeterminate than that, even if you're a perfect economist, so long as there is less than metaphysical certainty that it is *common knowledge* that everyone is such an economist who can never make a mistake.
Because what if #1 proposes 20/20/20/20/20, like PK suggested? What are the other 4 'perfect economists' supposed to do, given that something that should have been impossible has just happened? Well, it depends on what they believe might happen next. Do you interpret his action as a mistaken "tremble" that has only a tiny tiny probability of repeating itself? Or as an indication that, in fact, you were mistaken in your original belief that #1 was a perfect economist? And #1's anticipation of that reaction can influence his first choice.
More concretely, suppose #1 offers 98/1/1/0/0. Using backward induction, #2 sees that #1 fucked up; he will be able to offer 98/0/1/1, etc., as outlined by Widget, next turn so why should he vote for #1? And 4&5 would be better under that, etc. But #3 would rather have #1's division than #2's division, so will vote yes, and #2 would rather live than die, so the key question is: why did #1 make that mistake? Does it signify something about the other players' not being what he thought? Suppose it does, and means they're all crazy motherfuckers, as crazy as this #1 guy--then, fuck, he'd better just vote Yes and hope he makes it out of there alive. But if #1 is *certain* #2 would make that calculation, and vote yes to 98/1/1/0/0, then even if #1's not a crazy motherfucker, maybe he should pretend to be, etc.
All of this is dealt with very well in Roger Myerson's Game Theory textbook, which I recommend if you're into that thing. He does a nice job showing just how mindfucky the "beliefs" side of equilibrium analysis can get.
Heebie:
In a couple minutes, check out the flicr. BR has been cheating on you.
I should check my Marcus Rediker to see if there are jokes in the appendix.
Pkrate walks into a bar. He has a ship's wheel sticking out of his fly. Bartender says "hey, cap'n, I don't mean to pry, but did you know you have a ship's wheel sticking out of your crotch?"
Pirate says "Arr! It's drivin' me nuts!"
125: A businessman tried to use that joke on me and a girlfriend at a bar. We just looked blankly at him. Then he sat down and tried to explain why, actually, it's really funny, and maybe we weren't smart enough to get it.
This is a good illustration of Leifer's maxim that game theory is a theory of games that do not need to be played.
126: he didn't know how to sell it, clearly. Sad.
128 to 129.
Also: damn, damn, damn, damn, damn.
Sifu, you were just shot down. Isn't there something that nees to be attached to a wall?
Sifu's dignity has been attached to a wall and now he can't even urinate.
No Pirate is going to make the mistake of bringing a knife to a gunfight, or worse, some fancy mathematics.
You need to watch The Good, The Bad, and The Ugly. Or, remember how Americans actually do this overseas.
The first pirate is going to pull two guns and with one he'll shoot Pirate #2, the quickest Pirate, while fanning his second weapon on the other three. We're now down to four pirates while he drops his empty weapon and quickly pulls a replacement.
He fires a second time, again at the remaining fastest gun. We now have three pirates; and our prepared Pirate is still holding a loaded weapon on the two slowest Pirates.
We hear him say, "I vote I get 100 pieces, and each of you has a chance to live."
There's a fairly good game-theory game out there that my eight-year-old nephew thinks is great. The premise is that you are all a bunch of colonial traders trying to get shrunken heads or some shit, and each go round you trade under a different set of operating rules, like we want yellow beads this time and will give you four reds for one green unless you don't have any greens in which case you get blues, which this go round end up being either intensely desirable or the kiss of death. It's a well-designed little teaser of a game, and I'd bet PK is old enough for it. And you don't have to do any freaking set theory, thank God.
134: how can you fan a revolver while holding a revolver in the other hand? Do you use your forehead?
136: prehensile *cough* *cough* of course
In the life of Genghis Khan you really do have very complex multi-person games as the various leaders try to pull together dominant coalitions by attracting splinter groups and malcontents from each of the other players. The big players were important because of the coalitions they commanded, but large parts of each coalition could shift sides at any moment, and each sub-coalition could break into sub-sub coalitions. Mongol armies were organized so that the least trusted allies were on the front line, and any ally who showed up late was assumed to have been thinking about switching sides.
137: turn your head and then cough, please.
136, Pirates don't have revolvers but single shot hand-held canon, so your meaning of the term "fan" here is meaningless. To fan also means to move something back and forth -- so imagine your wrist using the weapon to alternately cover one then the other of the two pirates.
135: Totally racist.
That said, I need to find a book or something of these games that are age-appropriate. Until he learns to be wrong gracefully.
140: but if the three remaining pirates rush him, the chance they'll get shot is only .33, the chance they'll die is even less, and the chance they'll get between .33 and .5 of the gold is higher yet.
Racist as all hell, sure. Actually, the beads and shrunken heads and whatnot are barely even present as decorative elements: it's all about the rules and navigating them to your advantage. I tried to find the game online but no googling would get me there.
Sifu, you're being silly; men who do not trust each other will not allow each other to get in range "to rush." Watch a Samurai movie. Or a decent Western.
The rules as laid out were simple; I want to live, I want the gold, I'm willing to kill to live and get the gold. In every other scenario I'm likely dead or I get nothing. Preparation pays off. The first guy who knows he's shooting and does so has the advantage. See The Unforgiven. And you can never carry too many pistols.
Have you seen Quigley Down Under? Pirates carried the hand-held equivalent of shotguns in the amount of metal they put in the air. If two Pirates line up, it gets even simpler, because you'll take two out with one shot. Close combat with loud reports is fast and furious and unforgiving.
The scenario as described gives me no other option. Full out war.
143: Ooh, it's an online game thingy. Hmm.
JM's game could be rewritten with Celtic headhunters. Only the Welsh and the Irish would care.
Sifu, if you want to be on the mark, simply call me shrill and too literal. I plead guilty.
JE, I'm part Irish, and I'll drink to that.
Nah I've been defending a losing proposition, here. Pirates can shoot everybody.
Well, obviously. You can't shoot ninjas. Pirates can't shoot bears, either, but they can sure pleasure 'em.
Well, obviously. You can't shoot ninjas. Pirates can't shoot bears, either, but they can sure pleasure 'em.
Well, well, well. Three holes in the ground.
I was wondering, though. After Pirate #1 gets all the gold, does he send it to B.PhD, just to show his heart was really in the right place?
[I grew up on Westerns and Samurai movies, so my values may be a little ... economically irrational.]
153: don't kid. Thursday's your turn in the ground.
That must be a polite version of 'fuck you, clown'. Okay, I know how to take a hint.
Aw, I was hoping you'd say "Who you calling clown, fuck?"
Play with meeeeeee Iiiii aaam suuuuuuuper tiiiiiiiired!
Aw, I was hoping you'd say "Who you calling clown, fuck?"
Play with meeeeeee Iiiii aaam suuuuuuuper tiiiiiiiired!
I tried to make it funnier, really I did. But I, too, am tired.
I can shoot ninjas with a gun that fires bullets of pure love.
I can shoot ninjas with a gun that fires bullets of pure love.
Well yeah, how else do you expect to get more ninjas.
Just don't ask to be their friends.
I refute the claim that pirates have only one-shot cannon-style armaments.
You have 25 highly-motivated race horses and a race track on which 5 horses may run at a time, but no stopwatch. Assuming each horse runs the same speed in every race, how many races do you have to hold in order to find out the first, second, and third fastest horses?
I'll be darned. I used to do these as a teenager, I think out of crossword puzzle books, but it is such a vague description I didn't think I would find them on line. But there they were at the top of search.
The woman whose last name was Edwards had a pair of goldfinch making a nest in her yard, but not in the garage. The robins built their nest in the eaves of a front porch, but it wasn't Ms. Martin's front porch.
Just time-killers, like crossword or jigsaw puzzles
167: I'm going to go with 8, as I can't figure out any easy way to do it any faster. But I am in pretty not-so-great of shape at the moment.
What is the harder problem you have, heebie?
Favorite Greek city-state of pirates?
SpARRRta.
170: Keeping her hands off of me.
Favorite basketball team?
TARRRRRRRR Heels.
Favorite attorney directory?
MAAAARRRRtindale-Hubbell.
Why are all of the world's pirates voting for Barack Obama?
Because their favorite salad green is ARRRugula.
166, Stanley, of course you are right. But my original post [134] acknowledged both "old-timey" pirates and modern ones, i.e., Americans. Corporations negotiate with an Army sitting outside -- there are very few countries with anything economically useful where we don't have a military presence.
As Fred Ward (Dix) said to a Vietnamese police officer in the movie "Off Limits," (while a helicopter with a sniper floated above him in the background), "Don't you people get it? We're never outgunned."
And I suspect that the examples you've provided bolster my contention that these guys are not likely to show up with fancy-pants mathematics. If the first Pirate suggested a split of 98,1,1,0,0 etc. without a gun in both hands, he'd already be dead.
Legal hero?
John MARRRRshall, particularly MARRRRRbury v Madison.
176.LAST: YOU ARRRRRRREN'T TRYING TO SAY THAT ARRRRRRRRTIFICIAL MATH PROBLEMS ARRRRRRREN'T REAL ARRRRRRRRRE YOU?
178 ... I RRRRRationalized it was a RRRRReal math problem that must be solved as a RRRReal pirate, by RRRRRemoval of RRRRReal threats ... piratically.
And forget about the term "Pirate" as a term of art here adding some meaningful distinction -- you could have used "Economists on an Island" and I'd still be right, they just don't call themselves pirates anymore. Companies don't hire them to give stuff away.
ohayo and sorrrry to disrupt the rrrr jokes
so a sailor (or a pirate, you can substitute) tries to describe the sea to his hometown buddies:- i don't know, i have no words, you just sail and sail and sail and there are no any bars
a strange place by all measures
180: Given that Mongolia is about the most "landy" place in the world (along with Tibet, Afghanistan and a few others) probably leads to some interesting conceptualizations of the sea in general within your culture.
The ocean is a desert with its life underground and a perfect disguise above.
How old are the pirates? Do they have arrrrrthritis? How many are arrrrgumentitive? Is one called Limbaurrrrrrgh?
182: Quote those funky lyrics white boy!
Come on, guys. I know you're here. You're just ashamed to admit that you're on the internet on a Saturday, when custom demands that you be satisfying your carnal desires.
Go Presidential if you feel like it.
Well, I'm sort of around but working (dammit).
John yelling at an empty room isn't quite entertaining enough to procrastinate to, so that doesn't help.
I usually procrastinate on my couch, but have avoided injury so far. Maybe I'm not doing it right.
It's funny, I like logic problems generally, but this class of problems, where you're inducing a solution from what a group of other perfectly rational people would do or believe, I find absolutely impossible. I don't know if it's a mental block because I find them so psychologically implausible or what, but I can't do them at all.
I recompiled Lynx to fix a bug, so that I might then have unfogged crack monkey refreshed on my file server where it can then not bother and it goes faster.
Sadly, I didn't need to do all that, I just needed to remove the space after the equals sign.
I wanna be in a Mongol Pirate Horde.
max
['Cowardice, sir? You jest!']
-I recompiled Lynx to fix a bug, so that I might then have unfogged crack monkey refreshed on my file server where it can then not bother and it goes faster.
-Sadly, I didn't need to do all that, I just needed to remove the space after the equals sign.
</read>
I wanna be in a Mongol Pirate Horde.
Admiral
to clarify: hui is khoton, one of our tribes
After Pirate #1 gets all the gold, does he send it to B.PhD, just to show his heart was really in the right place?
YES.
I'm supposed to be writing a plenary speech, for delivery on Monday morning. SIGH.
192: One of the great admirals and explorers, but the castration part bothers a lot of sissy weenie types. You have to be willing to do what it takes.
Lass mich dein Pilot sein
In wolkenlosen Lüften
Voll Sehnsucht ruf ich deinen Namen
Wir werfen alles Geld zusammen
Because he has a pure heart, Pirate #1 one will win, send his gold to B.Phd, run the corporatists out of town, put up his guns, and retire to a cabin in the woods, where (between projects) he will make shrill comments on obscure academic blogs because he reads the posts too literally and because irony and cynicism drive him batshit crazy.
Going back to the tedious reality shit: the unrealistic part oif this is the "no pirate trusts any other pirate" part. In bargaining situations of this sort, negotiating alliances and cutting deals is what always happens. The winners are those who combine a powerful presence, charisma and persuasiveness, ruthlessness when necessary, and trustworthiness when possible and desirable. The general pattern is for pirate-type leaders to be generous and fair to their followers and allies almost all the time, while locating their treachery to key moments when a follower seems capable of threatening them. "There's honot among thieves" doesn't mean that thieves are always hnorable, just that if they weren't honorable most of the time, they wouldn't be very successful (and small-time, loner thieves usually aren't.
Our innate "tribalness" works even for pirates, motorcycle gangs, engineers, academics. We're insiders to the extent we're members of "outsider" groups. What's been of interest to me, once I got in organizations big enough to have a significant number of clearly-delineated outsider groups, like DoS, is how tribal these groups are, with their own language, codes, style, priorities, etc. We would compete inside our groups for alpha positioning, but members of other outsider groups were at some level like citizens of another country, not even employing the same language quite right.
You of course see this on "the internets," where it is impossible to know "all the traditions" ... all the tribal styles and specialized group-thinks ...
112
"at 102+ pirates, the top pirate will always be killed, because there is no way to buy off a majority of pirates, even if he takes no gold for himself."
This is wrong, you don't have to buy off a pirate who will otherwise get killed. So you can't have a solution which leads to a majority of the pirates getting killed as they will support anything the top pirate suggests.
189
"It's funny, I like logic problems generally, but this class of problems, where you're inducing a solution from what a group of other perfectly rational people would do or believe, I find absolutely impossible. I don't know if it's a mental block because I find them so psychologically implausible or what, but I can't do them at all."
Isn't this a symptom of autism?
One of the more deceptively subtle types, yes.
167 169
"You have 25 highly-motivated race horses and a race track on which 5 horses may run at a time, but no stopwatch. Assuming each horse runs the same speed in every race, how many races do you have to hold in order to find out the first, second, and third fastest horses?"
Seven. Divide the horses into five groups of five and run five heats. Then runoff the five heat winners. Let the first, second and third horses in the runoff be A, B and C respectively. Clearly A is the fastest horse overall. Furthermore the only candidates for second and third fastest overall are C, B and the horse who finished second in B's heat and the two horses who finished second and third to A in A's heat. So race these five horses. The winner is second fastest overall and the second place finisher is third fastest overall.
Cute problem.
James exists on an entirely different puzzle plane than the rest of us, I'm afraid.
I passed out and I rallied and I sprung a few leaks.
to clarify: hui is khoton, one of our tribes
I was thinking more along the lines of a paleo-Scottish/Viking/Mongol pirate horde sailing the Tarim Basin in wheeled longboats.
Because that would be cool.
max
['Big axes!']
The last Viking, Karl XII of Sweden, was defeated by the Russians near the Black Sea in 1709. In the following decades, the last westward-expanding Mongols, the Kalmyk/Torgut Oirats, were subjugated by the Russians too; some remain near the Caspian sea, but one large group (the Torguts) returned to Xinjiang in Chinese territory.
The Kalmyks were intermittently allied with the Russians during this period, but in 1709 they didn't fully support the Russians, though I believe they sent a token force.
More at my URL, but that has to be updated following Khodorkovsky'sWhere Two Worlds Met and various books on Karl XII.
In many respects the Cossacks were hybrid Viking-Mongols or Viking-Turks (Cossack = Kazakh). Some aspects of their organization were nomad-like, but like the Vikings and Varangians they also were seagoing pirates and smugglers on the Black Sea right up until the Russian Revolution (Lermontov writes about Black Sea smuggling). Cossacks under Mazeppa were allied to Karl XII, but Peter the Great crushed them before they could do anything.
In short, Max should read up on the Cossacks.
Gogol's Taras Bulba presents a vivid picture of Cossack life. It's fiction, but Gogol was a serious student of Russian and Ukrainian history.
There wasn't much going on here so I went looking for puzzles, and now I feel bad.
(Also: Shearer! Wow!)
201: "It's funny, I like logic problems generally, but this class of problems, where you're inducing a solution from what a group of other perfectly rational people would do or believe, I find absolutely impossible. I don't know if it's a mental block because I find them so psychologically implausible or what, but I can't do them at all."
189: "Isn't this a symptom of autism?"
Not at all. Liking these puzzles is a weak symptom of autism. Thinking that puzzles like this have any bearing on the way real people behave is a symptom of a particularly debilitating form of autism called "being an economist"
Vaguely related to 210: What is the word for "person who makes an astute observation about himself and then obstinately insists it is generalizable to the entire human population"?
[Not related to anything in this thread; this is from a RL interaction]
I sometimes call such people keen observers with their head up their ass.
They can really give you a riveting, detailed description of their colon. But still, it is just their colon.
A lot of second tier novelists are like this.
These are wrathful pirates, not cold-hearted logic machines, so when #1 offers them one or two pieces of gold while he keeps ninety-seven, they will be outraged and cut him down him in a spasm of irrational passion.
Well, I would.
211: Penetratingly self-absorbed.
204: Interesting James. Good stuff. Although I will also say that it gives some insight into your approach to some of the discussions here.
Dude, helpy-chalk, that's eight words. Aren't you the guy who is the expert on 2+2?
Seriously, though, that's pretty funny, because this guy actually is a second-tier novelist, more or less.
204: Wow, that is impressive.
203 is also elegantly concise in it's explanation.
213: Herbert Gintis have actually come up with experimental evidence that this might be the case. The experiments seem a little weak but it's nice to see economists trying to rehab.
Actually, according to Rediker, dividing up the loot wasn't the captain's job; the crew elected a quartermaster as a sort of separation-of-powers deal, who would divide up loot and be responsible for things like rations and watchbills.
See paper: An-AAARGH-chy
http://www.peterleeson.com/An-arrgh-chy.pdf