The idea that there is one course of action an actual person should take is to be rejected, and the question to be rephrased to more accurately reflect the intent of the questioner.
Ok: Find the paradox and resolve it! And.....go!
Go for the switch. Even at n/2, n/2>0.
Switch looks unambiguously preferable to me. The value of keeping your original envelope is n; of switching is ½(2n)+½(n/2), which comes out to 1¼(n). Switching has a ¼n greater expected value, no?
But I must be missing something, because I don't see the paradox.
6: Now, before you look in the envelope: should you switch back?
6: As far as I can see, you don't actually know the distribution on the second envelope...
So, wait, you were handed the envelope at random? There are two envelopes, with $k and $2k, and you've just been given one of them and are being asked if you should switch? Or is it that you're given an envelope and then Evil Heebie Puzzlemistress prepares a second envelope? Regardless, you should switch, since there's a higher expected value in switching, as TLL is saying, but it's more easy to illustrate with the Evil Heebie system.
Oh, duh. I get the paradox now. You can make exactly the same argument I made as to why you should switch from the second back to the first -- the first contains either twice or half as much as the second.
Huh. I'll have to think about that.
Go for the switch. Even at n/2, n/2>0.
The question was should you switch and then switch back.
This line of thinking anyway only trades on the fact that it's new money to you, which makes it an uninteresting kind of answer.
Swap the envelope.
Straightforward utility calculation says: swap.
0.5 * 2N + 0.5 * N/2 > N
[preview]
Crap, pwned by LB.
Unless we assume Truly Evil Heebie, who will always give you the cheapskate envelope and then wander off to buy $1/2n's worth of Corona and cupcakes.
Regardless, you should switch, since there's a higher expected value in switching, as TLL is saying,
As LB is saying. TLL's point had nothing to do with the expected value of switching.
I can't have a Corona with a slice of lime and salt around the rim, you jerk.
Crap, pwned by LB.
And by not reading the question.
I can't have a Corona with a slice of lime and salt around the rim, you jerk.
Excuuuuse me for imagining you might want to spend some of your ill-gotten gains on a present for Jammies, little miss Evil Hormonal Heebie
13: And my formatting was prettier.
The paradox is making my head hurt. I wasn't participating in the probability conversation the other day, but everyone saying 'not intuitive' was right -- I can get to a problem where I can't see the answer easily faster in probability than in almost any other topic.
This is like "Deal or No Deal". It's all free money, who cares how much. Now, if there were a penalty for being wrong (other than being punked on national tv) it might be more difficult. Or if some competitor gets the difference, then one would have to do some math first.
As LB is saying. TLL's point had nothing to do with the expected value of switching.
Some of us former mathematicians are able to understand how handwaving works, philosopher boy! You probably drank all of Jammies' Corona.
Excuuuuse me for imagining you might want to spend some of your ill-gotten gains on a present for Jammies, little miss Evil Hormonal Heebie
Oh, pooh. Jammies would be happy if I let him out of the basement.
there is a proverb saying something about some bird in the hand, then some other bird in the sky
which i always recall when i happen to watch the millionaire
so i'll never become a millionaire with that logic
Hmm... This is a really good question. Even though the economically rational thing is generally to make the switch (except if logarithmic utility functions defining risk-aversion), and I'd probably use that logic to talk myself into taking the money, I'd be fairly apathetic in this scenario.
If the multiple were larger, say 5 or 10 times the amount of money I was offered, suddenly my answer would start to change a lot depending on the amount of money on offer.
I don't see a lot of difference between $50 and a 50% shot at $100 or $25, nor between $500,000 and a 50% shot at $250k or a cool million. But I do see a lot of difference between $10 versus a 50% shot at $100 (where I would take the latter in a second) and $100k versus a 50% shot at $10k or a million (where I'd think really hard, and possibly take the sure thing that's a good years salary).
There are two envelopes, with $k and $2k, and you've just been given one of them and are being asked if you should switch? Or is it that you're given an envelope and then Evil Heebie Puzzlemistress prepares a second envelope?
I think it only makes sense to swap if things proceed the second way, which also prevents you from swapping back.
Our version of that is "A bird in the hand is worth two in the bush."
I'm not getting why you should swap back. Perhaps I am being stupid.
21: But snarkout, TLL wasn't hand-waving. He was making the perfectly straightforward and correct point that either way you've got more money than you had before you found the first envelope. That would still be true if the second envelope contained either $2n or $.25n, at which point the expected value of swapping is less than that of not swapping.
initial setup: you have 2n, the other envelope is n or 4n. Expected value of switching: 5/2n (10/4n)
after switching case 1: you: n, other: 2n or 4n. Expected value of switching: 3n (12/4n)
after switching case 2: you: 4n, other: n or 2n. Expected value of switching: 3/2n ( 6/4n)
overall expected value of switching twice: (12/4n + 6/4n)/2 = 9/4n
9/4n initial setup: you have 2n, the other envelope is n or 4n. Expected value of switching: 5/2n (10/4n)
after switching case 1: you: n, other: 2n or 4n. Expected value of switching: 3n (12/4n)
after switching case 2: you: 4n, other: n or 2n. Expected value of switching: 3/2n ( 6/4n)
overall expected value of switching twice: (12/4n + 6/4n)/2 = 9/4n
9/4n
29: After swapping, you still don't know what you have: if you got the 2k envelope, swapping back gets you half of what you have; if you got the 0.5k envelope, swapping back gets you twice of what you have; therefore the utility calculations give the same recommendation as before and homo economicus will be swapping back and forth for eternity.
Huh, not sure how I pasted that into the comment box twice.
at which point the expected value of swapping is less than that of not swapping
I AM SMRT.
The neat thing about the problem is that it looks totally different depending on how I describe it. If I describe it as "Here are two envelopes, X and Y. I'm going to hand you X. One of the envelopes has twice as much money as the other. Do you want to keep X, or switch to Y?" it's very obvious that switching doesn't change your odds at all.
But described as Heebie originally did, it sounds as if switching is rational. And I can't see any real distinction between the two descriptions of the situation.
Of the two ways of looking at it, I'm finding the one that suggests there's no difference more persuasive, but I can't see the error in the other one yet.
This line of thinking anyway only trades on the fact that it's new money to you, which makes it an uninteresting kind of answer.
New money to me is always interesting, Ben. Don't be so provincial.
after switching case 1: you: n, other: 2n or 4n. Expected value of switching: 3n (12/4n)
If the envelope you started with has 2n, then after switching, it still has 2n. These aren't magic envelopes.
34: anyway, you get my point, I'm sure.
You should not switch back. There are 3 envelopes: (1/2)x, x, and 2x. You are initially handed x. Switching from x to "one of the others" increases your expected takeaway, as explained above. But, on swithing back, there's no probability about anything at all, so thinking of it in terms of "it could be twice as much or it could be half" is wrong. I mean, that's technically right, but it's irrelevent: if you switch back you get x, and you know that.
These aren't magic envelopes
Ben brings up the very good point that we aren't taking into account the value of the envelopes. What if one of the envelopes is magic, or is made out of platinum, or was signed by George Washington?
I'm not getting why you should swap back. Perhaps I am being stupid.
You haven't counted yet, so you don't know whether you got the 2n or the .5n envelope, meaning that the other envelope either has twice as much (you got .5n) or half as much (you got 2n) as the envelope you're holding now.
But haven't we been in this situation before?
Is the idea.
But Brock is right in 39.
When you're holding the first envelope, it makes sense to think of the value in the other envelope as being dependent on the value in the current envelope (especially if things work as in 10's second option). The other envelope has either twice or half as much as the value in this envelope.
That isn't true after you swap.
... homo economicus will be swapping back and forth for eternity ...
Yes, but with each switch the economic pie gets bigger, GNP increases, and the rising tide lifts all boats. Even if Heebie, as the banker, takes a small cut of each transaction, with the size of her cut depending on the total amount in both envelopes..
Someone please figure this out soon. I'm stumped.
32 - Yeah, that's the paradox, but I think it's a trick of artful phrasing. Suppose it's a casino game. LB is being asked to risk a net $50 for a potential payoff of $150 (assuming a fair coin toss, she'll come out $25 ahead per game, over the long run). The switching-back part doesn't have the same utility calculations; assuming a fair coin toss, LB is being asked to wager her envelope with an expected value of $125 for a guaranteed payoff of $100, which will lead to penury and ruin and the study of philosophy.
Yeah, 39 was basically why I couldn't see why I'd swap back.
One of the envelopes is in fact covered with a very beautiful glitter, but it is toxic, while the other is plain manila, but if you kiss it, it will turn into a rainbow. However you have been bewitched by an evil sorcerer and rendered unable to distinguish the two. Should you switch?
This doesn't seem that difficult to me.
It's true that the situation after the swap is similar to the situation before the swap (50% chance of doubling, 50% chance of halving the money), but that similarity isn't what's important.
What's important in solving a problem like this is calculating the expected value. In the first instance you're going from an envelope with a (known) expected value of $3 to an envelope with an expected value of (6 + 1.5)/2 = $3.75 good trade! in the second case you're offered a chance to switch from an expected value of $3.75 to $3 not a good trade.
The reason they aren't symmetrical, is that there's a constraint in the second case (you only double your money if you start out worse) that doesn't exist in the first case.
You're too quick for us, ttaM, you shifty-eyed Scot.
You can make exactly the same argument I made as to why you should switch from the second back to the first -- the first contains either twice or half as much as the second.
Yeah, this is a false paradox:
In this case, you have two envelopes that define the system. One contains $k and the other contains $2k, but you don't know which is which. If you choose one at random, it has an expected value of $1.5k, so swapping makes no sense.
It's only if you know you have one envelope with $k and are offered two new envelopes with the dollar amounts $2k or $k/2 that it would make sense to choose one at random, since now you're holding an envelope of certain (relative) value and can choose between two envelopes of uncertain but expected (relative) value $1.25k > $k.
There are 3 envelopes: (1/2)x, x, and 2x.
That is a much clearer description of the situation.
Okay, for you guys who are convinced you should switch - how would you justify your strategy with LB's X and Y formulation?
One of the envelopes is in fact covered with a very beautiful glitter, but it is toxic, while the other is plain manila, but if you kiss it, it will turn into a rainbow. However you have been bewitched by an evil sorcerer and rendered unable to distinguish the two.
There's a fifty percent probability, however, that at some point in the future (say "Tuesday") you'll be able to distinguish between the two for the first time, and also a fifty percent probability that, for one hour on Monday, you'll be able to distinguish between the two, following which you'll immediately and completely forget about it, and then on Tuesday you'll be able to distinguish between the two. It's currently 12:35 and you can distinguish between the two, and no one else is around and you lack access to a calendar: what credence should you put on the proposition that it's Monday?
For the hour on monday between noon and one, I should have said.
I'm not understanding how LB's X and Y formulation is the same problem.
Spoiler. An analysis of the two-envelope problem by David Chalmers.
35: That's because the way the question is phrased is deliberately ambiguous. It leads to confusing the clear 2-envelope case with the clear 3-envelope case that you and Brock have already answered in such a way that it seems like a 2-envelope case where swapping makes sense.
the probability conversation the other day
I missed that, which thread was that in?
Wait, I think our immediate verdict on the second swap is wrong. Here's another way to look at it. Instead of expected utility vis-a-vis what you have in total, let's look at expected utility vis-a-vis projected gain or loss.
Suppose the first envelope has 10. That's what you have now. Swapping will either give you 10 more or take away 5. So the expected gain/loss is (10 - 5) / 2 = 2.5.
Now you have the second envelope. If you have 20 now, swapping back will lose you 10. If you have 5, swapping back will gain you 5. So the expected gain/loss is (-10 + 5) / 2 = -2.5. Negative! Don't swap!
But described as Heebie originally did, it sounds as if switching is rational. And I can't see any real distinction between the two descriptions of the situation.
The distinction is that the way heebie described it, and the way it's described in 10's second option, the values the second envelope can have depend on the value in the envelope you pick up.
(The value of what is contained by what "the other envelope" denotes depends on which envelope you pick up, of course, but that's beside the point.)
I really think the simple answer is right - you switch once and stand pat, for reasons described.
In making the switch, you accept the risk of losing n/2 in exchange for the potential n upside. You lose the upside (and risk) if you switch back, no matter how you describe that switchback.
Can we get some clarification? I agree with Ben in 59 - if Heebie is preparing a second envelope, you should switch if you think Heebie will play fair. If the envelopes are being offered initially, as in LB's 35, it makes no difference for reasons that I believe are related to the St. Petersburg lottery problem.
Suppose you've already made the first swap and consider the expected value of swapping back.
(n/2 + 2n/2)/2 + (4n/2 + 2n/2)/2 = 9n/4 Suppose you've already made the first swap and consider the expected value of swapping back.
(n/2 + 2n/2)/2 + (4n/2 + 2n/2)/2 = 9n/4 which is less than the (expected) 5n/2 that you have in hand at this point. So you should switch and not switch back.
On preview, pwned by ben in 59.
23: it's not a proverb, but it should be: A bird in the hand can only shit in your hand, but a bird in the sky can shit on your head.
You should switch back. I know this because the handsome boy told Kevin Spacey so in 21.
39, 47, and 49 are all kind of persuasive, but I'm still puzzled.
Say Heebie makes up three identical sets of envelopes, A1 and B1, A2 and B2, and A3 and B3. All the A envelopes have N dollars, and all the Bs have either 2N or N/2.
She walks up to me with A1 and B1, and says "Here are two envelopes. One has twice as much money as the other. Which do you want?" And I'm indifferent, right? My expected value is the same for the one as for the other.
She walks up to Brock, and hands him A2, and says "Look inside. However much money you see in there, B2 has either twice or half as much. Do you want to keep A2 or switch for B2?" And on the expected value argument I originally made in 6, he switches.
And then she walks up to ttaM, and hands him B3, and says "Look inside. However much money you see in there, A3 has either twice or half as much. Do you want to keep B3 or switch for A3?" And on the expected value argument I originally made in 6, he switches too.
Surely ttaM and Brock can't both have improved their expected value, right? To put it another way, Heebie repeats the experiment a hundred times -- half the time the A envelopes have more, the other half the time the B envelopes have more. At the end of those hundred reps, Brock and ttaM are going to have the same amount of money, right?
The point is that on the first switch, the chance of getting 4n is worth the risk of getting only n; on the second switch, the chance of getting 4n is not worth the risk of losing 4n, because you may already have it.
Basically, the best envelope is better than the worst envelope is bad (of course, it's still free money). If you suppose you had 3n, and had to decide whether to switch to one of 2n or 4n, there's no expected advantage either to switching or switching back.
62: If heebie is playing fair, then it doesn't matter when she prepares the second envelope, it'll contain the same contents whatever you decide.
11
"Oh, duh. I get the paradox now. You can make exactly the same argument I made as to why you should switch from the second back to the first -- the first contains either twice or half as much as the second."
As has been mentioned above it is not exactly the same. When you switch back when you double your money you are starting from a smaller amount than when you halve your money so your winnings are smaller as compared to your losses than in the first situation. Ie in the first situation you will win n or lose n/2 while in the second situation you will win n/2 or lose n.
63, 67: L.! Have you been around and I just haven't noticed? And geez, you must be nearly out of college now, right?
So in essence, according to the switch strategy, Brock and ttaM are going to trade envelopes, which can't possibly yield a winning strategy over time for both of them.
re: 66
You are forgetting that I would have conker the envelope bearer on the head and taken ALL the envelopes ... muuuhaahahah
All the A envelopes have N dollars, and all the Bs have either 2N or N/2.
She walks up to me with A1 and B1, and says "Here are two envelopes. One has twice as much money as the other. Which do you want?" And I'm indifferent, right? My expected value is the same for the one as for the other.
No. You're indifferent if, not knowing which is which, you're told that one envelope contains $N and the other contains $2N, but in the scenario you describe you should always pick B.
63 and 67 assuming, of course, that "switching back" does not mean a guaranteed switch back to the medium envelope, and that it's a 50-50 shot each time.
66: In the case you described for Brock and ttaM, they will actually have increased their expected value, but that's because they were offered the 3-envelope problem and not the 2-envelope problem.
If the A envelopes all had $2N or $N at random, and the B envelopes all had the opposite amount, then it's the two-envelope problem and they won't increase their expected value by swapping for envelope B.
Does someone need to run a simulation to prove that either strategy yields the same results?
Right. I think the 3 envelope formulation is misleading.
re: 75
"switching back" does not mean a guaranteed switch back to the medium envelope
That's how it's presented in the original description.
Hi, LB. Yes, I'm nearly out. I've been lurking.
74: You could use a conker though, if you accurate with them.
74: We would totally have all believed that 'conker' was some Scots word for unspeakable violence if you hadn't corrected it.
73: So you're saying that with the problem described as I did in 66, Brock will have more than ttaM after 100 repetitions? Why, and how would you know if you were Brock or ttaM?
She walks up to me with A1 and B1, and says "Here are two envelopes. One has twice as much money as the other. Which do you want?" And I'm indifferent, right? My expected value is the same for the one as for the other.
But what is that expected value? What would you be willing to pay to have the opportunity to play this game in the first place?
It's the answer to that question (and you'll probably need to define the situation more precisely before that question even makes sense) that makes it clear where the paradox comes from.
That's how it's presented in the original description.
There is no medium envelope. But it is a guaranteed switch back to the other envelope.
re: 82
Yeah, we carry around a little shrunken head of one of our previous victims, which we swing at them on the end of a string.
The normal English use of 'conker' comes from that.
It's why 'conk' is sometimes slang for 'head'.
Another way to think of it: "just like before, it could be half or twice what you currently have" is not strictly true. If you had n, and were given 2n, switching can only take you down; if you were given 0.5n, switching can only take you up. It would only be the exact same math if swapping from the 0.5n could lead to either 1n or 0.25n , and the 2n could lead to either 4n or 1n.
35 66
I think the point here is that after you know the amount you have more information. If it is a large amount you more likely picked the larger envelope and should not switch. If it is a small amount you more likely picked the smaller envelope and should switch. Large and small will depend on your expectation before looking in the envelope.
85: Only after the green pithy stuff is peeled off, though.
62: If heebie is playing fair, then it doesn't matter when she prepares the second envelope, it'll contain the same contents whatever you decide.
I just read the Chalmers paper in 55, and this does sort of resemble the St. Petersburg lottery, in that in theory I would always be willing to exchange my winnings for another lottery ticket with infinite expected value. I withdraw 62, and that's my response to 65; the fact that it's bizarre doesn't mean a thing in regards to the fact that it will always be better to switch to the unknown ticket assuming ideal circumstances.
re: 88
Naturally. Then you soak it in vinegar and freeze it for a few days.
73: So you're saying that with the problem described as I did in 66, Brock will have more than ttaM after 100 repetitions? Why, and how would you know if you were Brock or ttaM?
No, they should both switch every time, and they will both end up with the same amount at the end (probabilistically speaking). Because the average amount of money in envelope B is higher than the average amount in envelope A.
Thanks, Lady Heeb. Now I'm depressed and smoke is coming out of my ears.
P.S. 47 seems persuasive to me..
69 - The paradox is a little hairier than that, James -- you can get to the counterintuitive result that it's always best to switch even assuming that Heebie is not just right but God, and the money distribution is truly unbounded. Neat problem, Heebie!
87: Right. The linked analysis in 55, while I don't understand it fully on a quick reading, suggests that that's the answer -- that there really can't be a uniformly 50-50 chance that the other envelope is 2n or n/2, it's always going to depend on the value of n when you see it. Unfortunately, the reasoning is kind of over my head -- I might be able to work my way through it eventually, but I don't get it right now.
AUGH! How can people find 47 persuasive? The expected value of the two envelopes can't be any different, because you don't have any information about the one that you don't have about the other! The fact that you were handed one and not the other is completely immaterial!
Posting without looking:
Yes. The expected outcome if you switch is (2n + (n/2))/2 = 3n/2, or 50% more than n. The trick here is the hidden exponential in 2n vs. n/2: the upside is an order of magnitude larger than the downside.
OK, I retract 95. The problem in the post was stated where you know the contents of the first envelope. I had forgotten that part.
Yeah... In the two-envelope version, the real paradox comes when you supposedly get to open the envelope you chose. That's what causes your envelope to go from being one of two identical envelopes containing an expected $1.5N to an envelope containing $100 that you chose with a 50% probability, while the other envelope contains either $50 or $200.
It seems that no new information has been given to you that would compell a swap, since you took your 50% chance and the other envelope had equal expected value. But it also looks like the other envelope has a higher expected value, so you should swap.
The link by pfd gave some sort of complicated calculus method out of this for generalized distributions from which the dollar amounts are chosen, which seems unnecessary. But it's certainly a naggingly difficult thing to explain.
Yeah, if you have a prior probability distribution over the average amount of the two envelopes (before opening the first) that's anything but completely flat, then the amount in the first envelope gives you information about what's in the second. And you *don't* have a flat distribution, because that would mean you wouldn't be any more surprised if the first envelope contained 30 dollars than if it contained 10^30 dollars.
91: No, they should both switch every time, and they will both end up with the same amount at the end (probabilistically speaking). Because the average amount of money in envelope B is higher than the average amount in envelope A.
That's silly -- you can't say both that 'they should both switch every time' and that 'the average amount of money in envelope B is higher'. Under the 'switch' strategy, Brock always takes B, and ttaM always takes A. If the average amount of money in B were really higher than in A, then Brock should switch and ttaM should stand pat. And if that's the case, what's the difference between what Brock knows and what ttaM knows that lets Brock figure out he should switch but doesn't let ttaM figure out he should stand pat?
The winning strategy: figure out what you think $1.5n is (the average of the two envelopes) based on what you know about the person giving them to you. Then if the contents of the first envelope is lower than that, switch.
For the record, I have no idea how to resolve the paradox.
99: some sort of complicated calculus method
That's "some sort of complicated method of fluxions method" to you, bub.
Note that 102 has nothing to do with risk aversion.
101: Oh, wait, I misread your comment and thought they were both starting with the same lettered envelope. Hm.
The expected value of the envelope is .5np1 + 2np2, where p1 is the probability that the second envelope is lower and p2 is the probability it's higher, which are not equal, because of the non-uniform prior distribution over the amounts.
Damn, should have looked. I made the same mistake that got ironed out 90 messages up-thread. Also it's 5n/4, not 3n/2.
103 - Some googling suggests that you are not alone.
108: Or to be more precise, ½np1 + 2n(1-p1).
I think the point here is that after you know the amount you have more information. If it is a large amount you more likely picked the larger envelope and should not switch. If it is a small amount you more likely picked the smaller envelope and should switch.
Ah! Thanks, James!
Yeah, that's pretty much it, and it (sort of) resolves the paradox I gave in 99.1, which is that somehow the expected value of the other envelope has to equal the $100 you found, which would suggest a 2/3 chance of the other envelope containing $50 versus a 1/3 chance of the other envelope containing $200.
Basically: since the amounts in the two envelopes are $N and $2N, we define the probability distribution of both envelopes the moment we define the probability distribution for N. And the distribution for the $2N envelope will necessarily be more stretched out in such a way that the smaller number will on average be more likely. That's how the paradox gets resolved.
(I hope no one else wrote this while my boss was over here)
I think this is a misstatement of the old "let's make a deal" puzzler, isn't it? Switching in that scenario makes sense because additional information is introduced.
Here the decision seems like it would be determined by the marginal utility you assign to an extra 0.25n dollars that an initial switch theoretically represents.
Now we can address the conundrum of whether you should giggle pathologically or shit your pants. For me, it would depend on who "you" are.
LB, if it helps, try this scenario:
1. I give you the opportunity to purchase, for $20, an envelope with either $40 or $10 in it. I've determined which it is by fair coin toss. Is he right to buy it?
2. I give you, the opportunity to purchase, for either $10 or $40 (determined by fair coin flip), a spanking new $20 bill. Is he right to buy it?
Obviously the answer to the first is "yes" and the second is "no" -- it's counterintuitive, but if you assume fair odds and uniform prior distribution, you should always try to keep the unknown. The secret is that in real life it's not going to be fair odds and uniform prior distribution, as PMP says in 112, since whoever's playing this game with you doesn't have an infinite amount of money.
Similarly, a St. Petersburg lottery ticket really is worth an arbitrary amount of money, assuming you're in Gambler's Heaven where they play an infinite number of games (the expected value rises the more you play) despite the fact that that makes no sense and the real-world value of a ticket would be about $3.
Ah well, 100 nailed it well before I got to write my realization (which was actually already in pdf23ds's link, I just didn't bother paying enough attention to it because I was so annoyed at trying to interpret calculus equations in plaintext).
Also, from now on I should be able to remember pdf23ds's pseudonym without checking, because I now think of him as Doctor Slack if he suddenly became the Michael Jordan of working with Adobe Acrobat.
The secret is that in real life it's not going to be fair odds and uniform prior distribution, as PMP says in 112, since whoever's playing this game with you doesn't have an infinite amount of money.
Well, and also that there's not really any such thing as a well-defined uniform distribution over the positive real numbers. Its probability density function would be 0 everywhere.
I never actually read that link myself, because it contained calculus. (And because I wasn't planning on really getting sucked into this thread, I just wanted to make sure people knew this was a standard problem.)
117 - Yeah, good call. Wow, it's been a long time since I took my probability seminar.
117: Well, I'm really fuzzy on the math here, but I do believe that there's such a thing. The probability of any given point, or finite subrange, would be 0, but it would have a Lebesgue measure of something or other and mumble mumble mumble. But as I said, I'm fuzzy on the math.
Obviously the answer to the first is "yes" and the second is "no" -- it's counterintuitive, but if you assume fair odds and uniform prior distribution, you should always try to keep the unknown.
So, assuming fair odds and uniform prior distribution, look back at my 66. Who has more money after 100 repetitions, Brock or ttaM? Surely they have to have the same amount, right? Which means that 'keep the unknown' doesn't work.
I don't understand it completely, but it seems as if you can't assume fair odds and uniform prior distribution. (and of course if you can't, pd23ds' analysis works.)
LB! I thought at least you could get my name right!
I maintain that (1) this is not a tricky problem, as stated, (2) my analysis in 39 is right, (4) the two-envelope problem is a different problem, which is more paradoxical, although 55 seems to address it adequately.
Aw, crap. I was trying so hard not to just call you pdf that I got the end of it right and dropped the 'f'.
Chris. You have a blog, and it says your name is Chris. I can spell Chris.
123: But the problem, as stated, doesn't have a third envelope. And look at my 66 again. Who has more money after 100 reps, you or ttaM?
Now is it still simple?
120: Well, even if we construct something like a uniform infinite distribution for $N that's mumble mumble (and I assure you, I'm at similar constraints for dealing with non-complex functions, I did not pay enough attention in Measure Theory for this), the resulting distribution for $2N would still be a doubly-stretched mumble mumble that apparently makes the probabilities work out correctly according to the link.
If Jesus wants each envelope to have more money in it than the other one, that's what will happen, and the money will keep doubling until everyone has been blessed. Just keep switching.
If this doesn't work, contact to President Palin immediately. It's the loaves and fishes economic plan she inherited from Dubya.
125: LB, in 66, you're not indifferent, you want B1. And I'm not indifferent, I want to switch to B2. And ttaM's not indifferent, he wants to keep B3, and not switch. All this is driven by your setup, which is that "All the A envelopes have N dollars, and all the Bs have either 2N or N/2". This means the B's have a higher expected value. Read 39 again.
Loaves and fishes were the original Ponzi scheme. Lucky Jesus had enough self-control to not push his luck and do it five times in a row until the whole crowd was buried by mountains of bread.
Loaves and fishes were the original Ponzi scheme. Lucky Jesus had enough self-control to not push his luck and do it five times in a row until the whole crowd was buried by mountains of bread.
Well, I heard that Jesus did push it, and when the people found out they had been taken, well they crucified him. Literally.
125: In 66, you've fixed the probability distributions in such a way that ttaM would be balancing a loss of $N versus a gain of $N/2 while Brock balances a loss of $N/2 versus a gain of $N. And they would be able to figure this out after a few tries at the game, settling into an equilibrium where both take envelope B each time.
If the $N becomes a random variable, and you flip a coin each time to decide if the B envelope contains $N/2 or $2N, then it's still clear that the B envelope is a better choice, and after several trials (a lot more than in the first case) Brock and ttaM will probably settle on choosing envelope B after they notice its contents have a wider variability.
But basically, this is an artifact of labelling the two envelopes and denoting one as the "center" envelope (i.e., the one containing $N). Once you do so, the other can be identified and it will always be a better choice.
Now, here's where gangstas/impoverished grad students are quicker on the uptake than philosophers and mathematicians: They would grab both envelopes, given that the original owner appears to be a) up a cliff b) up a tree or c) on stilts, and run like crazy. None of this computing the odds, positing imaginary 3rd envelopes or talking in "n" and "2n" nonsense.
One of our kittens is insane, BTW. Or maybe hyperactive. It's kind of a toss-up at this point. If I hand you one kitten and offer to switch that kitten w/the other kitten, what are your chances of ending up with the insane kitten?
To continue on 128, and to answer your question "Who has more money after 100 reps, you or ttaM?" more directly, if ttaM and I both switch every time for 100 reps, I'll end with more money. Using your notation from 66, he'll have 100*N, and I'll have 50*(1/2)N+50*2N=125N.
Of course you freaking swaq it. n/2 is still free money, and 2n would be awesome.
If I hand you one kitten and offer to switch that kitten w/the other kitten, what are your chances of ending up with the insane kitten?
It depends, is the kitten I've currently got attacking my thumb like it stole something?
120
"Well, I'm really fuzzy on the math here, but I do believe that there's such a thing. The probability of any given point, or finite subrange, would be 0, but it would have a Lebesgue measure of something or other and mumble mumble mumble. But as I said, I'm fuzzy on the math."
No. The integral from 0 to infinity is the limit of the integral from 0 to x as x goes to infinity. So if the integral from 0 to x is always 0 then so is the integral from 0 to infinity.
What I really want to know is, how can I get an envelope with a couple thou in it to cover closing costs, so I can use the 5k we've already got to shave a point off our interest rate?
103: Aargh got posted on a Friday afternoon when I got sucked into the maw of giant clusterfuck at work.
I heard it originally from christian h here. His formulation was a bit more of the classic 2-envelope where someone else opens the envelope, but does not tell you the amount, but offers the switch.
Thanks for reminding me of why I hated real analysis, Evil Heebie. (The answer is to keep the money, since Heebie is evil, if right.)
Also, kittens are *supposed* to be hyperactive.
137. Have the seller pay the point out of the sale proceeds. You realtor should have already negotiated that for you.
141: Will do. At the moment the seller's offering 2,500 to cover repairs, which is insane--there's no way that'll cover repairs. However, we have learned that the house the seller is currently occupying is being foreclosed on, *and* they have a sick child with medical expenses (no idea on the age of the child), so they need cash, and they need it quick. So we want them to do the freaking repairs *and* give us the 2,500 for closing costs.
Any other advice, hmmm?
email me for free advice that will be worth what you pay for it.
You want to switch the first time and then not switch the second.
At the start your expected payoff is n.
Once you switch your expected payoff is 1.25*n.
Should you switch again your expected payoff goes back to being n. You lose.
This isn't a paradox.
I was away, and I'm not back, what is currently being debated? Reading the thread, I'm confused about the current questions.
As far as the original post, I think snarkout's description in 115 is the simplest explanation of why the two scenarios are not symmetrical:
1. I give you the opportunity to purchase, for $20, an envelope with either $40 or $10 in it. I've determined which it is by fair coin toss. Is he right to buy it?
2. I give you, the opportunity to purchase, for either $10 or $40 (determined by fair coin flip), a spanking new $20 bill. Is he right to buy it?
The answer is to figure out if there's a way to track down the owner, you amoral robots.
26 some bird in the hand was titmouse and the other one in the sky is crane, i looked up
and the first version is considered to be better
not s/b "now".
The answer is to figure out if there's a way to track down the owner, you amoral robots.
JP Stormcrow?
147: This explanation has only rendered the proverb more confusing to me.
Here is another way to think about the problem where two envelopes are prepared and you are given one at random. Suppose you pick a threshold x and switch if the first amount is less than x. Then if both amounts are less than x you will always switch and get the larger amount half the time. If both amounts are greater than x you will never switch and get the larger amount half the time. But if one amount is less than x and the other amount is greater than x then you will switch iff the first envelope contains the smaller amount so you will always get the larger amount.
One titmouse is definitely not worth two potential cranes. That is, if we are presuming that the "bird in the bush" is one that I am trying to shoot, as I think the proverb assumes. Cranes are far easier to shoot than titmeese.
147 cranes are obviously more valued than titmouses
maybe they look beautiful in the sky
sorry, i got confused too
so the titmouse is more like valuable b/c it's in the hand already
94
"Right. The linked analysis in 55, while I don't understand it fully on a quick reading, suggests that that's the answer -- that there really can't be a uniformly 50-50 chance that the other envelope is 2n or n/2, it's always going to depend on the value of n when you see it. Unfortunately, the reasoning is kind of over my head -- I might be able to work my way through it eventually, but I don't get it right now."
Yes. Suppose for example there are ten pairs of envelopes prepared containing (1,2),(2,4),(4,8) ... (512,1024) dollars and you are allowed to pick one pair at random and open one of the envelopes. Then most of the time you will see an amount between 2 and 512 (inclusive) and you will know the other is equally likely to be n/2 and 2*n so you will gain on average by switching. But if you get a 1024 you are certain to lose 512 by switching and this will balance all your gains so you will just break even by always switching. It is impossible to have a distribution that doesn't tail off at the top like this.
Heebie's formulation is the 2nd in a series of three "related" problems that I think expose some of the issues involved. (And I think each may have all been invoked somewhere on this thread, for instance 115, but tough shit, it's my puzzler and I'll post if I want to.)
(Assume the voice is an "honest broker", and you can't just grab and run and there is basically a lawful framework. You know, not like a presidential election in the US.)
Problem 1: You have $10 - someone offers to trade you an envelope that has a 50% chance of having $5 and 50% chance of having $20. This is straightforward, you should make the trade assuming trust, etc. for E(x) = $12.50.
Problem 2: Basically, what Heebie said, but lets say specify $10 in the envelope you open. This seems like 1 and you should switch, but see below.
Problem 3: Same as 2, but you don't see the amount. (Any argument on switching the 1st time here clearly leads to the infinte regress.)
Napi's solution basically takes problem 2 and turns it into problem 1 by stipulating a third envelope. I think the *real* problem with both 2 and 3 is that there is no single real world where they can happen as stated. In 2, once the problem assumes the person has $10 in the envelope, there is no actual 'two-envelope" world where the 50% 2x,1/2x condition can be met. Same is true for formulation 3 only worse.
Hey I know! Let's have a Bayesian vs frequentist fight!
And I'll add that the evolution of LB's thinking on this (as evidenced by her comments) paralled mine almost exactly. (Although it started from a slightly different formulation of the problem.)
I think you should definitely switch if the first envelope contains an odd number of dollars. What kind of omnipotent tester puts loose change in one of the envelopes. In any event, you would hear the quarters jingling.
Hey I know! Let's have a Bayesian vs frequentist fight!
This is all I can think of anytime I see Bayesian vs frequentist come up.
Just claim you're a Bayesian.
"I hired the best tutors at Chicago to teach [Kenneth French] statistics. After 5 years, I gave up. The man is an imbecile. Then one of my Ph.D. students, a classic bullshit artist, claimed that as a 'Bayesian,' he wasn't bound by the same rules as everyone else. A lightbulb went off in my head. Obviously that would never fly at a Chicago, but surely it would work at that Truck School. I was right."
Shearer's 154 makes sense. As long as the distribution of possible payouts has a finite highest payout, there are some values for the first envelope for which you shouldn't switch.
It is only when you make the assumption that there is no highest possible payout that you should always switch. It doesn't really matter in that case because it would be highly unlikely that either envelope contained less than some absurdly high but finite amount.
147: titmouse . . . is considered to be better
Well of course it is.
There's equivocation going on here between two possible games.
1. Heebie prepares two envelopes: one contains $1000, one contains $500, she shuffles them and gives you one. She then offers a swap. Your expected value is $750 whether you swap or not.
2. Heebie prepares an envelope that contains $1000. She gives you it. She then tosses a coin (privately), heads she prepares an envelope with $500 in it, tails she prepares an envelope with $2000 in it. She then asks you if you want to swap. You accept, since your expected value is $1250.
On the description given, you can't tell which game she's playing. Only the first is equivalent to the "pick an envelope of two".
Note that if we start with the second game, then asking if you want to swap back is to revert to the first game.
163: I've tried every kind of mouse, brother. Only the titmouse truly satisfies.
DS, darling. How I miss your worthy words at times. What is this untoward messing about with probabilities and whatnot? What about the human factor, the human factor: I ask you!
Y'all can make all the titty jokes you want, but you have to admit that the titmouse is a freaking adorable bird.
165: What about the human factor, the human factor: I ask you!
Parsimon mon coeur, well said as always. I can only agree.
(One might even add, what about the homonid-rodential factor? But I've added enough of that already.)
Please, we're not all francophones here. Homonid translated into English is homonest.
Hm, well, my MOTHER is a fan of the titmouse, B. So there's that.
Speaking of which, I'm not at all happy about this declared disaffection for the hot librarian motif that floats about here and there. Don't let the perfect asshole be the enemy of the good angel.
Birdwise, I saw a flock of egrets in a small pond on a recent bike trip. They looked like this. Not my photo, though.
Don't let the perfect asshole be the enemy of the good angel.
See, you don't even need the strikeouts.
Strikethrough, I mean. Whatever. I'm just over here doing my homonest thing.
I noticed that. I'll not forgive Palin for this, in any case. (I can't believe I said her name. No!)
Assuming that the first envelope contains a known amount of money N, and the second has a 50% chance of containing 2N and a 50% chance of containing N/2:
You should make the first switch. You shouldn't make the second switch.
The apparent paradox is that the situation seems symmetrical. The second envelope might contain twice as much as the first, or it might contain half as much as the first. The first envelope might contain twice as much as the second, or it might contain half as much as the second. So should you switch back and forth forever?
No. The situation is not symmetrical. In the first case, what you know is how much money, N, is in the envelope you've got. Whether the envelope you haven't got has twice as much or half as much, that amount of money N is the same. The expectation value of what you get from switching is (1/2) (2N + N/2) = 5N/4.
In the second case, call the amount of money in the envelope you've got M. In this case, you don't actually know M; you know N, the amount in the first envelope.
If the envelope you haven't got has twice as much, then M is smaller: M = N/2. If the envelope you haven't got has half as much, then M is larger: M = 2N. The expectation value of what you get from switching is (1/2) (2(N/2) + (1/2)(2N)) = N. (Well, of course--you KNOW the first envelope has N.)
No paradox. Stick with the second envelope.
I told you people the other day that probability was counterintuitive, but would you listen?
They never listen, Gonerill. They have to work it out for themselves. Eventually they may laugh.
176: What were the chances that we'd listen?
"We'd" s/b "they'd." Those who weren't tackling the Real Titmouse Issues.
176: Actually, the right answer here is intuitive, but trying to explain leads to confusion.
Intuitively, starting from x, you either get 2x or 1/2 x. If you lose, you lose 1/2 x, if you win, you gain x.
Even more intuitively, starting from $100 you either lose $50 or gain another $100.
But when you ask, "Why not change back? The other envelope is still either 1/2 or 2x the one in your hand." The answer to that is not intuitive.
179: You surely messed that one up, DS. Don't tell me you were secretly absorbed over whether the first envelope was just not enough? And felt the need to maximize. What is the expectation value?
Does anyone actually think this sort of thinking about choices generalizes to, you know, life? I genuinely don't know the answer.
Don't tell me you were secretly absorbed over whether the first envelope was just not enough?
I, of course, would take the first envelope and flee in case the cops arrived on the scene. But I don't assume my instincts are generalized.
The description linked in 55 is actually interesting. The amount of money in the first envelope is drawn from a given probability distribution. If that probability distribution tapers off (say because there's a fixed amount of money in the world), then it's not equally likely that the other envelope will contain x/2 or 2x the first one. If the probability distribution doesn't taper off, then you're dealing with infinities, and you probably can't talk in non-infinite terms about how much money is in the envelope and expected values and such.
Does anyone actually think this sort of thinking about choices generalizes to, you know, life? I genuinely don't know the answer.
The specific details of this paradox, probably not. But the process of reasoning the mathematical implications of a problem stated in possibly vague words is surely useful in life. If nothing else, it will let you reassure yourself that no matter what the convincing man on the other end of the phone says about the deal he's offering you, he must be lying or mistaken.
Does anyone actually think this sort of thinking about choices generalizes to, you know, life?
Absolutely. There are two well known similar problems, the Grocery Store Line Choice and the Traffic Jam Lane Choice. In both of those, it's clear that whichever lane one chooses will turn out to have been the slow one. They are examples of the circularity of caustion at the quantum level (where quantum means 'one thing', as in 'if it's not one thing it's another'). In other words, if you're Shrodinger's cat, you're always a goner.
There's a really neat description of why the traffic jam paradox isn't actually a paradox in that Traffic book.
I presume that the disembodied voice in the problem is a deliberate trope to distinguish the case at hand from the one-envelope problem that A. Michael Spence rode all the way to the Nobel prize.
For if we posit that the entity making the offer to switch is a real person possessing a similar utility function and actual knowledge of the value of the envelopes (as opposed to knowledge of the probability distribution function describing their values), we can reason as follows:
The other envelope contains either a greater or lesser amount of money than the one in my hand. If it were greater, then the other person would not offer to switch with me. Therefore I should keep my envelope.
IOW, the problem as stated implicitly rules out the existence of asymmetrical information.*
*the game theoretical concept, not McMegan's blog; would that it were so easy!
This must have been said before, but I want to win the thread, so:
As has been shown above, switching the 1st time increases your expected value from n to 1.25n, so switch.
If you switch back to the 1st envelope, which has n in it, you can either lose n (if the 2nd envelope has 2n in it) or gain 0.5n (if the 2nd envelope has 0.5n in it).
Assuming a 50-50 chance of either outcome, the expected value of switching back is:
(0.5*-n)+(0.5*0.5n)
This equals 0.25n-0.5n, or -0.25n So you should not switch back.
The nonexistent "paradox" comes from the assumption that you can just assume the 2nd envelope has n and the first one now has either 0.5n or 2n. In fact this assumption is invalid and makes an accurate comparison of the expected value of the first switch and the second switch impossible.
so that birds proverb was a Russian national proverb, pretty wise imo
our proverbs tend to sound more like avanturistic
for example, 'am'd khun argatai - man alive has always some means'
or 'am'd yavbal altan ayaganaas us uuna' - you'll drink from the golden cup if alive'
and the proverbs emphasize not survival, but hope and luck and are used in the betting situations
saying that proverb our man will pick up the second envelope most probably
I'm winning this thread, though.
Just wait. This is the Hands on a Hard Body of threads.
The winner of Innocence was decided by executive fiat, if you'll recall.
197 - Which was briefly available on Google Video, but has now been taken back down.
Do you people really believe in disembodied voices from the sky? And I used to respect you so much!
Live and learn, I guess. I should have known that people who will go and get married have no sense about anything else either. Life is hard for people who are right all the time. (Me, not Heebie!)
Actually I rather like winning threads.
140: This kitten is hyperactive for a kitten. And is annoying the other kitten no end.
For reasons unknown, both of them think the Biophysicist's jeans are a scratching post. This annoys the Biophysicist.
203: Heebie is Donald P. Sinclair. She and the other FPPs are demeaning us while they make side bets via e-mail. I for one won't tolerate it.
It's a race! I'm winning!
Heebie: married. Sifu: married. After I tried to tell them.
Now, disembodied voices from the sky.
Next: prosperity theology, personal relationships with Jesus, and earnest approaches to their unsaved friends.
A number of people have given what seems to me to be the right answer. But a little more kibitzing to help dispel the appearance of paradox can't hurt. Here's my shot. First, let's rehearse the answer:
Scenario one. Two envelopes, A with $200, and B with $400. You are given one on the basis of a coin flip. Do you swap? No point. Because the chances are 0.5 that you were given A and 0.5 that you were given B. If you were given A your profit after swapping would be $200 and if you were given B it would be -$200. Expected profit: 0.5*$100 + 0.5*(-$100) = $0.
Scenario two. Three envelopes. A with $200, B1 with $400 and B2 with $100. Do you swap A for one of B1 and B2? Sure. Because the expected profit is 0.5*$200 +0.5*(-$100) = $50, which is >0.
What about swapping again? No. Because the chances are 0.5 you chose B1 at the first swap, and the result of swapping a second time would be a profit of -$200, and the chances are equally 0.5 that you chose B2 at the first swap, in which case the result of swapping a second time would be a profit of $100. Expected profit, 0.5*(-$200) +0.5*$100 = -$50, which is quarter but rather one half the 'your money' you would be halving, leaving you with an expected profit/loss of 0.
Hiya, read!
Hmm, overfilled the comment-box. From the beginning of the last para in 207:
What about swapping again? No. Because the chances are 0.5 you chose B1 at the first swap, and the result of swapping a second time would be a profit of -$200, and the chances are equally 0.5 that you chose B2 at the first swap, in which case the result of swapping a second time would be a profit of $100. Expected profit, 0.5*(-$200) +0.5*$100 = -$50, which is less than zero.
But why the appearance of paradox?
Well, the first swap in the three envelope case is a matter of either halving or doubling your money. You end up with either $0.5x or $2x. The appearance of paradox arises from describing the second swap in the same way. But that is misleading, because the 'x' you would be halving is not the same as the 'x' you would be doubling. You equivocate in saying that the second swap leaves you with either $0.5x or $2x. With the first swap in the three envelope case, the 'x' or 'your money' referred to is the same whether you are halving or doubling, leaving you with an expected positive profit. With the second swap, the 'x' or 'your money' you would be doubling is one-quarter the 'x' or 'your money' you would be halving, leaving you with an expected loss.
And continuing from 210:
Putting it in the scare-quotes-formal mode helps to obviate the impression that you're in some sort of spooky superposition of states after the first swap. "Your money" refers either to the $400 or to $100, full stop. It's just that if it refers to, say, the $400, in which case swapping back to $200 is halving 'your money', then the result of swapping back to $200 from $100 should not really be described as doubling 'your money', but rather as doubling 'what would have been your money'.
Interestingly, you can treat scenario one as describable via a similar equivocation. Swapping is a matter of either doubling $200 ('x' or 'your money' if you were originally given A) or halving $400 ('x' or 'your money' if you were originally given B), each with probability 0.5. The difference is that here the 'your money' you would be doubling is not one quarter but rather one half the 'your money' you would be halving, leaving you with an expected profit/loss of 0.
Heebie is Donald P. Sinclair.
This is exactly what I always think of when I say, "And.....GO!" Everyone staring back at me blankly.
.........GO! ......GO! ........The race has begun, and technically you're winning because you're nearest the door.
||
soup, once the power's back on, let us know how you & Houston are.
Anyone else out there in Ike's path?
|>
No deaths or injuries in Galveston, I read.
Where's your IKE index NOW, Sifu? YEAH!
214: Only because they haven't been found yet.
Well they must not be trying very hard to be found, then.
Dead people are funny that way. (Funny ha ha, not funny peculiar.)
You're doing great. Keep it up.
So here's the explanation of essentially the link in 55 that I like (roughly as explained by a friend of mine who is a philosopher of probability, though I my have added mistakes.)
As has been pointed out above the answer to the question depends on the method Heebie/God used to assign money to the envelopes. If they choose in some way that always puts say less than $1000 in the envelope then the problem resolves easily (for example, if you have more than $500 in your envelope don't switch). The interesting case is when Heebie/God has an infinite bankroll. So let's say Heebie has told us how she assigned money to the envelopes and furthermore that she's done it as follows:
The two envelopes have $10^n and $10^n+1 and the chances that Heebie chose that pair is 1/2^n.
Notice that 1+1/2+1/4+...=1 so this is sensible.
Ok, now say you open the envelope and see x=10^n dollars. Well either the envelopes contain x and 10x, or they contain x and x/10. And the latter is twice as likely as the former. So switching should net you (1/3)*10x+(2/3)*x/10 which is bigger than x. So you should always switch!
But now what happens before you open the envelope? Well no matter what number you see when you open the envelope you'll want to switch. But does that mean you should want to switch before you open the envelope? No!
Here's why. How much would you be willing to pay to play this game? Well, even if you don't switch you expect to get 1/2 + 10/4+100/8+... which is bigger than any finite number. So you should be willing to pay any amount of money to play even the one envelope version of this game. Now everything makes sense. Before you open the envelopes you should think of each of them as being worth the same as each other but more than any finite amount of money, so don't switch. The moment you open one of them you're going to be disappointed (no matter what happens you'd be willing to pay that money to play the game again), so it's not surprising that you'd want to switch.
For example here's another version of the envelope paradox. Now there's no relationship between the two envelopes. Heebie's just put 10^n whith probability 1/2^n in one of them and 10^m with probability 1/2^m in the other. (This version will work with even more envelopes.) Now when you open one envelope you should always be willing to immediately spend that money to buy the next envelope! But before you open the envelope there's no point in swapping it with another envelope that as far as you know is no different.
There was also a lot of discussion of this topic over here in the mathblogosphere.
212: Dumb as it is, Rat Race cracks me up. I think that there should be don't-think-too-much madcap race or chase movie made every few decades to serve as an exemplar of the state of physical comedy for that period. Right now, in addition to RR, you have It's a Mad, Mad, Mad, Mad World and probably some Buster Keaton or Charlie Chaplin film from the '20s in the sequence.
So you should be willing to pay any amount of money to play even the one envelope version of this game. Now everything makes sense. Before you open the envelopes you should think of each of them as being worth the same as each other but more than any finite amount of money, so don't switch. The moment you open one of them you're going to be disappointed (no matter what happens you'd be willing to pay that money to play the game again), so it's not surprising that you'd want to switch.
This is why the envelope problem resembles the St. Petersburg lottery (another knotty problem in probability), which is the following game: you purchase a round at the table and flip a fair coin. The starting pot is a dollar. If it's tails, the game is over. If it's heads, the pot doubles. When the game ends, you keep the pot. Despite the fact that this game has an infinite expected value (expected payoff is 1/2 * ( $1 ) + 1/4 * ( $2 ) + 1/8 * ( $4 ) + 1/16 ( $8 ) + ..., which is a divergent sequence), the fair value for a play is somewhere around three bucks; similarly, if you adjust the starting pot up or down the fair value to play moves, even though the expected value supposedly hasn't changed.
225: and the complication is similarly the issue with thinking of the probability distribution over the set of dollar amounts as flat?
The complication with the St. Petersburg lottery problem actually ends up involving things like decision-making strategy and the idea of diminishing utility (and the fair value is actually a little higher than I remembered - the Wikipedia page on the problem is good), as well as real-world constraints on the amount of money the house would have available to pay.
In Cryptic's revised version of the problem in 222, the envelopes basically resemble a ticket to the St. Petersburg lottery, which are always theoretically worth more than any finite amount of money, even though it would be absurd to pay, e.g., $100 for a ticket.
226: You're doing really well, lil' buddy! Almost there!
Huh, it occurs to me that the St. Petersburg lottery problem is related in some ways to the "double on a loss" blackjack betting strategy that I've seen some very smart people fall for.
225, 228: I am thinking this is basically the same flaw with various "progressive" betting schemes. (Bet 2n+1 until you win, gaining a dollar after each sequence.) In practice you run into house limits, in Gambler's Heaven at some point you get to bet $1,099,511,627,777 to get back your dollar for that "sequence".
Fuck you Sifu. For that I am *so* winning this thread.
Everyone's doing great. Want some gatorade? It's got electrolytes.
re: 230
IIRC, that sort of Martingale strategy does 'work' but only if you've got infinite amounts of money and a casino that doesn't impose betting limits.
I figured out the "double on a loss" strategy independently when playing computer games. But even then I was ashocked when I would lose 10 times in a row, something beyond the realm of human probability as I saw it.
234: well, right. Similarly, the St. Petersburg lotterly only works if you have infinite money and the casino has no betting limits.
"lotterly"? That's the second time I did that; the first time I caught myself.
Heebie I don't think it's fair for you to try.
when i was in Las Vegas i tried to play a slot mashine, i put into it 10$, won it back, lost it again, thought i already lost 20$ and left, then i realized lost my breakfast in a paper bag, i forgot that i put it near the mashine, when i got back it was gone or maybe i couldn't find it between many mashines in a row
a banana with chips and water, we had to go very early to the Grand Canyon so i bought it in the evening
Lotterly, St. Petersburg has become a haven for infinitely endowed gamblers.
i could buy something to eat at the skywalk though, did not need to buy anything beforehand, so it convinced me like again that i'm right when i do not prepare things and go things just spontaneously
do, the cyrillic interferes with my typing, coz our ds are gs, sometimes it happens
In actual gambler world, I recall seeing numbers in the range of 80-90% for estimating the percentage of "fresh" money that goes into a slot machine that stays there in the end. In gambling probability talk, the rapid pace of slot machines results in exponents on steroids.
You can always tell when exponents are on steroids though because they grow hair on their properties.
And their mantissas are unreal!
As I hear it, first place is you're fired.
I'll do everything I can to help the most deserving commenter win.
If I knew how to close a comment thread, I would suggest sending me bribes. In two envelopes. No—three!
I hope heebie wins.
Suppose you were given a box with a truck in it, and you were told that you could switch to a box with either half a truck or two trucks. You wouldn't switch then, now would you?
ONE POINT FIVE TRUCK WILL MELT IN THE FIRES AS SURELY AS TWO
NONE SO BEAMISH SHALL PREVAIL
I hope Satan wins. He always seems to have such a rough time of it.
I wonder what Satanists are thinking. Isn't that kind of like betting on the Washington Generals?
Suppose you were with the one you love, and you were offered to either halve them, or double them. Now it gets harder to choose, again.
I've been doing very well for myself lately.
Actually an algorithmic strategy to win this thread would be interesting; you'd have so successfully guess (more accurately than anybody else) when the comments would be closed. Obviously, being able to close the comments yourself would be helpful in this, but assuming a baseline level of honesty on their part, and assuming nobody uses robots to comment, it becomes a relatively doäble, but fairly complex, problem.
What I would do is wait until there's a post/thread about why women are always so unemotional and sexually enthusiastic whereas men are so emotional and sexually apathetic, and yet men blame women for being emotional and sexually frigid, and then slip my comment to this thread in when it'll only be on the sidebar for a second before it's displaced.
I've been doing very well for myself lately.
Maybe then you should let heebie win.
If the problem can't be solved on the assumption that there are no robots, then the problem can't be solved, I say.
thread about why women are always so unemotional and sexually enthusiastic whereas men are so emotional and sexually apathetic, and yet men blame women for being emotional and sexually frigid
Sadly, all of the controversy in that thread would be over the question of whether it is possible to make those sorts of generalizations at all, rather than the accuracy of that specific generalization.
263: see I'm thinking you'd need some automation, to monitor the thread.
Note my assumptions above, though.
Note my assumptions above, though.
This was my "other way":
if ($comment->entry_id == 9208)
{
$comment->email($comment->email . ":" . $comment->author);
$comment->author("Beefo Meaty");
}
So it doesn't involve the power to close threads, per se.
As has been pointed out above the answer to the question depends on the method Heebie/God used to assign money to the envelopes. If they choose in some way that always puts say less than $1000 in the envelope then the problem resolves easily (for example, if you have more than $500 in your envelope don't switch). The interesting case is when Heebie/God has an infinite bankroll.
I've been avoiding this thread since my last sub-100 comment, but I really don't understand this. Why does heebie need an infinite bankroll for the problem to be interesting? Why doesn't a bankroll large enough to cover $3n suffice? I mean, couldn't heebie put $1000 into an envelope, then flip a coin and put either $500 or $2000 into another, then hand me the $1000 envelope?
271: A quick review of her personal weblog suggests that heebie is blowing all her coin on capricious car-accessory purchases, rather than sending you the tribute you so richly deserve, ben.
So make it $.50, $1, and $2.
I'm not going to think, when I open the first envelope and see $1, that I should swap, because, had I seen heebie's yearly salary, there would be no way the other envelope could contain twice as much, so in the actually obtaining situation it's more likely that the other envelope does contain twice as much, and one would have to be a fool to do so.
||
Someone send word to the Garance: I'm sorry, she can't be my blogcrush anymore. I've fallen for Kate Klonick of TPM.
|>
I'm loaded. The first envelope has a big $50, and I'm taking the winner to Sizzler's.
Garance does not want to be your blogcrush, perv. You're probably thinking of Jessica Valenti, a known hussy.
274: isn't the obvious conclusion, if you accept the "bubble" defense, that McCain is an incompetent executive you can't get his own campaign in order?
Someone should ask McCain if his conduct in this campaign can serve as a foretaste to world leaders for the sort of honorable treatment they'll get from a McCain administration, should he be elected.
"you" s/b "who". Excuse me. My mind is going.
That video is much more hard hitting than anything I've seen from Garance. Kate seems more crush worthy.
Also, if McCain is elected, the nation will collapse, and this election will be used by future generations as evidence that representative democracy doesn't work.
279.last: That's a grim outcome indeed, rob. We must fight with all our power to subvert this nasty conclusion.
This thread is real evidence of the depth of Heebie's egotism. It wasn't enough to always be right, it wasn't enough to get a 1000+ comment thread with her debut, now she wants a thread of *infinite* length.
Also, I saw this woman again last night and we spent half the evening in pleasant discussion and the other half making out like crazed teenagers, so it's possible I was overreacting before.
280: It's possible. Sounds excellent!
280: Did you tell her that you aired your anxieties about your previous date to a bunch of strangers on the internet? I hear that's a big turn-on.
Haven't read the whole thread, but isn't this a conditional probability problem? The tradeback is not the same as the initial trade because it's dependent on the probability embedded in the initial trade. If you work out the conditional probabilities correctly, the expected value of the tradeback will be n (vs. 1.25n to stop at one swap.)
Assume the double swap- Call B the probability of getting one envelope or the other; call A the probability of getting back your original envelope after a double swap. P(A|B), the probability of A given B, is 0.5 for each possible initial outcome: P(AintersectB)/P(B) = .5/1. P(AintersectB) is 0.5 because you have a 50% chance of getting each envelope, and that will always occur in conjuction with a swap back because we're assuming the double swap. P(B) is 1 because you know you're getting the original envelope back in either case. The expected value of a double swap is n (taking back your original envelope.) Therefore the expected probability of a double swap is 0.5n +0.5n = n. You should trade once and stop at an expected value of 1.25n.
the expected value of the tradeback will be n (vs. 1.25n to stop at one swap.)
Of course the expected value of the tradeback is n. There's only one other envelope and you know it contains n dollars. You shouldn't really need to do much calculation to figure that out.
A the probability of getting back your original envelope after a double swap
There are two envelopes. The probability of getting the original back after a double swap ought to be one.
The simpler way to say that is that, assuming the double swap, there's a 0.5 chance you'll be richer for a moment then the same again, back at n; a 0.5 chance you'll be poorer for a moment then back at n; 0.5n + 0.5n = n expected value for a double swap.
Dude, SP, the simpler way is to say that the double swap is a noöp.
You will not be the last poster, Sifu. Never!
I'm glad you've come to accept the inevitable, Sifu.