Another probability “paradox”, the *two-envelope problem*^{1}, goes like this:

Before you are two envelopes, A and B. One of them contains $X and the other $2X (which is equivalent to $Y/2 and $Y). You pick one envelope and are (1) asked if you would like to keep it or switch, or (2) open it, view its contents, and then asked if you would like to keep it or switch. Which strategy, keeping or switching, is likeliest to win you the big bucks?

**No peeking solution**

The traditional paradoxical solution to (1) is to argue this. Suppose you pick A, which can be said to have $X (recall you do not peek). There is then a 50% chance that B contains 2X (dropping the dollar signs, but understanding they are still there, lurking) and a 50% chance that B contains X/2. The “expected value“^{2} of B is said to be

0.5 * 2X + 0.5 * X/2 = 5/4 X.

So clearly you should switch, since the expected value of the envelope you did not pick is larger than X, which is the value of the envelope you hold.

But that’s nuts, because you could have picked up B, in which case the expected value of A is 5/4 X, too. Oh, what to do!

There’s a lot of nervousness about this problem; it has, as I said, been called a “paradox” because it does not conform to our understanding of the theorems of decision analysis.

To solve this, understand that there are unstated, tacit premises here. One is that X > 0; another is that X is real, i.e. X can take infinitely many values, and even approach infinity itself. The first assumption (non-negativity) is unproblematic. The trouble comes from assuming X is “real.”

Now, “real” numbers are one of those creations whose names lie. That is, “real” numbers are *un*real, in that most of them cannot be seen nor can they be constructed. Actual-real numbers are integers, or ratios of integers, where the integers can be large but finite. What if instead of assuming X “real”, we assume what is true, that X is an integer and is finite? Does that help us out of the paradox?

Take a close look at that “expected” value. Can we actually ever see “5/4 X”? We cannot. By the premises, we can only ever see X or 2X (or Y/2 or Y). The value “5/4 X” is impossible. Thus, we do not ever “expect” to see it.

Well, this is no answer, is it? It is just fretting over an unfortunate naming of a function of the data. Not quite.

What we have learned, in this paradox, is that expected value, whatever be its name, is not always useful as a criterion for decision making when we have discrete, finite information. We just assumed that it would be.^{3} Thus, the trivial solution to this paradox is nothing deeper than acknowledging that a one-number summary of a complex situation is, at times, insufficient for a decision criterion.

We can say there is a 50% chance that the envelope you did not pick has 2X and another 50% chance that it has X/2. And that’s it; that’s as simple as we can make it. We cannot do away with either of the probabilities, or bury them inside some function.

The answer, strictly according to probability and *not* expected value: Switching or keeping makes no difference.

Note *carefully* that there is not one word here, or anywhere, about “repeated” trials of this game. *You only get to play once!*

**Peeking solution**

At first blush, seeing $X when you open, say, envelope A is no different than the no-peeking situation. After all, B can still contain $2X or $X/2, right? Well, if X is real, then maybe that’s so (but see here for another argument, which is discussed in Part II).

But if X is actual-real, discrete and finite that is, then the situation *is* different. Suppose the units of X are dollars: it makes no difference what currency you use; the only point is that X comes to us in discrete, indivisible chunks (we could do pennies if you insist on the most basic unit).

It’s also true that if somebody were to play this game, they would not have an infinite amount of money available. There would be at most N dollars, where N can be as large as you like, just not infinite.

Suppose you pick A and find it contains $1. Do you switch or keep? Switch! Because there is a *0% chance* that B contains $1/2, which is an impossible amount because, don’t forget, X comes in discrete units and cannot be found in fractions. In fact, if you see *any* odd value of $X then you should switch, because we are 100% sure that the other envelope contains $2X! Switching will always double your money if $X is odd.

What about $X even? Well, then we are closer to the no-peeking solution, because we might think there is a 50% chance that the envelope you did not pick has 2X and another 50% chance that it has X/2. But we actually have *more* information, which says we should switch only up to a point.

The total amount of money in the game is 2X + X = 3X, which must be less than or equal to N. Suppose you open, say, A and find some amount W such that W + 2W > N. Then you are *100% sure* that B has W/2! You should keep, and not switch, any envelope where three times its amount is greater than N (because we know that W + W/2 <= N; where "<=" is less than or equal to). We can call these keeper Ws "large." One consequence is that large Ws will always be even.

But we *should* switch when X is even when we know that 3X < N, because we are gaining information just knowing this fact. Just how that information plays out will be explained in Part II.

Thus, when N is known, and X is actual-real, we have a solution *guaranteed* to maximize your profit. (Incidentally, expected value calculations sometimes work here, but only when X is odd or “large.”)

*In Part II: the solution for unknown or unevenly distributed N.*

————————————————————————————————

^{1}I’ve not seen either of these solutions in exactly this way, which means little; that is, the solutions might be out there and are unknown to me, which is the most likely case. Mark D. McDonnell and Derek Abbott offer a repeated-trials “randomly” switching solution for real X when one can peek. They also wisely consider and offer a solution for a bounded N. We examine this in Part II.

Sandy Zabell is credited with bringing this problem to the attention of Bayesian statistics. But he called it the “exchange paradox.”

John Norton also fingered the reliance on expected values to be the sticking point in the no-peek game, but was loath to give up on the device. He invented an “improper probabilities”—i.e. probabilities which are not probabilities—solution to leap from the pit.

^{2}“Expected value” is the sum of everything that can happen multiplied by the probability of everything that can happen. Think of it as a weighted average.

^{3}Expected value can also be used to say that the expected value of the envelope you hold is 0.5 * X + 0.5 * 2X = 3/2 X. But that’s also the expected value of the other envelope. Both are equal—and *impossible* values—-so switching or keeping are equivalent. Once more, sticking to the probabilities gives the unambiguous answer.

19 July 2010 at 8:09 am

Matt:

Is not EV = 1.5X, i.e. 0.5*X + 0.5*2X?

19 July 2010 at 8:40 am

Bernie,

In one view, yes. See footnote 3.

19 July 2010 at 9:26 am

Two equal and undecidable probabilities? The cats are starting to look nervous.

Normally when one starts mentioning envelopes and maximising sums of money, the politicians look nervous too.

19 July 2010 at 9:39 am

I’m wondering if the problem comes from the mechanism for selecting the numbers. Does the analysis of the paradox implicitly assume that the first number is a positive real variable with a flat distribution? That would be impossible, wouldn’t it? As would a positive integer variable or a positive rational variable, etc.

Then, given a positive real variable W with some valid probability distribution (not flat), you would need to analyze the problem as a conditional probability, e.g., given that W=4, what is the probability that X=4 (W is the original number) versus the complementary probability that X=2 (W is the doubled number).

I don’t have time to chase this around right now. Maybe someone else does.

19 July 2010 at 10:15 am

Kevin,

See Part II (tomorrow, I hope).

19 July 2010 at 10:31 am

Matt I must be dense:

You say

“There is then a 50% chance that B contains 2X (dropping the dollar signs, but understanding they are still there, lurking) and a 50% chance that B contains X/2. ”

Why X/2?

This paradox reminds me of the conundrum of how do you chose among 2 or more equally qualified candidates for a position or how do you chose between H and T in a coin toss. The former event is captured effectively and dramatically in Dumas’ Man in the Iron Mask. There is no issue as long as you do not have to legitimate or explain your choice – you simply and arbitrarily chose. What tends to happen when people are forced to rationalize their decision is that they create criteria that ensure a single, but essentially arbitrary outcome, e.g., birth order of male twins, seniority in promotions (recall Michael Caine and Stanley Baker in Zulu). The consequences of their being potential legitimacy afforded to other possible heirs led to the standard practice in Byzantium and Islam of slaughtering all other potential legitimate family members.

19 July 2010 at 11:25 am

I think the “paradox” in the no-peeking case is simply a result of bad math. If X is a random variable with a known distribution, and f() is a function, then Y=f(X) may make sense and it may be possible to calculate an expected value for Y. But “f(x)=2x or x/2″ is not a proper function of x, and saying Y=f(X) makes no sense.

To do it properly, let L and 2L be the amounts in the envelopes, and assume Prob(X=L)=1/2 and Prob(X=2L)=1/2. The expected value of X is then E(X)=L/2+2L/2=1.5L. Let Y be the amount in the other envelope. Then Y=3L-X, and E(Y)=3L-E(X)=1.5L. The expected values of X and Y are the same. Note “Y=3L-X” describes Y as a proper function of X, while “Y=2X or X/2″ does not.

19 July 2010 at 11:44 am

Forget all about probabilities, all I know is that IF I know there is a large sum of money in the envelope, I’d probably think thrice about trading it, but IF I know there is only a penny or a dollar, I wouldn’t hesitate to trade. IF only we know how to assimilate all possible, potential useful information properly. ^_^

Bernie,

After you open your envelop and find x dollars. Now, you reason that the remaining envelope has either $x/2 or $2x.

19 July 2010 at 12:13 pm

There’s a decent write-up of a Bayesian solution to this ‘paradox’ in American Statistican, 1992, by Christensen and Utts. Link for anyone who has access to JStor is http://www.jstor.org/stable/2685310.

19 July 2010 at 1:26 pm

Unfortunately, this problem called a “paradox” not “because it does not conform to our understanding of the theorems of decision analysis”, but simply because the expected value is not calculated correctly.

Suppose that one envelope contains A dollars, and the other contains B dollars, with A>B (A=2B is a special case).

Then either you get the envelope with A dollars, and you lose A-B by switching, or you get the envelope with B dollars, and you gain A-B by switching. Since the probability of each outcome is 0.5, then the expected value of switching is zero, and you should be indifferent whether to switch or not to switch.

19 July 2010 at 1:30 pm

JH:

Got it. Obviously I missed the point of the paradox, ergo I was being truly dense. Many thanks.

19 July 2010 at 6:07 pm

I pick Envelope B, no switching. Please send me the cash, or a check for the same amount.

19 July 2010 at 6:09 pm

I prefer simpler decision strategies. Like, how do you tell if a bug is dead? Step on it. If it goes crunch it’s dead. If it goes *squish* it’ll be dead. Probabilistically speaking, of course. Learned that one in grade school.

19 July 2010 at 10:40 pm

If you know N, then it’s not so much fun. What would be fun, is not knowing N. In that case, I would just want to see what one of the envelopes has and I would decide based on whether I’m willing to risk losing half of what I have. So, I pick strategy 2, even though it doesn’t answer the probability question.

20 July 2010 at 7:39 am

If it’s $100 I’ll switch because I need $200, or else my legs get broken.

20 July 2010 at 10:12 am

SM,

Right, that is more fun. It will turn out that we still have to specify some kind of guess about possible values of N. That is, it can never be entirely unknown.

Joseph Levy,

Well, saying that the way I illustrated (which I also do not agree with) is calculated incorrectly is a fine claim, but there must still come a proof it is incorrect. That is, we must definitively say why expected value cannot be thought of in that way. My proof is that it cannot be used at all. Let’s remember that expected values have the tinge of frequentism about them; that is, about repeated trials. But we’re not in a repeated trials situation. You only play once!

Wesley,

You wouldn’t happen to have a copy of this paper? I’m not behind a university firewall and don’t have JSTOR.

JH, Bernie,

See Part II.

20 July 2010 at 10:38 am

Joseph Levy is right that the expected value is not calculated correctly for the ‘non-peeking’ scenario.

Restating the problem with amounts A and B with A>B leads to amounts A-B, B, A+B, (i.e. an arithmetic series) the expected value of which is simply the arithmetic mean of the two possible values.

The correct way to calculate the expected value (without restating the problem) is to note that the corresponding values i.e. X/2, X, 2X are a geometric series for which the expected value is the geometric mean of the two possible values i.e. E = exp(0.5 ln (X/2) + 0.5 ln (2X)) = X.

Thus, the error in reasoning that leads to the paradox is the calculation of an arithmetic mean instead of a geometric mean.

There is no need to peek, nor to consider a bounded distribution to avoid infinity.

Footnote: the probability distribution of the three values X/2, X, 2X is a combination of two separate probability distributions. The first value X may be considered to be a random variable in uniform distribution, i.e. the values are evenly distributed along the number line. However, because the second value is either half or twice the first value, its values are not uniformly distributed along the number line. Instead, its probability distribution decreases with increasing X (it is the logarithms of the values that are evenly distributed). The two sums of money should therefore be treated as pairs of values with different probability distributions.

24 July 2010 at 12:23 am

I would like to emphasize StephenPickering’s last sentence: “The two sums of money should therefore be treated as pairs of values with different probability distributions.”

If we know one of the values is X, there is nothing in the problem saying or implying that there is a fifty percent chance that the set of values is (1/2X, X) and fifty percent chance that it is (X, 2X).