Before we begin, see this similar article about a different mugging scenario, which demonstrated that all probability is conditional and that decision isn’t probability.
Here’s the serious version of Pascal’s Mugging (Eliezer Yudkowsky, or “Yud’s” from Less Wrong, makes little sense; paragraph breaks mine).
Blaise Pascal is accosted by a mugger who has forgotten his weapon. However, the mugger proposes a deal: the philosopher gives him his wallet, and in exchange the mugger will return twice the amount of money tomorrow. Pascal declines, pointing out that it is unlikely the deal will be honoured.
The mugger then continues naming higher rewards, pointing out that even if it is just one chance in 1000 that he will be honourable, it would make sense for Pascal to make a deal for a 2000 times return.
Pascal responds that the probability for that high return is even lower than one in 1000.
The mugger argues back that for any low probability of being able to pay back a large amount of money (or pure utility) there exists a finite amount that makes it rational to take the bet — and given human fallibility and philosophical scepticism a rational person must admit there is at least some non-zero chance that such a deal would be possible.
In one example, the mugger succeeds by promising Pascal 1,000 quadrillion happy days of life. Convinced by the argument, Pascal gives the mugger the wallet.
The previous article and the paragraph breaks should be a giveaway that the whole thing is silly. But if isn’t obvious, here’s the breakdown.
We’re trying to get inside Pascal’s head and form some list of evidence that is probative of the proposition, Y = “The mugger will another day give me X”, where X varies according to the deal. First, the mugger is always wrong: there is no probability for his promise. That probability only exists in Pascal’s head, deduced on the evidence Pascal decides to accept.
Some character comes up to you on the street and says “I’m a philosophical mugger. Gimme $100 and tomorrow I’ll give you $200”, and you’d form the evidence E= “This guy is a lunatic.” From which you’d deduce the probability of Y is 0; i.e. Pr(Y | E) = 0. Further, the higher the amount the mugger claims he’ll give you, the more you’d add to your evidence of the fellow’s insanity. “This guy is a lunatic. He has the power of granting 1,000 quadrillion happy days of life? Give me a break. And fetch me a straight-jacket.” From this, also you’d deduce the probability of Y as 0.
If you like, you can say the “weight” of the probability of Y has increased, in the sense each that addition to the evidence list would on its own lead to a 0 probability of Y (or close to that, depending on how you internally phrased each item of evidence). Learn more about probability “weight” in this fine book.
Anyway, why the interest in such a simple non-problem? Because of the mistaken belief that “events” have probabilities. The mugger insists Pr(Y) > 0, and therefore ups his offer so that whatever decision rule (see the first article about decision rules) Pascal uses, eventually the rule will say it’s optimal to hand over his cash.
This is true; it follows. But that is only because the false belief that events have probabilities. There is no such thing as “Pr(Y)”: it doesn’t exist. It is always wrong. Only Pr(Y | E) exists; so we must have evidence E.
Is Pascal’s Mugging salvageable? Suppose Pascal’s E is the proposition, “The mugger might pay me X”. But that’s logically equivalent to “The mugger might not pay me X”, which is equivalent to “The mugger might or might not pay me X”, which is a tautology. And which thus gives no information about Y.
No. The only way to save PM as a sensible problem is for Pascal himself to add evidence like, “I like the way this fellow looks; he has an honest face. People with honest faces keep their promises in bizarre situations like this sometimes, but not usually. He can keep his promise of giving me X if he really wanted to.” That’s a mouthful, but that’s how our minds work.
Still, there’s no reason to suppose anybody would think like that (except in fanciful, fantastic situations). If X is “Mugger will give me 1,000 quadrillion happy days of life”, nobody would believe it. I doubt anybody would believe it if the mugger promised a buck-fifty.
Such is the utility of the vaunted “thought experiment”, a favorite of economists and con-womyn.
Of course “events don’t have a probability” but their occurrences do. In fact, you said so using words like “likely” and terms like “Pr(Y) > 0”. Presumably you aren’t arguing otherwise.
The real problem with the outlined scenario is that the mugger gives no reason to foster the belief he would ever have the amount promised let alone be honestly willing to repay it. Note that banks limit their risks by insisting on collateral or by keeping to small amounts and, even then, insist on some indication you would have the money for repayment.
In the real Pascals’s Wager, humans supposedly are betting on the reward of Heaven. (Yeah, he said the existence of God but he really meant the existence of reward/punishment in the after-life). That at least has a believable non-zero chance of occurring. It seems your quoted example is merely merging it with the St. Petersburg paradox by providing escalating (but believable) payoffs but the quoted article offers no believable reason the proposed payoffs would occur.
DAV,
No, whenever a term like “likely” is used, there are always tacit or implied premises. Always. But it would be tedious in colloquial speech to spell these out each and every time. I go over this in Uncertainty. Occurrences, events, propositons and so on do not have probabilities. Saying “Pr(Y)” is always wrong.
All,
I’ll leave it as a homework problem to show why Pascal’s mugging is not equivalent to Pascal’s wager.
This akin to the Ludic fallacy from Nicolas Taleb’s book, The Black Swan
Pascal (the Pascal in the story) is getting caught up in the game (probability) and losing touch with reality. It’s a fun thought experiment, but the example has no place in the real world — give me $100 today and I’ll give you $100 trillion tomorrow, probalistically, anyway.
Note: Funny, I just wrote that and turned to see my $100 trillion note from Zimbabwe taped to my filing cabinet. So it is possible, given hyper inflation, for the example to work. Just change your conditionals …
Briggs,
No, whenever a term like “likely” is used, there are always tacit or implied premises. Always.
I didn’t say there weren’t implicit premises. I said that the occurrence of the events have a probability. That doesn’t mean it is an absolute probability.
This example was intended to remind us of Pascal’s Wager by the use of the name Pascal. Maybe coincidental but I doubt it.
Looking at the inflation rate for Zimbabwe …
On November 14, 2008, the annualized rate of inflation was “89.7 Sextillion (10^21) percent.” And prices were doubling every day. Markets broke down afterwards, so no inflation rates were calculated after that date (https://www.cato.org/zimbabwe).
There’s a non-zero change of winning the lottery too. In fact, most draws have produced a winner in the past, so it is clearly a much better idea to buy lottery tickets for the 100 bucks.
Absurd theories produce absurd results, that is how you know the theory is absurd. There’s even a Latin phrase for it, Reductio ad Absurdiam, hence it must be true ?
My favorite example of probability gone mad is …
There was a group who sold some raffle tickets, with a grand prize and some minor prizes. On the tickets, the group listed the probability of winning the each prize.
The group held a weekly drawing and handed out the prize as drawn. On one week (not the last), they drew the winner of the grand prize. Despite that, they continued selling raffle tickets, noting the grand prize offered.
When someone questioned them — selling tickets with the advertised grand prize already handed out, the organizer stated the probability of drawing the grand prize had not changed — it remained the same regardless of whether the grand prize was already awarded or not .
Pascal’s wager isn’t about probability – it’s about expected value. Expected value is usually estimated (and sometimes defined) using probability estimates, but that’s an over-simplification.
We do not know, for example, what the probability is of an asteroid big enough to wipe out life hitting us in the next ten years is; but the expected value of the loss is infinite from the human perspective – a result you cannot obtain using EV=P x Cost for any P without cooking the books by assuming the cost infinite.
DAV: I “grok” the meaning of events not having a probability. Probability implies randomness, and if you’ve been around here a while you recognize that random does not exist. If I drop a ball, will it EVER fail to respond to gravity and not fall? That would be absurd.
Instead of random chance, there is instead ignorance of the factors; a thing may have probability in your mind because you don’t know the factors or it would be impossible to compute anyway. But with sufficient computational power it is nearly certain that absolutely nothing happens randomly because all caused things require a cause, and nothing happens without a cause. Now those are assumptions you are free to not hold, but if you don’t, then ANYTHING can happen at any time, and you still have the problem of not being able to assign a probability, since the probability itself could also change at any time.
As someone else pointed out, Pascal’s wager isn’t about probability or liklihood, it is a “return on investment” social calculation. The cost of believing in God is basically zero, the reward is potentially enormous OR the reward is zero. There’s no “downside” in other words, no cost.
Now if I were to demand $100 from you and offer as a return a reward in heaven, now suddenly you are comparing real, immediate costs with YOUR calculation of likely returns in the future.
In what way is that different from gambling at Las Vegas? No meaningful difference, it is a gamble, and the offeror of the bet usually has an advantage. Those particular odds ARE built into the machines; but would you play if you knew you must lose? Instead, you think that you might get *lucky* and beat the odds; but there is no such thing as beating the odds; permitting someone to win now and then is also built into the system.
Combinations and permutations exist in specific scenarios but if you could know all of the factors there still isn’t chance. Suppose I shuffle a deck of cards; it is a physical process and with sufficient computing power you could know in advance how the cards are going to shuffle. Skilled card players to an extent can shuffle cards and have them ordered the way they wish.
Perhaps there’s another way to look at this. Tweak it only slightly: The Mugger becomes a willing borrower (presumably having a poor credit rating) desperate for cash. Pascal becomes a Mafiosi-type lender, i.e., at usury rates. (One could quibble whether short-term payback at 2X the borrowed amount transcends traditional ‘usury’ rates, but let’s not bother.)
Now, we have a situation defined by precedent, in which the prospect for payback to Pascal is established, but not guaranteed. Of course, there is an implicit heavy, non-monetary penalty for failure to pay back the debt with exorbitant interest, which increases the incentive for a timely payback, but does this also increase its probability? Generalizing, when more information is added, can this meaningfully increase an event’s predictability? Just asking.
That guy’s alias is “Fatty Potter”, LOL
Michael 2,
Er, no. And random means unknown. It’s not a thing.
Probability is a statement of confidence in a result thus is dependent upon knowledge at hand.
Odds in gambling refer to the payout in either direction. It is possible to beat them if your assessment of the situation is better than that of the offerer. In stat problems the assumption is that the odds are “break even” and interchangeable with probability. Rarely happens in real life.
In situations where the utility function can’t be determined or is not concave, no optimum can be calculated. Common examples of this are Pascal’s Wager, the St. Petersburg Paradox, and this weird mugger problem — all essentially the same problem without a real solution.
This over all seems quite similar to Pascal’s wager. Being religious has at least a small non-zero cost. The new movie, Silence, highlights a possible very large cost. Either way the offer is to convince the sceptic to vet on the big prize.
Pingback: This Week in Reaction (2016/12/25) - Social Matter