# Probabilities Aren’t Decisions

The title says it all. But its presence implies, and it is true, that people often mistake the two.

If you decide to bet fifty bucks the USA beats Germany, I cannot learn the probability which you formed for that proposition. That is, you had some list of premises probative of “USA beats Germany” (including tacit premises on the date of the game, knowledge of laws of the game, etc.). Now if I knew those premises, I with you would be able to deduce the probability of P = “USA bests Germany.”

Unless you’re a mathematical geek, that list of premises will be vague, in flux, imprecise, not articulable. The probability of P given this imprecision won’t be a number. Most probabilities aren’t numbers. If you say, “I think there’s a good chance P is true”, then this “good chance” is not a number.

And there’s no reason in the world it should be—though because of a strange desire that everything should be quantifiable, some insist all probabilities, and all decisions, must be numbers. This is yet another doorway over-certainty enters. Forcing somebody to state a unique single number for a probability when the evidence (premises) under consideration do not warrant it, and indeed say the probability is “fuzzy”, is always a mistake.

But even if you are a mathematical geek and have compiled a set of premises that allow a quantification of P—and there’s no *a priori* reason the mathematician’s premises and probability are superior to the sport bettor’s premises and probability—there’s no direct way to move from that P to a decision. Probabilities aren’t decisions. In fact, it is the conflation of the two, decision and probability, that lead some to conclude that all probabilities must be unique numbers.

Now there exists a field called *decision analysis* that shows how to take probabilities as inputs to a function which provides a decision. But there is a choice of function depending on how one views the consequences of making a decision. That’s a subject for another day. For us, we have to understand there is another input, which are the things risked and rewarded. For sports betting, these things are money, but for other situations they could be anything.

Money sure sounds like a number, and it is. But the difficulty is knowing how much money to risk and how much one would like as a reward. Figuring these kinds of numbers is just like figuring probabilities: they require precise, explicit premises which allow us to deduce the amounts. You won’t have these with betting whether P is true, just like you won’t have the exact premises which allow a deduction of the probability of P.

The same math geek can, of course, invent a list of premises which allow deduction of the amounts, but there’s no telling—in advance—whether his premises are superior to yours. Forcing a number to be precise when it doesn’t want to be allows you to make formal calculations, and this exercise gives the appearance of rigor and certainty. And that’s the problem: *appearance*.

This is the problem with decisions of major consequence. Take rampant, out-of-control global warming (which will strike any day now). Economists plug fixed probabilities of doom and the monetary values of that doom into gorgeous equations and out comes answers. These answers are the reified. They *become* the reality; rather, conditional prophecies of the form “Unless thou repent of carbon dioxide and put us in charge of all aspects of your life, you will pay exactly this much.”

Anyway, that’s beside the point—which is that probabilities aren’t decisions. Knowing the probability of anything, unless that probability be an extreme probability (0 or 1), does not tell you what to do.

More later…

*I’ve decided to extend the special on typos through Saturday. Enjoy.*

These days, a nice numerical probability tells you that more money needs to be spent on research so that researchers can come up with another nice numerical probability, so that way more money can be spent ad infinitum.

Are you about to take a dive into the worlds of Peter Walley and Glen Schafer? That would be fun…

So, people don’t know what to do or what decision to make even if they evaluate that there is a 0.7 probability of making a $100,000 profit and there is a 0.3 probability of breaking even. Of course, no one knows that there are consequences of their decisions and choices since they are math geeks.

And, of course, probabilities cannot talk, therefore they cannot tell you what to do.

I don’t get it?! You have some goods points in your posts, yet those points seem to be blurred out by some strange imaginations of what other people know, think and do wrong. Perhaps it’s me… I live in a different world.

The essence of gambling has always been two-fold: knowing (or believing) how right you are and managing your money. Interestingly, the management part requires the probability part to go from being a measure of certainty to an estimation of winning N/M times.

Assuming you have narrowed the probability part as well as –er– possible, the management part is still hard. Kelly’s rehash of Bernoulli’s utility function (as originally written) doesn’t take into account the distribution of payoffs over time (i.e., from one race to the next) so pretty much forces placing bets on all horses. Seeing a horse with a huge payout and a large probability of winning doesn’t automatically mean betting a huge amount if similar future horses (i.e., same probability) will pay diddly-squat. Yeah, it’d be great IF you win but if you lose you could be placing yourself in a very deep hole that’s difficult to climb out of.

… but it is

very likelythat the cat will eat the bacon that it is tied on. To paraphrase a dutch proverb.Economists love to confuse this by adding more BS to the equation.

First the punter has his estimate of the likelihood his team will win.

And then the economist goes screwballs.

What is the satisfaction that he will get from winning the bet compared to his dissatisfaction from loosing the bet? First it is assumed that these can not necessarily be quantified, but be ordered. Then they lay out some equation as if all of these parameters could be quantified and have been measured.

DAV,

I think that Kelly’s analysis has some interesting results. If you have superior (but not perfect) information, and a finite endowment, there is a strategy to maximize the long-term growth of your resources.

Doug M,

Believe me, it won’t work in modern racing as written. For one, his estimation of the probability of winning was exactly that of the crowd. Shuffling money around might get you ahead but your edge is really small. You’d be better off taking the money to a bank.

Doug M,

If you are talking about investing in, say, the stock market. The distribution of payoffs is needed there as well. Fortunately, in the stock market you want to bet with the crowd. OTOH, you could still dig yourself a really deep hole if you place a lot on a (relatively) sure thing and turn out to be wrong. Kelly/Bernoulli would tell you to do just that.

The lessen here is to not gamble on sports – that and math philosophers whould probably stay out of politics.

I think you should lay off what other people believe. I don’t know anyone confusing decision and and probability. We get the difference.

When you bring climate change into that, a simple cost benefit analysis shows anyone who isn’t a liar, crook, or moron, that less pollution and the cleaner technology that goes with it is worth the cost.

JMJ

JH,

Your example is incomplete.

1. You give 0% to the chance of losing your original stake, which is exceedingly rare for real investments.

2. You don’t give the amount of money invested.

Let’s adjust your probabilities a little.

0.4 $100,000 profit

0.2 Break Even. No profit, No loss.

0.2 Lose 50% of original investment

0.2 Lose entire investment stake.

More realistic, but still not enough by itself to make a decision. Without knowing the initial investment, you still don’t know anything about the down side risks.

If the original investment is only $1 the answer is probably hell yes, but if the original investment is $100,000,000 the answer is going to be different.

JMJ,

“When you bring climate change into that, a simple cost benefit analysis shows anyone who isnâ€™t a liar, crook, or moron, that less pollution and the cleaner technology that goes with it is worth the cost.”

Your cost / benefit analysis isn’t simple, its’ overly simplistic.

The worst case scenario (very low probability) is that we have to go back to pre-industrial CO2 emissions levels to avoid catastrophe.

Wind and Solar will never be able to provide more than a tiny fraction of our current energy needs. We can’t build nuclear fast enough and we can’t transfer off liquid fuels for transportation fast enough to maintain current energy consumption.

The cost you are talking about is going back to a pre-industrial life style. There are no benefits to this if the worst case scenario doesn’t come true.

MattS,

You are right. My quick example may not contain enough information for a multi-tenth-billionaire to make a decision. Please feel free to add any premises that would allow the decision-making easier or possible. Time frame. Risk attitude. Alternatives. Utility function. And so on.

Anyway, still, your example demonstrates that, due to uncertainty, we will not overlook those probabilities and there is information contained in them that can affect our decision making… even if the probability is assessed to be a non-quantitative “extremely rare.”

A while ago, a good friend, who works in a well-known investment firm in NYC, told me he was given a very important and exciting project. A project to calculate the insurance premium to insure the investments of filthy-rich investors! So, 0% chance of loss might seem extremely rare to us, but not to some rich investors.

I am not a tenth-billionaire, but life is good!

Matt, I am in no way countenancing a reversion to primitive technology (quite the opposite, while you apparently advocate being stuck in the status quo). And I don’t know why you think I need a lessen in risk aversion. If you knew risk aversion a little better, though, you probably wouldn’t seem so conservative.

JMJ

@ Jersey McJones

Only if you live forever.

“When you bring climate change into that, a simple cost benefit analysis shows anyone who isnâ€™t a liar, crook, or moron, that less pollution and the cleaner technology that goes with it is worth the cost.”

You can’t possibly say that without knowing the cost. And this cost will be paid in dollars as well as human lives(taking money away from more immediate needs). This is dangerous thinking from inside a single issue box. Guys like BjÃ¸rn Lomborg have thought this through in a much better way. I certainly wouldn’t consider him a liar, crook or a moron. Just a smart guy who thinks outside of single issues.

DAV,

You have a different takeaway from Kelly than I do…

“you could still dig yourself a really deep hole if you place a lot on a (relatively) sure thing and turn out to be wrong. Kelly/Bernoulli would tell you to do just that.”

Well then, it wasn’t a sure thing, now was it? If you truly had a sure thing, you would bet the farm on it. But, you never do. What the Kelly system does is sets a maximum. You should never bet more than X where X is a function of your confidence in the outcome and payoff for being right. And since we know that we are always overly certain, we should never wager more than 1/2 X.

The Kelly criterion sets an upper bound, and not a minimum.