For readers in a hurry, here’s the answer: There is no probability of a nuclear war. Nor of a conventional one. Nor even of you being victor in your next King Of The Hill Battle.

Thanks to Ken Fitch for pointing us to an article with same name, by an author who disagrees with me and says there *is* a probability of a nuclear war.

The author is Alex Tabarrok, who provides a table from another author who compiled of various sources, each providing its own probability of a nuclear war. These are “annualized”, meaning you can use the numbers and say “There is a X% chance of a nuclear war this year.” And the same next year.

If there was a probability of a nuclear war. Which there isn’t. Not *unconditionally*.

The compiler meant a nuclear war with Russia, and so that is the proposition to which her experts put their minds. Will there be a nuclear war? I take it to mean the use in anger of a nuclear weapon of any kind. The answer is yes or no. The answer is yes is the war is caused to happen by some entity, and no if caused not to happen by any number of reasons.

If we knew the cause, then we’d know the answer. God knows what will happen. Could anybody besides God know? Well, it’s not logically impossible. There might be somebody out there who has thought through every possible contingency (note that word!), and judged correctly every action every actor would take, up to and including whether the button is pushed or not, for every day between now and the crack of doom. There are such things as prophets.

Barring direct revelation, we only have our wits to go on. And our wits have fits. The best we can do is gather all those bits of of information which we believe are probative of the question and think through all possibilities relative to this.

Which everybody already knows. It’s the same thing people who bet on sports, horses, or stocks do.

Now everybody also knows even the best sports bettors don’t bat a thousand, to sportily mix a metaphor. Surely who will win tonight’s game is much easier to predict that whether the bombs will start flying. This means the obvious: (1) even experts don’t know exactly what to look for, and (2) the causes of a win or a war are too many to grasp.

Same thing is true even for rolls of dice! The causes aren’t hard to know in principle; it’s just some bouncing around using the equations of motion. If you knew, precisely knew, the initial conditions, and the causes, then you’d know exactly what would happen on the toss.

If you knew, precisely knew, what was on Vlad’s mind, and will be on Vlad’s mind, and the minds of all his generals, and the minds of all our generals, and the minds of all our goading neocon let’s-have-another-war armchair generals, and the minds of all peoples everywhere who will influence these other minds, and all conditions of the world that will influence the minds of those peoples everywhere, and you knew the calculus of free will, why, then you’d know whether or not there’d be a nuclear war. Simple!

Barring that level of precision, you can instead gather what little you do or can know. You’ll be left with a set of propositions you believe are probative to “Nuclear war is launched”. Best thing you could do is then say, “Given what I know, the chance of a war is slim”, and leave it at that. Without quantifying “slim.” Why?

Quantifying “slim” adds to the list of propositions you already hold about “Nuclear war is launched”. Some of those propositions will be wrong, some right. Saying you know how to get to an exact number means saying you know the exact mathematical way the evidential propositions you’re using related to “Nuclear war is launched”. Well, you might be right about that, but you’re probably wrong. It’s just too complex, and there’s no way to verify your suppositions.

Still, the question of war *is* important. Decisions must be made regarding it. Decisions require knowing the uncertainty of the question. This is where the temptation to quantify the probability becomes, to many, overwhelming. Some of the decisions will be monetary, and money is a number; therefore, the probability must be made into a number, too, so that some fancy equations will work.

That leads to over-certainty. You’re cramming a quantified model into a box because you want numbers. That means ignoring all the other non-quantified bits of the problem. The equations become bloated in esteem.

There will be some who say you need quantified probabilities to make decisions, which is false. We make decisions all the time without quantifying anything. The outcome of a mistake about the war is not quantifiable, whether in saying there will be a war when there won’t be, or the opposite. It is to guarantee uncertainty to put artificial numbers on outcomes, just so some formal “decision analysis” model, fed with artificial probabilities, will work.

*To support this site using credit card or PayPal click here*

Categories: Statistics

Probability of nuclear war: “It’s just too complex, and there’s no way to verify your suppositions.”

What is ‘nuclear war’: “I take it to mean the use in anger of a nuclear weapon of any kind.”

By that definition “nuclear war” HAS occurred (WWII, Japan); the probability was 100 percent. Exactly.

The question has been posed as if the event MIGHT happen, which is wrong given that it has happened. Thus, considering a human event having occurred once the question becomes, ‘Might it happen again?’

The familiar cliche that history repeats, or rhymes – that based on established records of different batches of humans doing again pretty much what some ancestors did – strongly suggests that, given humans are involved, the probability of ANOTHER nuclear war (per the definition given) is 100 percent.

The only uncertainty is when, where and why.

If one is inclined to think the answer is zero, or less than100 percent, part of the rationale should address why a thing humans have done once is a thing humans will never ever do again (especially given how easy that thing is to do – easier than wholesale genocide, for example, which has many instances on record).

I think Spock could calculate those odds!

https://youtu.be/SQEh9gm2xcs

An accidental release of a nuclear missile towards the United States, is a possibility, without a return to Jesus by our country….only God knows how far His mercy extends.

God bless, C-Marie

Actually, wars do follow a mathematical pattern, but it’s a power-law distribution, so many concepts related to the normal distribution, like mean and variance, do not necessarily apply.

Read this, and be very afraid:

https://blog.jim.com/culture/taleb-refutes-pinker-on-war/

https://en.wikipedia.org/wiki/Power_law

The term “homogeneous Poisson process” means that recent history does not predict the likelihood of wars in the near future. Combatants might be exhausted from their recent engagement, or they might be yearning for revenge, and these two effects seem to cancel out.

If Taleb’s model is correct, you are more likely to die in a thermonuclear extinction event than to be murdered. That is, “N times the expected number of wars that kill at least N people” increases with N, from N=1 (a single murder) to N = the entire population of the Earth.

Well, religion and the whole global competition over whose god has a bigger dick could be the catalyst for nuclear war. I’ve never believed in God or religion. Science answers all the big questions as far as I am concerned. The idea of a supreme being that creates a world, only to let it go to shit because not enough people “believe in him” makes no sense at all.

Doesn’t the very concept of free will preclude the possibility of a calculus of free will?

Bedarz,

Exactly.

Another dogmatic frequentist who insists that probabilities are long-run frequencies, ONLY long-run frequencies, and can never be anything else. You need to read about Bayesian statistics. Epistemic probabilities — which measure what degree of confidence in a proposition is warranted by the information you have available — are an entirely valid concept. They obey the usual rules of probability.

In fact, there are various theoretical results showing that using probabilities to quantify uncertainty is the only logically coherent approach to reasoning in the face of uncertainty. Here is a paper that shows that if you try to extend classical propositional logic to deal with degrees of uncertainty, and want to retain certain properties that classical propositional logic already has, you unavoidably end up using probabilities to measure uncertainty and the laws of probability as your logic. There are two things to note about the theorem:

* It does NOT assume a numerical measure of uncertainty; rather, it proves that you unavoidably end up with one.

* It gives you an actual formula so that, if you have propositions A1, …, An as your known information, you can calculate the probability of any other proposition.

I omitted the link to the paper in my previous comment. Here it is:

From propositional logic to plausible reasoning: A uniqueness theorem.

https://www.sciencedirect.com/science/article/abs/pii/S0888613X16302249

https://arxiv.org/abs/1706.05261

There is a high probability that America may Nuke itself or the Southern Border if the Democrats keep insisting on bringing so many people here!

Bedarz, Perfect!!!

Kroplinski, The choice was either to create robotic creatures with no will of their own and thus no ability to choose and make choices…… so why make them at all…..or create creatures with free will who could and do choose….which choices make the troubles in the world……do you really prefer science as science is only discovering what already is…..science can only work with what is to make different mixtures….which comes to why create at all…..God did so because He is Love….so all of this brings us to His Son’s crucifixion in our place…..and His Resurrection…..and with our faith placed in Christ….to Eternal life….forever…..

God bless, C-Marie

Kevin,

And while you’re at it, check out my Books, Probability Articles & Free Class Page.

Should I be worried about a nuclear war between the USA and Russia?

You should worry about nuclear war no matter who it is. India and Pakistan, between those two could precipitate a nuclear war. North Korea another example. Yes we should all be worried about that.

The biggest question is why are we going down this path? I keep asking myself why such enthusiasm for nuclear war? Can anyone answer me that, do you have an idea of who it might be? Is it an entity, or aliens, the 666 or is it some sort of genetic defect? I mean, they do say that between male and female, men are the superior of the species, at least that’s what they have been telling themselves. If Men are so damn Superior, why is the world so f’d up?

Should you worry about nuclear war, YES. But don’t worry too much about it. There’s no way that you can stop it, it’s in an inevitability. Perhaps it’s genetically programmed into male DNA, the hunter gatherer instinct.

Yes Men are intelligent tool users, but the species in general has to evolve to a higher consciousness.

Don’t Worry Be Happy.

—-

Another dogmatic frequentist who insists that probabilities are long-run frequencies, ONLY long-run frequencies, and can never be anything else.

—-

I don’t believe the outcomes of a roulette wheel depend on who is spinning, so I’m going to have to disagree. That frequentist is not being dogmatic, but just properly defining the area of study (generally repeating events, limiting behavior, irrelevance of place selection). That frequentist would probably say that “uncertainty” or “chance” used to describe other things is fine to use and study, just to reserve the word “probability” for the frequentist meaning. I’d also add that the long run can inform about the current short run state, meaning that frequentists can speak about short term.

—-

You need to read about Bayesian statistics. Epistemic probabilities — which measure what degree of confidence in a proposition is warranted by the information you have available — are an entirely valid concept. They obey the usual rules of probability.

—

Just obeying rules isn’t enough IMO, it has to be more closely tied to the real world (empirical) to be useful. For example, I can fit the Empire State building into the diagonal of an n-dimensional unit hypercube, given large enough n. Well, just using math anyway. I cannot do that in reality because a) I cannot move the Empire State building, b) I don’t know if n-dimensions exist, and c) hypercubes are mathematical constructs.

One can see a difference between flipping a coin 10,000 times and noticing the limiting behavior of the relative frequency of Heads, vs you believe the probability is .5 because the coin has two sides, someone else believes it is .2 because they are certain the coin is a magician’s trick coin, and someone else believes it is .9 because they believe in testimony of 5 people who told them that it is .9. The fact that the beliefs of .5, .2, and .9 (all based on best information available mind you), and any prior belief, will get ‘washed out’ by the actual data (“likelihood swamps the prior”) tells us that data is ultimately more important than beliefs in determining “probability”. For large n this is true, and for small n we all cannot be certain of much no matter what definition of things we prefer.

It is kind of like these ‘proofs of God’ relying on Bayes rule. It becomes like a ‘Drake Equation’ where you get whatever output you’d like (to support your beliefs usually) based on your prior. Bayes rule is sound of course but the outcome might not be when using subjective inputs,

Justin