William M. Briggs

Statistician to the Stars!

Page 3 of 414

Explanation Vs Prediction

The IPCC, hard at work on another forecast.

The IPCC, hard at work on another forecast.

Introduction

There isn’t as much space between explanation and prediction as you’d think; both are had from the same elements of the problem at hand.

Here’s how it all works. I’ll illustrate a statistical (or probability) model, though there really is no such thing; which is to say, there is no difference in meaning or interpretation between a probability and a physical or other kind of mathematical model. There is a practical difference: probability models express uncertainty natively, while (oftentimes) physical models do not mention it, though it is there, lurking below the equations.

Let’s use regression, because it is ubiquitous and easy. But remember, everything said goes for all other models, probability or physical. Plus, I’m discussing how things should work, not how they’re actually done (which is very often badly; not your models, Dear Reader: of course, not yours).

We start by wanting to quantify the uncertainty in some observable y, and believe we have collected some “variables” x which are probative of y. Suppose y is (some operationally defined) global average temperature. The x may be anything we like: CO2 levels, population size, solar insolation, grant dollars awarded, whatever. The choice is entirely up to us.

Now regression, like any model, has a certain form. It says the central parameter of the normal distribution representing uncertainty in y is a linear function of the x (y and x may be plural, i.e. vectors). This model structure is almost never deduced (in the strict sense of the word) but is assumed as a premise. This is not necessarily a bad thing. All models have a list of premises which describe the structure of the model. Indeed, that is what being a model means.

Another set of premises are the data we observe. Premises? Yes, sir: premises. The x we pick and then observe take the form of propositions, e.g. “The CO2 observed at time 1 was c1“, “The CO2 observed at time 2 was c2,” etc.

Observed data are premises because it is we who pick them. Data are not Heaven sent. They are chosen and characterized by us. Yes, the amount of—let us call it—cherishing that takes place over data is astonishing. Skip it. Data are premises, no different in character than other assumptions.

Explanation

Here is what explanation is (read: should be). Given the model building premises (that specified, here, regression) and the observed data (both y and x), we specify some proposition of interest about y and then specify propositions about the (already observed) x. Explanation is how much the probability the proposition about y (call it Y) changes.

That’s too telegraphic, so here’s an example. Pick a level for each of the observed x: “The CO2 observed is c1“, “The population is p”, “The grant dollars is g”, etc. Then compute the probability Y is true given this x and given the model and other observed data premises.

Step two: pick another level for each of the x. This may be exactly the same everywhere, except for just one component, say, “The CO2 observed is c2“. Recompute the probability of Y, given the new x and other premises.

Step three: compare how much the probability of Y (given the stated premises) changed. If not at all, then given the other values of x and the model and data premises, then CO2 has little, and maybe even nothing, to do with y.

Of course, there are other values of the other x that might be important, in conjunction with CO2 and y, so we can’t dismiss CO2 yet. We have a lot of hard work to do to step through how all the other x and how this x (CO2) change this proposition (Y) about y. And then there are other propositions of y that might be of more interest. CO2 might be important for them. Who knows?

Hey, how much change in the probability of any Y is “enough”? I have no idea. It depends. It depends on what you want to use the model for, what decisions you want to make with it, what costs await incorrect decisions, what rewards await correct ones, all of which might be unquantifiable. There is and should be NO preset level which says “Probability changes by at least p are ‘important’ explanations.” Lord forbid it.

A word about causality: none. There is no causality in a regression model. It is a model of how changing CO2 changes our UNCERTAINTY in various propositions of y, and NOT in changes in y itself.1

Explanation is brutal hard labor.

Prediction

Here is what prediction is (should be). Same as explanation. Except we wait to see whether Y is true or false. The (conditional) prediction gave us its probability, and we can compare this probability to the eventual truth or falsity of Y to see how good the model is (using proper scores).

Details. We have the previous observed y and x, and the model premises. We condition on these and then suppose new x (call them w) and ask what is the probability of new propositions of y (call them Z). Notationally, Pr( Z | w,y,x,M), where M are the model form premises. These probabilities are compared against the eventual observations of z.

“Close” predictions means good models. “Distant” ones mean bad models. There are formal ways of defining these terms, of course. But what we’d hate is if any measure of distance became standard. The best scores to use are those tied intimately with the decisions made with the models.

And there is also the idea of skill. The simplest regression is a “null x”, i.e. no x. All that remains is the premises which say the uncertainty in y is represented by some normal distribution (where the central parameter is not a function of anything). Now if your expert model, loaded with x, cannot beat this naive or null model, your model has no skill. Skill is thus a relative measure.

For time series models, like e.g. GCMs, one natural “null” model is the null regression, which is also called “climate” (akin to long-term averages, but taking into account the full uncertainty of these averages). Another is “persistence”, which is the causal-like model yt+1 = yt + fuzz. Again, sophisticated models which cannot “beat” persistence have no skill and should not be used. Like GCMs.

More…

This is only a sketch. Books have been written on these subjects. I’ve compressed them all in 1,100 words.

———————————————————————————-

1Simple causal model: y = x. It says y will be the value of x, that x makes y what it is. But even these models, though written mathematically like causality, are not treated that way. Fuzz is added to them mentally. So that if x = 7 and y = 9, the model won’t abandoned.

Philosophic Issues in Cosmology V: What Measurements Tell Us—Guest Post by Bob Kurland

Distance Ladder from Skynet University (UNC)

Distance Ladder from Skynet University (UNC)

Bob Kurland is a retired, cranky, old physicist, and convert to Catholicism. He shows that there is no contradiction between what science tells us about the world and our Catholic faith.

Read Part IV.

When you can measure what you are speaking about, and express it in numbers, you know something about it, when you cannot express it in numbers, your knowledge is of a meager and unsatisfactory kind; it may be the beginning of knowledge, but you have scarcely in your thoughts advanced to the stage of science.—Lord Kelvin.

The following types of data are primary: positions and luminosities of stars and galaxies (including x-ray, UV, visible, IR, microwave and radio-frequency radiation ); wavelengths of spectral lines from these objects; Doppler shifts of such wavelengths (shifts in the wavelength that depend on the velocity of the object emitting the radiation); frequencies, intensities and polarizations of the microwave cosmic background radiation (CBR).

It’s important to realize that there is a “ladder” of inferences of secondary data from these primary data. For example, the distances of nearby stars (10-100 light years or so distant from us) can be estimated relatively accurately by parallax measurements. From the intensity of light observed, one can then estimate accurately the intrinsic brightness of these stars. One can then use other properties, at known distances, to set up what are called “standard candles”: properties that relate to the intrinsic brightness, so that the intrinsic brightness can be inferred, to give from the observed intensity an inferred distance.

Mass Density and Curvature of Space

Various standard candles are used at various distances, including cepheid variables to supernovae and galactic lensing of quasars. One of the first standard candles was the intrinsic brightness of the Cepheid variables. Hubble used these to estimate the distance of stellar objects and to construct his plot of red shift versus distance, which was the basis for the expanding universe theory. Since that time more accurate measures have given very good linear relation between red-shift (velocity moving away from us) and distance from us.

One can also count the number of objects within the field of view and from this count make an estimate of the total number of objects to be seen, and thus infer the total (baryonic-ordinary) mass. From this astronomical data one can infer the following: the actual ratio of matter to a critical value; this ratio is designated “Omega 0″ (with uppercase Greek letter). If Omega 0 is > 1, space-time is positively curved (like a sphere) and the universe expansion will eventually turn into a collapse, for a “big crunch”; if Omega 0 is = 1 space-time is flat and the universe will expand in a uniform way; for Omega 0 <1, the universe is curved as in a saddle surface, and will expand indefinitely.

Dark Energy

Observations of red shifts from distant supernovae and from temperature anisotropies in the cosmic background radiation suggest that there is a “dark energy”, a pressure (as in the “lambda” constant in Einstein’s original formulation) that makes the expansion of the universe accelerate. (What this is saying is the expansion rate is slower for older, more distant objects, faster for more recent, closer objects, so there is an acceleration of the rate.)

Evidence for an Expanding Universe

The following observations, in addition to the red shift, confirm the picture of a universe expanding from a hot big bang: the cosmic background radiation, the relative abundance of hydrogen to helium in the universe (about 3/1) and the lack of heavy elements in far distant galaxies. The cosmic background radiation is like the embers of a burnt-out fire, the embers of the hot “Big Bang” spread evenly throughout the universe. The small irregularities in the cosmic background radiation indicate the fluctuations that grew into stars and then galaxies. The relative abundance of hydrogen to helium is consistent with models of element formation that took place at an early, high temperature stage of the universe. For far distant galaxies (10 billion years light distance, say), they are also at an early stage of development (remember, going in distance is also going back in time) and therefore heavy elements have not yet formed by the collapse of red giant stars.

Ellis lists (among others) the following common misconceptions about the expanding universe:

  • Misconception 1: The universe is expanding into something. It is not, as it is all there is. It is just getting bigger, while always remaining all that is.
  • Misconception 2: The universe expands from a specific point, which is the centre of the expansion. All spatial points are equivalent in these universes, and the universe expands equally about all of them. Every observer sees exactly the same thing in an exact RW geometry. There is no centre to a FL universe.
  • Misconception 3: Matter cannot recede from us faster than light. It can, at an instant; two distantly separated fundamental observers in a surface {t = const} can have a relative velocity greater than c if their spatial separation is large enough. No violation of special relativity is implied, as this is not a local velocity difference, and no information is transferred between distant galaxies moving apart at these speeds. For example, there is presently a sphere around us of matter receding from us at the speed of light; matter beyond this sphere is moving away from us at a speed greater than the speed of light. The matter that emitted the CBR was moving away from us at a speed of about 61c when it did so.

The next in this series will deal with the Anthropic Principle.

Summary Against Modern Thought: God Has No Passive Potentiality

This may be proved in three ways. The first...

This may be proved in three ways. The first…

See the first post in this series for an explanation and guide of our tour of Summa Contra Gentiles. All posts are under the category SAMT.

Previous post.

We started with a in-depth proofs that there must exist, for anything to change, an Unchanging Changer, an Unmoved Mover. We call this “entity” God. Why? To merely label this Primary Force (to speak physically) “God” felt like cheating. Why does a physical force have to be called God? Isn’t that topping it high? That’s because we don’t yet know that it’s the logical implications of the foregoing proof that insist this force is God. So far, we know the force must be eternal, i.e. outside of time. Today, we see that it must be without potentiality. Still not enough to come to God, as He is usually understood—but we have many chapters to go! Today’s proofs are so succinct and clear they need little annotation.

Chapter 16: That in God there is no passive potentiality

1 NOW if God is eternal, it follows of necessity that He is not in potentiality.i

2 For everything in whose substance there is an admixture of potentiality, is possibly non-existent as regards whatever it has of potentiality, for that which may possibly be may possibly not be. Now God in Himself cannot not be, since He is eternal. Therefore in God there is no potentiality to be.ii

3 Again. Although that which is sometimes potential and sometimes actual, is in point of time potential before being actual, nevertheless actuality is simply before potentiality: because potentiality does not bring itself into actuality, but needs to be brought into actuality by something actual. Therefore whatever is in any way potential has something previous to it. Now God is the first being and the first cause, as stated above.[1] Therefore in Him there is no admixture of potentiality.iii

4 Again. That which of itself must necessarily be, can nowise be possibly, since what of itself must be necessarily, has no cause, whereas whatever can be possibly, has a cause, as proved above.[2]iv Now God, in Himself, must necessarily be. Therefore nowise can He be possibly. Therefore no potentiality is to be found in His essence.

5 Again. Everything acts according as it is actual. Wherefore that which is not wholly actual acts, not by its whole self, but by part of itself. Now that which does not act by its whole self is not the first agent, since it acts by participation of something and not by its essence. Therefore the first agent, which is God, has no admixture of potentiality, but is pure act.v

6 Moreover. Just as it is natural that a thing should act in so far as it is actual, so is it natural for it to be passive in so far as it is in potentiality, for movement is the act of that which is in potentiality.[3] Now God is altogether impassible and immovable, as stated above.[4] Therefore in Him there is no potentiality, namely that which is passive.

7 Further. We notice in the world something that passes from potentiality to actuality. Now it does not reduce itself from potentiality to actuality, because that which is potential is not yet, wherefore neither can it act. Therefore it must be preceded by something else whereby it can be brought from potentiality to actuality. And if this again passes from potentiality to actuality, it must be preceded by something else, whereby it can be brought from potentiality to actuality. But we cannot go on thus to infinity. Therefore we must come to something that is wholly actual and nowise potential. And this we call God.vi

————————————————————————–

iRecall that to be in potentiality means possessing the capability of change, but as was proved over the course of many weeks, God does not changed. He is the Unmoved Mover.

iiThis metaphysical truth is a hammer. Note very carefully that we move from this truth to God. We do not start with belief. Speaking very loosely, God is a theorem. And I only mention this to counter to frequent, and really quite ridiculous charge, that our knowledge of God is entirely “made up” (of beliefs).

iiiAin’t that a lovely point? Remember: it is not the potential of you being in Cleveland that actually moves you there. Something actual must do that. Actualities fulfill potentialities.

ivLinger over this one, dear reader. What is necessary must be.

vThis follows from God being unchanging.

viI adore these kinds of proofs. Once you understand what a infinite regression truly implies, understanding dawns brightly. The “base” of all must be actual and not in potential. Must be. St Thomas calls this “base” God. We still haven’t felt why he does this, but we’re getting closer.

Next installment.

[1] Ch. xiii.
[2] Ch. xv.
[3] 3 Phys. i. 6.
[4] Ch. xiii.

Gibbon (And O’Brian) On Too Many Lawyers

egib

This is Gibbon, quoted in Patrick O’Brian’s The Reverse of the Medal by the character Dr Stephen Maturin, who then speaks:

‘”It is dangerous to entrust the conduct of nations to men who have learned from their profession to consider reason as the instrument of dispute, and to interpret the laws according to the dictates of private interest; and the mischief has been felt, even in countries where the practice of the bar may deserve to be considered as a liberal occupation.”

‘He thought—and he was a very intelligent man, of prodigious reading—that the fall of the Empire was caused at least in part by the prevalence of lawyers. Men who are accustomed over a long series of years to supposing that whatever can somehow be squared with the law is right—or it not right then allowable—are not useful members of society; and when they reach positions of power in the state they are noxious. They are people for whom ethics can be summed up by the collected statues.’

Gibbon would have agreed that “lawyers” include regulators and modern-day bureaucrats (many of whom are trained lawyers). The Authoritarian (these days read: progressive, leftist) believes that the law and morality are one, an ancient and diseased fallacy as ineradicable and as harmful as rats. This is why she seeks to enlarge the law to encompass all manner of activity, and of thought. Their well known slogan is “Whatever is not mandatory is forbidden!”

Help me. What group is it that constantly, loudly, nervously, and boorishly insists, at every opportunity, of their collective rationality and reason?

Skip it. Nobody needs another lesson on the left’s zeal for shackling, but what is less known is how progressive policy drives excesses on the right. Men who understand that the law is everything, and who know no other morals, will push that law to its extreme. This causes a natural reaction and encourages a greater tightening of the bonds. The process is iterative and ends only when the knots become so burdensome that life is strangled.

The law does not forbid a man from maximizing short-term profit by firing large swaths of employees who only yesterday he called “family.” It is natural to pity the dispossessed and to despise the (family) man, but the inclination to force the State (with the help of lawyers) to punish the man causes more harm than good.

The man is punished, but he feels aggrieved more than shamed, and thus seeks (with the help of lawyers) to further test the limits of the law, which causes more excess. And so on.

Through it all the State is seen as Arbiter, the Supreme Entity. This belief is encouraged by both sides. But people forget the State is made of people, especially those people who falsely believe in the equality of law and morality. Like lawyers.

Solution? A fundamental change in how we view the world. How do we bring it about? Don’t know. Blog posts? Your ideas?

Missing Global Warming Close To A Solution?

I can’t tell you where I obtained a photocopy of the note below. I can tell you that I am in Washington DC (at the lovely and recommended Morrison Clark Inn).

Look for this news to bust wide open once the “New Weathermen” are identified. A source (I cannot be more specific without betraying a confidence) hinted that the offspring of certain individuals want to carry on the “family business.”

My guess is the IPCC will pay. The ransom is peanuts next to the monies government expects to generate with Global Warming regulations. Your thoughts?

ransom

« Older posts Newer posts »

© 2014 William M. Briggs

Theme by Anders NorenUp ↑