William M. Briggs

Statistician to the Stars!

Category: Statistics (page 1 of 365)

The general theory, methods, and philosophy of the Science of Guessing What Is.

Uncertainty’s A Hit!

It hasn’t been a year, but Springer issued its annual report, and, in my eyes anyway, Uncertainty is a hit!

From 30 June 2016 through the first two weeks of April 2017, 821 copies were sold. And more were downloaded—but I don’t know how many. Springer has a licensing deal with many major universities and institutions, which provides on-demand ebooks for members. Uncertainty is one of these books. I have had reports from a good handful of people that they have downloaded it, but that’s all I know. (I don’t get separate royalties from downloaded copies.)

Do not forget Uncertainty’s main page where posts related to the book are indexed (like with most things, I am running behind on this).

I’ve had conversations with many about various points in the book. I haven’t seen anywhere where I changed my mind. But I’ve seen many places where I could have added more explanation, spent more time editing and writing, and I have been told I need add discussions on several crucial matters. Sample size, for instance (relate it to real, not statistical, control). Factor analysis (don’t do it). And so on. Keep the questions coming and I’ll get to them—eventually.

Review

Uncertainty’s been reviewed at The Philosopher by Thomas Scarborough. Bad news first:

I have one demurral ato make. In places, the style seems unnecessarily to get in the way of the content. In particular, outbursts such as ‘Die, p-value, die, die, die!‘ or ‘p-values, God rot them!’, while they are certainly memorable, do not seem to serve the book well as the serious academic work that it is.

I agree. The “die die die” was a joke, which is now obvious of too obscure an origin (think back to Usenet days and you’ll have it). So I wish it were gone. There are an appalling number of typos, too, which is surprising not because of me (I’m famous for them), but because the book was edited by Springer. I will correct these for any second edition.

Now for better bits.

[Uncertainty], writes the author, is not answered by grasping for equations, let alone models. It requires ‘slow, maturing thought’. It is more a matter of philosophy than of mathematics. Yet people shun the effort. Instead, they grasp at pre-packaged probability theory, which is far too easily applied without further thought. In fact, the author sketches a situation of crisis proportions. There is altogether too much that we get wrong…

The publisher describes this work as a textbook. It begins with what one might call a componential analysis of probability. It carefully examines such concepts as truth, induction, chance — and many besides. Then it applies these observations to the field of modelling. While the mathematics are complicated, this is compensated for by the authors’s gift of explanation.

The book really brightens up when one reaches worked examples of what can and does go wrong, and how probability calculations for the self same situations may easily turn out to be quite different. The examples are generalised, too, so as to be meaningful beyond specific contexts. Some particularly illuminating sections of the book include a series of graphs and equations in which the quantification of GPAs, the probabilities of developing cancer, or how one might validate homophobia, are discussed…

All in all, if the author is right, then our world has strayed down a path which is dangerously simplistic — and this tendency towards simplistic thinking has much to do with how we think about uncertainty. One might go so far as to say: that we have misapplied, and continue to misapply, theory which has to do with things of critical importance, including the very future of humanity.

This Scarborough fellow is on to something. Better buy the book to discover what.

I’ll leave the last word to commenter Keith:

William Briggs’s book sounds intriguing, and an important read; there’s so much to the story of probability and chaos. Most people are familiar with the frequently recounted story involving the mathematician-cum-meteorologist Edward Lorenz, who used computer models to predict weather — and in the process serendipitously contributed to the development of chaos theory, and how scientists look at such exquisitely nonlinear systems as the weather. Back in the early Sixties, he decided to rerun one of his weather simulations. However, not thinking it mattered, Lorenz decided to begin the simulation in the middle, using numbers (for the ‘initial conditions’) from the first run. Much to his astonishment, the new virtual weather pattern, which he expected to follow the first run of the model, dramatically deviated from it. What he subsequently realized is that whereas the computer had stored in its memory the first run’s results to six decimal places, the printout, from which he reentered the numbers, had truncated the numbers to just three decimal places, to save space. As for predictions, modeling, nonlinearity, probability, initial conditions, uncertainty, controls, chaos, and outcomes, the rest is history. I look forward to reading this book.

Brain Damage Increases Religious Fundamentalism—Or Scientific Hubris?

Stream: Brain Damage Increases Religious Fundamentalism—Or Scientific Hubris?

You didn’t hear it coming. You didn’t even feel it. Yet there you were on Hamburger Hill, 12 May 1969, praying you’d come through the battle, when a piece of shrapnel dug into your skull.

It’s still there today. Doctors couldn’t, didn’t dare, take it out. Maybe it doesn’t hurt; the doctors said it shouldn’t. But you swear you can feel it in there.

Suppose this permanently wounded Vietnam veteran was you, dear reader. Now I ask you the obvious questions: How does this make you feel? Would this injury—just perhaps—incline you to deepen your religious faith?

If you answered that question—no matter how you answered it—you’re one up on the scientific researchers Wanting Zhong, Irene Cristofori, and three others who studied the religious commitment of Vietnam vets with brain injuries. These scientists thought brain injuries caused vets to become more religious, not because of the introspection harrowing life-threatening experiences like that imbue, but because the scientists thought the injuries themselves caused the vet’s brains to, in effect, misfire and induce these unfortunate men to become more fundamental in their religious beliefs.

Don’t scoff. This was peer-reviewed research in the journal Neuropsychologia, published in the article “Biological and cognitive underpinnings of religious fundamentalism“.

What’s this about religion? The authors say “Religious beliefs are socially transmitted mental representations that may include supernatural or supernormal episodes that are assumed to be real.” That they might even be real did not enter the authors’ minds as a possibility. Never mind. The real object is religious fundamentalism, which they say “embodies adherence to a set of firm religious beliefs advocating unassailable truths about human existence”. Unassailable truths like the scientific method?

“Fundamentalism requires a departure from ordinary empirical inquiry: it reflects a rigid cognitive strategy that fixes beliefs and amplifies within-group commitment and out-group bias”. If that’s not bad enough, “Recent studies have linked religious fundamentalism to violence [and] denial of scientific progress”.

To these authors, that the brain is responsible for religious fundamentalism is a given. “Evolutionary psychology explains the appeal of religious fundamentalism in terms of social functional behavior”, they say. Yet the “neurological systems that enable such inflexible, non-disastrous beliefs [such as fundamentalism] remain poorly understood.” So they studied it.

But if evolution made the brain cause religious belief, did evolution cause the authors’ brains to believe religion can be explained by the brain? What part of the brain is responsible for bad science?

It is an old observation, but a good one, that if the brain is causing our thoughts, then it cannot be trusted, because what guarantee is there that if it misleads us in one area it is not misleading us in another? There is none. If the brain is causing spurious religious beliefs, it could also cause spurious science beliefs. And there is no way to tell the difference.

[]

If you’re not brain damaged, go there to read the rest.

Since this critique appeared at Stream and not here, I went light on the details. If there is interest, and if I have time, I can expand criticisms here. There’s not much need, though, since we have seen this kind of paper come and go hundreds of times.

The Weakening Of College Credentials

Picture from The Audacious Epigone

The picture above is the result of a 10-question quiz given to (purported) college graduates from 1974 until last year, taken from the site The Audacious Epigone. That gentleman’s description and analysis of the plot should be read by all.

The GSS vocabulary test may be this, as discussed at the Inductivist. It is a series of 10 questions, each multiple choice, where one has to know the definition of these words:

Level 1: Edible (96.2%), Broaden (96.1%).

Level 2: Space (84.4%), Pact (84%), Accustom (82.1%), Animosity (77.9%).

Level 3: Cloistered (38.6%), Caprice (35.3%).

Level 4: Emanate (28.7%), Allusion (25.7%).

This was taken from the comments of somebody calling himself Jason Malloy. The percentages are the average correct responses, perhaps over the entire public and not just the graduates.

Now I take it as a given that every regular reader of this site would score 100% on this quiz, and would not be taxed unduly in the effort. Indeed, I take it that knowledge of these words (and the words used in the multiple choices) would be the bare minimum, and really below that minimum, for a college graduate. If you can graduate “college” and not know what allusion means, the degree awarded has little value.

There is a fuzziness to the plot not shown, which must be accounted for when we want to apply the results to the public as a whole. But we’ll take it rough and ready, and not read too much into it.

When we’re asked by some social service worker, or whomever, “Here, take this vocabulary quiz”, the average, and indeed more intelligent person, might become bored and rush through. Mistakes not related to knowledge happen. Which is why, the Epigone plotted nine or ten correct.

Four decades ago, 12% of the population had degrees. Today, 33% does. If, in the early seventies, that 12% roughly corresponded with the top 12% of the IQ distribution, then the 6% of the population that aced the Wordsum test would comprise 1 in 2 of those grads. If today that 33% roughly corresponds with the top 33% of the IQ distribution, then the 6% of the population acing the Wordsum test would be a bit more than 1 in 6 of today’s grads.

More or less, plus or minus. What’s more concrete is the increase in the public who have “degrees”. The Census bureau estimates in 1940 about 5% who had a Bachelors or higher, rising to about 25% in 2009. High school was completed by about 25% (by age 25) in 1940, jumping to somewhere north of 80% in 2009.

In 1940, one had to come from a non-poor family or be of high intelligence to graduate college. And we have all have seen the tests from the days of yore and know that the material that the graduates were expected to know then was of a much higher difficulty than today. Here is a test from 1912 for eighth-graders, containing a spelling test at least much more taxing than the GSS WORDSUM.

It is obvious that the material from 1912, even suitably updated for our current year (“Which president was impeached, and on what charge?”), would be too much for many enrolled college students today. The score for college graduates on the WORDSUM should be 100%, or very close to it, recalling the ambiguity in identifying college degree holders and the impatience with taking a boring test and so on. But in 1974 it was only 50%. And today it is about 15%.

Obviously, 1974 is after the turmoil of the ’60s, when the inflation in education set in hard. We can’t look to dates before this because of the GSS’s inception, but I can’t see anybody seriously arguing that something well north of (say) 95% of college students graduating in 1940 would not have aced the test. There would have been a tailing off after World War II, as we might expect, because of the GI Bill, but the sharpest descent began around 1968.

Just think:
the “first accredited women’s studies course was held in 1969 at Cornell” (Wiki). It was straight downhill after that.

The conclusion is obvious: the majority of college degrees are of little value in judging a person’s intellectual capacity. Inflation has set in with a vengeance. Given the push by our leaders for more to enter college—to get a “degree” and not to gain an education—the value of the degree will continue to decline.

In the limit and if our masters have their way, everybody by age 25 (or whatever) will possess a Bachelors degree. At that point, the degree is of no value in discriminating intellectual ability. How could it? Everybody, by definition, has one.

Why I’m Unconvinced By Penrose’s Entropy/Anthropic Argument

There are a number of “constants” used in physics, such as the speed of light, the Planck constant, elementary charge and so forth. Some of these constants are bare, meaning they do not have a dependency on other constants, and some are derived from other constants, like vacuum permittivity (which is an exact formula of the speed of light and vacuum permeability, the latter being dependent on the definition of ampere).

Now these constants appear in certain formulas, and these formulas are derived by arguments, the lists of premises of which are very long and contain both observation and (ultimately) metaphysical premises. For instance, all use math, which is not observational. The constants “fall out” of these formulas, and are estimated via experiment. Their values are not deduced directly as, say, the value of π is in mathematics (there are many formulas for calculating the value of π, all based on argument).

If in any of these physics formulas a constant’s value can be derived, it is no long really a constant, but an assumed true (given the prior argument) value.

Though experiment can assist in estimating constants, conditional on the arguments which imply the constants’ existence, what follows is nobody knows why these constants take the values they do (nobody knows why π takes the value it does, either, though we can compute its value). Since constants are not derived, it could be that they are not real, in the sense they are not really part of the universe; it may be that they are estimating or summarizing groups of effects, that because the formulas which imply them might be incomplete, the constants are only parameterizations, in the same sense of probability models, or of encapsulating more fundamental processes as yet unknown. Or it could be they are real, in which case their values might be deduced. But it’s only “might” because it does not follow that we will ever know the right and true premises which lead to their deduction.

If the constants are only parameterizations, then arguments based on “choosing” constants, as is Penrose’s anthropic-like entropy argument, rely on false premises.

But, like most physicists do, I think the constants are real: they are the Way Things Are. And that means they were caused to be the way they are. They were made to take the values they did. The question then becomes why these values and not others.

It turns out, physicists like Penrose say, that assuming the arguments in which the constants appear are true, that if the certain values of these constants were to vary in only a minuscule way, the universe would look far different than it does now, even to the extent that life like us could not possibly exist. These non-life-universe arguments appear sound, remembering they are all conditional on assumed physical theory.

It turns out that only an exceedingly narrow range of the standard constants allow a universe anything like this one. Using an entropy argument well summarized in Robert Spitzer’s New Proofs for the Existence of God (pp 52-59; we’ll be going through this whole book), Penrose shows the creation of the constants had to have the “accuracy of one part in 1010123“, which is mighty precise! (In case your browser does not render that math, it’s 10 to the 10 to the 123rd power.)

This argument is not only premised on assumed physical theory, which is uncontroversial, but it also assumes there was a choice possible in the value of the constants.

There is no way to know or prove this choice existed, even for God. It could be, do not forget, that one or more of the values of the constants might be deducible, we just yet do not know how. We might some day discover how. In that case, we will have proven this constant had to have this particular value, with no choice about it. Penrose’s number would be reduced by some amount for each new deduction. If we could deduce all the constants, then it would appear the universe was inevitable, under Penrose’s interpretation.

But this is all doing it the hard way. That the universe exists at all, rather than nothing, is sufficient proof of the existence of God.

One possible line of escape, used by some and also summarized by Spitzer, is to assume all the “universes” which had the “allowable” values of constants really do or did exist; thus, they say, solving the choice problem. Or quantum mechanical arguments imply the constants are chosen “randomly”.

It should be obvious these are fallacies. The same unproven premise is there, that the constants could be different than they are, that a choice was possible. If a choice was possible, there had to be a chooser, or some simpler, more basic mechanism that led to the particular values.

But then what accounts for this constant chooser? It could be God directly, or other more fundamental still physical processes. If the latter, these had to come from somewhere: they could not have come from nothing.

Every path taken by these arguments leads to the same origin, which is Ultimate Chooser, the real true and sole reason the way things are are The Way Things Are. The nature of nature has to have an explanation, and that explanation can never, not ever, be nothing. The explanation has to be something outside nature, and the only candidate for that is God.

Incidentally, I do not agree with any probabilistic argument used to prove God’s existence, e.g. this one from Swineburn.

Older posts

© 2017 William M. Briggs

Theme by Anders NorenUp ↑