Guardian’s ‘How Statistics Lost Their Power’

Been a wealth of material lately, which explains why we’re just now coming to the Guardian’s, “How statistics lost their power — and why we should fear what comes next“. Now by statistics the author means the old-fashioned, and really to be preferred, definition of “measurements taken on observables”, and not models and math, or not models and math per se.

In theory, statistics should help settle arguments. They ought to provide stable reference points that everyone — no matter what their politics — can agree on. Yet in recent years, divergent levels of trust in statistics has become one of the key schisms that have opened up in western liberal democracies…

Rather than diffusing controversy and polarisation, it seems as if statistics are actually stoking them. Antipathy to statistics has become one of the hallmarks of the populist right, with statisticians and economists chief among the various “experts” that were ostensibly rejected by voters in 2016. Not only are statistics viewed by many as untrustworthy, there appears to be something almost insulting or arrogant about them. Reducing social and economic issues to numerical aggregates and averages seems to violate some people’s sense of political decency.

Not trust the ever-burgeoning bureaucracies official numbers? Heaven forfend! Why, bureaucrats, NGO flacks, and politicians are entirely disinterested actors, who only want what is best for one and all. Yes? They would never consider cooking books so that things came out in favor of requiring more of their services, would they? No, sir!

After all, liars figure but figures don’t lie. Yes? To gauge unemployment, all we have to do is count those without jobs, right? And the number of folks needing a government service? Easy too, with no chance of bias. Yes?

Well, that’s counting. There is also modeling.

The declining authority of statistics — and the experts who analyse them — is at the heart of the crisis that has become known as “post-truth” politics. And in this uncertain new world, attitudes towards quantitative expertise have become increasingly divided. From one perspective, grounding politics in statistics is elitist, undemocratic and oblivious to people’s emotional investments in their community and nation.

The Guardian, of course, of the Left, and of the old-guard Left, a group well used to victory, having had them with only rare interruptions for the last century. Until recently. One of the explanations the Left has given to themselves about why they are losing is that the “other side” has abandoned “truth”. Which it has, if you define “truth” as that which accords with progressive ideology.

In Germany, for example (from where we get the term Statistik) the challenge was to map disparate customs, institutions and laws across an empire of hundreds of micro-states. What characterised this knowledge as statistical was its holistic nature: it aimed to produce a picture of the nation as a whole. Statistics would do for populations what cartography did for territory.

This is still, and still should be, the goal of official statistics. Dry facts, which almost are accompanied by their uncertainties. God bless the statisticians provide this wealth! Yet…

The emergence in the late 17th century of government advisers claiming scientific authority, rather than political or military acumen, represents the origins of the “expert” culture now so reviled by populists.

A concern of the author is preserving democracy, which, as I often say, is populism by definition. It’s the losing side that throws the term “populist” out as one of derision. Bad statistics have nothing to do with populism. The reason the Guardian’s enemies dislike traditional experts is because (a) they are far too often wrong, and (b) they confuse measurement with ought. For example, government, activist, and bureaucrats have promised us we would have plunged into another ice age by now, or something like it, with bodies stacked by the side of the road like cord wood. It didn’t happen. Not only that, all the experts’ “solutions” to fix this non-problem were nuts.

And then came global warming…but that is a story for another day. Our author then admits:

Not every aspect of a given population can be captured by statistics. There is always an implicit choice in what is included and what is excluded, and this choice can become a political issue in its own right…In France, it has been illegal to collect census data on ethnicity since 1978, on the basis that such data could be used for racist political purposes.

Somebody ought to suggest the latter move here, at top levels and at campuses and work places. But, nah. Proscribe asking about race and euphemistic statistics would quickly take their place.

The potential of statistics to reveal the state of the nation was seized in post-revolutionary France. The Jacobin state set about imposing a whole new framework of national measurement and national data collection.

The potential of statistics wasn’t the only thing seized by the Jacobins.

During the 1920s, statisticians developed methods for identifying a representative sample of survey respondents, so as to glean the attitudes of the public as a whole. This breakthrough, which was first seized upon by market researchers, soon led to the birth of the opinion polling.

How well did that turn out? Hate to mention it, but doesn’t this smack, just a little, of populism? I’m just asking.

We can agree with this:

Yet in recent decades, the world has changed dramatically, thanks to the cultural politics that emerged in the 1960s and the reshaping of the global economy that began soon after. It is not clear that the statisticians have always kept pace with these changes. Traditional forms of statistical classification and definition are coming under strain from more fluid identities, attitudes and economic pathways. Efforts to represent demographic, social and economic changes in terms of simple, well-recognised indicators are losing legitimacy.

Is this not admitting experts are falling more often into error? I’m just asking.

The article goes on—and on—and on—and on—even coming to the expected criticism of Steven Bannon and Donald Trump. But it’s long on wind and short on solutions.

Which are? The more open source the better, and the more numbers are given with predictive uncertainties, the better, too.

2 Comments

  1. Sheri

    We survey for unemployment instead of just using the number of claims filed per week in each state. There are a few errors in those—people who filed and were denied, people who forgot to report they now have a job and wages, and the like (I worked in the Unemployment Overpayment department of my state at one time.) but by and large, this is an accurate number of how many people are seeking work. Now, for the Labor Participation rate, that’s useful if you only want to know how many people are not working—I don’t know how accurate it is. In reality, finding out how many people are retired, how many don’t work because they are independently wealthy, how many are under 18 and not students (students count separately), how many are stay-at-home moms or dads, etc is virtually impossible, short of last year’s income tax statistics and even then, if someone did work and made too little money to pay taxes, they may not file. No wonder people don’t trust statistics as presented today in politics and research.

    Personally, I think the use of “average” is what destroyed statistics. Using one number to describe millions of people’s height, the temperature of the planet, the amount of rain expected to fall based on past averages, any extremely varied population—any rational, sane person would scream NO and never listen to another word.

    More information provided with the statistics would help, but you have to teach people math first. Americans are lousy at math and fall for anything sounding authoritatively mathematical.

Leave a Reply

Your email address will not be published. Required fields are marked *