Confirmation Bias Happens To The Other Guy

Yet another entry in a long and endless series of experiments which show people often see what they want. Twist in this new one is that people who aren’t too good at math stay not too good, and that people who are good sometimes forget to be good.

Paper is “Motivated Numeracy and Enlightened Self-Government” by Dan M. Kahan and some others, available at the Social Science Research Network. Kahan and pals use theory piled atop theory to explain their motivation and results, but we can skip over all of that and focus on their experiment. (The authors also created a wonderfully intricate regression model which I deemed best to draw a veil over.)

Authors found about a thousand folks who looked more or less like a sample from the US would, and then asked them to solve story problems. This provided a “numeracy” score from one to nine for each individual. Everybody was also classified on the side of angels (Conservative Republican) or the Other (Liberal Democrat) based on self report.

Then people saw these:

Figure 1
Figure 1

The first setup told folks that an experiment had been run wherein some people used a new skin cream and some didn’t, and here were the results, which were either the top-left 2×2 (A) or the top-right 2×2 (B). The people were asked to consider these results and then say which was better: skin cream or not, under (A) or (B).

Regardless of (A) or (B), only about half the people could figure it out. But three-quarters (not 100%) of those with “numeracy” scores seven or better nailed it. Any professor who has ever administered a Statistics 101 exam will be familiar with results like this. Nothing exciting so far.

The next experiment showed people the same tables, except now supposedly derived from data in cities which did or did not ban carrying handguns. Same idea as before, with two different outcomes (C) or (D). Similar question, too: was the ban of handguns associated with an increase or decrease in crime?

Again, only about half could figure it out. The small “Aha” came from splitting conservatives and liberals. The numerate (story-problem scores 7-9) lefties did well on the quiz when crime decreased, but faired just as poorly as innumerates when crime increased. And, as you saw coming, the numerate righties did well on the quiz when crime increased, but faired just as poorly as innumerates when it decreased.

Interpretation? Given the authors did not present raw numbers but only (unnecessarily and overly) massaged curves and my analysis is derived from those, it’s safe to say people roughly saw what they expected to see even when they had the facility to work out the right answer. And when they didn’t (have the facility) they didn’t use what they didn’t have.

Like I said, no surprise. But it still bears repeating that confirmation bias is prevalent and that even you can suffer from it. Yes, even you, even me, and even Chris Mooney who wrote about this paper at Mother Jones under the headline Science Confirms: Politics Wrecks Your Ability to Do Math.

Science (who could not be reached for comment) confirms no such thing. Although it is possible politics wrecks people’s mathematical ability, far more likely is that people, especially smart people, skip the math when they think they already know the answer. Or that they see what they want to see, as is common.

Chris Mooney, for your edification, wrote the books The Republican War On Science and The Republican Brain: The Science of Why They Deny Science. He also wrote the articles “Diagnosing the Republican Brain“, “The Republican Brain on the Republican Brain“, and “Why the GOP distrusts science“.

One suspects a theme.

—————————————————————————–

Thanks to Roger McDermott for finding this paper.

16 Comments

  1. Tom Galli

    I am astounded that hundreds of past “studies” have confirmed the existence of human bias. Yet, we need more to confirm what we already know….

  2. Scotian

    Charlie Martin at PJ Media has covered this as well.

    http://pjmedia.com/lifestyle/2013/09/26/why-are-science-and-politics-so-hard/?singlepage=true

    I like his interpretation, which is that the people who get it wrong are not necessarily wrong. When you mix politics with math or even science with math badly there is a tendency to forgot that the reader has other sources of information in addition to the desired statistical result in front of them. Therefore it may be rational to assume that the data under consideration is wrong. It is sort of like the badly worded multiple choice question on an exam where the student is faced with the choice: do I give the answer which is expected or the correct answer? This is why Lewis Carroll liked to use nonsense syllogisms since it separated out prior knowledge to focus on the matter at hand. I notice that the venerable Briggs has also, on occasion, used this technique.

    In fact I, myself, in my latest assignment in statistical mechanics, am using a similar technique to test whether my students really believe in the atomic theory of matter – believe deep down in the inner deeps of their souls. Here I wish to thank Briggs for his philosophic approach to statistics. It has revived my spirits and I have brought new vigor to a course that had begun to grow stale over the decades.

  3. Sheri

    I get hit with these studies all the time on my climate change blog–it’s always the skeptics who suffer from confirmation bias, not those who believe the science. Thanks for the interesting write up.

  4. Luis

    I am extremely biased for the notion that humans are extremely biased.

    Then again, if we are all “extremely biased”, then isn’t that the norm, and we should otherwise call it “normally biased”? By definition it’s obviously true. And now I’ve run out of lapalice shenanigans to say!

  5. MattS

    Another dispatch from the office of Captain Obvious.

    Briggs,

    Keep up pointing these out they can be quite funny. What interests me, is how much taxpayer money did this bunch waste on re-inventing the wheel.

  6. Bob Koss

    A few weeks ago I pointed out on Kahan’s blog a flaw in the skin cream experiment serious enough in my mind to junk the paper. I imagine the paper will still be cited by many. I didn’t bother with the rest of the paper. My comment follows.

    Looking at the skin cream experiment I’m curious about your claim that the correct answers are not the same. Frankly, if you showed me those two tables I would have to answer the cream causes the rash to increase for both tables(A, B).

    The information available is not only the numbers in the tables, but also other information such as the starting groups are of equal size. That is implicitly stated when telling the subjects “Because patients do not always complete studies, the total number of patients in each of the two groups is not exactly the same…”.

    The subjects are also told one group of patients did not receive the cream. No mention is made of fooling them with a placebo in hopes they might come back even if cured. Just no treatment at all. No incentive to go back if it cleared up by itself.

    Of those who returned in two weeks there are 298 who received the cream and only 128 who did not. A discrepancy of 170 patients. Before answering whether the cream works I am going to have to consider why so many of the untreated didn’t return. In table (A) where 21 improved and 107 got worse I would expect the majority of the other 170 probably got better with no treatment and simply didn’t go back. Why go back if you no longer need treatment? This would seriously increase the percentage who got better with no cream, making it a higher percentage than those who received the cream. So my answer would be cream increases rash. Similar thinking would apply to table (B) and would affect the percentage, but would still result in cream increases rash.

    I really don’t see how having subjects score a study of patients with freewill that may avoid providing their data can be related to subjects scoring a study of cities which always provide their data.

  7. anona

    I knew what this post was going to be about even before I read it 😉 Call me biased!

  8. Timotheos

    Actually, the headline was 100% correct, it just wasn’t talking about the test subjects. 😉

  9. Don Jackson

    My “take away” point from this paper doesn’t conflict with yours much, Briggs. (I skipped the “statistical” analysis — considered asking you; then figured, someone else would. :)) But, for a considerable number of people I usually disagree with, it matters.
    Or, at least, it should!
    While this “study” is only “suggestive” it presents what should be obvious to everyone: Motivated reasoning is equivalent to motivated unreasoning. Expertise, in terms of education and/or intelligence, doesn’t obviate bias and –in fact– may exacerbate it…

    I’d not be so dismissive of Kahan.

  10. JH

    …far more likely is that people, especially smart people, skip the math when they think they already know the answer.

    Really? Are you speaking from your experience?

  11. Briggs

    Don,

    I don’t think we’re far apart at all: “Although it is possible politics wrecks people’s mathematical ability, far more likely is that people, especially smart people, skip the math when they think they already know the answer.” All I lack is the theory (and the unnecessarily complicated statistics).

  12. This is an example of a common line of psychological research:

    1) Researchers give experimental subjects made-up data.
    2) The subjects recognize the data as fabricated and ignore it.
    3) The researchers cite this as evidence of irrationality.

    It looks like cognitive scientists have defined rationality to mean “agrees with anything you are told.”

  13. Sheri

    Cognitive scientists in some areas do seem to define rationality as “agrees with anything you are told”–especially politics and climate change. This is actually true in much of life right now–doctors, teachers, virtually all “experts” expect people to assume that having a degree or a position of power makes them automatically right. Argument from “appropriate” authority– a clever reworking of a logical fallacy to aid them in their efforts. As Joseph noted, failure to go along with the scheme gets one’s behaviour labelled as irrational, the second step in attempting to force your agreement.

Leave a Reply

Your email address will not be published. Required fields are marked *