Fair Use Notice

FAIR USE NOTICE

A BEAR MARKET ECONOMICS BLOG

OCCUPY MADNESS AND DYSFUNCTION

This site may contain copyrighted material the use of which has not always been specifically authorized by the copyright owner. We are making such material available in an effort to advance understanding of environmental, political, human rights, economic, democracy, scientific, and social justice issues, etc. we believe this constitutes a ‘fair use’ of any such copyrighted material as provided for in section 107 of the US Copyright Law.

In accordance with Title 17 U.S.C. Section 107, the material on this site is distributed without profit to those who have expressed a prior interest in receiving the included information for research and educational purposes. For more information go to: http://www.law.cornell.edu/uscode/17/107.shtml

If you wish to use copyrighted material from this site for purposes of your own that go beyond ‘fair use’, you must obtain permission from the copyright owner.

FAIR USE NOTICE FAIR USE NOTICE: This page may contain copyrighted material the use of which has not been specifically authorized by the copyright owner. This website distributes this material without profit to those who have expressed a prior interest in receiving the included information for scientific, research and educational purposes. We believe this constitutes a fair use of any such copyrighted material as provided for in 17 U.S.C § 107.

Read more at: http://www.etupdates.com/fair-use-notice/#.UpzWQRL3l5M | ET. Updates
FAIR USE NOTICE FAIR USE NOTICE: This page may contain copyrighted material the use of which has not been specifically authorized by the copyright owner. This website distributes this material without profit to those who have expressed a prior interest in receiving the included information for scientific, research and educational purposes. We believe this constitutes a fair use of any such copyrighted material as provided for in 17 U.S.C § 107.

Read more at: http://www.etupdates.com/fair-use-notice/#.UpzWQRL3l5M | ET. Updates

All Blogs licensed under Creative Commons Attribution 3.0

Tuesday, September 17, 2013

Science confirms: Politics wrecks your ability to do math

 

grist

A BEACON IN THE SMOG

 

Science confirms: Politics wrecks your ability to do math


      
math is hard
Shutterstock
Everybody knows that our political views can sometimes get in the way of thinking clearly. But perhaps we don’t realize how bad the problem actually is. According to a new psychology paper, our political passions can even undermine our very basic reasoning skills. More specifically, the study finds that people who are otherwise very good at math may totally flunk a problem that they would otherwise probably be able to solve, simply because giving the right answer goes against their political beliefs.
The study, by Yale law professor Dan Kahan and his colleagues, has an ingenious design. At the outset, 1,111 study participants were asked about their political views and also asked a series of questions designed to gauge their “numeracy,” that is, their mathematical reasoning ability. Participants were then asked to solve a fairly difficult problem that involved interpreting the results of a (fake) scientific study. But here was the trick: While the fake study data that they were supposed to assess remained the same, sometimes the study was described as measuring the effectiveness of a “new cream for treating skin rashes.” But in other cases, the study was described as involving the effectiveness of “a law banning private citizens from carrying concealed handguns in public.”

The result? Survey respondents performed wildly differently on what was in essence the same basic problem, simply depending upon whether they had been told that it involved guns or whether they had been told that it involved a new skin cream. What’s more, it turns out that highly numerate liberals and conservatives were even more – not less — susceptible to letting politics skew their reasoning than were those with less mathematical ability.

But we’re getting a little ahead of ourselves — to fully grasp the Enlightenment-destroying nature of these results, we first need to explore the tricky problem that the study presented in a little bit more detail.

Let’s start with the “skin cream” version of this brain twister. You can peruse the image below to see exactly what research subjects read (and try out your own skill at solving it), or skip on for a brief explanation:

Full text of one version of the study's "skin cream" problem. Click to embiggen.
Dan Kahan
Full text of one version of the study’s “skin cream” problem. Click to embiggen.
As you can see above, the survey respondents were presented with a fictional study purporting to assess the effectiveness of a new skin cream, and informed at the outset that “new treatments often work but sometimes make rashes worse” and that “even when treatments don’t work, skin rashes sometimes get better and sometimes get worse on their own.” They were then presented with a table of experimental results, and asked whether the data showed that the new skin cream “is likely to make the skin condition better or worse.”

So do the data suggest that the skin cream works? The correct answer in the scenario above is actually that patients who used the skin cream were “more likely to get worse than those who didn’t.” That’s because the ratio of those who saw their rash improve to those whose rash got worse is roughly 3 to 1 in the “skin cream” group, but roughly 5 to 1 in the control group — which means that if you want your rash to get better, you are better off not using the skin cream at all. (For half of study subjects asked to solve the skin cream problem, the data were reversed and presented in such a way that they did actually suggest that the skin cream works.)

This is no easy problem for most people to solve: Across all conditions of the study, 59 percent of respondents got the answer wrong. That is, in significant part, because trying to intuit the right answer by quickly comparing two numbers will lead you astray; you have to take the time to compute the ratios.

Not surprisingly, Kahan’s study found that the more numerate you are, the more likely you are to get the answer to this “skin cream” problem right. Moreover, it found no substantial difference between highly numerate Democrats and highly numerate Republicans in this regard. The better members of both political groups were at math, the better they were at solving the skin cream problem.

But now take the same basic study design and data, and simply label it differently. Rather than reading about a skin cream study, half of Kahan’s research subjects were asked to determine the effectiveness of laws “banning private citizens from carrying concealed handguns in public.” Accordingly, these respondents were presented not with data about rashes and whether they got better or worse, but rather with data about cities that had or hadn’t passed concealed carry bans, and whether crime in these cities had or had not decreased.

Overall, then, study respondents were presented with one of four possible scenarios, depicted below with the correct answer in bold:

The four problem scenarios from the study (each respondent received just one of these). Click to embiggen.
Dan Kahan
The four problem scenarios from the study (each respondent received just one of these). Click to embiggen.
So how did people fare on the handgun version of the problem? They performed quite differently than on the skin cream version, and strong political patterns emerged in the results — especially among people who are good at mathematical reasoning. Most strikingly, highly numerate liberal Democrats did almost perfectly when the right answer was that the concealed weapons ban does indeed work to decrease crime (version C of the experiment) — an outcome that favors their pro-gun-control predilections. But they did much worse when the correct answer was that crime increases in cities that enact the ban (version D of the experiment).

The opposite was true for highly numerate conservative Republicans: They did just great when the right answer was that the ban didn’t work (version D), but poorly when the right answer was that it did (version C).

Here are the results overall, comparing subjects’ performances on the “skin cream” versions of the problem (above) and the “gun ban” versions of the problem (below), and relating this performance to their political affiliations and numeracy scores:

Full study results comparing subjects' performance on the skin cream problem with their performance on the gun ban problem. Vertical axes plot response accuracy. Horizontal axes show mathematical reasoning ability. Click to embiggen.
Dan Kahan
Full study results comparing subjects’ performance on the skin cream problem with their performance on the gun ban problem. Vertical axes plot response accuracy. Horizontal axes show mathematical reasoning ability. Click to embiggen.
For study author Kahan, these results are a fairly strong refutation of what is called the “deficit model” in the field of science and technology studies — the idea that if people just had more knowledge, or more reasoning ability, then they would be better able to come to consensus with scientists and experts on issues like climate change, evolution, the safety of vaccines, and pretty much anything else involving science or data (for instance, whether concealed weapons bans work). Kahan’s data suggest the opposite — that political biases skew our reasoning abilities, and this problem seems to be worse for people with advanced capacities like scientific literacy and numeracy. “If the people who have the greatest capacities are the ones most prone to this, that’s reason to believe that the problem isn’t some kind of deficit in comprehension,” Kahan explained in an interview.

So what are smart, numerate liberals and conservatives actually doing in the gun control version of the study, leading them to give such disparate answers? It’s kind of tricky, but here’s what Kahan thinks is happening.

Our first instinct, in all versions of the study, is to leap instinctively to the wrong conclusion. If you just compare which number is bigger in the first column, for instance, you’ll be quickly led astray. But more numerate people, when they sense an apparently wrong answer that offends their political sensibilities, are both motivated and equipped to dig deeper, think harder, and even start performing some calculations — which in this case would have led to a more accurate response.

“If the wrong answer is contrary to their ideological positions, we hypothesize that that is going to create the incentive to scrutinize that information and figure out another way to understand it,” says Kahan. In other words, more numerate people perform better when identifying study results that support their views — but may have a big blind spot when it comes to identifying results that undermine those views.

What’s happening when highly numerate liberals and conservatives actually get it wrong? Either they’re intuiting an incorrect answer that is politically convenient and feels right to them, leading them to inquire no further — or else they’re stopping to calculate the correct answer, but then refusing to accept it and coming up with some elaborate reason why 1 + 1 doesn’t equal 2 in this particular instance. (Kahan suspects it’s mostly the former, rather than the latter.)

The Scottish Enlightenment philosopher David Hume famously described reason as a “slave of the passions.” Today’s political scientists and political psychologists, like Kahan, are now affirming Hume’s statement with reams of new data. This new study is just one out of many in this respect, but it provides perhaps the most striking demonstration yet of just how motivated, just how biased, reasoning can be – especially about politics.



This story was produced as part of the Climate Desk collaboration.

Chris Mooney is host of the Point of Inquiry podcast and the author of four books, including The Republican War on Science and The Republican Brain: The Science of Why They Deny Science and Reality.

No comments:

Post a Comment