Saturday, June 23, 2012

As long as groupthink encourages people to approach reality "independent of accuracy," is there any hope for reasoned discourse? (A: Nope)


On RT'S Alyona Show, Chris Hayes talks about the present extreme distrust of authorities he describes in his new book Twilight of the Elites. He asks (at 3:38), "Under this environment, in which we can't find anything that we have consensus trust in, can we produce the level of social consensus necessary to make the scale of change that we need?" Here's my answer, now that I've been introduced to the concept of "motivated reasoning": not a chance in hell. (Note: Chris's conversation with Alyona continues here.)

"Once group loyalties are engaged, you can't change people's minds by utterly refuting their arguments. Thinking is mostly just rationalization, mostly just a search for supporting evidence."
-- NYU psychology professor Jonathan Haidt, to Ezra
Klein, in his New Yorker piece
"Unpopular Mandate"

by Ken

Does the phrase "motivated reasoning" set off any alarm bells with you? It sure didn't with me, and I had to remind myself, every time I encountered it in Ezra Klein's invaluable New Yorker piece "Unpopular Mandate" that it is in fact psychological/social-science jargon for a process that, as far as I'm concerned, makes human civilization a virtual impossibility.

And I'm afraid I did a poor job of explaining that in my post last night, "The rapid reverse polarizing of the public view of the individual mandate can tell us something deeply important and troubling about our politics." What I really wanted to write about was a piece whose title had caught my eye as passed on by Nation of Change: "Why Did Occupy Protesters Spend So Much More Time in Prison Than Tea Partiers?" I didn't know that Occupy protesters are "spend[ing] so much more time in prison than Tea Partiers," but I was certainly prepared to believe it, which I suppose is itself an illustration of the workings of "motivated reasoning" on the left side of the political spectrum. And the joke was on me, I find now, since there doesn't seem to be an actual piece here, just the nevertheless-interesting extended "infographic" I've reproduced so poorly here. By all means check it out onsite, either at Nation of Change or on Upworthy, where I gather it originated.

Certainly the title, this business about Occupy protesters being handled far more harshly by the criminal-justice system than Teabaggers, shouted out to me about the country's pervasive tilt to the right -- no, not tilt, we're way beyond a "tilt," we're perched atop a precipitous Right-plunging vertical. And in this context, I was sitting on Ezra Klein's New Yorker piece, which I knew I wanted to write about eventually, because it opened a huge window for me onto why people "think" ("think" emphatically in quotes) the way they do, based heavily on an apparent human genius for persuading ourselves, when it's convenient to do so, that we believe what our group believes, no matter how blatantly the belief may conflict with readily observable reality.

So I couldn't help bringing in a couple of especially pregnant portions of that piece, which wound up taking over, without really conveying what had really excited me about the piece, and of course horribly depressed me. Which is the price one pays for doing things in such a half-assed way.

A lot of my problem came back to that term "motivated reasoning," which sounds so innocent, but is anything but. First off, the use of that word reasoning suggests a decision-making process that involves some sort of, you know, reasoning, as in the faculty of reason. But no, all it seems to mean to the fancy-pants intellecutals who use the term is any process by which people arrive at decisions -- in other words, "reason" in the sense of the reason why people do stuff, no matter how contrary to reason. So you see, "reasoning" in this sense can me the exact opposite of what sane people understand by it. Frankly, I'm still stunned to learn that any responsible person, let these large cohorts of professors, could (mis)use the word in such corrupt fashion.

Then there's the "motivated" part of "motivated reasoning." Nothing suspicious about "motivation," is there? Motivation is a good thing, no? Who isn't always looking for, hoping for motivation? It's a pretty neutral term, right? Certainly doesn't convey anything dangerous, right? Except that in this particular usage, it means arriving at a conclusion by any means other than reality. Apparently, as we'll see in a moment, the preferred term for un-assisting factors is "accuracy," a curiously dumbed-down-sounding version of what's apparently meant: reality. In this sense, the "assistance" not only can but does apply to any kind of lies, manipulation, and any other damned bullshit.

Put it all together, and "assisted reasoning" means "arriving at beliefs that may be partly or entirely at odds with reality simply because . . . well, because whatever, because I feel like it, what's it to you? And Ezra, in this New Yorker piece, provides a vivid outline of the way in which "motivated reasoning" has come to dominate our political reality, and since it's a process that defies any connection to actual "reason," it appears to be beyond realistic reach.


With that background, I offer you portion of Ezra's piece that deals with what I consider the huge subject of "motivated reasoning," with just a driblet of highlighting from me.
Jonathan Haidt, a professor of psychology at New York University's business school, argues in a new book, "The Righteous Mind," that to understand human beings, and their politics, you need to understand that we are descended from ancestors who would not have survived if they hadn't been very good at belonging to groups. He writes that "our minds contain a variety of mental mechanisms that make us adept at promoting our group's interests, in competition with other groups. We are not saints, but we are sometimes good team players."

One of those mechanisms is figuring out how to believe what the group believes. Haidt sees the role that reason plays as akin to the job of the White House press secretary. He writes, "No matter how bad the policy, the secretary will find some way to praise or defend it. Sometimes you'll hear an awkward pause as the secretary searches for the right words, but what you'll never hear is: ‘Hey, that's a great point! Maybe we should rethink this policy.' Press secretaries can't say that because they have no power to make or revise policy. They're told what the policy is, and their job is to find evidence and arguments that will justify the policy to the public." For that reason, Haidt told me, "once group loyalties are engaged, you can't change people's minds by utterly refuting their arguments. Thinking is mostly just rationalization, mostly just a search for supporting evidence."

Psychologists have a term for this: "motivated reasoning," which Dan Kahan, a professor of law and psychology at Yale, defines as "when a person is conforming their assessments of information to some interest or goal that is independent of accuracy" -- an interest or goal such as remaining a well-regarded member of his political party, or winning the next election, or even just winning an argument. Geoffrey Cohen, a professor of psychology at Stanford, has shown how motivated reasoning can drive even the opinions of engaged partisans. In 2003, when he was an assistant professor at Yale, Cohen asked a group of undergraduates, who had previously described their political views as either very liberal or very conservative, to participate in a test to study, they were told, their "memory of everyday current events."

The students were shown two articles: one was a generic news story; the other described a proposed welfare policy. The first article was a decoy; it was the students' reactions to the second that interested Cohen. He was actually testing whether party identifications influence voters when they evaluate new policies. To find out, he produced multiple versions of the welfare article. Some students read about a program that was extremely generous -- more generous, in fact, than any welfare policy that has ever existed in the United States -- while others were presented with a very stingy proposal. But there was a twist: some versions of the article about the generous proposal portrayed it as being endorsed by Republican Party leaders; and some versions of the article about the meagre program described it as having Democratic support. The results showed that, "for both liberal and conservative participants, the effect of reference group information overrode that of policy content. If their party endorsed it, liberals supported even a harsh welfare program, and conservatives supported even a lavish one."

In a subsequent study involving just self-described liberal students, Cohen gave half the group news stories that had accompanying Democratic endorsements and the other half news stories that did not. The students who didn't get the endorsements preferred a more generous program. When they did get the endorsements, they went with their party, even if this meant embracing a meaner option.

This kind of thinking is, according to psychologists, unsurprising. Each of us can have firsthand knowledge of just a small number of topics -- our jobs, our studies, our personal experiences. But as citizens -- and as elected officials -- we are routinely asked to make judgments on issues as diverse and as complex as the Iranian nuclear program, the environmental impact of an international oil pipeline, and the likely outcomes of branding China a "currency manipulator."

According to the political-science literature, one of the key roles that political parties play is helping us navigate these decisions. In theory, we join parties because they share our values and our goals -- values and goals that may have been passed on to us by the most important groups in our lives, such as our families and our communities -- and so we trust that their policy judgments will match the ones we would come up with if we had unlimited time to study the issues. But parties, though based on a set of principles, aren't disinterested teachers in search of truth. They're organized groups looking to increase their power. Or, as the psychologists would put it, their reasoning may be motivated by something other than accuracy. And you can see the results among voters who pay the closest attention to the issues.

In a 2006 paper, "It Feels Like We're Thinking," the political scientists Christopher Achen and Larry Bartels looked at a National Election Study, a poll supported by the National Science Foundation, from 1996. One of the questions asked whether "the size of the yearly budget deficit increased, decreased, or stayed about the same during Clinton's time as President." The correct answer is that it decreased, dramatically. Achen and Bartels categorize the respondents according to how politically informed they were. Among the least-informed respondents, Democrats and Republicans picked the wrong answer in roughly equal numbers. But among better-informed voters the story was different. Republicans who were in the fiftieth percentile gave the right answer more often than those in the ninety-fifth percentile. Bartels found a similar effect in a previous survey, in which well-informed Democrats were asked whether inflation had gone down during Ronald Reagan's Presidency. It had, but many of those Democrats said that it hadn't. The more information people had, it seemed, the better they were at arranging it to fit what they wanted to believe. As Bartels told me, "If I'm a Republican and an enthusiastic supporter of lower tax rates, it is uncomfortable to recognize that President Obama has reduced most Americans' taxes -- and I can find plenty of conservative information sources that deny or ignore the fact that he has."

Recently, Bartels noticed a similar polarization in attitudes toward the health-care law and the Supreme Court. Using YouGov polling data, he found that less-informed voters who supported the law and less-informed voters who opposed it were equally likely to say that "the Supreme Court should be able to throw out any law it finds unconstitutional." But, among better-informed voters, those who opposed the law were thirty per cent more likely than those who supported it to cede that power to the Court. In other words, well-informed opponents realized that they needed an activist Supreme Court that was willing to aggressively overturn laws if they were to have any hope of invalidating the Affordable Care Act.

Orin Kerr ["a George Washington University professor who had clerked for Justice Anthony Kennedy," who we learned in the opening paragraph said in 2010, "There is a less than one-per-cent chance that the courts will invalidate the individual mandate"] says that, in the two years since he gave the individual mandate only a one-per-cent chance of being overturned, three key things have happened. First, congressional Republicans made the argument against the mandate a Republican position. Then it became a standard conservative-media position. "That legitimized the argument in a way we haven't really seen before," Kerr said. "We haven't seen the media pick up a legal argument and make the argument mainstream by virtue of media coverage." Finally, he says, "there were two conservative district judges who agreed with the argument, largely echoing the Republican position and the media coverage. And, once you had all that, it really became a ballgame."

Which is more or less where I picked up last night.


I probably still haven't communicated what seem to me the overwhelming implications of the application of "motivated reasoning" to our national political discourse. It simplifies my life in one way, though. For a couple of weeks now I've been trying to find a way to write about an absorbing piece that David Swanson posted on the "War Is a Crime" (formerly "AfterDowningStreet") blog: "Can We Get Along Without Authorities?" David was writing with enthusiasm about Chris Hayes's new book, Twilight of the Elites.
Hayes charts a growing disillusionment with authorities of all variety: government, media, doctors, lawyers, bankers. We've learned that no group can be blindly trusted. "The cascade of elite failure," writes Hayes, "has discredited not only elites and our central institutions, but the very mental habits we use to form our beliefs about the world. At the same time, the Internet has produced an unprecedented amount of information to sort through and radically expanded the arduous task of figuring out just whom to trust." Hayes calls this "disorienting."

David did have a reservation, though. There's no reason to feel disoriented, he suggested, unless you've fallen into the habit of trusting these co-called authorities. And he proposes some common-sense ways of screening such authorities for credibility that calls for further inquiry. And he fixes on the obviously important subject of climate change.
I think people are listening to authorities -- they're just the wrong ones. Rather than listening to scientists about science, they're listening to jackasses with radio shows who know nothing about science. Back in the heyday of belief in authorities (whenever that was) people didn't always listen to scientists about evolution; many preferred to listen to charismatic charlatans rejecting evolution. Perhaps as much as restoring a willingness to trust authorities, we need to instill a desire to learn from would-be authorities enough to judge which ones deserve our trust on matters beyond our own comprehension or direct knowledge.

Climate change is not theoretical. There is evidence that can be shown to people, if they can get beyond the rejection of pointy-headed scientists that Hayes notes, and if they can also get beyond the religious belief that humans couldn't harm the earth if they tried, not to mention the religious belief that harming the earth is unimportant or desirable.

Climate change increasingly can be shown to people up-close-and-personal. And when it can't, the magic of video and photography can show it to us from elsewhere on the planet. Learning to look beyond the borders of the United States would do as much for our society as trusting intellectuals would.

And then he brings the climate-change example back to the problem of war, which he says to him resembles climate change.
Either can destroy us. Either can be better understood by looking outside the United States. Hayes, in describing well the gap between those in power in Washington and the other 99% of us, describes in particular the gap between the war deciders and that sliver of the U.S. population that actually fights the wars. But what about the gap between us and the victims of our wars? What about the power of patriotic flag-waving to overcome concern for troops, even among the troops themselves, who could and should refuse illegal orders?

This is really, really smart stuff, and it leads to this:
I suspect that the key to avoiding disorientation is to expect shortcomings. We probably shouldn't worry as much as Hayes does about whether baseball players use steroids. We should probably expect a church that lies about the finality of death to lie about other things, including child abuse. It's not all the members of the church that did that; it's a small group of very powerful people at the top. In politics too, we should recognize the corrupting influence of power. But we shouldn't fault humanity because presidents are murderous thugs. We should recognize the elite, as Hayes defines them, as elite. We should be aware of their patterns of wrong-doing, rather than fantasizing that half of them, belonging to one of the two big political parties, are our close friends and role models. That way lies disillusionment and disorientation.

We don't need to get the mechanics right. We won't fix our government by ending the filibuster or by amending the Constitution to point out that corporations aren't people and buying stuff isn't protected speech. I'm in favor of those things, but fundamentally we need to change our culture, create and follow better models, develop social capital of trust and community.

And yet, we'll need to get some of the mechanics of government right too. The founders of the United States, for all they got wrong, got power right in many ways. Presidents were denied the power to launch wars. The people's representatives were given the power to impeach presidents. The rule of law was placed above the law of rulers. We need to recover all of that. And doing so will require -- as Hayes recommends -- placing the power of the people to control the elite higher on our agenda than cheerleading for the half of the elite belonging to one of the two parties.

Hayes' book -- as is fairly typical of political books -- has a title that at first sounds optimistic, but 215 of its 240 pages are devoted to describing the disaster underway, while the last 25 pages are set aside for the topic of what we might do about it. Basically, Hayes recommends that we build a movement for progressive taxation by joining forces with upper-middle-class rightwingers. This might not be as crazy as it sounds. We don't need to find rightwingers who favor progressive taxation, but we do need to create them. No doubt that sounds extremely "elitist," but we can create such people out of their own beliefs.

This is not just smart but exciting stuff, and as I said, I've been trying to find a way to write about it. It's still an absorbing piece, and I can't encourage you strongly enough to read it. However, just now it seems beside the point. "Fundamentally," David says, "we need to change our culture, create and follow better models, develop social capital of trust and community." Oh, really?

In a culture saturated with "motivated reasoning," what's the chance of any of that happening? I would answer this the same way I answered Chris Hayes's question up top: not a chance in hell.

Labels: , , ,


At 7:48 PM, Blogger Dan Lynch said...

Good post.

My response: leadership matters, and we don't have any.

At 9:41 PM, Blogger KenInNY said...

That sure doesn't help, Dan does it?


At 4:15 PM, Anonymous Alan said...

Was interesting. If I remember correctly from my reading of the research liberals are a lot less likely to do "motivated reasoning" than conservatives. Which makes sense since a large part of conservative morality is obedience to the group. Much less so for liberals. This, of course, was omitted from the NY'er article, probably to avoid getting hate mail from all the conservatives, or perhaps because the NY'er was performing some motivated reasoning of their own.


Post a Comment

<< Home