Wednesday, August 24, 2016

Are Our Brains Already Pre-Wired To Determine The 2016 Election?-- A Guest Post By Daniel Levitin

>




Our guest blogger today is Dr. Daniel J. Levitin, a neuroscientist who has written for us a few times over the years. Dan and I have been close friends for over 3 decades. He is perhaps best known as the author of the mega bestsellers This Is Your Brain On Music (which is required reading at Harvard for all incoming freshman) and The Organized Mind. His new book is timed for the start of this election season and is a primer on critical thinking for everyone. It’s called A Field Guide to Lies and I urge you to buy his book-- I read it in galleys and it will change the way you read the news, I guarantee it. You can pre-order it here. Dan gave me two signed advance copies of the book and I'll send them out as a thank you from Blue America today to the two most generous contributors to either Grayson on this page.

Recently he told me that "in 1996, five Mt. Everest hikers lost their lives because they did not allow new and relevant information to alter their views about the safety of proceeding. The 2007 global financial crisis has been traced to belief perseverance, when financial experts stuck with the status quo in spite of new evidence of weakening financial instruments." We talked about how the brain works in this regard in impacting politics and he added that there's a "need to nudge ourselves to think about how things might turn out if they don't go the way we think they will-- to imagine extreme, but still realistic scenarios. Use pre-mortems-- think ahead to all the things that could go wrong, and what might be the effects of these failures. Part of practicing the pre-mortem is recognizing that we are fallible and will make mistakes, succumbing to biases. What if a short-tempered candidate said the wrong thing to the wrong people at the wrong time? What if a candidate who wants to expand social support systems is confronted instead with a dire financial crisis? Which of these scenarios would yield the worst outcome? Third, identify what you would need to know in order to endorse or reject a candidate-- what are the real deal-breakers and deal-makers for you. Then take a dispassionate look at what the candidates really-and-truly stand for, not just what you've assumed they do."




Why It's So Hard To Learn Anything New About The Candidates
-by Daniel Levitin


Many Americans decided months ago which of the two presumptive Presidential nominees we would support, Donald Trump or Hillary Clinton. The contrast between them is pronounced, and it has been said to be the most stark in the 52 years since the LBJ/Goldwater contest in 1964.

As new information comes in from journalists about the candidates' trustworthiness, rationality, attitudes about race, and views on the responsibilities and functions of a free press, most of us will tend to ignore it. This is because of confirmation bias and belief perseverance, cognitive short-cuts that cause us to discard information that contradicts things we think we already know. Our brains can be lazy, and don't want to have to keep revisiting old decisions. This can save previous neural resources but can lead to devastatingly bad decisions.

Every time the press reports something negative about Donald Trump, his supporters will tend to dismiss it as biased or irrelevant. After all, their candidate is "an outsider" who "bucks the establishment," while "the media want to keep the status quo and are against someone who tells it like it is." Every time the press reports something negative about Hillary Clinton, her supporters will tend to dismiss it, believing that their candidate is the victim of a "Clinton-bashing media" who have had it in for her and her husband for more than three decades.

Those initial decisions are often based on emotion and intuition, and we then cherry-pick evidence that allows us to maintain our view. When a candidate is shown to have committed some unsavory act, we find ourselves saying, "Yeah, but (s)he's still the best person for the job." Drew Westen at Emory University found that people making decisions this mode during the 2004 presidential elections failed to use the parts of their brains associated with deliberate, logical thinking (in the prefrontal cortex), and instead they engaged brain regions associated with sympathy. This in turn causes voters to give the benefit of the doubt to their preferred candidate but not to the opponent.

What kind of evidence would it take to unseat these biases? How strong an argument would have to be presented to make us change our minds? More than you'd think, and more than would seem rational or adaptive.

In a classic study, students at Stanford University were shown photos of people while hearing what they thought was a playback of their own heartbeat-- the heartbeat would speed up at points randomly determined by the experimenter. The students were thus led to believe that they felt more strongly about some individuals than others. After all, physiology doesn't lie.

At the end of the experiment, the researchers explained that the heartbeat had been computer manipulated and didn't correspond to their true judgments at all. Asked to rate whom they felt the strongest about, the students were overwhelmingly biased toward the photo that had been accompanied by the accelerated heart rate.

Think about this: The only evidence the students had for which person they felt strongest about had just been removed, yet they tenaciously clung to their initial belief. Social scientist Emily Thorson of George Washington University calls these "belief echoes," and in her research confirms that exposure to political information persists in shaping attitudes long after that information is discredited.

Trial attorneys know this well-- they will often make a defamatory remark about a witness or defendant that they know will lead to an objection, which the judge will sustain; but if that remark caused the jury (and possibly the judge) to form a negative impression, it can take hold and govern the outcome of the trial, even if it has been shown to be false.


Politics, like high school, is partly (maybe largely) tribal, ideological and emotional. But it doesn't have to be. In this increasingly inter-connected, global economy, each of us has a stake in how our leaders interact on our behalf with us, our neighbors, our trade partners, and even our enemies. Emotions are most useful when they motivate us to act, but such actions should be based on reason.

You wouldn't step in an airplane that was designed by someone using emotion as their sole blueprint-- you shouldn't want a country governed by someone whom voters have not properly vetted. Political parties and candidates are not like a hometown sports team that you stick with through thick and thin-- political issues are complex, and many of us find that we agree with some of the things being said by candidates in both parties.

Overcoming these biases is necessary to being informed, and being informed is fundamental to any democracy. Thomas Jefferson himself believed that democracy (or a democratic republic) "rests on the foundation of an educated electorate." Jefferson also said that "Democracy is nothing more than mob rule, where fifty-one percent of the people may take away the rights of the other forty-nine percent." The whole point of democracy is to prevent the more powerful from dominating the less powerful. Our best defense against that-- to prevent us from devolving into a mob-- is to apply educated reason, arrived at through an analysis of facts.

Sure, a good candidate should stir emotion, patriotism, and should inspire. But to what end? Overcoming our biases requires three steps. We first need to be aware of them. In a study by McKinsey, investors who were made aware of cognitive biases were able to overcome them and increase their profits by 7%. Next, we need to make an extra effort to slow down and try to evaluate information from multiple sides objectively, and be ready to change our views. Psychologists Philip Tetlock and Jennifer Lerner call this making a break from using confirmatory thought to using exploratory thought. Finally, as my colleague Daniel Kahneman notes, there's a better chance of overcoming biases when a discussion of them is widespread. Therefore, discuss your views with others, and not necessarily others with whom you agree. You don't learn anything by only talking to people who agree with you. The press is helping us to wade through all of the conflicting claims, fact-checking, and contextualizing. Listening to them with an open mind is up to us.




UPDATE:

It's looks like Dan Piraro was inspired by his old friend Dan Levitin:


Labels:

1 Comments:

At 10:28 AM, Anonymous Anonymous said...

Do I detect a distinct bias in favor of TWO parties, only?

John Puma

 

Post a Comment

<< Home