Curiosity killed the cat but it might save the human

Recently I shared some thoughts on Facebook with a friend whose political views differ from mine. Reading back through the thread hours later, I noticed something missing from our conversation: I hadn’t asked my friend a single question. Words used were kind, but I was laser-focused on giving my two cents and a dollar bill explanation. The instinct to ask a question was swallowed by a drive to express a rightness in my position.

Turns out I’m exactly where evolution has put me, and us. Behavioral studies over the last several decades suggest the human brain is wired to win. What does this mean? In the course of debates with family, friends, acquaintances, strangers in person, or strangers on the internet, our brains are not very skilled at taking in new data and making revisions. When our views are challenged, we’re physiologically inclined to “dig in”, ignore contrary data, and stay true to our current beliefs.

Knowledge is power, and awareness of our limitations offers the chance to correct for them. There’s a lot up for debate these days. Using climate change as an example, scientific studies point emphatically in the direction that the earth is warming due to human activity, and if we don’t start rethinking the way energy is harvested, consequences will grow uncomfortable over time. The scientific process is our ally and weapon against correcting innate human biases associated with a limited ability to reason on logic. Despite good evidence, and lots of it, that maybe something icky is happening in the atmosphere, there’s a large number of people in the world who say nope, not happening. And not because there’s an equally overwhelming body of evidence to the contrary. Instead of sniffing the climate thing out with further research, funding is cut, government webpages on climate science are shut down, and scientists are put on Twitter gag orders.

Whether it’s climate change or any other topic, a void of curiosity often accompanies a non-expert’s strong point of view. I used climate change as an example because it’s one of the more extreme and evidence is measurable in a concrete way. But even in debates where the issue at hand isn’t standing on a premise so data-centric, we often enter into it with our minds made up and want nothing more than to poke holes in the other person’s argument (while ignoring the holes in our own). Guilty as charged.

Since 2016 when Donald Trump came onto the scene and stirred the nation’s attention, controversies are frequent and each one kicks up a storm of public opinion. In the middle of it, has anyone else noticed the reluctance within ourselves and among others to acknowledge the shortcomings of our own team? Take a known fact and one person will highlight it in bold and another will use a double strikethrough. The truth is buried arguably somewhere in the middle.

Given the steady observation of these tendencies across party lines, I’ve wondered about a psychological force at play that could explain the bizarre relationship humans have with the truth, and I can boil my questions down to two.

  • Why do humans often dismiss evidence as false if it’s contrary to what they believe?
  • Where does our conviction come from, and is it logical?

There have been several articles in print over the last year that touch on these questions. The first one comes from the February 27th, 2017 issue of The New Yorker. “Why Facts Don’t Change our Minds,” by journalist and author Elizabeth Kolbert, is a review on a new book called “The Enigma of Reason” by cognitive scientists Hugo Mercier and Dan Sperber. Mercier and Sperber argue human reason evolved in response to challenges associated with humans living in communities. The ability to collaborate is “a distinct advantage of humans, and on the savannahs in Africa, social standing in the group mattered.” Our reasoning skills evolved in the context of gaining advantage within a group, and this group versus individual context has implications for our ability to reason towards truth in society today.

Experiments show – 1. Logic doesn’t rule

 In her article, Kolbert referenced a study done by Mercier and his European colleagues where results suggest logical reasoning is not a dominant instinct.

In the first phase of the experiment, participants were asked to answer a series of simple reasoning problems.

  • Participants had to explain their answers and were given the opportunity to modify them.
  • Fewer than 15% chose to modify, indicating they were satisfied with their responses.

In the second phase, participants were shown one of the same reasoning problems, along with their answer and someone else’s who had come to a different conclusion. The catch – the other person’s answer was presented as their own.

  • ~50% of the participants caught on to the trick.
  • Of the other half, nearly 60% rejected their original answers they had previously reviewed and approved.

In short, more than half of the remaining participants rejected their own answers when they were presented as someone else’s.

Experiments show – 2. It’s hard for us to change our minds

 The results of an experiment completed at Stanford in 1975 and noted by Kolbert suggest that once a person believes something to be true, it’s difficult to change their mind.

In the first part of the experiment, undergraduate students were asked to distinguish between a real suicide note and a fake.

  • Some students were told they scored high for accuracy, while another group was told they had guessed more than half incorrectly.
  • These scores were completely fake.
  • After the exercise, the students were told the scores were fake.

In part two, the students were asked to categorize how many notes they thought they actually guessed correctly, and how many they thought an average student would get right.

  • The students in the group who were told they scored high for accuracy thought they had done significantly better than the average student would.
  • Students in the group who were told they scored low for accuracy thought they had done significantly worse than the average student.

The researchers noted, “Even after evidence for their beliefs is totally refuted, people fail to make appropriate revisions in those beliefs. With fake news, Twitter, or fabricated studies, this is one of the many cases in which environment changed too quickly for natural selection to catch up.”

Experiments show – 3. We think we know more than we actually do

Kolbert references yet another complicating factor to man’s ability to reason logically. In the book, “The Knowledge Illusion: Why We Never Think Alone,” cognitive scientists Steven Sloman and Philip Fernbach postulate that humans do not think as individuals, but in groups. This concept is called groupthink. We depend on other people’s knowledge to help us meet our needs. The consequence is what is referred to as the illusion of explanatory depth. It’s challenging for a person to perceive where their knowledge ends and someone else’s begins.

Sloman and Fernbach’s book references a study done at Yale where participants were tested on their knowledge of the workings of everyday items.

  •  Participants were asked to rate their understanding of zippers and toilets and write detailed steps on how the devices work.
  • After writing the detailed steps, they had to rate their understanding of the devices again.
  • Self-assessments on how well they understood the item’s functionality dropped after participants were forced to write down how the device operates in discrete steps.

From Kolbert, “In a study conducted in 2012, they (Sloman and Fernbach) asked people for their stance on questions like: Should there be a single-payer health-care system? Or merit-based pay for teachers? Participants were asked to rate their positions depending on how strongly they agreed or disagreed with the proposals. Next, they were instructed to explain, in as much detail as they could, the impacts of implementing each one. Most people at this point ran into trouble. Asked once again to rate their views, they ratcheted down the intensity, so that they either agreed or disagreed less vehemently. They argue that if we, or our friends, or the pundits on CNN spent less time pontificating and more time trying to work through the implications of policy proposals, we’d realize how clueless we are and try to moderate our views.”

“As a rule, strong feelings about issues do not emerge from deep understanding,” Sloman and Fernbach write. This is how a community of knowledge can become dangerous.”

In an article printed by USA Today, journalist Alia E. Dastagir adds a personal element to why humans resist facts. “People reject science when asked to believe something that conflicts with a deeply held view.”

Christopher Graves, president and founder of the Ogilvy Center for Behavioral Science explains to Dastagir, “A human being cannot grasp something as a fact if it in any way undermines their identity. And that is an immutable human foible. These things have always been there, but not at scale,” he states. “You have a basic psychological tendency to perpetuate your own beliefs, to discount anything that runs against your own prior views.”

Furthermore, explains Brendan Nyhan, a political scientist at Dartmouth College, “Once we’ve convinced ourselves of something, research suggests facts don’t appeal to us. When you encounter facts that don’t support your idea, your belief in that area actually grows stronger.”

What we can do

Even though the human mind is fallible, we are capable of taking stock, reflecting and adapting to solve problems. Being aware of evolutionary limitations in human reasoning can be the first step towards a more peaceful internet, neighborhood and family. The flash point: deep in a conversation when emotions run strong, we have the self-control to take a long view of what we’re saying and why. Are our words formed and carried by the need to be right, by the need to preserve our sense of identity and by hubris? Or are we open to accepting contrary information, and even more radically, to changing our minds?

Another avenue to progress, according to scientists, is through curiosity. Curiosity is the antidote to circular arguments, dead-end debates, and meanness. It shifts the focus off our own primal agenda to win and onto a path of resolution.

Exercising curiosity is an exercise in humility. Acknowledging we don’t have it all figured out is implicit to debating curious. And in another great irony of the universe, the more readily we admit the gaps in our knowledge, the closer we can get to the truth.

 

References:

http://www.newyorker.com/magazine/2017/02/27/why-facts-dont-change-our-minds

https://www.usatoday.com/story/news/2017/04/20/science-march-war-truth-political-polarization/100636124/

 

Advertisements

Published by: Ann Syrowski

The stats: 28 years old M.S. Atmospheric Science B.S. Geology I'm a Chicago-based normal person who goes to work every day but has a dream of becoming the best writer I can possibly be with the tools I've been given. I love this city, my family, friends, a strong drink, music, and the outdoors. On any given Saturday I could be at a museum taking in a new exhibit, or lying in my bed covered in breakfast crumbs while watching Father of the Bride for the 99th time. Both are equally likely.

Categories UncategorizedLeave a comment

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s