Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.

Tuesday, September 26, 2017

Right in the gut

I know I've said it before, but it bears saying again: the strength of science lies in its reliance on hard evidence as the sine qua non of understanding.

I've tried to embrace this outlook myself, insofar as a fallible and biased human can do so.  Okay, so every day I poke fun at all sorts of odd beliefs, sometimes pissing people off.  But you know what?  You want to convince me, show me some reliable evidence.  For any of the claims I've scoffed at.  Bigfoot.  Ghosts.  ESP.  Astrology.  Tarot divination.  Homeopathy.

Even the existence of god.

I'm convinceable.  All you have to do is show me one piece of irrefutable, incontrovertible evidence, and I'm sold.

The problem is, to my unending frustration and complete bafflement, most people don't approach the world that way.  Instead, they rely on their gut -- which seems to me to be a really good way to get fooled.  I'm a pretty emotional guy, and I know my gut is unreliable.

Plus, science just doesn't seem to obey common sense at times.  As an example, consider the Theory of Relativity.  Among its predictions:
  • The speed of light is the ultimate universal speed limit.
  • Light moves at the same speed in every reference frame (i.e., your own speed relative to the beam of light doesn't matter; you'll still measure it as traveling at 300,000,000 meters per second).
  • When you move, time slows down.  The faster you move, the slower time goes.  So if you took off in a rocket ship to Alpha Centauri at 95% of the speed of light, when you came back from your trip you'd find that while twelve years or so would have passed for you, hundreds of years would have passed on Earth.
  • When you move, to a stationary person your mass increases and your length in the direction of motion contracts.  The faster you move, the more pronounced this effect becomes.
And so on.  But the kicker: all of these predictions of the Theory of Relativity have been experimentally verified.  As counterintuitive as this might be, that's how the world is.  (In fact, relativistic effects have to be taken into account to have accurate GPS.)

None of which we would know now if people relied solely on their gut to tell them how things work.

Despite all this, there are people who still rely on impulse and intuition to tell them what's true and what's not.  And now a study jointly conducted by researchers at Ohio State University and the University of Michigan has shown conclusively that if you do this, you are more prone to being wrong.

[image courtesy of the Wikimedia Commons]

Kelly Garrett and Brian Weeks decided to look into the connection between how people view evidence, and their likelihood of falling for incorrect information.  They looked at survey data from almost 3,000 people, in particular focusing on whether or not the respondents agreed with the following statements:
  • I trust my gut to tell me what’s true and what’s not. 
  • Evidence is more important than whether something feels true.
  • Facts are dictated by those in power.
They then correlated the responses with the participants' likelihood of believing a variety of conspiracy theories.  Unsurprisingly, they found that the people who relied on gut feelings and emotions to determine the truth were far more likely to fall for conspiracies and outright untruths.

"Misperceptions don’t always arise because people are blinded by what their party or favorite news outlet is telling them," Weeks said.  "While trusting your gut may be beneficial in some situations, it turns out that putting faith in intuition over evidence leaves us susceptible to misinformation."

"People sometimes say that it’s too hard to know what’s true anymore," Garrett said.  "That’s just not true.  These results suggest that if you pay attention to evidence you’re less likely to hold beliefs that aren’t correct...  This isn’t a panacea – there will always be people who believe conspiracies and unsubstantiated claims – but it can make a difference."

I'd say it makes all the difference.  And in the current political environment -- where accusations of "fake news" are thrown around right and left, and what people consider to be the truth depends more on political affiliation than it does on rational fact -- it's more than ever absolutely essential.

No comments:

Post a Comment