Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.

Saturday, January 28, 2017

Locking yourself into error

I got in a rather interesting -- well, I suppose you could call it a "discussion" -- with a Trump supporter yesterday.

It came about because of recent posts here at Skeptophilia that have been pretty critical of the president, his appointees, and their decisions.  After a few minutes of the usual greetings and pleasantries ("You're a liberal lackey who sucks up what the lying mainstream media says without question!", stuff like that), I asked her what to me is the only pertinent question in such situations:

"What would it take to convince you that you are wrong?"

"I'm not wrong," she said.

"That's not what I asked," I responded.  "I asked what would it take to convince you that you are wrong.  About Donald Trump.  Or about anything."

"What would it take to convince you?" she shot back.

"Facts and evidence that my opinion was in error.  Or at least a good logical argument."

"People like you would never believe it anyway.  You're swallowing the lies from the media.  Thank God Donald Trump was elected despite people like you and your friends in the MSM."

"And you still haven't answered my question."

At that point, she terminated the conversation and blocked me.

Couple that with a second comment from a different person -- one I elected not to respond to, because eventually I do learn not to take the bait -- saying that of course I have a liberal bias "since I get my information from CNN," and you can see that the fan mail just keeps rolling in.

Of course, the question I asked the first individual isn't original to me; it was the single most pivotal moment in the never-to-be-forgotten debate between Ken Ham and Bill Nye over the theory of evolution in February of 2014, in which the moderator asked each man what, if anything, would change his mind.  Nye said:
We would need just one piece of evidence.  We would need the fossil that swam from one layer to another.  We would need evidence that the universe is not expanding.  We would need evidence that the stars appear to be far away but are not.  We would need evidence that rock layers could somehow form in just 4,000 years…  We would need evidence that somehow you can reset atomic clocks and keep neutrons from becoming protons.  Bring on any of those things and you would change me immediately.
Ham, on the other hand, gave a long, rambling response that can be summed up as "Nothing would change my mind.  No evidence, no logic, nothing."

The whole thing dovetails perfectly with a paper released just two days ago in the journal Political Psychology.  Entitled "Science Curiosity and Political Psychology," by Dan M. Kahan, Asheley Landrum, Katie Carpenter, Laura Helft, and Kathleen Hall Jamieson, the paper looks at the connection between scientific curiosity and a willingness to consider information that runs counter to one's own political biases and preconceived notions.  The authors write:
[S]ubjects high in science curiosity display a marked preference for surprising information—that is, information contrary to their expectations about the current state of the best available evidence—even when that evidence disappoints rather than gratifies their political predispositions.  This is in marked contrast, too, to the usual style of information-search associated with [politically-motivated reasoning], in which partisans avoid predisposition-threatening in favor of predisposition-affirming evidence. 
Together these two forms of evidence paint a picture—a flattering one indeed—of individuals of high science curiosity. In this view, individuals who have an appetite to be surprised by scientific information—who find it pleasurable to discover that the world does not work as they expected—do not turn this feature of their personality off when they engage political information but rather indulge it in that setting as well, exposing themselves more readily to information that defies their expectations about facts on contested issues.  The result is that these citizens, unlike their less curious counterparts, react more open-mindedly and respond more uniformly across the political spectrum to the best available evidence.
And maybe that's what's at the heart of all this.  I've always thought that the opposite of curiosity is fear -- those of us who are scientifically curious (and I will engage in a bit of self-congratulation and include myself in this group) tend to be less afraid about being found to be wrong, and more concerned with making sure we have all our facts straight.

[image courtesy of the Wikimedia Commons]

So I'll reiterate my question, aimed not only toward Trump supporters, but to everyone: what would it take to convince you that you are wrong?  About your political beliefs, religious beliefs, moral stances, anything?  It's a question we should keep in the forefront of our minds all the time.

Because once you answer that question with a defiant "nothing could convince me," you have effectively locked yourself into whatever errors you may have made, and insulated yourself from facts, logic, evidence -- and the truth.

4 comments:

  1. This is good, but I wonder if there are some... positions? that are axiomatic enough that they can't be interrogated in this way. Like, what would it take to convince me that, after all, yes, you really should be mean to people? I don't even know what that would be like.

    ReplyDelete
  2. You are getting to the core consideration in this post. Reminds me of my favorite refrigerator bumper stickers "It is easier to fool people than it is to convince them that they have been fooled." Mark Twain

    ReplyDelete
  3. Matthew E: " I wonder if there are some... positions? that are axiomatic enough that they can't be interrogated in this way."

    I find that an excellent point, and one that we discuss in decision science--any system, including logical ones, must start with basic axioms that must be accepted--not proved--for proving them would require more axioms, ad infinitum--turtles all the way down. ;)

    In science, one of the axioms is "Objective reality, fact, and reason exist, no matter how hard they are to see, prove, and test."

    For subjectivists it is "There is no such thing as objective reality, truth, and reason, everything is subjective, and just a matter of one's point of view."

    I cannot see any way to convince subjectivists that they are wrong. Can you?

    Thanks for a thoughtful article on an important topic, Gordon.

    ReplyDelete
  4. Perhaps objective reality, fact, and reason exist, but no human is capable of approaching them without some level of bias and subjectivity. We can do our best to make sure our conclusions fit all available evidence as well as possible, while acknowledging that another person's point of view might lead her to draw different conclusions or emphasise evidence that we may have found less important. This approach gives us a basis for discounting views that are not based in evidence at all or on a carefully preselected set of evidence. But it acknowledges that what appears to be objectivity and reason has often been merely conformity with the status quo of a dominant class or culture.

    ReplyDelete