Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.

Saturday, April 29, 2017

Awoo

Yesterday I was asked by one of my Critical Thinking Students if I'd ever heard of Florida Swamp Apes.  After a brief moment in which I wondered if he were asking about a sports team, I answered in the negative.  He brought out his cellphone, on which he had downloaded an admittedly creepy image, which I include below:



Imagine my surprise when I found out that there's a whole site devoted to this odd beast, also called the "Florida Skunk Ape" for its strong smell.  Considered to be the "southernmost Bigfoot species in the United States," the Florida Skunk Ape has been sighted all over southern Florida, but most commonly in the Everglades region.

As with most of these alleged animals, the claims of sightings are numerous and varied, and the hard evidence essentially non-existent.  There are a lot of photographs, but to borrow a line from the astronomer Neil DeGrasse Tyson, there probably is an "Add Bigfoot" button on PhotoShop, so we shouldn't consider the photographic evidence to be evidence at all.  Also on the website is an audio clip of a Skunk Ape's howls, which to my ear sounded more like a distant dog, or possibly a guy going "Awoo."  We also have an interview with Dave Shealy, who seems to be one of the people responsible for the whole Skunk Ape phenomenon (he is the director of the Skunk Ape Research Center of Ochopee, Florida, open 7 AM to 7 PM, admission $5, which I am definitely going to visit next time I'm in Florida).  Lastly, we are informed that Skulls Unlimited, a company which sells a virtually unlimited number of skulls (thus the name), is now offering resin models of Bigfoot skulls.   One has to wonder what they cast the mold from, but in the field of cryptozoology it is sometimes best not to ask too many questions.

I thought I had heard of most of the cryptozoological claims from the United States, but this one was new to me.  Of course, the Sasquatch of the Pacific Northwest is so familiar by now as to elicit yawns, and many of us know of the Boggy Creek Monster of Fouke, Arkansas, which generated not one, nor two, but five truly dreadful movies.  There's Mothman and the Flatwoods Monster in West Virginia, the Dover Demon of Massachusetts, the Enfield Monster of Illinois, Goatman of Maryland, and dozens of others.  But the Skunk Ape is one I'd never heard of before, and I'm finding myself wondering how I missed it.  It did cross my mind briefly that perhaps the Skunk Ape sightings were merely elderly Bigfoots from the north who had moved to Florida when they retired, but apparently this is incorrect, as one site talks about a sighting of a "young and vigorous animal, probably an adolescent" and another refers to "Skunk Ape mating season"  (May, if you're curious; but you might want to refrain from wandering around the swamps of Florida in May, because two female European tourists tell the story of being chased by a "huge male Skunk Ape with an erection."  They got away, fortunately.)

"Not everyone who sees a Skunk Ape reports it," Dave Shealy says.  "They don't want people to poke fun at 'em, or to tell 'em they're crazy. That's not the exception; that's pretty much the rule...  There's never been a documented case of anyone ever being physically attacked by a Skunk Ape.  But also, there's a lot of people that go into the Everglades that never come out." 

Which really isn't all that reassuring.

In any case, the Florida Skunk Ape gives us yet another line in the ledger of Extraordinary Claims Requiring Extraordinary Evidence Of Which There Seems To Be None.  It's just as well, because it's the last week of April, so Skunk Ape mating season is almost upon us, and if there really was evidence that this thing exists I would feel duty-bound to go investigate, and the last thing I want is to be chased around in some godforsaken swamp by a Bigfoot with a boner.  So I think I'll give this one a pass.  

Friday, April 28, 2017

Playing on the heartstrings

I'm a pretty emotional guy, and one of the things that never fails to get me is music.  Among the musical moments that always grab me by the feels and swing me around, sometimes to the point of tears, are:
Then, there are the ones that send chills up my spine.  A few of those:
I've always been fascinated by this capacity for music to induce emotion.  Such a response is nearly universal, although which music causes tears or that little frisson up the spine varies greatly from person to person.  Most of Mozart's music (with the exception of the Requiem and a couple of other pieces) really doesn't do much for me.  It's pleasant to listen to, but doesn't evoke much in me other than that.  I actively dislike Chopin, Brahms, and Mahler, and I know people for whom those are the absolute pinnacle of emotional depth in music.

[image courtesy of the Wikimedia Commons]

In a paper released just last week in Nature, neurophysiologists Kazuma Mori and Makoto Iwanaga of Osaka University looked into an explanation for how this phenomenon happens, if not exactly why it happens.  Their paper, "Two Types of Peak Emotional Responses to Music: The Psychopathology of Chills and Tears," describes experiments they ran in which they allowed test subjects to listen to music while monitoring their reactions not only via subjective description but by such physiological criteria as skin conductivity (a common measure of stress).

And what happened was pretty cool.  They found that (as I have done above) strongly evocative pieces of music tended to fall into two categories, ones that elicit tears and ones that elicit chills.  The authors write:
The present study investigated the psychophysiological responses of two types of peak emotions: chills and tears.  We used music as the stimuli because the chills response has been confirmed in music and emotion studies... The chills and tears responses were measured by self-report sensations during song listening.  We conducted an experiment measuring subjective emotions and autonomic nervous system activity.  The hypothesis was that tears would be different from chills in terms of both psychological and physiological responses.  With respect to psychophysiological responses, we predicted that chills would induce subjective pleasure, subjective arousal, and physiological arousal whereas tears would induce subjective pleasure, relaxation, and physiological calming.  In addition, we asked participants to rate song expression in terms of happiness, sadness, calm, and fear in order to understand the emotional property of chills-inducing songs and tear-inducing songs...  [The] results show that tears involve pleasure from sadness and that they are psychophysiologically calming; thus, psychophysiological responses permit the distinction between chills and tears.  Because tears may have a cathartic effect, the functional significance of chills and tears seems to be different.
Which supports the contention that my experience of bawling the first time I listened to Ralph Vaughan Williams's Fantasia on a Theme by Thomas Tallis served the purpose of emotional catharsis.  I know my mood was better after the last chords died out, with the exception of the fact that I felt a little like a wrung-out dishrag; and despite the fact that I don't exactly like crying, I listen to these tear-evoking pieces of music over and over.  So there must be something there I'm seeking, and I don't think it's pure masochism.  The authors write:
The current results found that the mixed emotion of chills was simultaneous pleasure, happiness, and sadness.  This finding means that chills provide mainly a positive experience but the sadness factor is necessary even though a favourite song is the elicitor.  Given that music chills activate reward-related brain regions, such an emotional property could make chills a unique experience and separate chills from other mixed emotional experiences.  Furthermore, as the mixed emotion of tears was simultaneous pleasure and sadness, it was different from the mixed emotion of chills.  The tears response contributes to the understanding of the pleasure of sad music.  As people generally feel displeasure for sad things, this is a unique mixed emotional response with regard to music.  Although previous studies showed that sad music induced relatively weak pleasure, the current tears’ results showed that sad songs induced strong pleasure.  It is difficult to account for why people feel sad music as pleasurable; however, the current results suggested that the benefit of cathartic tears might have a key role in the pleasure generated by sad music.  Therefore, the two types of peak emotional responses may uniquely support knowledge of mixed emotion.
So that's pretty awesome, and it's nice to know that I'm not alone in my sometimes overwhelming response to music.  And now I think I'll go listen to Shostakovich's Symphony #5 and have a nice long cry.  I know I'll feel better afterwards.

Thursday, April 27, 2017

Going to the dogs

I am the proud owner of two dogs, both rescues, who are at this point basically members of the family whose contributions to the household consist of barking at the UPS guy, sleeping most of the day, getting hair all over everything, and making sure that we get our money's worth out of the carpet steamer we bought five years ago.

First, there's Lena the Wonder-Hound:


And her comical sidekick, Grendel:


Both of them are sweet and affectionate and spoiled absolutely rotten.  Lena's ancestry is pretty clear -- she's 100% hound, probably mostly Blue-tick Coonhound, Redbone, and Beagle -- but Grendel's a bit of a mystery.  Besides his square face and coloration, other significant features are: (1) a curly tail; (2) a thick undercoat; and (3) a tendency to snore.  This last has made us wonder if he has some Pug or Bulldog in his background somewhere, but that's only speculation.

This all comes up because of a recent delightful study in one of my favorite fields, cladistics.  The idea of cladistics is to create a tree of descent for groups of species based on most recent common ancestry, as discerned from overlap in DNA sequences.  And a group of researchers -- Heidi G. Parker, Dayna L. Dreger, Maud Rimbault, Brian W. Davis, Alexandra B. Mullen, Gretchen Carpintero-Ramirez, and Elaine A. Ostrander of the Comparative Genomics Branch of the National Human Genome Research Institute -- have done this for 161 breeds of dog.

The authors write:
The cladogram of 161 breeds presented here represents the most diverse dataset of domestic dog breeds analyzed to date, displaying 23 well-supported clades of breeds representing breed types that existed before the advent of breed clubs and registries.  While the addition of more rare or niche breeds will produce a denser tree, the results here address many unanswered questions regarding the origins of breeds.  We show that many traits such as herding, coursing, and intimidating size, which are associated with specific canine occupations, have likely been developed more than once in different geographical locales during the history of modern dog.  These data also show that extensive haplotype sharing across clades is a likely indicator of recent admixture that took place in the time since the advent of breed registries, thus leading to the creation of most of the modern breeds.  However, the primary breed types were developed well before this time, indicating selection and segregation of dog populations in the absence of formal breed recognition.  Breed prototypes have been forming through selective pressures since ancient times depending on the job they were most required to perform.  A second round of hybridization and selection has been applied within the last 200 years to create the many unique combinations of traits that modern breeds display.  By combining genetic distance relationships with patterns of haplotype sharing, we can now elucidate the complex makeup of modern dogs breeds and guide the search for genetic variants important to canine breed development, morphology, behavior, and disease.
Which is pretty cool.  What I found most interesting about the cladogram (which you can see for yourself if you go to the link provided above) is that breeds that are often clustered together, and known by the same common name -- such as "terrier" -- aren't necessarily closely related.  This shouldn't be a surprise, of course; all you have to do is look at the relationships between birds called "buntings" or "sparrows" or "tanagers" to realize that common names tell you diddly-squat about actual genetic distance.  But it was still surprising to find that (for example) Bull Terriers and Staffordshire Terriers are more closely related to Bulldogs and Mastiffs than they are to (for example) Scottish Terriers; that Corgis are actually related to Greyhounds; and that Schnauzers, Pugs, Pomeranians, and Schipperkes are all on the same clade.  The outgroup (most distantly related branch) of the entire clade is the peculiar Basenji, a Central African breed with a strange, yodel-like bark, a curly tail, and pointed ears, whose image has been recorded almost unchanged all the way back to the time of the ancient Egyptians.

Anyhow, it's an elegant bit of research, and sure to be of interest to any other dog owners in the studio audience.  Me, I'm wondering where Grendel fits into the cladogram.  Considering his peculiar set of traits, he might have a branch all his own, and give the Basenji a run for its money as the oddest breed out there.

Wednesday, April 26, 2017

In your right mind

Another peculiarity of the human brain is lateralization, which is the tendency of the brain to have a dominant side.  It's most clearly reflected in hand dominance; because of the cross-wiring of the brain, people who are right-handed have a tendency to be left brain dominant, and vice versa.  (There's more to it than that, as some people who are right handed are, for example, left eye dominant, but handedness is the most familiar manifestation of brain lateralization.)

It bears mention at this juncture that the common folk wisdom that brain lateralization has an influence on your personality -- that, for instance, left brain dominant people are sequential, mathematical, and logical, and right brain dominant people are creative, artistic, and holistic -- is complete nonsense.  That myth has been around for a long while, and has been roundly debunked, but still persists for some reason.

I first was introduced to the concept of brain dominance when I was in eighth grade.  I was having some difficulty reading, and my English teacher, Mrs. Gates, told me she thought I was mixed-brain dominant -- that I didn't have a strongly lateralized brain -- and that this often lead to processing disorders like dyslexia.  (She was right, but they still don't know why that connection exists.)  It made sense.  When I was in kindergarten, I switched back and forth between writing with my right and left hand about five times until my teacher got fed up and told me to simmer down and pick one.  I picked my right hand, and have stuck with it ever since, but I still have a lot of lefty characteristics.  I tend to pick up a drinking glass with my left hand, and I'm strongly left eye dominant, for example.

Anyhow, Mrs. Gates identified my mixed-brainness, and the outcome apropos of my reading facility, but she also told me that there was one thing that mixed-brain people can learn faster than anyone else.  Because of our nearly-equal control from both sides of the brain, we can do a cool thing, which Mrs. Gates taught me and I learned in fifteen seconds flat.  I can write, in cursive, forward with my right hand while I'm writing the same thing backwards with my left.  (Because it's me, they're both pretty illegible, but it's still kind of a fun party trick.)

[image courtesy of the Wikimedia Commons]

Fast forward to today.  Some recent research has begun to elucidate the evolutionary reasons behind lateralization.  It's been known for years that lots of animals are lateralized, so it stands to reason that it must confer some kind of evolutionary advantage, but what that might be was unclear... until now.

Research by a team led by Onur Güntürkün, of the Institute of Cognitive Neuroscience at Ruhr-University Bochum, in Germany, has looked at lateralization in animals from cockatoos to zebra fish to humans, and has described the possible evolutionary rationale for having a dominant side of the brain.

"What you do with your hands is a miracle of biological evolution," Güntürkün says.  "We are the master of our hands, and by funneling this training to one hemisphere of our brains, we can become more proficient at that kind of dexterity.  Natural selection likely provided an advantage that resulted in a proportion of the population -- about 10% -- favoring the opposite hand. The thing that connects the two is parallel processing, which enables us to do two things that use different parts of the brain at the same time."

Additionally, Güntürkün says, our perceptual systems have also evolved that kind of division of labor.  Both left and right brain have visual recognition centers, but in humans the one on the right side is more devoted to image recognition, and the one on the left to word and symbol recognition.  And this is apparently a very old evolutionary innovation, long predating our use of language; even pigeons have a split perceptual function between the two sides of the brain (and therefore between their eyes).  They tend to tilt their heads so their left eye is scanning the ground for food while their right one scans the sky for predators.

So what might seem to be a bad idea -- ceding more control to one side of the brain than the other, making one hand more nimble than the other --turns out to have a distinct advantage.  And if you'll indulge me in a little bit of linguistics geekery, for good measure, even our word "dexterous" reflects this phenomenon.  "Dexter" is Latin for "right," and reflects the commonness of right-handers, who were considered to be more skillful.  (And when you find out that the Latin word for "left" is "sinister," you get a rather unfortunate lens into attitudes toward southpaws.)

Anyhow, there you have it; another interesting feature of our brain physiology explained, and one that has a lot of potential for increasing our understanding of neural development.  "Studying asymmetry can provide the most basic blueprints for how the brain is organized," Güntürkün says.  "It gives us an unprecedented window into the wiring of the early, developing brain that ultimately determines the fate of the adult brain.  Because asymmetry is not limited to human brains, a number of animal models have emerged that can help unravel both the genetic and epigenetic foundations for the phenomenon of lateralization."

Tuesday, April 25, 2017

Thanks for the memories

I've always been fascinated with memory.  From the "tip of the tongue" phenomenon, to the peculiar (and unexplained phenomenon) of déjà vu, to why some people have odd abilities (or inabilities) to remember certain types of information, to caprices of the brain such as its capacity for recalling a forgotten item once you stop thinking about it -- the way the brain handles storage and retrieval of memories is a curious and complex subject.

Two pieces of recent research have given us a window into how the brain organizes memories, and their connection to emotion.  In the first, a team at Dartmouth and Princeton Universities came up with a protocol to induce test subjects to forget certain things intentionally.  While this may seem like a counterproductive ability -- most of us struggle far harder to recall memories than to forget them deliberately -- consider the applicability of this research to debilitating conditions such as post-traumatic stress disorder.

In the study, test subjects were shown images of outdoor scenes as they studied two successive lists of words.  In one case, the test subjects were told to forget the first list once they received the second; in the other, they were instructed to try to remember both.

"Our hope was the scene images would bias the background, or contextual, thoughts that people had as they studied the words to include scene-related thoughts," said Jeremy Manning, an assistant professor of psychological and brain sciences at Dartmouth, who was lead author of the study.  "We used fMRI to track how much people were thinking of scene-related things at each moment during our experiment.  That allowed us to track, on a moment-by-moment basis, how those scene or context representations faded in and out of people's thoughts over time."

What was most interesting about the results is that in the case where the test subjects were told to forget the first list, the brain apparently purged its memory of the specifics of the outdoor scene images the person had been shown as well.  When subjects were told to recall the words on both lists, they recalled the images on both sets of photographs.

"[M]emory studies are often concerned with how we remember rather than how we forget, and forgetting is typically viewed as a 'failure' in some sense, but sometimes forgetting can be beneficial, too," Manning said.  "For example, we might want to forget a traumatic event, such as soldiers with PTSD.  Or we might want to get old information 'out of our head,' so we can focus on learning new material.  Our study identified one mechanism that supports these processes."

What's even cooler is that because the study was done with subjects connected to an fMRI, the scientists were able to see what contextual forgetting looks like in terms of brain firing patterns.   "It's very difficult to specifically identify the neural representations of contextual information," Manning said.  "If you consider the context you experience something in, we're really referring to the enormously complex, seemingly random thoughts you had during that experience.  Those thoughts are presumably idiosyncratic to you as an individual, and they're also potentially unique to that specific moment.  So, tracking the neural representations of these things is extremely challenging because we only ever have one measurement of a particular context.  Therefore, you can't directly train a computer to recognize what context 'looks like' in the brain because context is a continually moving and evolving target.  In our study, we sidestepped this issue using a novel experimental manipulation -- we biased people to incorporate those scene images into the thoughts they had when they studied new words.  Since those scenes were common across people and over time, we were able to use fMRI to track the associated mental representations from moment to moment."

In the second study, a team at UCLA looked at what happens when a memory is connected to an emotional state -- especially an unpleasant one.  What I find wryly amusing about this study is that the researchers chose as their source of unpleasant emotion the stress one feels in taking a difficult math class.

I chuckled grimly when I read this, because I had the experience of completely running into the wall, vis-à-vis mathematics, when I was in college.  I actually was a pretty good math student.  I breezed through high school math, barely opening a book or spending any time outside of class studying.  In fact, even my first two semesters of calculus in college, if not exactly a breeze, at least made good sense to me and resulted in solid A grades.

Then I took Calc 3.

I'm not entirely sure what happened, but when I hit three-dimensional representations of graphs, and double and triple integrals, and calculating the volume of the intersection of four different solid objects, my brain just couldn't handle it.  I got a C in Calc 3 largely because the professor didn't want to have to deal with me again.  After that, I sort of never recovered.  I had a good experience with Differential Equations (mostly because of a stupendous teacher), but the rest of my mathematical career was pretty much a flop.

And the worst part is that I still have stress dreams about math classes.  I'm back at college, and I realize that (1) I have a major exam in math that day, and (2) I have no idea how to do what I'll be tested on, and furthermore (3) I haven't attended class for weeks.  Sometimes the dream involves homework I'm supposed to turn in but don't have the first clue about how to do.

Keep in mind that this is 35 years after my last-ever math class.  And I'm still having anxiety dreams about it.


What the researchers at UCLA did was to track students who were in an advanced calculus class, keeping track of both their grades and their self-reported levels of stress surrounding the course.  Their final exam grades were recorded -- and then, two weeks after the final, they were given a retest over the same material.

The fascinating result is that stress was unrelated to students' scores on the actual final exam, but the students who reported the most stress did significantly more poorly on the retest.  The researchers call this "motivated forgetting" -- that the brain is ridding itself of memories that are associated with unpleasant emotions, perhaps in order to preserve the person's sense of being intelligent and competent.

"Students who found the course very stressful and difficult might have given in to the motivation to forget as a way to protect their identity as being good at math," said study lead author Gerardo Ramirez.  "We tend to forget unpleasant experiences and memories that threaten our self-image as a way to preserve our psychological well-being.  And 'math people' whose identity is threatened by their previous stressful course experience may actively work to forget what they learned."

So that's today's journey through the recesses of the human mind.  It's a fascinating and complex place, never failing to surprise us, and how amazing it is that we are beginning to understand how it works.  As my dear friend, Professor Emeritus Rita Calvo, Cornell University teacher and researcher in Human Genetics, put it: "The twentieth century was the century of the gene.  The twenty-first will be the century of the brain.  With respect to neuroscience, we are right now about where genetics was in 1917 -- we know a lot of the descriptive features of the brain, some of the underlying biochemistry, and other than that, some rather sketchy details about this and that.  We don't yet have a coherent picture of how the brain works.

"But we're heading that direction.  It is only a matter of time till we have a working model of the mind.  How tremendously exciting!"

Monday, April 24, 2017

Reality blindness

I read an article on CNN yesterday that really pissed me off, something that seems to be happening more and more lately.

The article, entitled "Denying Climate Change As the Seas Around Them Rise" (by Ed Lavandera and Jason Morris), describes the effects of climate change in my home state of Louisiana, which include the loss of entire communities to rising seas and coastline erosion.  An example is the village of Isle Jean Charles, mostly inhabited by members of the Biloxi-Chetimacha tribe, which basically has ceased to exist in the last ten years.

But there are people who will deny what is right in front of their faces, and they include one Leo Dotson of Cameron Parish.  Dotson, a fisherman and owner of a seafood company, "turned red in the face" when the reporters from CNN asked him about climate change.  Dotson said:
I work outside in the weather on a boat, and it's all pretty much been the same for me.  The climate is exactly the same as when I was a kid.  Summers hot, winters cold...  [Climate change] doesn't concern me...  What is science?  Science is an educated guess.  What if they guess wrong?  There's just as much chance for them to be wrong as there is for them to be right.  If [a scientist] was 500 years old, and he told me it's changed, I would probably believe him.  But in my lifetime, I didn't see any change.
Well, you know what, Mr. Dotson?  I'm kind of red in the face right now, myself.  Because your statements go way past ignorance.  Ignorance can be forgiven, and it can be cured.  What you've said falls into the category of what my dad -- also a fisherman, and also a native and life-long resident of Louisiana -- called "just plain stupid."

Science is not an educated guess, and there is not  "just as much chance for them to be wrong as there is for them to be right."  Climate scientists are not "guessing" on climate change.  Because of the controversy, the claim has been tested every which way from Sunday, and every scrap of evidence we have -- sea level rise, Arctic and Antarctic ice melt, earlier migration times for birds, earlier flowering times for plants, more extreme weather events including droughts, heat waves, and storms -- support the conclusion that the climate is shifting dramatically, and that we've only seen the beginning.


At this point, the more educated science deniers usually bring up the fact that there have been times that the scientific establishment has gotten it wrong, only to be proven so, sometimes years later.  Here are a few examples:
  1. Darwin's theory of evolution, which overturned our understanding of how species change over time.
  2. Mendel's experiments in genetics, later bolstered by the discovery of the role of DNA and chromosomes in heredity.  Prior to Mendel's time, our understanding of heredity was goofy at best (consider the idea, still prevalent in fairy tales, of "royal blood" and the capacity for ruling being inheritable, which you'd think that any number of monarchs who were stupid, incompetent, insane, or all three would have been sufficient to put to rest).
  3. Alfred Wegener's postulation of "continental drift" in 1912, which was originally ridiculed so much that poor Wegener was forced to retreated in disarray.  The fact that he was right wasn't demonstrated for another forty years, through the work of such luminaries in geology as Harry Hess, Tuzo Wilson, Fred Vine, Drummond Matthews, and others.
  4. The "germ theory of disease," proposed by Marcus von Plenciz in 1762, and which wasn't widely accepted until the work of Robert Koch and Louis Pasteur in the 1870s.
  5. Big Bang cosmology, discovered from the work of astronomers Georges Lemaître and Edwin Hubble.
  6. Albert Einstein's discovery of relativity, and everything that came from it -- the speed of light as an ultimate universal speed limit, time dilation, and the theory of simultaneity.
  7. The structure of the atom, a more-or-less correct model of which was first described by Niels Bohr, and later refined considerably by the development of quantum mechanics.
There.  Have I forgotten any major ones?  My point is that yes, prior to each of these, people (including scientists) believed some silly and/or wrong ideas about how the world works, and that there was considerable resistance in the scientific community to accepting what we now consider theory so solidly supported that it might as well be considered as fact.  But you know why these stand out?

Because they're so infrequent.  If you count the start of the scientific view of the world as being some time during the Enlightenment -- say, 1750 or so -- that's 267 years in which there have been only seven times there has been a major model of the universe overturned and replaced by a new paradigm.  Mostly what science has done is to amass evidence supporting the theories we have -- genetics supporting evolution, the elucidation of DNA's structure by Franklin, Crick, and Watson supporting Mendel, the discovery of the 3K cosmic microwave background radiation by Amo Penzias and Robert Wilson supporting the Big Bang.

So don't blather at me about how "science gets it wrong as often as it gets it right."  That's bullshit.  If you honestly believe that, you better give up modern medicine and diagnostics, airplanes, the internal combustion engine, microwaves, the electricity production system, and the industrial processes that create damn near every product we use.

But you know what?  I don't think Dotson and other climate change deniers actually do believe that.  I doubt seriously whether Dotson would go in to his doctor for an x-ray, and when he gets the results say, "Oh, well.  It's equally likely that I have a broken arm or not, so what the hell?  Might as well not get a cast."  He doesn't honestly think that when he pulls the cord to start his boat motor, it's equally likely to start, not start, or explode.

No, he doesn't believe in climate change because it would require him to do something he doesn't want to do.  Maybe move.  Maybe change his job.  Maybe vote for someone other than the clods who currently are in charge of damn near every branch of government.  So because the result is unpleasant, it's easier for him to say, "ain't happening," and turn red in the face.

But the universe is under no obligation to conform to our desires.  Hell, if it was, I'd have a magic wand and a hoverboard.  It's just that I'm smart enough and mature enough to accept what's happening even if I don't like it, and people like Dotson -- and Lamar Smith, and Dana Rohrabacher, and James "Snowball" Inhofe, and Scott Pruitt, and Donald Trump -- apparently are not.

The problem is, there's not much we can do to fix this other than wait till Leo Dotson's house floats away.  Once people like him have convinced themselves of something, there's no changing it.

I just have to hope that our government officials aren't quite so intransigent.  It'd be nice to see them wake up to reality before the damage done to our planet is irrevocable.

Saturday, April 22, 2017

Poll avoidance

I'm lucky, being an outspoken atheist, that I live where I do.  The people in my area of upstate New York are generally pretty accepting of folks who are outside of the mainstream (although even we've got significant room for improvement).  The amount of harassment I've gotten over my lack of religion has, really, been pretty minimal, and mostly centered around my teaching of evolution in school and not my unbelief per se.

It's not like that everywhere.  In a lot of parts of the United States, religiosity in general, and Christianity in particular, are so ubiquitous that it's taken for granted.  In my home town of Lafayette, Louisiana, the question never was "do you go to church?", it was "what church go you go to?"  The couple of times I answered that with "I don't," I was met with a combination of bafflement and an immediate distancing, a cooling of the emotional temperature, a sense of "Oh -- you're not one of us."

So no wonder that so many atheists are "still in the closet."  The reactions by friends, family, and community are simply not worth it, even though the other alternative is having a deeply important part of yourself hidden from the people in your life.  As a result, of course, this results in a more general problem -- the consistent undercounting of how many people actually are atheists, and the result that those of us who are feel even more isolated and alone than we did.

[image courtesy of creator Jack Ryan and the Wikimedia Commons]

Current estimates from polls are that 3% of Americans self-identify as atheists, but there's reason to believe that this is a significant underestimate -- in other words, people are being untruthful to the pollsters about their own disbelief.  You might wonder why an anonymous poll conducted by a total stranger would still result in people lying about who they are, but it does.  Jesse Singal, over at The Science of Us, writes:
So if you’re an atheist and don’t live in one of America’s atheist-friendly enclaves, it might not be something you want to talk about — in fact you may have trained yourself to avoid those sorts of conversations altogether.  Now imagine a stranger calls you up out of the blue, says they’re from a polling organization, and asks about your religious beliefs.  Would you tell them you don’t have any?  There’s a lot of research suggesting you might not.  The so-called social-desirability bias, for example, is an idea that suggests that in polling contexts, people might not reveal things — racist beliefs are the one of the more commonly studied examples — that might make them look bad in the eyes of others, even if others refers to only a single random person on the other end of the phone line.
As Singal points out, however, a new study by Will Gervais and Maxine B. Najle of the University of Kentucky might have come up with a way around that.  Gervais and Najle came up with an interesting protocol for estimating the number of atheists without having to ask the specific question directly.  They gave one of two different questionnaires to 2,000 people.  Each had a list of statements that could be answered "true" or "false" -- all the respondents had to do was to tell the researcher how many true statements there were, not which specific ones were true, thus (one would presume) removing a lot of the anxiety over admitting outright something that could be perceived negatively.  The first questionnaire was the control, and had statements like "I own a dog" and "I am a vegetarian."  The second had the same statements, with one additional one: "I believe in God."  Since one would presume that in any sufficiently large random sample of people, the same proportion of people would answer "yes" to any given statement, then any increase in the number of (in this case) "false" replies would have to be due to the additional statement about belief.

And there was a difference.  A significant one.  The authors write:
Widely-cited telephone polls (e.g., Gallup, Pew) suggest USA atheist prevalence of only 3-11%.  In contrast, our most credible indirect estimate is 26% (albeit with considerable estimate and method uncertainty).  Our data and model predict that atheist prevalence exceeds 11% with greater than .99 probability, and exceeds 20% with roughly .8 probability.  Prevalence estimates of 11% were even less credible than estimates of 40%, and all intermediate estimates were more credible.
So it looks like there are a lot more of us out there than anyone would have thought.  I, for one, find that simultaneously comforting and distressing.  Isn't it sad that we still live in a world where belonging to a stigmatized group -- being LGBT, being a minority, being atheist -- is still looked upon so negatively that there are that many people who feel like they need to hide?  I'm not in any way criticizing the decision to stay in the closet; were I still living in the town where I was raised, I might well have made the same choice, and I realize every day how lucky I am to live in a place where people (for the most part) accept who I am.

But perhaps this study will be a first step toward atheists feeling more empowered to speak up.  There's something to the "safety in numbers" principle.  It'd be nice if people would just be kind and non-judgmental regardless, even to people who are different, but when I look at the news I realize how idealistic that is.  Better, I suppose, to convince people of the truth that we're more numerous than you'd think -- and not willing to pretend any more to a belief system we don't share.