Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.

Friday, July 1, 2016

Vandals for god

When I wrote last year about ISIS's systematic destruction of the archeological treasure-trove in Palmyra, Syria, one of my readers stated, "And there you have one more difference between the Muslims and the Christians; you don't see Christians participating in this kind of wanton destruction."

After a brief exchange in which I at least got her to add the words "... any more" in the interest of historical accuracy, I let the matter drop.  Whether the aggressive proselytizing by Christian missionaries in the 20th century, especially in Africa, could be looked upon as accomplishing the same thing was a direction I didn't have the inclination to take the conversation.  In any case, arguments about religion rarely ever accomplish anything but hard feelings on both sides.

And to be fair, the scale on which ISIS and the Taliban have destroyed the cultural heritage of Syria and Afghanistan is far beyond anything similar we've seen recently.  But that doesn't mean the same sort of impulses don't drive some of the Christian fringe.  They just have less opportunity to exercise their crazy ideas.

Enter the Christians who are working in Mexico to destroy the iconic Mexican pyramids because they're "used for devil worship."

Authorities in the village of San Bartolo Tutotepec are investigating damage to the 7,000 year old pyramid of the same name, after Jehovah's Witnesses took credit last month for similar damage to the Makonikha sanctuary in Hidalgo.

"We are following the word of God," a spokesperson for the church told Mexican news source Telesur after the first incident.  "We believe that the sites are still used for traditional rituals that are not Christian and may involve devil worship."

The pyramid of San Bartolo Tutotepec [image courtesy of the Wikimedia Commons]

In fact, San Bartolo is used for religious observances by the Otomi people to worship and give sacrifices to their deities of earth and water.  But apparently there are those who follow the general scheme of "if it's different than what I believe, it's devilry," and think this equates to "worshiping Satan."

Mexican authorities have stated that it may not be the same people who participated in the earlier incident, and are also investigating some local villagers who have voiced the same opinions.  But whoever turns out to be responsible, it comes from the same motives; the idea that it's somehow virtuous to be vandals for god.

I think what galls me most about this is that these people, and others like them, think that the beliefs of others don't matter.  They are not content to follow the tenets of their religion insofar as it guides their own ethics and morals; they take the further step of claiming that god is directing them to control how everyone else lives.


Plus, of course, there's the tragedy of irreparable damage to a beautiful structure that has withstood 7,000 years of human use and natural wear and tear.  The idea that these people would think they had the right to walk and and vandalize it makes me feel a little sick.

I'm also amazed at how far some people will go to give the vandals a bye simply because they're Christians.  I was discussing this with a friend a couple of days ago, and he said, "You can see why they think it's devil worship.  You do know that those temples were once used for human sacrifice, don't you?"

Yes, well, they aren't now, are they?  If we were to tear down every structure that had ever been devoted to torture, execution, and man's general inhumanity to man, we wouldn't have much left.  Starting with the Tower of London and working our way outwards from there.

So I can't find any mitigating circumstances here.  I hope that the vandals get caught before they can do any further damage, and that "wait, it's my religion" isn't pulled out as some kind of Get-Out-Of-Jail-Free card.

Thursday, June 30, 2016

Viral stupidity

My dad used to say that ignorance was only skin deep, but stupid goes all the way to the bone.

There's a lot to that.  Ignorance can be cured; after all, the etymology of the word comes from a- (not) and -gnosis (knowledge).  There are plenty of things I'm ignorant about, but I'm always willing to cure that ignorance by working at understanding.

Stupidity, on the other hand, is a different matter.  There's something willful about stupidity.  There's a stubborn sense of "I don't know and I don't care," leading to my dad's wise assessment that on some level stupidity is a choice.  Stupidity is not simply ignorance; it's ignorance plus the decision that ignorance is good enough.

What my dad may have not realized, though, is that there's a third circle of hell, one step down even from stupidity.  Science historian Robert Proctor of Stanford University has made this his area of study, a field he has christened agnotology -- the "study of culturally constructed ignorance."

Proctor is interested in something that makes stupidity look positively innocent; the deliberate cultivation of stupidity by people who are actually intelligent.  This happens when special interest groups foster confusion among laypeople for their own malign purposes, and see to it that such misinformation goes viral.  For example, this is clearly what is happening with respect to anthropogenic climate change.  There are plenty of people in the petroleum industry who are smart enough to read and understand scientific papers, who can evaluate data and evidence, who can follow a rational argument.  That they do so, and still claim to be unconvinced, is stupidity.

That they then lie and misrepresent the science in order to cast doubt in the minds of less well-informed people in order to push a corporate agenda is one step worse.

"People always assume that if someone doesn't know something, it's because they haven't paid attention or haven't yet figured it out," Proctor says.  "But ignorance also comes from people literally suppressing truth—or drowning it out—or trying to make it so confusing that people stop caring about what's true and what's not."

[image courtesy of Nevit Dilman and the Wikimedia Commons]

The same sort of thing accounts for the continuing claims that President Obama is a secret Muslim, that Hillary Clinton was personally responsible for the Benghazi attacks, that jet impacts were insufficient to bring down the Twin Towers on 9/11 so it must have been an "inside job."  Proctor says the phenomenon is even responsible for the spread of creationism -- although I would argue that this isn't quite the same thing.  Most of the people pushing creationism are, I think, true believers, not cynical hucksters who know perfectly well that what they're saying isn't true and are only spreading the message to bamboozle the masses.  (Although I have to admit that the "why are there still monkeys?" and "the Big Bang means that nothing exploded and made everything" arguments are beginning to seem themselves like they're one step lower than stupidity, given how many times these objections have been answered.)

"Ignorance is not just the not-yet-known, it’s also a political ploy, a deliberate creation by powerful agents who want you 'not to know'," Proctor says.  "We live in a world of radical ignorance, and the marvel is that any kind of truth cuts through the noise.  Even though knowledge is accessible, it does not mean it is accessed."

David Dunning of Cornell University, who gave his name to the Dunning-Kruger effect (the idea that people systematically overestimate their own knowledge), agrees with Proctor.  "While some smart people will profit from all the information now just a click away, many will be misled into a false sense of expertise," Dunning says.  "My worry is not that we are losing the ability to make up our own minds, but that it’s becoming too easy to do so.  We should consult with others much more than we imagine.  Other people may be imperfect as well, but often their opinions go a long way toward correcting our own imperfections, as our own imperfect expertise helps to correct their errors."

All of which, it must be said, is fairly depressing.  That we can have more information at our fingertips than ever before in history, and still be making the same damned misjudgments, is a dismal conclusion.  It is worse still that there are people who are taking advantage of this willful ignorance to push popular opinion around for their own gain.

So my dad is right; ignorance is curable, stupidity reaches the bone.  And what Proctor and Dunning study, I think, goes past the bone, all the way to the heart.

Wednesday, June 29, 2016

Hotfoot

I am often stunned by the level of credulity exhibited by some folks.

Take, for example, the incident that occurred a few days ago at a four-day motivational seminar called "Unleashing the Power Within" hosted by speaker Tony Robbins.  According to the article, Robbins's seminars cost between $1,000 and $3,000 to attend, and the high point of the thing is that you get to walk barefoot on red-hot coals.

[image courtesy of photographer Jens Buurgaard Nielsen and the Wikimedia Commons]

Me, I'd pay $1,000 to avoid having to walk on red-hot coals.  But these people evidently thought this was a great idea.  And to be fair, apparently there are circumstances in which you can walk on coals and not get burned -- and a good, physics-based explanation of how that can happen.

The problem is, it doesn't always work out that way, and when it doesn't, major ouchies occur.  Which is what happened last week in Dallas, Texas...

... to thirty seminar participants.

Now I can see how one person could get burned, or two, or maybe even three.  But you'd think that when the 23rd person shrieked "HOLY FUCK MY FEET ARE BURNING OFF" that the remaining participants would go, "Okay, maybe not."  What did Robbins do, line the participants up in decreasing order of intelligence, or something?

So Dallas Fire Rescue was called in, and thirty people were treated for injuries.

"It felt like someone had taken a hot iron and pressed it against my feet " said seminar participant Paul Gold of West Palm Beach, Florida, who suffered second-degree burns on both feet.  "In hindsight, jumping off would have been a fantastic idea.  But when you're in the spirit of the moment, you're kinda focused on one task."

I dunno, I think I'd have to be pretty damn focused not to think of getting off a bed of hot coals when my feet are about to burst into flame.

Gold added that he thought he'd signed a hold-harmless waiver before participating.  He signed something, he was certain about that, but isn't sure what it said.

Which supports my contention that the firewalkers weren't chosen for their critical thinking ability.

Another participant, Jacqueline Luxemberg, said that part of the problem was that a lot of the participants weren't following the leaders' directions, but were concentrating more on taking selfies and videos.  So look for a rash of Facebook photos with captions like, "This is me just before my lower legs caught on fire."

Look, I'm all for facing your fears.  There is something pretty empowering about facing down something you thought you couldn't handle, achieving a goal you were sure you would never manage.  But there are far better ways to do it than tromping across a bed of red-hot charcoal briquets.  For one thing, whether you get burned or not has nothing to do with your mental state -- it's physics, pure and simple.  Second, there's a decent chance you'll end up with blisters all over the soles of your feet, which has got to make walking uncomfortable for a week or two thereafter.

And third, you're putting thousands of dollars into the hands of people who are trying to convince you that walking on hot coals is a great idea.  Myself, I can think of lots of other uses for a thousand bucks than giving it to Tony Robbins.  Add to that the woo-woo mystical trappings a lot of those people weave into their presentations, and I'll get my motivation elsewhere, thanks.

Tuesday, June 28, 2016

Taking offense

A few days ago, Neil Gaiman wrote the following perceptive words:
I was reading a book (about interjections, oddly enough) yesterday which included the phrase “In these days of political correctness…” talking about no longer making jokes that denigrated people for their culture or for the colour of their skin.  And I thought, “That’s not actually anything to do with ‘political correctness’.  That’s just treating other people with respect.” 
Which made me oddly happy. I started imagining a world in which we replaced the phrase “politically correct” wherever we could with “treating other people with respect”, and it made me smile.

You should try it.  It’s peculiarly enlightening. 
I know what you’re thinking now.  You’re thinking “Oh my god, that’s treating other people with respect gone mad!”
Which I agree with, for the most part.  Gaiman is right that people often use "political correctness" as a catchall to cover their own asses, to excuse themselves for holding opinions that are bigoted or narrow-minded.  To me, the phrase has come to be almost as much of a red flag as when someone starts a conversation with, "I don't mean to sound racist/sexist/homophobic, but..."

On the other hand, there is an undeniable tendency in our culture to equate "offensiveness" with "having our opinions challenged."  Witness, for example, the professors at the University of Northern Colorado who are being investigated for offending their students -- by presenting, and asking students to consider, opposing viewpoints.

One professor was reported for asking students to think and write about conflicting views of homosexuality in our society.  As part of the assignment, the professor had asked students to consider the following:  "GodHatesFags.com: Is this harmful?  Is this acceptable?  Is it legal?  Is this Christianity?  And gay marriage: Should it be legal?  Is homosexuality immoral as Christians suggest?"

Note that the professor wasn't saying that homosexuality is immoral, or that the answer to any of the other questions posed above was "yes;" (s)he was asking the students to consider the claim, and creating an evidence-based argument for or against it.  The student filing the complaint didn't see it that way.

"I do not believe that students should be required to listen to their own rights and personhood debated," the student wrote.  "[This professor] should remove these topics from the list of debate topics.  Debating the personhood of an entire minority demographic should not be a classroom exercise, as the classroom should not be an actively hostile space for people with underprivileged identities."

Because learning how to counter fallacious arguments with facts, and answer loaded questions rationally, somehow creates an "actively hostile space."

[image courtesy of photographer Fredler Brave and the Wikimedia Commons]

The second professor's case is even more telling, as it came about because (s)he had assigned students to read the famous article by Greg Lukianoff and Jonathan Haidt called "The Coddling of the American Mind," which addresses precisely the problem I'm writing about in this post.  After reading the paper, the professor asked the students to consider the questions raised by the article, specifically the issues of "trigger warnings" for minorities such as homosexuals and transgender individuals in reading controversial material.

"I would just like the professor to be educated about what trans is and how what he said is not okay because as someone who truly identifies as a transwomen [sic] I was very offended and hurt by this," one student wrote in the complaint.

The university complaints office backed the student.  The professor was instructed not to interject opinions into his/her lessons -- including those of the authors who wrote the article.

So there's something to be gained by having students avoid all opinions that they disagree with?  If they think they're not going to run into those once they leave college, they're fooling themselves -- and if they haven't been pushed into thinking through how to respond to bigots and people who are simply ignorant, they're basically choosing to be intellectually disarmed adults.

Students should be forced to consider all sorts of viewpoints.  Not to change their minds, necessarily, but to allow them to think through their own beliefs.  I tell my Critical Thinking students on the first day of class, "You might well leave this class at the end of the semester with your beliefs unchanged. You will not leave with your beliefs unchallenged."

Now, note that I am not in any way trying to excuse teachers (on any level) who try to use their classrooms as a field for proselytizing.  I only have the one source for the incidents at the University of Northern Colorado, and there might be more to the story than I've read.  If these professors were using their positions of authority to press their own bigoted viewpoints about gender and sexual identity on their students, they deserve censure.

But I suspect that's not what's going on, here.  We've become a polarized society, with half of us lambasting the political correctness movement and simultaneously feeling as if their right to free speech makes it acceptable to offend, and the other half afraid to voice an opinion for fear of treading on some hypersensitive individual's toes.  What's lost is the opportunity for civil discourse -- which, after all, is one of the best and most reliable pathways toward learning and understanding.

Monday, June 27, 2016

Score card

It's the last week of June, and I just wrapped up another school year.  My 29th overall, which still seems kind of impossible to me until I realize that a child of a former student graduated from high school this year.  Then it seems pretty real, along with a realization of "Good lord, I'm getting old."

So I've been at this for a long time, and with, I think, some measure of success.  Which is why I read my letter from the school district awarding me my numerical grade for the school year with a mixture of amusement and irritation.

I won't leave you hanging; I got an 81.  I got an 84 last year and a 91 the year before that, so according to the state rating scale, I'm becoming incrementally less competent.  It can't, of course, be because the metric is flawed, that the three grades are comparing different assessments of different students put together in different ways.  No, in the minds of the geniuses at NYSED, this number means something fundamental about my effectiveness as a teacher.

In fact, that's what an 81 gets you; a designation of "Effective."  You have to have a 92 to be "Highly Effective."  If you're below 75, you're "Developing."  I'm glad I didn't land in that category.  If after 29 years at this game, I'm not "Developed," I don't hold out much hope.

What amused me most about all of this nonsense was the paragraph that said, and I quote:
Please remember that your scores are confidential and should not be shared in any way.  In accordance with state regulations, the parent of a child in your class may request your composite score and rating as well as that of the principal.  For your own protection, teachers are strongly discouraged from sharing their own scores outside of the district process.
Which is a recommendation I'm happy to toss to the wind (along with the aforementioned letter).  If we keep our scores and the way they were generated under wraps, it allows the statistics gurus at the State Education Department to keep everyone under the impression that they actually know what they're doing.

[image courtesy of the Wikimedia Commons]

Let me get specific, here.  My 91 two years ago was based upon the scores of my Critical Thinking classes and my AP Biology class.  Critical Thinking is an elective, and while the day-to-day work is difficult (requiring a lot of thinking, surprisingly) the material that is suitable for an exam at the end of the year is actually quite easy.  So my students performed brilliantly, as I would expect.  Additionally, that year's AP class was an extremely talented group who knocked the final exam clear out of the park.

Fast forward to last year.  My score last year was based on a combination of my Regents (Introductory) Biology class and my AP Biology classes.  Because of a strange policy of piling students who are classified as learning disabled into the same class, last year's Regents Biology was half composed of students who have been identified with learning disabilities.  Many of these students were hard-working and wonderful to teach, but it's unsurprising that that part of my grade went down.  My two AP classes last year were a friendly, cheerful lot who also happened to be somewhat motivationally challenged, and who by the end of the school year were far more invested in playing Cards Against Humanity than they were in studying for my final.  So that accounts for the remainder of the decline in my score.

This year, my score was a composite once again between Regents and AP Biology, but this time my Regents classes were among the most talented, hardest-working freshman and sophomores I've ever had.  My AP class was small but outstanding, but because of the way the scoring is done, they would have to score on my (very difficult) final exam higher than a target determined by their score on the (far easier) Regents Biology exam for me to have that student's score count in my favor.  On the part of my assessment that came from my AP class, I got a grand total of three points of of a possible twenty -- mostly because of students who got an 81 or 82 on an exam where their target was 85.

So my three scores in three consecutive years have absolutely nothing to do with one another, and (I would argue) nothing whatsoever to do with my competence as a teacher.  But because there's no idea that is so stupid that someone can't tinker with it to make it even stupider, next year the State Department of Education has informed us that we'll be assessed a different way.  Our joy at hearing this pronouncement was short-lived, because once we heard how they're going to score us, we all rolled our eyes so hard it looked like the email was inducing grand mal seizures.

Next year, unless over half of your students are in classes that take a mandated state exam at the end of the year, 50% of your score will be based on an average of the "Big Five" exams, the ones that all students have to take to graduate -- English, US History, Algebra I, Global History, and Biology.   (The other half, fortunately, will be based on evaluation by an administrator.)  If you think you can't have read that correctly, you did; the half of the high school band teacher's grade (for example) will come from students' scores on exams that she had absolutely nothing to do with.  Even for me, who teaches one of the "big five" -- less than half of my students next year will be in Regents Biology, so I'll be getting the composite score, too.

But don't worry!  Because students mostly score pretty well on these exams, and the score will be calculated using the time-honored statistical technique of averaging averages, we'll all look like we're brilliant.  So in effect, they took an evaluation metric that was almost completely meaningless, and changed it so as to make it completely meaningless.

Because that's clearly how you want an evaluation system to work.

All of this, it must be said, comes from the drive toward "data-driven instruction" -- converting every damn thing we do into numbers.  Couple this with a push toward tying those numbers to tenure, retention, and merit pay, along with a fundamental distrust of the teachers themselves, and we now have a system that is so far removed from any measure of reliability that it's almost funny.

Almost.  Because NYSED, and other state educational agencies, look upon all of this as being deadly serious.  It's all very well for me -- a veteran teacher of nearly three decades who is looking to retire in the next few years -- to laugh about this.  I wouldn't be laughing if I were a new teacher, however, and I'd be laughing even less if I were a college student considering education as a profession.

In fact, it'd make me look closely at what other career options I had.

Saturday, June 25, 2016

Take the wheel

One of my favorite units to teach in my Critical Thinking classes is ethics.

I'm no expert in the topic; not only do I not have a degree in philosophy, I have a way of seeing everything in so many different shades of gray that most of the time it's hard for me to make a decision regarding my own ethical standards.  I still love the topic for a number of reasons -- because it brings up issues that the students themselves often haven't considered, because it provokes fantastic class discussions, and because it appeals to the risk-taker in me.  I seldom ever know where the discussion is going to go ahead of time.

We usually start the unit with some exercises in ethical decision-making, presented through a list of (admittedly contrived) scenarios that force the students into thinking about such issues as relative worth.  Examples:  there are two individuals who are dying of a terminal illness, and you have one dose of medicine that can save one of them.  Who do you save?  What if it's two strangers -- what more would you need to know to make the decision?  A stranger and a family member?  (This one results in nearly 100% consensus, unsurprisingly.)  A stranger and your beloved dog?  (Are bonds of love more important, or is human life always more valuable than the life of a non-human animal?)  And for the students who say they'd always choose a human life over their dog's... what if the human was a serial killer?

Some students are frustrated by the hypothetical nature of these questions, although the majority see the point of considering such issues.  And there are situations in which such decisions need to be thought through beforehand -- such as in the case of self-driving cars.

Self-driving cars are an up-and-coming technology, designed to eliminate cases of human-caused automobile accidents (caused by fatigue, impairment, loss of attention, or simply poor driving skills).  And while a well-designed self-driving car would probably eliminate the majority of accidents, it does bring up an interesting ethical dilemma with respect to how they should be programmed in the case of an unavoidable accident.

Suppose, for example, there are three pedestrians in the road at night, and a self-driving car is programmed to swerve to miss them -- but swerving takes the car into a wall, killing the driver.  In another scenario, a truck is in the lane of an oncoming self-driving car, and in order to miss colliding with the truck, the car has to cut into the bike lane -- striking and killing a cyclist.  How do you program the car to make such decisions?

Google's Lexus RX 450h Self-Driving Car [image courtesy of the Wikimedia Commons]

This was the subject of a paper in Science this week by a team led by Jean-Fran├žois Bonnefon at the University of Toulouse Capitole in France.  They created a survey that described the problem, and asked the following question: should self-driving cars be programmed to minimize casualties at all costs, even if it meant sacrificing the driver's life?  Or should they be programmed to save the driver at all costs, even if it meant putting others' lives at risk?

The results were fascinating, and illustrative of basic human nature.  Over 75% of the respondents said that a self-driving car should be programmed to minimize casualties, even if it meant that the driver died as a result.  It's a variant of the trolley problem -- more lives saved is always better than fewer lives saved.  But the interesting part came when the researchers asked respondents if they themselves would prefer to have a car that was so programmed, or one that protected the driver's life first -- and the vast majority said they'd want a car that protected them rather than some random pedestrians.

In other words, saving lives is good, provided that one of the lives saved is mine.

"Most people want to live a world in which everybody owns driverless cars that minimize casualties," says Iyad Rahwan, a computer scientist at MIT who co-authored the paper along with Bonnefon and Azim Shariff of the University of Oregon, "but they want their own car to protect them at all costs...  These cars have the potential to revolutionize transportation, eliminating the majority of deaths on the road (that's over a million global deaths annually) but as we work on making the technology safer we need to recognize the psychological and social challenges they pose too."

You have to wonder how all of this will be settled.  While driverless cars have the potential to reduce overall accidents and automobile fatalities, the programming still requires that some protocol be determined for decision-making when accidents are unavoidable.  Myself, I wouldn't want to be the one to make that call.  I have a hard enough time making decisions that don't involve life and death.

But it does give me one more interesting ethical conundrum to discuss with my Critical Thinking classes next year.

Friday, June 24, 2016

Risk and brain amoebas

We humans are poor at assessing risk.

It's something I've commented upon before; we tend to vastly overestimate the likelihood of being harmed by something gruesome and unusual (such as a shark attack), while vastly underestimate the likelihood of being harmed by something commonplace (such as smoking).  This leads to missed opportunities and unnecessary anxiety in the first case, and ignoring truly dangerous behaviors in the second.

This comes up because of an article I've seen posted now several times, about an Ohio teenager who died from an infection by the "brain-eating amoeba" Naegleria fowleri.  The 18-year-old victim appears to have been infected while on a whitewater rafting trip near Charlotte, North Carolina, and several days later came down with the fever, chills, and headache associated with primary amoebic meningioencephalitis, which is as horrifying as it sounds.  The microorganism gets into your system through inhaled water, and it travels through the olfactory nerves to the brain.  There it turns from eating its usual food source, bacterial films in freshwater sediments, to consuming your brain cells.  The disease has a 97% mortality rate.

Naegleria fowleri [image courtesy of the CDC]

Unfortunately, the story (although correctly reported, for the most part) is inducing widespread hysteria from people who evidently missed the following line: "The CDC reported 37 infections in the 10 years from 2006 to 2015."  Let me put that statistic a different way; given the current population of the United States (318 million), that amounts to about one death per hundred million people per year.  Even if there were three times as many cases that go unreported -- unlikely, given the severity of the symptoms and the likelihood of dying as a result -- it's still a tiny, tiny risk.

Interestingly, these numbers are ten times smaller than the likelihood of your being crushed to death by a piece of your own furniture (303 deaths in the last ten years).

So here are a few of the comments I've seen posted in the last couple of days, edited to reflect the far more likely scenario of your being killed by a falling television cabinet.  I've inserted "television watching" and equivalent phrases for "swimming" and "hard hat" for "nose plug."
  • I wish I hadn't read about this!!!  I'm never sitting in front of an unsecured television cabinet again.
  • Just in time for summer.  So much for television watching.
  • They should post warning signs on television cabinets!  It could have prevented this tragedy.
  • Every time I'm sitting in front of the television, I'm gonna think about this.
  • I'm protecting my kids from this.  They'll never watch television again without wearing a hard hat.
There.  I hope that sounded as ridiculous to you as it did to me.  And remember; there is ten times the justification for making those statements as there is for making equivalent statements about brain-eating amoebas.

Note that I'm not trying to minimize the tragedy of what happened.  A young life cut short is always sad, especially given how unlikely an occurrence it was.  What is completely unjustified is the panic that these sorts of stories always induce, even in people who should know better.  The U.S. National Whitewater Center, where the young woman is thought to have been infected, has responded by hyperchlorinating their well water, and health officials in North Carolina have recommended "holding your head above water when taking part in warm freshwater activities" and "avoid(ing) water-related activities in warm freshwater during periods of high water temperature and low water levels."

So when are you supposed to go swimming?  January?

The bottom line is that everything you do is a risk.  Most of the risks are quite small, and chances are that you do several things every day without a thought that are orders of magnitude riskier than your being killed by brain amoebas.  If you really want to lower your risk of illness and death, quit smoking, eat a healthy diet, drive carefully, find ways to reduce your stress levels, and get enough exercise.

And keep an eye on any unsecured television cabinets.  They're just waiting for an opportunity to strike.