We’ve mentioned logical fallacies on the site before. It turns out that the human brain is not the most reliable machine, at least when it comes to being rational/logical. After all, our brains evolved to help us find better food sources and communicate about the dangers (and discomforts) of eating poison ivy or being attacked by saber-toothed tigers, not to help us analyze the finer points of morality or the inner workings of the cosmos. Functionally, they are just constructions of chemical and electrical signals that react to various stimuli.

Some fallacies are more common than others. Slippery slope fallacy, for instance, which I see very frequently, especially in news media and political debates: if this thing happens, then that thing will happen, and if that thing happens, then we’re all going to die! Or what I like to call the cockroach fallacy: we are more apt to remember the negative than the positive (I call it that because of the oft-cited “cockroach in a bowl of cherries example,” – you probably won’t remember how those were the sweetest, juiciest cherries you’ve ever eaten, if you bit into one and it crunched, turning out to be a cockroach). Or there’s the one that I have been seeing repeated over and over again in all kinds of contexts: exceptionalism.

There are all kinds of lists of logical fallacies out there, often with different names and different definitions, so “exceptionalism” may not be expressed in the same terms on whatever list you happen to favor, but it will likely be there in one guise or another. This one in particular sets my engineer’s brain to twitching, because of a little design principle referred to as an edge case, but that’s getting ahead of ourselves. We need to define this fallacy of exceptionlism.

My personal definition of the exceptionalism fallacy: an instance in which a pattern, precedent, norm, rule, moral, or assumption is held to be true in some cases, but not this one. Note that it is necessary to guard against all fallacies simultaneously, as overapplication of the exceptionalism fallacy could, for instance, lead you into the slippery slope fallacy. It’s likely impossible to be consciously aware of all of the fallacies constantly, but knowing they exist will help you identify them in more egregious cases, and allow you to better analyze your thinking when it is most important.

Ironically enough, this post was prompted by the same event that prompted last week’s post, Don’t Trust the Science: Scientific American’s decision to endorse a presidential candidate for the first time in its 175 year history. I’ve been reading Scientific American’s core publication since at least the seventh grade, and their articles played no small influence in my decision to study astronautical engineering and get involved in the world of space; it was terribly dismaying to me to read that they had decided to take a political stance. It doesn’t matter to me what that political stance happens to be: I consider the fact that they have asserted a political position to be irredeemably damaging to the integrity of their publication. All of which is somewhat secondary to the point, and I will refrain from going into a rant on journalistic and scientific integrity with regards to the “discipline of skepticism.” Instead, I want to talk about “historic times.”

In its endorsement, Scientific American claims that they were “compelled” to buck 175 years of tradition, and that it was not a political matter, but rather a “matter of life and death.” The endorsement also reveals just how susceptible even these “scientific gatekeepers” are to logical fallacies. Calling an election a matter of life and death is certainly slippery-slope, but in this context I do not think it is as dangerous as the exceptionalism that riddles the entire decision.

It is easy to look in the limited context of a generation or two, and declare that something is “without precedent.” It is also easy to assume that the time we are living in is different from any other – after all, we’re living in it. All other times by definition are other, and we do not feel as intimately connected to or familiar with them. This kind of exceptionalist thinking riddles academic circles today. Physicists, for instance, are often quoted as claiming that we are “very close to a grand unified theory,” and understand so much more about the universe than our ancestors did. The latter may be true, but it does not mean that we are close to understanding everything. The things we take for granted today are the very things that may have to be revised in the future, and the more tightly we cling to something that is true today, the more likely we are to find ourselves on the right side of the present, not the right side of history. This is why I so strongly support the idea that “there are no right answers, only wrong answers and less wrong answers, and our goal is to constantly find less wrong answers.”

The danger of exceptionalism is what it can be used to justify. If this is an extraordinary circumstance, then surely extraordinary actions are reasonable in response, like breaking with a core, 175-year precedent to endorse a political candidate. This is not to say that there are never times when exceptions do need to be made, but if you find yourself making an exception, you need to stop and think very, very carefully about it. Look at history, search for analogies, and weigh the consequences of the exception. People called President Trump’s election in 2016 “unprecedented,” and have used it to justify all kinds of exceptions, yet it, and the popular reaction to it, is actually quite similar to President Andrew Jackson’s election in the early nineteenth century.

Any time you make an exception, it opens a crack into whatever substance in which the exception was made. If it’s an exception to a standard of behavior, then standards of behavior will no longer be as standard. If it’s an exception to a long-standing precedent, then the precedent becomes that much easier to break in the future. If it’s an exception to a rule or policy, then that rule or policy becomes essentially meaningless, unless the exception itself is codified similarly, and that’s how rules and regulations become so astronomically complicated. Think of it like having a thread loose on a knit sweater. It may not seem like much, but inevitably that thread, that exception, will be pulled, and the whole sweater might unravel.

I’ve used the context of Scientific American’s exception to endorse a political candidate as an example here, because I am still dismayed by that decision and find it a powerful, personal warning against this kind of thinking, but the hazards of exceptionalist thinking are true in all kinds of fields and disciplines: science, politics, even everyday life. Perhaps especially everyday life. If you find yourself justifying something based on exceptionalism, you should start treading very, very carefully.

6 thoughts on “Exceptionalism: It’s Always Dangerous, Except This Time

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s