
When I couldn’t decide what I wanted to read next from my reading list, I decided to instead spend some time rereading books that I’ve been wanting to post reviews for on the site. For the next few weeks, therefore, you can anticipate reviews for books that I read before I started doing regular book reviews on the site. This is not entirely unusual for me; I will periodically start to consider that I don’t remember some of the books that I’ve read as well as I should, and go into bouts of rereading in order to compensate for that perceiving shortcoming. Although I realize that, when you’re reading as much as I am, it’s probably inevitable to forget some books, it still bothers me, and is one of the reasons that I like owning books, rather than renting them. It’s also why I would continue to do these book reviews, even if no one were reading them.
This example, The Art of Thinking Clearly, is something that I’ve been meaning to post a review for on the site for quite some time now, mostly because of how often I reference logical fallacies. Whatever else this book might be, and it certainly has its flaws, it is a short, approachable compendium of common logical fallacies. While it is probably impossible to hold in one’s head every fallacy simultaneously, being aware of them, and periodically refreshing that awareness, can be immensely beneficial towards improving the rigor of your thinking. By the way, I will be posting a list of all of the fallacies covered in the book at the end of this post, along with a very brief sentence or two of my own thoughts on each.
That being said, this book has its own flaws. Partially, this is a matter of packaging. Although it is presented as a commercial, semi-scientific self-help book, it is really the author’s personal notes on logical fallacies compiled into a slightly more coherent form for sale to the general public. Some of the chapters themselves exhibit fallacies described in previous chapters, the anecdotes and examples are not always relevant or even interesting, and it can be easily argued that there are better sources out there to describe these concepts with far more scientific backing and rigor. Most of the examples are very focused on economics, and the author is very skeptical of anything approaching belief.
Speaking of science, I will allow myself a brief digression to express my frustrations with the scientific discipline of psychology, which, like certain other fields, is prone to woefully lax experimental design, and confusions of correlation and causation. I do not blame the experimenters for the former flaws, as it is almost impossible to design experiments that are sufficiently controlled to generate a clear, unimpeded and unequivocal signal, but I do hold them to account for playing fast and loose with correlation and causation. Without going into detail on the problems with specific experiments, which I will reserve for such a time as I decide to review my textbook on classic experiments in psychology and behavioral science, suffice to say that most psychological experiments are prone to affecting their own outcomes. This leads me to question some of the logical fallacies that have been identified only recently through such experimentation. With that, I will desist from further offending any psychologists in the audience.
I would like to mention one amusing anecdote relating to “chauffer” knowledge, in which Max Planck was on a lecture tour, and his chauffer observed that he had heard the lecture so many times that even he could give it. So they switched outfits, and the chauffer gave Planck’s presentation. When he was asked a question, he replied that the answer was so simple that he would have his chauffer answer it. While I have no idea if this particular story is true or not, and it was about as relevant in the book as it is in this post, I find it quite amusing, and I really hope that Planck and his driver did something like that at some point.
With all of that in mind, you might wonder why I, or anyone else, should bother reading this book. There are other lists of logical fallacies out there, after all, that are more comprehensive, more rigorous, more scientific, and probably even better written. To be honest, if I were setting out today to find a list of logical fallacies, I probably wouldn’t choose this book. However, I already have it, I know both its flaws and its uses, and so it serves my purposes well. It is approachable, with each fallacy covered by a short chapter, and if some of the chapters serve little purpose save to expose the author’s own lack of understanding of the fallacy being described, they are still useful for example purposes, and as a starting point from which to improve your own thinking. Don’t think of this book as a destination: think of it as a starting point. If you keep that in mind, I think that you will find it worth your time.
Logical Fallacies Listed in The Art of Thinking Clearly
- Survivorship Bias: Not so much a fallacy as a failure of research. Since we don’t hear from those who fail to survive (fail to become successful), we don’t think about the proportions.
- Swimmer’s Body Illusion: This is really just a misunderstanding of causation versus correlation, which is a fallacy, but is listed elsewhere on this list
- Clustering Illusion: We are naturally inclined to associate things, even if they should not be associated
- Social Proof: Don’t jump off a bridge just because your friends are all doing it
- Sunk Cost: This one is quite pernicious. If you have invested in something (money, time, whatever), you will be inclined to continue that investment even as evidence builds that it is a poor one. Related to the Endowment Effect.
- Reciprocity: Again, I’m not sure that this is really a fallacy. Yes, we will tend to try to reciprocate in many situations, but I don’t think it is a failure of thought. If you are reciprocating without thought, then I suppose this would be relevant.
- Confirmation Bias: We look for information that supports out preconceived notions. This is one of the major critiques of the scientific hypothesis, and very difficult to counter, mostly because it takes a great deal of both awareness and effort.
- Authority Bias: We tend to defer to authority. Much like Social Proof, which is really just deferring to the authority of the masses. I call this a symptom of mental laziness.
- Contrast Effect: Things look better or worse in comparison to their immediate company. It’s why we’re terrible at perceiving change over long timescales – there isn’t sufficient contrast moment to moment.
- Availability Bias: A fancy name for the “logical fallacy” of laziness. We will tend to opt for that which is most readily accessible, and may not be inclined to investigate further.
- It’ll Get Worse Before It Gets Better: This should not be on the list. There are so many other factors that go into whether this is true or false about a given situation that it cannot be described as a flaw in human reason. The emphasis on temporal limits, however, is useful.
- Story Bias: We like stories. They’re much easier to digest and understand than numbers. That can make us blind to what is really going on. Plus, there is always another perspective that is not being presented, more sides to the story. As much as I advocate for thinking of everyone as the protagonists of their own stories, using stories to understand specific events is a terribly flawed practice.
- Hindsight Bias: Things always seem obvious in retrospect. This is why reading history can be tricky, and why we always think that we are living in exception and exceptionally uncertain times.
- Overconfidence Effect: We think that we know more than we really do. As Socrates said, the beginning of wisdom is in knowing you know nothing. As I revise Socrates, the beginning of wisdom is in knowing that you most likely know nothing, or at least a very minimal amount. Pride goeth before the fall.
- Chauffer Knowledge: It is one thing to be able to recite information or to know facts; it is another to actually understand something. I’ve known teachers whose only method of instruction when a student was struggling was to repeat the exact same thing that the student had not understood in the first place. If those teachers had really understood their subjects, and were not acting as merely knowledge chauffers, they would have tried different explanations to help the struggling students. Not so much a logical fallacy, but be aware of when you are a true expert, versus merely a chauffer of knowledge.
- Illusion of Control: Somewhat a case of correlation versus causation interacting with the clustering illusion. It can make us think that we are more in control of our lives and situations than we really are. A proper understanding of cause and correlation would largely render this irrelevent, so I am skeptical of it being separately included.
- Incentive Super-Response: Not really a fallacy. People respond to incentives. Period. Dot. Does more really need to be said?
- Regression to Mean: I am skeptical of this one. It asserts that most complex systems will tend to revert to arithematic mean behavior given time. While largely true in such systems that are undisturbed, I would not qualify it as a fallacy. Again, I would slot this as a facet of correlation versus causation. So much ties back to those concepts…
- Outcome Bias: We look for neat explanations to outcomes. Very similar to the hindsight bias. Again, really a matter of misidentifying causes from correlations.
- Paradox of Choice: This is not a logical fallacy so much as it is a function of human cognitive function. Choice is desirable, but too much choice can become overwhelming. However, the threshold can be quite different for different individuals.
- Liking Bias: If we think someone likes us, we will be more inclined to like them back, and also to cede authority to them. Definitely a valid fallacy.
- Endowment Effect: We place more value in something if we own it. Sometimes this is a fallacy, but sometimes it is a matter of practical considerations beyond mere value of material. Sure, my car may not be worth as much as I think it is, but some of my perception of its value is a result of incorporating the additional expenses of money, time, and effort involved in me finding a new vehicle if I sold this one.
- Coincidence: I’m not sure that this really needs to be separate from the clustering illusion. Again, it is a mistake in causation versus correlation. I think most of this list could be boiled down to mistaking causation and correlation. We should probably do a Tuesday post on that concept.
- Groupthink: Most of us are familiar with this one. It’s closely related to the Authority Bias and to Social Proof.
- Neglect of Probability: We have a poor intuitive understanding of probability, and so will tend to neglect it in our decision-making. Of course, the science of probability itself has problems, which we will discuss in much more detail when I get around to reading and reviewing Bernoulli’s Fallacy.
- Scarcity Error: This is only a fallacy under certain circumstances. We perceive a paucity of a good or service as rendering it more valuable. Which is true: that’s basic economics, supply and demand. The less supply to meet the demand, the more valuable whatever is being supplied and demanded. It only becomes a fallacy if the scarcity is a matter of perception rather than reality. Is laziness a fallacy?
- Base-rate Neglect: Don’t forget to apply Occam’s Razor: the most likely solution or answer is the simplest one. That which is more common is more likely to be true. However, over-applying a knowledge of this fallacy can lead you to too readily dismiss the improbable.
- Gambler’s Fallacy: Cause versus correlation. Again.
- The Anchor: Humans are poor at functioning in the abstract. When faced with the abstract, or with unknowns, we look for reference points, even if those reference points are irrelevent to the situation at hand. Make sure that your reference points are relevent, and this can be a useful tool. If they aren’t, you can end up with grievous errors. And remember that the further from your reference you try to extrapolate, the more prone you are to accumulate errors. It’s dead-reckoning for decision-making.
- Induction: We draw conclusions about the future based on the past. While it’s true that the past does not necessary predict the future, and that it is possible to reach terribly flawed conclusions from this mode of thinking, it is also true that induction is absolutely necessary to, well, almost everything. It’s all about making the right assumptions, but determining which are the right ones to make can be a whole different challenge.
- Loss Aversion: Loss affects us much more severely than does gain. This is true, and related to the “cockroach in the bowl of cherries” fallacy that I may have referenced before. Being aware of this kind of tendency in our natures is important to maintaining a positive outlook on life, and can even help in fighting depression and similar conditions.
- Social Loafing: In group settings, we don’t work as hard as we do on our own. Read Ayn Rand to really understand social loafing (I should add those books to my re-read list so that I can review them here on the site, and because they are definitely worth re-reading).
- Exponential Growth: Dobelli asserts that humans have a poor inuitive grasp of exponential growth. This might be true, but sufficient mathematical training can instill such an intuition, and I think he overstates his case here.
- Winner’s Curse: This may be true historically of certain situations, but I am not certain that I would call it a fallacy, or at least not an independent one. It sounds more to me like a conspiracy between the sunk cost fallacy and the endowment effect, and is not universally true.
- Fundamental Attribution Error: Again, cause versus correlation, this time specific to providing individuals with an outsized role, so maybe a bit of story bias thrown in for good measure. Yet it is no more accurate to excise the role of individuals entirely.
- False Causality: Are we really only now getting to mistaken causation with correlation? This might be the single most important and significant and prevalent fallacy on the list, or any list.
- Halo Effect: Kind of related to the contrast effect, this one suggests that we will ignore, excuse, contextualize, or even galmourize objectively negative information in the presence of sufficiently positive information.
- Alternative Paths: As a planner by nature, I have to assume that others who do not share that inclination may find themselves subject to this tendency not to analyze the possible routes to reach a given condition. I’m not sure that I would really call it a fallacy – again, is laziness a fallacy? If so, it should go right after correlation versus causation.
- Forecast Illusion: This has elements of outcome bias, hindsight bias, overconfidence effect, and survivorship bias, especially that last one. Probably not its own fallacy. Be wary of anyone who claims to predict the future (Stormlight reference, anyone?), because the future is unknowable. See several other fallacies about probabilities, predictions, and not knowing the future based on the past.
- Conjunction: While I consider this fallacy poorly named, it is valid: it is the tendency to think more information is better, more likely, or more accurate, when in fact the opposite will tend to the case. The examples in the book actually do a pretty good job of communicating this one.
- Framing: I wrote an entire post on framing stories some time back. This states that we react differently to how things are presented, even if the substance of what is being presented is identical. The old “a pound of feathers or a pound of bricks” trick.
- Action Bias: We prefer action to inaction when faced with unknowns. This is probably an element of the fight-flight-freeze response (humans have pretty poor natural camouflage, so freeze doesn’t work out for us most of the time). However, acting without sufficient information can be dangerous and wasteful. Of course, the alternative can also be dangerous and wasteful, so there is no easy answer.
- Omission Bias: Not at all what it sounds like. This is the exact opposite of the action bias, and refers to a subset of situations in which we are inclined to favor passivity. Note: Dobelli argues that this is not exactly the opposite of the action bias, but I disagree. The action bias states that we are inclined to take action in ambiguous situations. The omission bias states that we are inclined to remain passive in situations where we do know the outcomes.
- Self-Serving Bias: Another facet of causation versus correlation, this time in the form of finding excuses for our failures and taking responsibility for our successes.
- Hedonic Treadmill: While a legitimate effect, I again question if this qualifies as a logical fallacy. It states that we tend to be poor at estimating our own future emotions. This seems more like a lack of self-awareness than it does a logical fallacy.
- Self-Selection Bias: There is a lot to unpack in this one, and it reaches deep into concepts of philosophy and existence. The author uses it to assert that we spend too much time asking questions that don’t need answers, like philosphers marveling at the development of language or at the very fact of existence. I disagree with his sentiments on such questions and their utility, and think they are poor examples, but I agree with the fallacy’s inclusion on the list. Pay attention to his first few examples, and not the later ones. Although the telephone survey is amusing.
- Association Bias: More causation versus correlation.
- Beginner’s Luck: This chapter is full of inept application of statistics, and the author himself is noncommittal about this fallacy. Basically, it’s just overconfidence again.
- Cognitive Dissonance: A major finding of psychology, and one which I find quite applicable (despite my general skepticism of the field), this is our tendency to lie to ourselves in order to explain why we do things that are contrary to our original intentions or inclinations.
- Hyperbolic Discounting: A peculiar way of stating that humans put an inordinate value on immediacy, and struggle with delayed gratification. While true, I am again uncertain that I would call this a logical fallacy, since it is something that can be overcome, and that varies significantly from individual to individual. Some people (like me) are quite adept at delayed gratification, sometimes to extremes.
- “Because” Justification: We look for reasons, we like reasons, we want reasons. What those reasons are, if they make sense, or if they have any substance is all irrelevent.
- Decision Fatigue: Another one that is a true phenomenon but that I cannot comfortably accept as a logical fallacy. We tend to get worn out by making decisions (see the fallacy about choice), but is that really a flaw in our logic, or just a neuophysiological function of which to be aware?
- Contagion Bias: I’m really torn about this one. Rather than biasing you one way or another, I’ll let you read the chapter and come to your own conclusions.
- The Problem with Averages: By now, I think most of us are familiar with the problems with averages. This is not a fallacy at all, but a matter of statistical understanding.
- Motivation Crowding: Think of it like the inverse of incentive super-response. However, you should probably ignore the chapter’s focus on financial examples. Motivation crowding is a real thing, but the author’s examples have more to do with social normes and cues than they do with the motivation crowding.
- Twaddle Tendency: People like to talk, a tendency that I’ve never entirely understood. I am prone to rambling at length about topics about which I am passionate, but I generally have some amount of substance to what I am saying, even if it takes me some time to express it. Again, not so much a logical fallacy as it is a proclivityof which to be aware.
- Will Rogers Phenomenon: Once again, not a logical fallacy but a statistical illusion.
- Information Bias: We like to have more information, and we think it leads us to better decisions. This may or may not be true, and is circumstantially dependent, but I do accept this is a logical fallacy. Amassing information is a very different thing from being in posession of knowledge.
- Effort Justification: As the author states, a special case of cognitive dissonance, but I will add that I think his examples are too focused on the material and fail to account for less tangible measures of value and worth.
- The Law of Small Numbers: Another statistical matter that I argue does not qualify as a fallacy.
- Expectations: I think I’ve written before about the importance of managing expectations. Our expectations affect our reactions, so being aware of this in order to control and manipulate it can be a powerful tool.
- Simple Logic: All this is saying is that the immediately obvious answer is not always the correct one, and that falling back on our intuition can get us into trouble. This would be another one that I would file under “laziness.” Maybe I should do my own post of a list of logical fallacies.
- Forer Effect: There are a few components to this one, but I would reduce it to the self-lensing effect. In other words, we tend to filter what we see through a fundamentally selfish perspective (what other perspective could we possibly have?), and therefore can see great meaning in overly general statements and claims.
- Volunteer’s Folly: You’ve probably heard about this one before, especially if you’ve taken any kind of economics course. Specialization and division of labor mean that the best way for you to contribute to a given cause is to either work more in your chosen profession and donate money, or to supply services in your chosen profession on a pro bono basis. All of that is true, and all of that is rational, and all of that is logical. It has certainly complicated my relationship with volunteering. However, there are confounding factors. Most pertinently, it depends upon what your goals are. This is another one that reading some Ayn Rand will really make you think about, and I am increasingly thinking that Atlas Shrugged might need to make an appearance on the site.
- Affect Heuristic: The author claims here that no one uses a decision matrix to make decisions. Speaking for the engineers in the audience, I beg to differ: I use a decision matrix to make most significant decisions. However, that does not invalidate the fallacy in general terms: this is one of those that is truly an innate flaw in our logic that we must make deliberate and constant efforts to overcome. Aside from the decision matrix claim, Dobelli explains this one pretty well.
- Introspection Illusion: That the human brain is a terrible tool for diagnosing the human brain is as obvious as it is rarely considered. Indeed, you could argue that all of this talk of logical fallacies is an example of the human brain struggling futilely to fix problems with the human brain.
- Inability to Close Doors: A lot of this tendency has to do with our perceptions and misperceptions of opportunity cost. Although the name is clunky, I’ll accept this is a logical fallacy, although it sort of falls again under the category of mental laziness.
- Neomania: Since this is not a universal phenomenon, I’m not certain that I would call it a logical fallacy. There are certainly some people who are prone to it, but not everyone, and I suspect that it has much to do, at a scientific level, with the activity of your amygdala, and on a human level, with how you tend to respond to change.
- Sleeper Effect: Talking specifically about propoganda in the chapter unnecessarily narrows the scope of this effect, which is really talking about how memory works. We remember substance and meaning better than we remember trivia, generally, which is why we will remember articles but not their authors or titles. This is worth taking note of, and definitely qualifies as a logical fallacy.
- Alternative Blindness: Yet another example of mental laziness that probably does not need its own category, especially since it fits in with the previous entries on decision-making. All I will add is: Kobiashi Maru.
- Social Comparison Bias: Although this is called a social comparison bias, it’s more a feature of inate human tendencies towards competition, rivalry, and survival. However, it doesn’t directly affect thinking, which makes me question its status as a logical fallacy (in case you couldn’t tell by now, I have a much tighter definition of what constitutes a logical fallacy than does the author).
- Primacy and Recency Effects: A basic function of how our minds process information means that data that we receive first, and data that we have received most recently, will be more prominent in our thoughts when we make decisions unless we make a specific effort to be more deliberate.
- Not-Invented-Here Syndrome: We are partial to our own ideas because we get emotionally attached to them. It is why beta and alpha readers are so important, and why I need to place temporal distance between myself and a piece of my writing before I feel that I can judge it accurately. However, whether we are partial in a positive or negative way will depend upon the person. For instance, I will tend to rate my writing poorly directly after writing it, simply because I wrote it.
- The Black Swan: This definitely does not belong in a book of logical fallacies, no matter how interesting the concept, and especially the author’s ruminations about their possibly increasing prevalence (the events referred to as black swans, that is, not the birds). If there is a fallacy here, it is our difficulty in understanding the increasingly large scales on which the modern world operates.
- Domain Dependence: I consider this chapter a giant argument in favor of the polymath approach.
- False-Consensus Effect: Technically, this one directly contradicts, and also complements, the overconfidence effect. It states that we tend to think that we are part of the majority, even if we have zero evidence to support it and it probably isn’t true.
- Falsification of History: There is no need for this to be separate from cognitive dissonance.
- In-Group Out-Group Bias: In two words: tribal tendencies. We could write a whole post on this, and I am willing to admit it being a logical fallacy. What I am not willing to admit is the author’s final claim in this chapter.
- Ambiguity Aversion: The author’s examples are questionable, but the important takeaway here is the difference between risk and uncertainty.
- Default Effect: Another one that is just an amalgamation of previous fallacies.
- Fear of Regret: This one has the same logic as the scarcity one. Are my interpretations beginning to run out of steam, or is the book? Maybe both.
- Salience Effect: Several things are going on in this one, which are a little slippery to pin down in a coherent fashion. It’s really a result of our tendency to form connections that are not appropriate or where none exist, so in a way this goes back to another correlation versus causation argument.
- House-Money Effect: Bleak House actually thoroughly explains this effect far better than the author does in the character of Richard.
- Procrastination: We all are familiar with procrastination, and we are probably all guilty of it. The author makes the argument that it is irrational, which I submit is situationally dependent.
- Envy: People are prone to compare themselves to others, a habit which it is probably best to fight, but I don’t know that this is a logical fallacy. It might lead you to commit errors in judgement, but that does not make it, of itself, a flaw in human reason.
- Personification: The exact same thing as story bias, and definitely did not need its own chapter.
- Illusion of Attention: Another that does not need a seperate chapter, as it is part of the overconfidence effect.
- Misrepresentation: Not a fallacy at all, and not really having anything to do with thinking. This is just a cultural observation.
- Overthinking: Another one that is related to decision fatigue and so forth. It is rather out of place, because the author is essentially asserting that it is a logical fallacy to not trust out instincts…exactly against which the rest of the book has been arguing.
- Planning Fallacy: I contend that the author misidentifies planning as the cause of the fallacy, rather than merely being correlated with it. More accurately, this is about overconfidence.
- Deformation Professionnelle: More arguments in favor of the polymath concept.
- Zeigarnik Effect: That which we do not address tends to continue to bother us. Not really a logical fallacy, but probably a truth of human nature.
- Illusion of Skill: Sort of related to the overconfidence effect, but pertaining to how we perceive other people. All it’s really claiming is that there are more variables involved than we can readily identify and that therefore skill may not be the dominant one.
- Feature-Positive Effect: The author succinctly states “we place greater emphasis on what is present than what is absent,” and I think that sums this up perfectly. It’s also about all that needs to be said, so you could probably skip the rest of the chapter.
- Cherry Picking: Exactly what it sounds like. This is another one that may not be a logical fallacy, but is nonetheless good to keep in mind.
- Single Cause: Covered by other fallacies, but true nonetheless, and perhaps even worth including as its own line item (okay, that may be generous of me).
- Intention to Treat Error: Not so much a fallacy as an error in statistical analysis. Like in previous examples, I will treat this more thoroughly when I review Bernoulli’s Fallacy, and spare you a second essay on statistics and the potential flaws of scientific models.
- News Illusion: Yet another entry that is not a logical fallacy at all, but a social and cultural critique, this time of consumption of “news.” I think the author takes an unreasonably extreme view on this topic, but it depends greatly on the source and type of the news in question, and on what you do with the “news” once you’ve read it.
While I do think that you will find this book useful, and encourage you to read it, consider my rather negative take on many of the listed fallacies a warning: the author flat-out states that he has not defined what “thinking errors” are in the entire course of the book. So consider this an introduction to the idea of logical fallacies, and a promise that it is a topic that we will explore in more detail, and more rigorously, in the future here on the site. And yes, we will definitely being defining what a logical fallacy is, and coming up with our own list.
5 thoughts on “The Art of Thinking Clearly Review”