The Art of Thinking Clearly is an introduction to the most common logical fallacies that affect people’s ability to make good decisions. Logical fallacies affect everyone and are extremely difficult to avoid. Rolf Dobelli encourages readers to improve their decisions by learning how to recognize the fallacies and how to work around them.
Dobelli warns that his book isn't a step-by-step guide to logical thinking. Rather, he takes the “via negativa” approach, explaining what prevents you from thinking logically. If you’re aware of your illogical tendencies, you can work around them and make good decisions.
In this guide, we’ll explore the most common logical fallacies, grouping them by theme. We’ll start with fallacies caused by evolutionary adaptations and move to fallacies with more varied causes. We’ll cover how these fallacies form and how to overcome them.
First, we’ll cover fallacies related to group membership. Dobelli says that one of the traits that most influences you is the desire to be in a group. For early humans, group membership was necessary for survival. Those who left the group died, while those who stayed with the group survived and reproduced. Thus, your brain is genetically wired to fit in.
(Shortform note: The physical protection groups once offered has become less important, but group membership is still valuable to modern humans: It exposes you to others’ experiences and life skills and helps you develop empathy and self-worth.)
In this section, we’ll cover some of the specific ways this desire to be in a group impacts you.
To maintain your place in a group, you’re pressured to copy other people’s behavior, especially if that person is an authority. This pressure can even convince you to ignore your morals, Dobelli says.
(Shortform note: It’s hard to resist pressure from authorities because you’re trained from childhood to obey them. You can make this pressure easier to resist by distancing yourself from the authority and forming a connection with any victims of the problematic actions you’re being pressured into.)
Another consequence of group membership is prioritizing your own group above others. Your brain focuses on similarities between you and your group members, Dobelli says, ignoring any differences. Your brain also simplifies other groups, labeling them as “other” and ignoring any similarities you share with them. Finally, you think your group is the best because you only spend time with your group and don’t hear any different opinions.
(Shortform note: You prioritize your in-group because group membership becomes part of your identity: Protecting your group becomes protecting your identity. However, pinning your identity on membership of an in-group is dangerous, because in-groups can change depending on the situation—you may find yourself no longer a member of the group you value and feel unsure of your identity. You can avert this by increasing the variety of your groups so you don't tie yourself too closely to just one. Spend time with people from different groups, focus on points of connection with others rather than differences, and recognize when your groups are formed around arbitrary means that could crumble.)
Now that we've covered biases relating to humanity’s desire to fit in, we’ll examine fallacies that misdirect your attention. Humans tend to pay attention to the most memorable or flashy information that comes up, rather than the most pertinent or helpful, Dobelli explains. (Shortform note: The brain has arguably evolved to do this for reasons of efficiency. Your brain rapidly stores information, and it takes it less time and energy to accept the flashiest information available than to evaluate the entire situation.)
Here are some of the ways in which your attention is drawn to the wrong things:
According to Dobelli, when the salience effect takes hold, your brain latches onto unusual or notable factors of a situation and gives them too much credit for causing the situation, ignoring any more subtle influences. (Shortform note: What you pay attention to when the effect sets in depends on your past experiences. You might notice certain details due to your career or past experiences while someone else would notice different things. Thus, surrounding yourself with people with varied experiences can give you a clearer idea of the whole situation, as you can compare notes on the different things you’ve noticed.)
Another way people’s attention is misdirected is through story bias. People prefer entertaining fiction to boring facts, Dobelli explains. Sometimes, this means following an interesting story-based tangent while ignoring the central issue; other times, it means assigning meaning to random events.
(Shortform note: Why do we do this? Possibly because stories activate your brain’s sensory processing center. This gives you a shot of positive feelings like compassion and empathy.)
How can you overcome story bias? Dobelli suggests picking stories apart rather than blindly consuming and accepting them: Consider the story-teller’s intentions and what they might be hiding.
(Shortform note: In addition, be cautious about what kind of stories you consume. The stronger and more convenient the narrative, the more suspicious you should be of it, because strong narratives can distract from logical gaps in a person’s statement.)
A final bias that Dobelli claims unproductively diverts your attention is survivorship bias: the belief that you have a better chance of succeeding than you actually do because success stories are more widely publicized than failures. Dobelli suggests researching statistics and examples of failure in the field you’re considering to gain a more accurate perspective of success. (Shortform note: Start your research by asking “What am I missing?” and looking for holes in your data. In addition, be careful when researching: If your sources are incomplete or suffering from survivorship bias themselves, they won’t help you ascertain reality.)
The next set of fallacies we'll look at revolve around the kind of thinking you use. Dobelli says there are two main kinds of thinking: Fast and instinctive, and slow and logical. (Shortform note: These types of thinking are also called hot and cold cognition. The first is influenced by instinct and emotions, while the second is based on logic and reasoning.)
Fast, instinctive thinking is useful when actions are familiar or something you’ve been evolutionarily optimized for, Dobelli says. (Shortform note: Malcolm Gladwell argues in Blink that you can use your instinctive thinking more often—for instance, even in unfamiliar situations—as long as you train it correctly to avoid coming to incorrect snap judgments. You can improve the accuracy of your instinctive thinking by expanding your worldview, so that your instincts have more experience to draw on, as well as by paying attention to context, so that you know what type of experience to draw on.)
When you’re trying something new, or you’re in a complex situation you’re not instinctually prepared for, use slow logic, Dobelli adds. (Shortform note: Trigger logical thinking by increasing your task’s difficulty. For example, making the font smaller when analyzing a news article increases the effort you put into the task and triggers logical thinking.)
One incorrect use of cognition is the tendency to prefer a plausible story to a probable one, Dobelli explains. In other words, Dobelli explains, when a story makes sense to you, you’re likely to believe it even if the true probability of it occurring is low. For example, consider a girl named Katrina who loves musicals. Now consider these statements:
Many people pick the second option as most likely because it makes a better story: Katrina loves musicals, so she’d perform in a musical. However, the first option is actually more likely in terms of probability because it’s broader: It has just one condition (Katrina being on stage) rather than two (Katrina being on stage and in a musical).
This tendency, called the conjunction fallacy, occurs when you use fast thinking instead of slow. While your logical brain is still calculating the probability of an event, your instinctive thought process makes connections to explain why the event might occur. The connections it finds often form a plausible story, so you accept the instinctive connection rather than waiting for the logical probability.
(Shortform note: While Dobelli presents the conjunction fallacy as something instinctual and common, others argue that he exaggerates the danger of said fallacy. Studies show that the conjunction fallacy isn’t as widespread as the original researchers, whose research Dobelli relies on, suggested. Only 45% of respondents succumbed to the fallacy, compared to 85% in the original study. In addition, the new researchers discovered that the conjunction fallacy can be avoided by simply discussing a situation with another person.)
Part of humanity’s fast, instinctive thinking is the affect heuristic: a mental shortcut in which your brain makes rapid subconscious judgments of like or dislike. These “affects” influence your risk-benefit analyses: If your immediate judgment is good, you’ll focus on the benefits of a situation, while if the affect is bad, you’ll focus on the risks.
(Shortform note: These instinctive judgements can help you make decisions. This was particularly helpful to early humans because it quickly provided more data for risk analysis, and early humans faced regular life-threatening risks. However, the information required for a good risk-benefit analysis in modern times is too complex for the affect heuristic to assess adequately, especially when emotions are involved. Thus, rather than being beneficial, the heuristic can lead you to make a risky decision merely because you’re excited or delay a good decision because you’re sad.)
The next set of fallacies we’ll look at revolve around complex math. Your hunter-gatherer brain isn’t designed for complex math, Dobelli says. This means you can’t instinctively grasp complex math concepts, but understanding these concepts is increasingly important for modern life.
(Shortform note: Some people argue that people’s difficulty with complex math concepts is a result of how math is viewed. People internalize the idea that math is difficult and those who struggle with the concepts stop trying to understand them. If math was treated like a language, which takes practice but can be learned by anyone, people would learn complex math easier.)
Here are some situations in which struggling with math negatively impacts your decisions:
Averages are one of the complex math concepts that your brain isn't evolutionarily prepared for, Dobelli explains. One of the biggest pitfalls when working with averages is ignoring the distribution: the original set of numbers used to calculate the average. Without knowing the distribution, averages are misleading because they don’t show the outliers: the extremes at either end of the distribution that drastically change the average. To get a true average, these outliers must be removed, Dobelli says. This isn’t instinctive, but it’s important for modern life because outliers are increasingly common.
Averages and Scalable Events
Dobelli’s discussion of distribution and outliers finds parallels in Nassim Nicholas Taleb’s The Black Swan. Taleb sorts events into two categories: scalable and non-scalable. Scalable events have no defined limits, while non-scalable events have defined limits. (Taleb notes that Black Swan events—events that are unpredictable yet highly influential—occur solely in scalable situations.)
Most natural events are non-scalable. For example, there’s a defined limit to how much weight a human can lift; strength and weakness revolve around an average, and there are few outliers in that average’s distribution.
On the other hand, many man-made situations and ideas are scalable, with outliers in their averages’ distribution. There’s no upper limit to wealth, for example, which allows for the existence of billionaires—outliers in the distribution of global wealth. The presence of a single billionaire can significantly raise the average wealth of a town, making this average misleading—most people will earn well below the skewed average. Thus, when dealing with scalable situations, understanding averages and their distribution is important to gain an accurate picture of the situation.
Statistics is another area of math that you’re not evolutionarily primed for, Dobelli says. One statistical error is self-selection bias, in which the nature of the participants in a study influences its outcome. Specifically, people only join studies they’re comfortable with responding to, which alters your data, Dobelli says. Those who might provide embarrassing or somehow “undesirable” responses simply won’t take part, narrowing your study’s scope and skewing the results.
(Shortform note: The only way to completely eliminate these problems is studying people who don’t know they’re being studied. However, researchers must receive consent from participants, so this isn’t practical. That said, you can limit self-selection bias. Most studies do this by collecting demographic information: Researchers look for patterns in the demographics of people who chose to participate and alter how they weigh the responses to reduce self-selection bias. Specifically, they give greater weight to results from those less likely to self-select, and lesser weight to results from those likely to self-select.)
Next, we’ll cover fallacies related to memory. People believe their memories are untouchable, stored away and recalled when needed in perfect condition. However, this isn't the case, Dobelli warns. Your memory is affected by your feelings, opinions, and situation.
(Shortform note: Your memories are affected in these ways at several points: First, whatever you were feeling in the moment is tangled up with the actual situation in your memory; later, every time you remember the situation, your current mental state further alters your memories. Thus, the more you remember a situation, the more distorted the memory becomes.)
In this section, we’ll look at some of the ways in which your memory is unreliable.
The main reason your memory is unreliable is that your brain is constantly rewriting your memories, Dobelli explains. This is called falsification of history. As your opinions and worldview change over time, your brain alters the details of your memories, making you remember the past in a way that better matches your current opinions and worldview.
(Shortform note: Your brain rewrites memories in this way to be helpful: By updating the information, your memories become more relevant to the current moment and your current decisions. However, rewriting memories also means you become overconfident in your beliefs: When you think you’ve always held the same beliefs, you won’t feel the need to challenge them.)
Your memory is also influenced by the order in which you receive information and how much time has passed since you received said information. According to Dobelli, the first information you receive is initially easier to remember than information introduced later. This is called the primacy effect. However, this only works for a short time, as the information eventually leaves your short-term memory. After that, whatever information you heard most recently is easier to remember. This is the recency effect.
(Shortform note: How do these effects work? In the case of the primacy effect, when you learn a piece of information early, your brain has more time to repeat it. This keeps it in your short-term memory for longer, until it can be transferred to long-term memory. As for the recency effect, when you learned a piece of information recently, the information is still in your short-term memory and so is easy to recall. You can manipulate these tendencies by memorizing important information first to trigger the primacy effect and reviewing information before you need it to trigger the recency effect.)
In this section, we’ll cover how misinterpreting cause and effect damages your judgment. According to Dobelli, humans struggle to interpret cause and effect because they confuse correlation and causation. When two events coincide, people assume there’s a causal relationship between the two of them, even when there’s not.
(Shortform note: How do people make these mistaken links? They take their knowledge of the effect and look for any similar events that might point to a cause, regardless of the likelihood of that similar event actually being the cause. In other words, they look for possible correlations between the events and mistake this for one causing the other.)
One type of misrepresentation of cause and effect is association bias, or the brain’s tendency to make connections where none exist. Dobelli says this misrepresents cause and effect by forming false knowledge, where you falsely causally connect two unrelated things.
Superstitions form this way, Dobelli explains. For example, say you bring rainboots when camping, and the weather is perfect. The next time you go camping, you leave the rainboots behind and the weather is awful. The next time you bring them, the weather is wonderful again. After a few of these experiences, your brain connects the boots and good weather, even though it’s just a coincidence that the weather improved when you brought the boots.
(Shortform note: Why does association bias occur? Dobelli doesn’t say, but others argue that association bias is a defense mechanism: Making connections helps you form “protective frames.” These are practices or support systems that let you evaluate risk (for example, the risk of it raining when you go camping). In our example, the brain mistakenly created a frame in which the presence of rainboots reduces the risk of rain.)
Another way people misrepresent cause and effect is by oversimplifying: To make a simple pattern of cause and effect, people simplify to a single cause. This mindset is dangerous because everything is affected by a complex web of causes, Dobelli states. There’s never a single cause for complex effects like crime or success. (Shortform note: Problematically, if you simplify to a single cause, you’ll also simplify to a single solution. For example, if you believe high illness rates are solely due to unaffordable healthcare, you’ll work only to make healthcare affordable. Your singular focus means you don’t realize that other factors like safe housing and income must be addressed too.)
The next set of fallacies we’ll cover revolves around probability and predictions. Dobelli says people hate uncertainty and try to predict future events to alleviate that uncertainty. However, to make accurate predictions, you must understand probability, which humans struggle with. Thus, people’s predictions are usually inaccurate.
(Shortform note: Even though humans are proven to struggle with probability, and predictions are notoriously unreliable, people still make a living estimating probability and making predictions. This is a form of authority bias: You assume that if the person is making a prediction, they must have based said prediction on experience. However, no matter how knowledgeable the person is, they’ll struggle to process the information needed to make accurate predictions.)
In this section, we’ll cover ways people misunderstand probability and make inaccurate predictions.
According to Dobelli, people struggle to make good decisions because they neglect to consider the probability or risk involved in those decisions. Logically, they should choose the option with the highest probability of going well for them and the lowest risk of going badly. However, people instead choose the option that will have the biggest positive impact on them if it occurs, regardless of how likely it is to occur.
(Shortform note: A type of neglect that Dobelli doesn’t cover is denominator neglect, which Daniel Kahneman describes in Thinking, Fast and Slow. If probability is a fraction, with the situation you’re evaluating as the numerator and the total number of possibilities as the denominator, you’ll base your judgment of risk solely on the numerator and ignore the denominator. This means you’ll regularly misinterpret probability, since the numerator depends on the denominator to accurately show probability.)
The next fallacy we’ll cover is hindsight bias. Dobelli says hindsight bias makes past events seem like they should’ve been easily predictable. People see an obvious pattern of circumstances that led to a past event occurring, and they think people should have noticed that pattern and predicted the event. At the time, though, the pattern wasn’t clear, so people couldn’t use it to predict the event. It’s only with hindsight that the pattern becomes clear.
Hindsight bias encourages overconfidence, Dobelli says. You think you’re good at detecting patterns when really, you’re not: You’re only seeing them because of hindsight. You thus fail when trying to apply these pattern-spotting “skills” to predicting the future.
(Shortform note: Past events seem obvious because of how your brain predicts: When shown two possibilities, your brain creates reasons why both are possible. However, once Possibility A is proven, your brain doesn’t need to retain information about Possibility B. It forgets that information, making you believe Possibility A was obvious all along. This altered memory also creates overconfidence. You forget any prior uncertainty or incorrect predictions, which reinforces your overconfidence about your pattern-finding and prediction abilities.)
In this final section, we’ll cover fallacies that affect how you value things. According to Dobelli, humans tend to put value in a person, situation, or item for arbitrary and illogical reasons.
One illogical shift in your valuation of an item is that when you own an item, you subconsciously overinflate its value simply because it's yours, Dobelli explains. This is called the endowment effect.
(Shortform note: This effect stems from loss aversion. Once something is in your possession, you fear losing it, which makes you value the item more. You can avoid this fallacy by avoiding personal connections to items: However, doing so may harm your well-being. Valued belongings become an extension of your identity and a way to express your personality, and preventing those connections from forming can make you feel stifled and unable to be yourself.)
Liking bias also affects how you value people, specifically. The more you like someone, the more value you put on their opinions and desires, Dobelli says. This means you’re more likely to do something for an individual you like, even if doing so goes against your own interests.
(Shortform note: Dobelli doesn’t say why liking someone makes you value them more. Some experts say you value people you like more because when you like someone, you form an alliance with them. Having a common goal (friendship) unites you and the other person, making you more likely to value them and fulfill their desires.)
Another error in thinking that affects how you value things is the sunk cost fallacy. According to Dobelli, the more time, effort, or resources you invest in something, the higher you value that thing. You'll also be more resistant to parting with it, even if keeping it means losing more time, effort, or resources in the future.
(Shortform note: This fallacy stems from a fear of waste: Most people try not to waste time, money, or effort, and letting go of something you’ve invested resources in feels like wasting those resources. While this is technically true—it is a waste of time, money, or effort—continuing to invest resources only creates more waste.)
You can overcome this fallacy by focusing on whether something is serving you in the present and will continue to do so in the future, rather than focusing on what you’ve invested in the past. (Shortform note: Dobelli’s suggestion to focus on the future doesn’t mean ignoring the past: Consider the past to make good decisions based on all the data you’ve collected, but don’t let past effort stop you from moving on.)
In The Art of Thinking Clearly, Rolf Dobelli breaks down the most common logical fallacies that plague humanity. Logical fallacies affect everyone, are extremely difficult to avoid, and can hinder our decision-making ability. Dobelli encourages readers to improve their decisions by learning how to recognize these fallacies and how to work around them.
Rolf Dobelli is a Swiss writer and entrepreneur. Born in Lucerne, Switzerland, in 1966, Dobelli studied philosophy and business administration at college. He graduated with an MBA and PhD in philosophy from the University of St. Gallen, Switzerland in 1995.
Dobelli primarily wrote novels before stepping into non-fiction in 2011. The Art of Thinking Clearly was his first non-fiction book, and he hadn’t always intended to publish it. The book was originally a private list of fallacies he’d curated to help himself act logically. It was only after friends expressed interest in the list that he started publishing it in newspaper columns and eventually as a book.
Once it was published, The Art of Thinking Clearly catapulted Dobelli to international success. Since then, he’s written four other non-fiction books. The Art of Thinking Clearly remains his most well-known publication.
Dobelli connects to readers through his simple and anecdotal writing style. He cautions in the book’s introduction that he’s not an expert in his subject matter, though he’s spent years studying with experts. Dobelli instead views himself as a translator, taking complex psychological ideas and making them accessible to the general public.
In addition to writing, Dobelli co-founded getAbstract, a company that summarizes business content. He also founded WORLD.MINDS, a non-profit that unites science, business, and culture by inviting experts in these fields to share their ideas.
Connect With Rolf Dobelli:
The Art of Thinking Clearly was originally published in German (under the name Die Kunst des klaren Denkens) in 2011 by Carl Hanser Verlag. It was translated into English in 2013 through Scepter, an imprint of Hodder & Stoughton. In 2014, a second edition was published through Harper Paperbacks, an imprint of HarperCollins, including an extensive bibliography and citations. For this guide, we’re using the 2014 edition.
The Art of Thinking Clearly was published in 2011, on the heels of an unprecedented period of expansion in psychological study. Since the 1980s, interest in psychology had grown steadily. The founding of groups like the Association for Psychological Science had allowed scientists to expand their borders by collaborating with fellow researchers in different fields and locations. Some of the fallacies Dobelli discusses were discovered during this time, and many more were expanded upon. The Art of Thinking Clearly made this information accessible to laypeople, capitalizing on the public’s curiosity and the pace of new discoveries in the field.
Dobelli wasn’t the only one to take advantage of the aforementioned increased interest in psychology. In fact, Dobelli was inspired to create his own list of fallacies after reading an early draft of Nassim Nicholas Taleb’s Antifragile, which was published in 2013. Other books, such as Thinking, Fast and Slow by Daniel Kahneman, published in 2012, also explored psychological fallacies to great success. However, Dobelli is unique in the breadth of his content. Most of his contemporaries focused on a few fallacies with a common root, while Dobelli explored 99 of the most common fallacies, including how they form and how to overcome them.
Dobelli admits in his introduction that the ideas contained in the book are not his own; rather, as we’ve noted, he acts as a translator between scientist and layperson. However, some people believed he drew too much from others’ ideas without giving adequate credit. In 2013, Taleb accused Dobelli of plagiarizing Antifragile, an early draft of which he’d given Dobelli several years before. While Dobelli argued that he’d referenced Taleb fairly, he updated later editions of the book with citations and a bibliography.
The Art of Thinking Clearly was an instant success, entering Germany's Der Spiegel bestseller list at number one and maintaining the position for 30 weeks. It was the overall bestselling non-fiction book in Germany and Switzerland in 2012. After the 2013 English translation was published, it became a top-10 bestseller in many other countries, including the UK, South Korea, and India.
Many readers consider The Art of Thinking Clearly to be required reading for those seeking to make logical decisions. They enjoy Dobelli’s straightforward approach and anecdotal style that make complex psychological ideas accessible to the layperson. Many readers also appreciated the book’s insights on overcoming the fallacies and said the information helped them avoid illogical thinking.
However, not all readers appreciated the book’s simplistic style. Some readers argued that the book was too simplified and didn’t provide enough nuance to give readers a firm understanding of the fallacies. In addition, some pointed to Dobelli’s alleged plagiarism to argue that he doesn’t introduce any original ideas.
As noted, The Art of Thinking Clearly originated as a personal list of fallacies Dobelli kept to encourage his own logical thinking. Thus, the book has a matter-of-fact tone and focuses on explaining each fallacy quickly and simply. This is also why Dobelli uses the most well-known examples to illustrate the fallacies, rather than creating his own: His goal was not to introduce new ideas to the field of psychology, but rather to give a quick and easy guide to established ideas.
Throughout the book, Dobelli uses “via negativa,” a method of problem solving in which instead of introducing steps to success (how to think logically), he tells you what to avoid (the logical fallacies). Since there’s no step-by-step method of thinking logically all the time, but research shows that certain problems with logic affect everyone, this method arguably works well for Dobelli’s topic.
The Art of Thinking Clearly has 99 short chapters without any overarching organization or thematic grouping. This is probably because of the book’s history as newspaper columns: Columns don’t need to build on each other, and they have strict length limits. However, in book form, this lack of organization can make the ideas feel disjointed.
In addition, there’s a great deal of repetition throughout the book. While Dobelli warns that this is the case—logical fallacies all stem from the brain, so they’re bound to overlap—some fallacies seem contrived, forced into a new chapter to reach the 99-chapter goal when they could work better folded into another chapter.
In this guide, we’ve broken The Art of Thinking Clearly into two parts. Part 1 covers evolutionary fallacies—those stemming from humanity’s hunter-gatherer past—and Part 2 covers non-evolutionary fallacies—those stemming from other sources. Within these parts, we’ve grouped Dobelli’s fallacies by theme, combining and rearranging chapters to decrease repetition and increase logical flow.
Throughout the guide, we explore the science behind the fallacies, compare and contrast Dobelli’s ideas with other experts on psychology and logical fallacies, and provide concrete steps to overcome illogical thinking.
In The Art of Thinking Clearly, Rolf Dobelli explores 99 common fallacies that inhibit logical thinking. He explains how these fallacies form and how they can negatively affect your daily decision-making.
Dobelli warns that his book isn't a step-by-step guide to logical thinking. Rather, he takes the “via negativa” approach, explaining what prevents you from thinking logically. If you’re aware of your illogical tendencies, you can work around them and make good decisions.
In Part 1 of this guide, we’ll cover evolutionary fallacies: traits stemming from humanity’s past as hunter-gatherers. In Part 2, we’ll cover non-evolutionary fallacies: traits stemming from other sources.
According to Dobelli, evolutionary fallacies are errors in thinking that occur because of the way humans’ brains evolved over millions of years. Through evolution, humans’ thought processes were optimized for a hunter-gatherer lifestyle. As we’ll see, in the modern industrialized world, these lingering thought patterns become pitfalls rather than benefits.
Evolutionary Psychology: How It Works and Why You’re Not Evolving Anymore
Dobelli’s explanation of logical fallacies here uses evolutionary psychology. Evolutionary psychology theorizes that creatures’ traits are determined by natural selection, where individuals with maladaptive traits die and leave the gene pool. Meanwhile, those with helpful traits survive and reproduce, passing their genes on. These helpful traits strengthen over time, and new traits are introduced from genetic mutations. This process led to thought patterns useful for hunter-gatherers persisting through millions of years of humanity.
But we’re not hunter-gatherers any more, so shouldn’t humans still be evolving to better fit their modern environment? Why do these logical fallacies persist if natural selection exists? There are three reasons:
Humanity is too spread out. When humans lived in small groups, beneficial genetic mutations could gradually expand through the group and then beyond. In the modern world, the human race is too large and spread out for genetic mutations to easily spread.
There’s a lack of environmental pressure. While global warming affects weather patterns, it isn’t extreme enough to trigger evolution. If a global disaster struck that drastically and permanently changed the weather or food supply, natural selection would become relevant again as humans would be forced to evolve to survive.
It hasn’t been long enough for us to evolve again. Humans were hunter-gatherers for almost 4 million years. It’s only been about 10,000 years since the invention of agriculture, let alone the industrial revolution. Thus, human brains have not had time to evolve further.
Dobelli states that one of the evolutionary traits that most influences you is the desire to be in a group. For early humans, group membership was necessary for survival. Those who left the group died, while those who stuck to the status quo survived and reproduced. Thus, your brain is genetically wired to fit in.
(Shortform note: This may have been the original evolutionary reason to be in a group, but are there any other, modern reasons? Psychology suggests that as the protection groups offered became less important as human society evolved, other needs became more important: Group membership exposes you to others’ experiences and life skills and helps you develop empathy and a sense of self-worth.)
This desire for community causes problems in the modern world. In this chapter, we’ll cover the following fallacies resulting from a need to belong:
Social proof is an example of group instinct. To fit in, you copy other people’s behavior and judge that behavior according to the number of people participating in it. The more people participating, the “better” you judge the action, Dobelli says. This effect influences you at all times.
(Shortform note: While Dobelli describes social proof as constantly influencing you, others argue that it only works when you’re uncertain: If you know what to do, you don’t need other people’s guidance.)
This tendency was helpful in the past as it aided survival: If everyone else was building shelters and looking for firewood, you knew bad weather was coming and could prepare, Dobelli says. If everyone else ran away, you ran too because there was probably a predator nearby.
In modern times, though, your decisions don’t always fall into the dichotomy of “follow the group and live” or “leave the group and die,” Dobelli adds. Modern humans have more physical security, so decisions revolve around more nuanced issues like being happy. In these instances, following the group can lead you astray. For example, if everyone else pursues an office career, your brain assumes it’s safer and wants to do the same, no matter how miserable such a job might make you.
(Shortform note: As Dobelli notes, most people aren’t faced with the live-or-die dichotomy today, yet social proof and its negative consequences persist. Why? In the absence of external threats, your brain sees rejection as a danger. This fear of rejection leads you to people-please, hide your personality, and try to fit in by following the group.)
Social Proof: Collectivist vs. Individualistic Cultures
The culture you come from—collectivist or individualistic —possibly influences how susceptible you are to social proof. Collectivist cultures focus on group membership: You’re defined by your group and your desires come second to the group’s. Individualistic cultures, on the other hand, focus on the individual: You’re defined by personal autonomy and focus on your own goals.
In one study, individuals from collectivist cultures were more susceptible to social proof, while those from individualistic cultures acted more in line with their past decisions. However, more research showed that susceptibility to social proof depended more on having a personal collectivist or individualistic mindset than on a person’s overarching culture. While the culture you’re raised in affects your mindset and susceptibility to social proof, it can’t completely predict it.
Authority bias is another fallacy related to social proof, Dobelli says. However, rather than being pressured by a group, you’re pressured by an authority. An authority is anyone with power over you, whether through knowledge (they’re more knowledgeable than you) or political or social power. (Shortform note: This authority doesn’t have to be applicable to the situation. As long as you view the person as an authority in one field (say, a politician being an authority in government), you’ll listen to their advice or opinions in other fields too (such as medicine). This is because you attribute their authority to them as a person rather than to their field of expertise.)
Authority bias convinces you to act in ways you otherwise wouldn’t through respect or fear, Dobelli explains. You assume that the authority must be right because of their position of power, or you’re afraid to disobey them because of that power. (Shortform note: In addition, you’re trained throughout your life to obey authorities. You form a “heuristic,” or a shortcut in thinking: When an order is given by an authority, your brain automatically accepts it.)
To overcome authority bias, Dobelli suggests asking yourself how authorities are influencing you and if you should let them continue to do so.
How to Overcome Authority Bias
Authority bias primarily refers to actions that are morally reprehensible, as Dobelli implies. Even if you believe an action is wrong, authority bias can convince you to do it. Dobelli suggests overcoming this morally problematic bias by asking yourself how authorities are influencing you; however, this is easier said than done, as it's sometimes difficult to detect others’ influence on you.
Here are a couple of other suggestions for overcoming the bias, if Dobelli’s method doesn’t work:
Distance yourself from the authority. The pressure to obey lessens if the authority isn't physically present.
Form a moral buffer. If there’s a victim of the actions an authority pressures you into, form a connection with them. This emotional component makes it easier to ignore the authority.
The next bias we’ll discuss is what Dobelli calls the “twaddle tendency”—or, as we’ll call it for simplicity, the nonsense distraction. Dobelli notes that people distract from their ignorance, uncertainty, or laziness by speaking long-winded nonsense. This is an attempt to maintain group membership: Part of group membership is bringing some level of knowledge to the group. When people feel their membership is threatened, they pretend to have knowledge to protect their position in the group.
(Shortform note: Another word for nonsense is “pseudo-profound bullshit”: saying things that sound profound but are really meaningless. The goal isn't to educate but to disguise ignorance and to maintain verisimilitude. People do this because of a lack of self-worth: If they were confident in their self-worth, they’d feel secure enough to reveal their ignorance and learn from others.)
Anyone who’s watched a political debate has witnessed the nonsense distraction. A politician gives a rambling answer with lots of patriotic keywords, and you realize they didn't really answer the question.
Avoid this tendency by testing your ideas for logic and clarity, Dobelli advises. If you don’t know something, it’s better to admit you don’t know than to hide your ignorance.
(Shortform note: It's not always better to show your ignorance as Dobelli suggests. If you're in a position of authority, doing so may harm your credibility. Continuing our example, voters are more likely to dismiss a politician who admits ignorance than one who disguises their ignorance. Logically, the voters might know the honest answer is better, but the nonsense distraction makes them prefer a useless answer to none.)
24-Hour News: The Downfall of Fact
One of the most pervasive examples of the nonsense distraction is 24-hour news. Back when anchors had to fit a news report into an hour or less, there was only time to present the facts. However, once the news cycle expanded to fill the entire day, news stations had too much time and not enough news. The internet exacerbated the problem, taking news to a minute-by-minute cycle.
This need to fill runtime led to sensationalism and opinion pieces. Facts are buried under overdramatized information, and instead of just sharing news, publishers tell you what to think about it. These opinion pieces use authority bias, discussed above: If the person is trusted to give an interview, you assume they’re qualified, even if they’re not.
Another fallacy related to group membership is the fundamental attribution error. According to Dobelli, people ignore outside influences on problematic situations, instead assigning blame for the problem to the most visible individual. This is because your brain is designed to pay attention to the people around you—in other words, your present “group.”
(Shortform note: The fundamental attribution error also relates to collectivist and individualist cultures, as discussed above. Individualist cultures place high emphasis on individual identity, so members of these cultures naturally fall prey to the fundamental attribution error more than their collectivist counterparts, who place greater emphasis on the group and the circumstances.)
The Neurology of the Fundamental Attribution Error
Why do people suffer from the fundamental attribution error? Dobelli doesn’t say, but scientists claim it’s because of a process called “mentalization,” which takes place in your social brain: You try to determine another person’s intentions by guessing at their thought process.
When mentalizing, you focus so much on what the other person is thinking and intending that you ignore external influences on their behavior. Thus, you’re more likely to blame them for problems than the situation they’re in.
Mentalization happens mostly unconsciously: Your social brain is part of the brain’s default processes. So, it’s hard to recognize when the fundamental attribution error occurs. The best way to avoid it is to consciously challenge your social brain: Recognize when you’re mentalizing and look at the situation as well as the individual’s actions.
Part of humanity’s desire to be in a group is prioritizing your own group above others. As Dobelli explains, this means you magnify your group’s positive traits and minimize those of other groups: a phenomenon called in-group, out-group bias. (Shortform note: Why do you prioritize your in-group? Dobelli doesn’t say, but others suggest that by categorizing yourself as a member of a group, you make that membership part of your identity and thus focus on its positive traits. In-group bias is stronger in people with low self-esteem, who rely more heavily on their group for a positive sense of identity.)
Dobelli notes that there are three components to believing your group is better than others:
1. Groups frequently form around unimportant things, but your brain locks onto those similarities and prioritizes them above any differences. (Shortform note: These prioritized in-groups can change depending on the situation. If you’re driving and a pedestrian blocks the road, you’ll be annoyed at pedestrians. Your in-group is “drivers,” while “pedestrians” is an out-group. However, if you’re later a pedestrian and a car honks at you, you’ll be annoyed at drivers. Your in-group shifted to “pedestrians'' according to the situation, and your feelings likewise shifted.)
2. When you’re in a group, your brain views the members of other groups as more similar than they are. Your brain wants others to fit neatly into their opposing box, so it ignores any conflicting information. (Shortform note: Research confirms that people can identify minor differences between in-group faces while missing differences between out-group faces. Your brain recognizes when individuals aren’t in your in-group and puts less effort into processing or remembering their faces, meaning it doesn’t register differences between these faces.)
3. Groups usually form around people’s opinions, which means you spend a lot of time with other people who hold the same opinions as you. Because everyone around you agrees, you believe your group must be right. (Shortform note: On social media, this kind of group is called an echo chamber. Social media algorithms direct you to people with similar beliefs, which reinforces your underlying beliefs. In turn, interacting with those people reinforces the algorithm.)
In the modern day, Dobelli argues, group bias is a dangerous phenomenon that blinds you to the facts of a situation and enforces prejudice. (Shortform note: To counter this bias, spend time with members of out-groups. Focus on points of connection, rather than differences, and recognize when your groups are formed around arbitrary means.)
According to Dobelli, reciprocity is another tendency that forms out of humanity’s instinctual desire to be part of a group: If someone does something for you, you’re more likely to do something for them in return. This comes from a dislike of being in other people’s debt.
(Shortform note: How specifically you react to being in someone’s debt depends on your reciprocity style: You either only help others when it benefits you, and so are less affected by being in debt; help others for the sake of helping even when it means a loss for yourself, and so are highly affected by being in debt; or help others to the same extent they help you, where you have a more neutral reaction to being in debt.)
Reciprocity is a vital part of life, Dobelli notes. It inspires cooperation, creates and strengthens communities, and helps keep the group healthy. (Shortform note: Mutual aid is a form of reciprocity that inspires these factors. Communities unite to support the group, everyone helping and being helped in turn. Poor and marginalized groups have used mutual aid throughout history, and researchers have even observed the phenomenon in animals, pointing to reciprocity’s evolutionary origins.)
Unfortunately, Dobelli concedes, this tendency can be twisted. Manipulating people into giving you something is easy if you first make them feel indebted to you. (Shortform note: Often, manipulative people offer you some small benefit before asking for a larger favor. Even though the exchange is imbalanced, you still feel compelled to agree.)
Reciprocity also has a dark side: retaliation. Dobelli points out that when others are kind to you, you want to return the favor and be nice as well, but if someone hurts you, you want to hurt them in return. This cycle only escalates the problem.
(Shortform note: Though Dobelli argues that retaliation is dangerous, others argue that retaliation—or negative reciprocity—is a necessary part of cooperation. They clarify that Dobelli refers to private retaliation, where an injured party carries out retaliation without supervision and frequently over-retaliates, prompting the initial aggressor to retaliate in turn. However, public retaliation is the basis of the legal system: A proportionate form of retaliation is objectively determined and taken against the initial aggressor. Given this definition, retaliation is an important part of society, so long as it's objectively proportionate.)
Another fallacy that forms from group membership is social comparison bias, Dobelli explains. This involves refusing to help someone out of fear that they’ll take your spot in the group. This is a defense mechanism: You don’t want to lose your position in the group, so you won’t help others even if doing so would help the group.
Social comparison bias is especially prevalent in the business world, Dobelli says. People hire below their own skill level so they don’t feel threatened. By hiring less qualified people, however, you ensure your business won’t excel: It’ll be limited by your abilities and the lesser abilities of your employees. Instead, improve your company by hiring the best, even if they’re better than you.
(Shortform note: A barrier to hiring the best people might be worry about how to manage people who are smarter than you. In this situation, focus on what you bring to the team and how to create an environment where your employees can best use their expertise. In addition, acknowledge the areas in which you fall short and learn from your more experienced employees.)
Upward and Downward Social Comparison
While Dobelli defines social comparison bias as refusing to help others out of fear of being outclassed, other sources define it more generally as a sense of competitiveness and dislike stemming from comparison. They break social comparison into two categories: upward and downward.
Upward social comparison means comparing yourself to those who are better than you. This can either motivate you to improve or lower your self-esteem. Meanwhile, downward social comparison means comparing yourself to those who are somehow worse than you. This either boosts your self-esteem or leads to the problems Dobelli explores above.
So how can you get the benefits of healthy competition without damaging your self-esteem or ruining your company?
Be humble. It’s difficult to admit that someone else is better than you, but doing so helps you in the long run.
Approach upward social comparison with curiosity rather than jealousy. Look at how others became successful and how you can use that knowledge.
Seek out your betters, rather than running from them, so you can learn from them. This gives you an advantage over your competition and helps you excel.
In this final section about group biases, we’ll discuss social loafing: When added to a group, individuals lessen the effort they exert, both physically and mentally. This allows them to save energy. Dobelli says this is a feature of groups, not a bug: Early humans formed groups because the responsibility for survival was spread out rather than concentrated on each individual.
When a member puts too little effort into the group’s actions, however, they are looked down upon and sometimes expelled from the group. Thus, Dobelli notes, social loafing is a careful balancing act, with each team member putting in enough effort that their slacking isn’t noticed.
While social loafing is a rational behavior on the part of the worker, it's a problem for employers, Dobelli argues. Social loafing means teams are more willing to take bigger risks than the individuals that compose them. They do so because the shared responsibility involved in group membership means they won’t be the sole focus of the blame if things go wrong.
(Shortform note: Interestingly, most definitions of social loafing don’t include the idea that groups make riskier decisions than individuals. This is considered a separate phenomenon called “risky shift.” However, Dobelli’s inclusion of risky shift as an element of social loafing makes sense, as diffused responsibility is the cause of both phenomena.)
The best way to combat social loafing is to make sure that the group can trace everyone’s individual contributions back to them. This increases personal responsibility and requires all the team members to put their full effort forward, Dobelli says. (Shortform note: You can also manipulate in-group bias, discussed above, to avoid this fallacy. When people feel a sense of kinship with their group, they’ll work harder to benefit said group.)
Social Loafing: How It Forms and How to Prevent It
While Dobelli stresses the evolutionary causes of social loafing, there are a couple of other factors that influence it:
Your expectations. If you expect your coworkers to lower their effort or feel that your contributions to the group won’t be important, you’ll lower your own effort because you don’t want to work hard by yourself or for something unimportant.
How you value the goal. If you’re passionate about the project, you’ll work harder. If you only care about the reward, which will be shared among the group, you’ll work less.
Prevent social loafing by encouraging peer accountability, as Dobelli discusses. By making your group members more aware of when someone isn’t pulling their weight and empowering them to hold that person accountable, social loafing lessens.
According to Dobelli, social proof pressures you to copy other people’s behavior. This is a defense mechanism to maintain your group membership, but it can lead you to make bad decisions. Every person has unique wants and needs, and following the status quo might keep you from happiness.
Think of the last time you did something because you felt pressured to by your group. (This group could be family members, friends, religious or political groups, and so on.) What happened? Did your actions make you happy or unhappy, and why?
Next, describe how you might have acted differently without that group pressure. How might you have been happier if you’d chosen your own path? (Be realistic: Making your own path can have negative consequences as well as benefits.)
Finally, write out a plan for the next time you encounter social proof. (For instance, you might take some time for yourself away from your group and make a pros and cons list to figure out what you really want.) Be specific, with actionable steps.
Now that we've covered biases relating to humanity’s desire to fit in, we’ll examine the next set of evolutionary fallacies: those in which you pay attention to the wrong things. Humans have evolved to pay attention to the most memorable or flashy information that comes up, rather than the most pertinent or helpful, Dobelli explains. The more conspicuous and repeated a piece of information is, the more you’ll believe it.
(Shortform note: The brain has arguably evolved to do this for reasons of efficiency. Your brain rapidly stores information, and it takes it less time and energy to accept the flashiest information available than to evaluate the entire situation. This is especially true in the modern day, when you’re constantly bombarded with information from, for instance, social media.)
In this chapter, we’ll cover the following fallacies:
The first attention-affecting fallacy we’ll look at is the salience effect. Dobelli notes that your brain latches onto unusual or notable factors of a situation and gives these factors too much credit for causing the situation. (Shortform note: The salience effect relies on your past experiences. You might notice certain unusual details due to, for instance, your career or past experiences, while someone else would notice different details.)
Popular attention influences which factors the salience effect focuses on, Dobelli adds. For example, many people talk about the importance of going to college to get a good job. If you meet a person whose business failed and they don’t have a college degree, your mind will latch onto that salient factor because it's at the forefront of your mind, and you’ll attribute the failed business to the person’s lack of a college degree. Meanwhile, the owner’s education might have had no impact on the failure of the business.
(Shortform note: Businesses try to draw popular attention to their products by standing out among their competitors. This is increasingly difficult as advertisements infiltrate all of modern life and people become adept at ignoring them. However, there’s a form of popular attention that is very common in modern life: star reviews. Whether it’s movies, restaurants, or an online purchase, people rely on this modern version of word of mouth when making decisions.)
Dobelli recommends avoiding the salience effect by ignoring the easiest or most obvious information and looking for the less flashy causes behind situations.
How to Overcome the Salience Effect
Dobelli’s suggestion to look for the less obvious causes behind situations can be difficult to put into action, since you’re competing directly against the powerful salience effect. Here are a few tips to make this easier:
1. Look for the less obvious causes of a situation by asking a lot of “why”s, slowing down to consider the answers to these “why”s, and being critical about how much you really understand about the potential causes.
2. Surround yourself with people from diverse backgrounds. As discussed, past experience determines which facts you focus on. Knowing people who can bring different experiences to a situation means you can combine your assessments to see the whole picture.
3. Provide real-time feedback. If you want to combat the salient traits you initially notice, provide other salient information: For instance, focus on the possible consequences of your decision as you’re making it. If you’re buying a car, seeing how fuel costs affect your budget can help you look past the impressive paint job.
Story bias is another attention-affecting fallacy: People prefer entertaining fiction to boring facts, Dobelli explains. Sometimes, this means following an interesting, story-based tangent while ignoring the central, factual issue; other times, it means incorrectly assigning a story and meaning to random events. (Shortform note: Why do we do this? Possibly because stories activate your brain’s sensory processing center. When your sensory center activates, it gives you a shot of oxytocin, which generates positive feelings of compassion, empathy, and trust. Meanwhile, facts activate the data processing center, which doesn’t provide this boost in positive feelings.)
Stories are easier to remember than other types of information, Dobelli adds, which is why we rely on them instead of facts. Humans have told stories to each other for entertainment and to explain life for millions of years, so our brains are wired to identify and store them.
How can you overcome story bias? Dobelli suggests picking stories apart rather than blindly consuming and accepting them: Consider the story-teller’s intentions and what they might be hiding. (Shortform note: In addition, be cautious about what kind of stories you consume. The stronger and more convenient the narrative, the more suspicious you should be of it, because strong narratives can distract from logical gaps in a person’s statement.)
The Psychology of Story Bias
Why do people remember stories better than facts? Dobelli suggests an evolutionary explanation, but others argue it’s because humans naturally compress and consolidate information to make it more manageable. Your senses provide too much information to process, and to handle this workload, you simplify the information by creating relationships or patterns—in other words, stories. This is why songs are helpful memorization tools and smells can trigger memory: You remember things as a compressed, connected unit, rather than separate pieces of information.
While this compression is an important tool that allows your brain to function, it presents a problem. Any information that disturbs your attempt at compression because it doesn’t fit a pattern makes it harder for your brain to work, so when compressing, you discard any facts that don’t fit the story, and may thus fail to retain important information.
Next, let’s discuss survivorship bias: the belief that you have a better chance of succeeding than you actually do because success stories are more widely publicized than failures. Widely publicized information is easier to access, triggering the salience effect discussed above, Dobelli says: You latch on to this information when making judgments about your chance of success, ignoring the many invisible stories of failure.
Survivorship bias is found in most fields, especially those in which success brings fame. For example, you might see seemingly inexperienced teenagers competing in the Olympics and therefore think it’s easy to qualify for the team. Meanwhile, the vast majority of athletes never qualify.
Survivorship bias doesn’t just affect people trying to succeed, Dobelli notes. When you’ve already succeeded, you’re more likely to attribute your success to traits you share with other successful people, ignoring that those same traits can be found in many of the people who failed.
How can you avoid survivorship bias? Dobelli says to research statistics and examples of failure in the field you’re considering. This’ll give you a better idea of your chances of success. (Shortform note: Start your research by asking “What am I missing?” and looking for holes in your data. In addition, be careful when researching: If your sources are incomplete or suffering from survivorship bias themselves, they won’t help you ascertain reality.)
Survivorship Bias and Story Bias
Dobelli attributes survivorship bias to how easy attaining information about success is, especially when success leads to fame. However, research suggests a deeper reason for this bias: humanity’s desire for storytelling, as discussed in the last section. You want a satisfying, logical narrative that explains how people found success so you can do the same. To create this narrative, you ignore the probability of failure, only taking information about success into account. You search for concrete explanations for other people’s success, even though success is often determined by luck.
This also explains why survivorship bias affects you even when you’ve achieved success: Your desire for a story (this time about how you succeeded) doesn’t change. In addition, self-serving bias plays a role: When good things happen, you credit yourself for it. Your brain doesn’t want to credit your success to chance, and that, combined with your desire for story, pushes you toward perpetuating survivorship bias by attributing success to certain traits you share with other successful people.
Next, we’ll discuss the false knowledge fallacy. According to Dobelli, while real knowledge belongs to experts who work for their understanding, false knowledge belongs to people who don’t know what they’re talking about but make it look like they do.
TV hosts frequently use false knowledge, Dobelli says, as they simply read information others have gathered from scripts. However, because they’re the figurehead providing the information to the audience, they receive a high level of trust and respect. Rather than looking for the actual experts behind the knowledge, the audience attributes the knowledge to the person in front of them. This is dangerous because the hosts can exploit this unearned trust: They can share incorrect or harmful information that inspires bad decisions.
(Shortform note: While false knowledge can still fool viewers into trusting TV hosts, it’s a much less common phenomenon in recent years. Trust in news reporters hit an all-time low in 2015, and while it’s slowly increasing again, it still hasn’t fully recovered. This trust disappeared because the public realized that TV hosts not only use false knowledge but also blur the lines between fact, opinion, and entertainment.)
It’s easy to fall into sharing false knowledge yourself, Dobelli warns. People overestimate their knowledge and expertise. To avoid this, examine your knowledge honestly and draw a hard line between what you really do know and what you’re not sure about. If you couldn’t discuss the subject easily and factually with another expert, be careful discussing it at all.
False Knowledge: Information vs. Wisdom
Dobelli treats false and real knowledge as polar opposites, but knowledge is arguably more of a sliding scale. Some people claim that there are four stages of knowledge:
Information. This stage is the same as Dobelli’s false knowledge: You have some information, but that information isn't being internalized or examined.
Knowledge. In this stage, you have a basic understanding of the information and can use it helpfully.
Insight. In this stage, you deeply understand the information and can work with it creatively to problem solve.
Wisdom. This stage is the same as Dobelli’s real knowledge: Information, knowledge, and insight come together with experience to provide overarching understanding.
While wisdom has never been easy to attain, in the internet age, it’s becoming even rarer. People consider themselves experts because they have easy access to information, but they never gain a deeper understanding of it. Furthermore, basic information lends itself to flashy graphs and shocking numbers, letting people rile up their social media followers without understanding the deeper meaning of the information they share.
Taylor’s advice for avoiding this trap echoes Dobelli’s: Before considering yourself an expert, look at the stages of knowledge. Only when you’ve reached the insight or wisdom stages can you be confident in your understanding. Finally, don’t be afraid to admit when you don’t know something: Doing so encourages your curiosity and lets you see the situation in new ways.
Another attention-affecting fallacy is the feature-positive effect. Because you pay attention to the most flashy or ostentatious information around you, you’re terrible at noticing when things aren’t present, Dobelli says. For example, if you’re distracted by being late for work, you might not notice your phone isn’t in your pocket. (Shortform note: The exception is when you’re an expert in your field. Experts notice excluded information based on past experience of similar situations, while novices and moderately knowledgeable individuals don’t.)
According to Dobelli, realizing this tendency can help you give better advice. People better accept and implement advice that is framed in a positive way than a negative way. This doesn’t just mean “positive” in terms of constructive: It means telling someone to do something is more effective than telling them to stop doing something. For example, “Eat more vegetables” is more effective advice than “Stop eating sugar” because the inclusion of one food group is easier for your brain to process than the absence of another.
Positive vs. Constructive Feedback
Dobelli’s definition of positive feedback is unique. Most people define it as complimenting someone, but Dobelli defines it as discussing things that should be done, rather than things that shouldn’t be done.
Interestingly, constructive criticism is often (although not always) positive in the way Dobelli defines: Good constructive criticism is specific and actionable, which lends itself to describing what should be done. To illustrate this, let’s look at an example of non-constructive, negative feedback and an example of constructive, positive feedback:
“Stop dragging your heels with the filing” isn't specific because it uses a metaphor, rather than describing a specific behavior. It’s not actionable because no specific action has been outlined, and it’s not positive because it tells you to stop doing something rather than telling you to do something.
On the other hand, “Only reply to your emails in the morning so you can file five cases a day” is specific because it gives a particular behavior (replying to emails) and a reason for implementing it (so you can file five cases a day). It’s actionable because it provides a specific action (replying in the morning), and it’s positive because it talks about what should be done.
The final attention-affecting fallacy we’ll look at is the illusion of attention. Dobelli says the illusion of attention means humans tend to zone into specific elements of a situation, focusing only on the details they deem important. Once in this focused state, you might not notice even obvious distractions: Your brain filters them out, deeming them less important.
However, the biggest problem with the illusion of attention is that you believe your observation skills are great, Dobelli argues. Because your brain doesn’t retain any information about “unimportant” observations, you think you’re noticing everything in your area and overestimate your abilities. This overconfidence means you don’t question your true abilities or challenge yourself to notice more.
The illusion of attention isn't a problem when you’re focusing on one activity and the rest of your surroundings continue as usual, such as reading a book while walking in a familiar area. In this case, your attention can split to handle the tasks of reading and walking, Dobelli explains.
However, once an unusual event is introduced, your brain can’t handle all its tasks at once and starts to make mistakes due to your overconfidence, Dobelli says. For example, if your dog walked in front of you as you’re reading, you might step on its tail. Because you believe your observation skills are stronger than they are, you aren’t careful enough to avoid stepping on it.
Why the Illusion of Attention Occurs and How to Combat It
The illusion of attention occurs because your brain has a limited amount of sensory information it can process, called cognitive capacity. Once you’ve reached cognitive capacity, your brain can’t accept new information, even if the information is obvious or important. This inability to process is called inattentional blindness.
The senses most strongly connected to inattentional blindness are sight and sound: These senses take in so much information that they easily overpower your cognitive capacity. This is why you turn down your music when searching for a street: Your brain could handle the tasks of driving and listening to music, but adding the effort of hunting for a street pushes you over cognitive capacity, and you must lessen one sensory input to process another.
However, most of the time, you don’t realize you need to lessen one sensory input to process another. This is because, as noted above, your brain filters out the new input: Its lack of cognitive capacity means it doesn’t notice the new input at all. Because you don’t realize that information didn’t register, you become overconfident in your observational abilities, thinking you caught everything. The only way to combat this overconfidence is to realize how much you don’t notice by getting other people’s input on situations and challenging your expectations and assumptions.
The illusion of attention makes people think they’re paying attention to everything around them, when really, their brain discards many details it deems unimportant. You can overcome this illusion by challenging your expectations and assumptions of your situation.
Describe a situation where you thought you were paying attention to everything but were later surprised by something you missed. (This could be someone reacting differently than you anticipated or a decision having unexpected consequences.)
Next, describe your expectations and assumptions going into the situation. What did you think was going to happen? Why were you so sure? Be specific.
Next, consider what you missed that caused the situation to surprise you. Why didn’t you notice it?
Finally, make a plan for how you can reduce the illusion of attention in the future and catch all important details about a situation. (For instance, you might discuss your plans with other people so they can tell you if you’ve missed anything or reduce sensory input so you can register all important information.)
The next set of fallacies we'll look at revolve around using the wrong type of thinking. There are two main kinds of thinking: fast and instinctive, and slow and logical. Both play important roles, Dobelli explains, but both cause logical fallacies when used in the wrong situation. (Shortform note: These types of thinking are also called hot and cold cognition. Instinct and emotion dictate hot cognition, while logic and reasoning control cold cognition. Hot cognition uses little mental energy, doesn’t turn off, and inspires impulsive decisions, while cold cognition uses a lot of mental energy and can overwhelm you with information.)
So when is it better to rely on fast, instinctive thinking? When actions are familiar or something you’ve been evolutionarily optimized for, Dobelli says. If you’ve ever thought too hard about the way you’re breathing, you’re using slow thinking when instinctive thinking is better. When you’re trying something new, or are in a situation you’re not instinctually prepared for, use slow logic, Dobelli concludes.
Other Perspectives on When and How to Think Instinctively
In Blink, Malcolm Gladwell argues that you can use your instinctive thinking more often than Dobelli suggests—for instance, even in unfamiliar situations—as long as you train it correctly to avoid coming to incorrect snap judgments. You can improve the accuracy of your instinctive thinking by expanding your worldview, so that your instincts have more experience to draw on, as well as by paying attention to context, so that you know what type of experience to draw on.
That said, thinking instinctively may feel uncomfortable to you—Gladwell notes that people prefer instinctive thinking to logic, which makes it hard to engage your logical thinking. In Thinking Fast and Slow, Daniel Kahneman notes that you can encourage logical thinking by increasing the difficulty of your task. For example, if you’re reading an article, making the font smaller increases the effort you put into the task, triggering your logical thinking and helping you analyze the article.
In this chapter, we’ll look at the following logical errors that come from using the wrong kind of thinking:
The first incorrect use of thinking we’ll explore is the conjunction fallacy: the human tendency to prefer a plausible story to a probable one. In other words, Dobelli explains, when a story makes sense to you, you’re likely to believe it even if the true probability of it occurring is low.
For example, consider a girl named Katrina who loves musicals. Now consider these statements:
Many people pick the second option as most likely because it makes a better story: Katrina loves musicals, so she’d perform in a musical. However, the first option is actually more likely in terms of probability because it’s broader: It has just one condition (Katrina being on stage) rather than two (Katrina being on stage and in a musical).
The conjunction fallacy occurs when you use fast thinking instead of slow, Dobelli says. While your logical brain is still calculating the probability of an event, your instinctive thought process makes connections to explain why the event might occur. The connections it finds often form a plausible story, so you accept the instinctive connection rather than waiting for the logical probability.
Is the Conjunction Fallacy Really as Problematic as Dobelli Claims?
While Dobelli presents the conjunction fallacy as something common, others argue that he exaggerates the danger of this fallacy. Dobelli based his understanding of the conjunction fallacy on the initial study that defined it. However, other studies were unable to replicate this initial study’s results—they found that the conjunction fallacy isn’t as common as the original researchers suggested. Only 45% of respondents succumbed to the fallacy, compared to 85% in the original study.
In addition, while Dobelli doesn’t offer a solution to the conjunction fallacy, this later research does: Discussing a situation with other people drastically reduces the likelihood of the conjunction fallacy taking root. Thus, it’s possible that overreliance on quick, instinctive cognition—which Dobelli credits with causing the conjunction fallacy, as discussed above—can easily be overcome.
In this section, we’ll discuss the affect heuristic: part of humanity’s fast, instinctive thinking. Dobelli says the affect heuristic is a mental shortcut in which your brain makes rapid subconscious judgments of like or dislike. These “affects” influence your risk-benefit analyses: If your immediate judgment is good, you’ll focus on the benefits of a situation, while if the affect is bad, you’ll focus on the risks.
(Shortform note: Heuristics like the affect heuristic aren’t necessarily negative. They keep your brain working efficiently and help you make decisions. The affect heuristic was particularly helpful to early humans because it quickly provides more data for risk analysis, and early humans faced regular life-threatening risks—for instance, being attacked by wild predators. However, the information required for a good risk-benefit analysis in modern times is too complex for the affect heuristic to assess adequately, especially when emotions are involved. Rather than being beneficial, the heuristic can lead you to make a risky decision merely because you’re excited or delay a good decision because you’re sad.)
Allowing affects to control your behavior is dangerous because they’re manipulatable, Dobelli warns. Images and situations alter people’s affects, changing the way you judge items or situations. For instance, marketers use cheerful pictures to give you a positive affect of their product.
(Shortform note: You might assume that only people have affects based on appearance, but any object can have them. Machines that look intimidating, like a nuclear power plant, make people wary even if their benefits outweigh their risks. On the other hand, people downplay the risks of enticing items, like a motorcycle, even though they’re extremely dangerous.)
Another incorrect use of cognition is hyperbolic discounting: the tendency to crave immediate gratification and the willingness to sacrifice an unreasonable amount of money, time, or effort for that immediacy. It's difficult for people to wait for things, Dobelli says, a tendency that accelerates the closer you come to attaining gratification.
(Shortform note: While Dobelli doesn’t define it as such, this is an example of overused hot cognition and underused cold cognition. People who can delay gratification have better-developed prefrontal cortexes, the part of the brain that handles cold cognition. Thus, they delay gratification by using their prefrontal cortexes to engage cold cognition. Meanwhile, those with less developed prefrontal cortexes are more susceptible to the instinctual desire for instant gratification.)
Dobelli says this is an instinctive trait from times when humans were more animalistic, noting that animals display this trait too. Consider wolves, who eat almost half their body weight at once. They can’t delay gratification because they don’t know when they’ll have another meal. However, humans no longer need instant gratification to survive, so hyperbolic discounting just inspires bad decisions, like paying a high amount to get something quicker.
How to Avoid Hyperbolic Discounting
How can you avoid hyperbolic discounting (and, therefore, bad decisions)? Dobelli doesn’t say, but one solution is priming. Priming means exposing yourself to certain stimuli that encourage your brain to respond to later, different stimuli in a desired way. In the case of hyperbolic discounting, you’d expose yourself to stimuli that encourage you to delay gratification the next time you encounter a choice between delaying and immediacy.
There are two ways to prime yourself to delay gratification:
Talk about your future plans. Using the words “future” and “self-control” can prime you to delay gratification later on. Combining these words with a broader discussion of your plans reinforces your goals and makes you more likely to carry them out.
Imagine talking to your future self. This exercise lets you see a possible result of your decisions. Focus on avoiding instant gratification in the long-term by imagining a positive future for yourself if you make good long-term decisions. Imagining a negative future that comes from instant gratification also works.
Affects are immediate judgments of like or dislike. Your brain makes these very quickly and relies on these affects rather than logical risk-benefit analyses. It’s important to overcome the affect heuristic because these instinctive judgments aren’t always accurate.
Identify three instinctive judgments of objects or actions that influence your behavior. (Common affects include seeing sugary treats as better than healthy food, seeing intimidating items as more dangerous than they are, such as nuclear power plants, or seeing attractive items as more beneficial than they are, such as motorcycles.)
Next, make a table with two columns: one for “Risks” and one for “Benefits” of the thing you’ve judged. Fill in the columns for each affect. (Be as objective as you can and don’t be afraid to research: The goal is to see where your affects may be misleading you.)
Next, reflect on your opinions from the first question. Have they changed at all since completing the above exercise? Why or why not?
Finally, make a plan for the next time an affect impacts your decisions. (For instance, you might take some time to do a risk-benefit analysis like above or talk to someone with different opinions than you about the thing you’ve judged.)
The next set of fallacies we’ll look at revolve around complex math. Your hunter-gatherer brain isn’t designed for complex math, Dobelli says: Because your ancestors didn’t need this math to survive, you can’t instinctively grasp concepts like averages, statistics, and exponential growth. However, understanding these concepts is important for modern life.
(Shortform note: Some people argue that people’s difficulty with complex math concepts isn't a biological limitation but a result of how math is viewed. People internalize the idea that math is difficult and define themselves as “a math person” or “not a math person.” Thus, those who struggle with math concepts stop trying to understand them. If people treated math like a second language, which takes a lot of practice and repetition to learn but can be learned by anyone, people would learn complex math easier.)
In this chapter, we’ll look at the following examples of complex math concepts and how not understanding them causes logical errors:
Averages are one of the complex math concepts your brain isn't evolutionarily prepared for, Dobelli explains. When averaging, you add together a group of numbers then divide by the number of numbers. For example, to find the average of 5, 9, 12, and 14, you would add them together to get 40 then divide by four to get 10.
One of the biggest pitfalls when working with averages is ignoring the distribution: the original numbers used to calculate the average. The distribution of the average is important because averages are misleading: Your brain focuses on the average and doesn't consider possible outliers in the distribution. Outliers are the extremes at either end of the scale that drastically change the average, Dobelli adds.
For example, a waiter could serve an average of 20 tables a day. Your brain instinctively limits the range of possible tables waited per day to a range close to 20, such as 15 to 25. However, there might be slow days where the waiter only serves 5 tables, or busy days where they serve 40. The average hides these outliers, so if you ignore the distribution, you don’t have a true idea of the waiter’s workload.
To get a true idea of the workload, you must remove the outliers, Dobelli says. This isn’t instinctive, but it’s important for modern life because outliers are increasingly common.
Averages and Scalable Events
Dobelli’s discussion of distribution and outliers finds parallels in Nassim Nicholas Taleb’s The Black Swan. Taleb sorts events into two categories: scalable and non-scalable. Scalable events have no defined limits, while non-scalable events have defined limits. (Taleb notes that Black Swan events—events that are unpredictable yet highly influential—occur solely in scalable situations.)
Most natural events are non-scalable. For example, there’s a defined limit to how much weight a human can lift; strength and weakness revolve around an average, and there are few outliers in that average’s distribution.
On the other hand, many man-made situations and ideas are scalable, with outliers in their averages’ distribution. There’s no upper limit to wealth, for example, which allows for the existence of billionaires—outliers in the distribution of global wealth. The presence of a single billionaire can significantly raise the average wealth of a town, making this average misleading—most people will earn well below the skewed average. Thus, when dealing with scalable situations, understanding averages and their distribution is important to gain an accurate picture of the situation.
Statistics is another area of complex math that you’re not primed for, Dobelli says. One common statistical error is self-selection bias, in which the nature of the participants in a study influences its outcome. Specifically, people only join studies they’re comfortable responding to, which alters your data, Dobelli says. Those who might provide embarrassing or somehow “undesirable” responses simply won’t take part, narrowing your study’s scope and skewing the results.
Say, for example, a school does a survey on alcohol and drug use. Only students who are comfortable discussing their usage will respond. Students who use alcohol and drugs more are less likely to participate either out of fear of penalization or because they’re ashamed of their usage. This means your results will skew toward lower alcohol and drug usage and not reflect the true usage at the school.
Working Around Self-Selection Bias
How can you avoid self-selection bias? Dobelli doesn’t say, but others say the only way to completely eliminate self-selection bias is studying people who don’t know they’re being studied. However, uninformed research is unethical. Both the U.S. Food and Drug Administration and the American Psychological Association require researchers to receive informed consent from all study participants.
However, you can limit self-selection bias. Most studies do this by collecting demographic information: Researchers look for patterns in the demographics of people who chose to participate, and alter how they weigh the responses to reduce self-selection bias. Specifically, they give greater weight to results from those less likely to self-select, and lesser weight to results from those likely to self-select.
Returning to our earlier example, if you notice that many A- and B-students respond to your survey, reporting low drug and alcohol use, but only a few C- and D-students respond, reporting higher drug and alcohol use, you might weigh those C- and D-student responses more heavily than the A- and B-student responses. The C- and D-students would be less likely to self-select and participate in the study due to shame around their grades and substance use. Weighing the responses of those that do respond more highly will counteract the impact of this lack of self-selection.
Exponential growth is another complex math concept that people don't instinctively understand, Dobelli says. Most things in nature increase linearly, where a number increases by the same amount at regular intervals. For example, if you add one rock to a pile every day, on the 15th day, you’ll have 15 rocks. This is a basic concept that early humans encountered, so your brain naturally understands this kind of growth.
However, exponential growth means that a number doubles at regular intervals, Dobelli explains. For example, you add one rock to a pile on day one, two rocks the next day, four rocks the next day, and so on. By day 15, you’ll have 16,384 rocks. This is not a concept that early humans encountered, so your brain struggles to grasp it. However, it’s increasingly common in the modern world and thus crucial to understand: Everything from investments to product sales can grow exponentially.
Exponential Growth in Nature and Modern Life
While Dobelli is correct that exponential growth was rarely observed by early humans and is now much more observable in modern fields such as finance, exponential growth is still possible in the natural world that early humans would have experienced. For example, if two dogs have six puppies, and each puppy also has six puppies, the dog population would experience exponential growth. However, natural exponential growth is limited by resources: The dog population can only grow exponentially until it reaches “carrying capacity,” the maximum number of dogs the environment can support. Thus, while natural exponential growth is possible, it's rare.
Furthermore, early humans may not have identified natural exponential growth because it’s only noticeable if you keep track of all members of a species, which wasn’t possible millions of years ago. However, it’s easier with modern technology. Once you’re tracking the species, you can spot exponential growth by finding the “doubling time”: If the population of a species doubles within a week, for example, see if it doubles again next week. If the population continues to double within the same time (or faster), it’s experiencing exponential growth.
Now that we’ve covered the main groups of evolutionary fallacies, we’ll look at the following final, miscellaneous fallacies of this kind:
According to Dobelli, ambiguity aversion is the human tendency to dislike uncertainty. Uncertainty is a lack of concrete facts about an outcome. For early humans, lacking facts was deadly: If you didn’t know which plants were edible or where predators lived, you died. Thus, humans evolved to avoid uncertainty.
(Shortform note: Ambiguity aversion is the root of the colloquialism “Better the devil you know.” Even if your situation is bad, you’d rather stay in that situation than face the uncertainty of leaving. For example, say you’re stuck in a job you hate and are offered a new job in a field you know nothing about but might like—you’re not quite sure. Despite not enjoying your current job, you’ll probably choose to stay: Your fear of uncertainty over the new job makes you choose the unhappiness you know.)
However, uncertainty is very prevalent and thus unavoidable in modern society. While the ideal approach to this rampant uncertainty would be to learn to accept and work with it, Dobelli argues, this is near impossible: Your tolerance for uncertainty is pre-determined by how your amygdala, which processes emotions, forms. (Shortform note: Dobelli claims uncertainty resilience is biologically determined. However, others argue that you can increase resilience through “flooding,” during which you expose yourself to increasing amounts of uncertainty. After the initial exposure period, your brain becomes bored with the uncertainty and accepts it.)
Another miscellaneous evolutionary fallacy is action bias: the tendency to take action rather than waiting for a better opportunity or more information, Dobelli explains. For early humans, delaying action could mean death (for instance, if a predator approached). While this is much less often the case for modern humans, we still instinctively want to act in all situations. However, this instinct causes problems in today’s complex world, when impulsive actions are more likely to cause problems than waiting and thinking things through. For example, impulsively deciding to invest in a fund without waiting and researching its likelihood of growth may lead to you losing money.
While the need for patience has grown, society's tolerance of it hasn’t, Dobelli says. People praise those who take action, even if that action was reckless or hasty, while accusing those who choose to wait and gather information of cowardice. (Shortform note: Others argue that Dobelli is only partially right. Studies show that people praise leaders who quickly make positive decisions (like promoting someone) and see them as more trustworthy than those who wait. However, they see leaders who quickly make negative decisions (like cutting someone’s pay) as less trustworthy. Thus, society’s tolerance for quick decisions is situation-specific, rather than uniform as Dobelli implies.)
Action Bias and Manipulation
While Dobelli acknowledges that action bias causes problems in the modern world, he doesn’t cover a specific issue that can have major consequences: People, or even governments, manipulating action bias to fulfill their own goals.
In his book Factfulness, Hans Rosling argues that salespeople, politicians, and activists manipulate action bias by using time limits and worst-case scenarios to spark panic. In the wake of this panic, they encourage people to take drastic actions rather than waiting to gather more information—in other words, to give in to action bias. Simultaneously, they withhold information from the public because it’s easier to inspire panic when the public has little information or experience. Panicking and ill-informed, the public makes hasty decisions that can be disastrous.
For example, say a politician’s running for president. He encourages people to vote for him by saying war with another country is inevitable—a worst-case scenario— and he can mitigate the damage by attacking first. He presents the other country as highly dangerous and emphasizes the need for immediate response. He also exaggerates the consequences of not acting—for instance, that their country’s way of life will be destroyed. This sparks panic among the public. The politician then limits the distribution of any opposing information, so the public can’t allay this panic through research. Thus, the panicking public votes the politician into office, no matter how unjustified or disastrous doing so actually is.
No matter the source, overcoming action bias is important. Dobelli doesn’t explain how to do so, but Rosling suggests the following tips:
Collect information. Examine all facts and evidence before jumping to conclusions.
Don’t panic. Remember that most problems don’t have a “now or never” deadline.
Mind probability. Focus on the most probable scenario instead of the worst case scenario.
Loss aversion is another miscellaneous evolutionary fallacy in which people are affected far more strongly by loss than they are by gaining things. They consequently prioritize not losing things over gaining new things, Dobelli states. Even when the things you lose and gain have the same value, you react to the loss twice as intensely as the gain. (Shortform note: Some people think the more you have, the less this tendency affects you because the loss isn’t as important. This isn’t the case. Rich people are just as anxious over losing their wealth or belongings as poor people—maybe more so, as they have more to lose.)
Loss aversion is an evolutionary trait passed on from early humanity, when losing something—be it belongings, weapons, or food supply—was often fatal, Dobelli explains. Being cautious and avoiding loss kept early humans alive and able to procreate, meaning their cautious genes survived to the current day.
(Shortform note: Evolutionary instincts also determine which losses matter more to you. Losing something you physically had hurts more than losing something you had in a more abstract way. For example, losing $100 cash hurts more than losing $150 electronically, even though you lose more money electronically. Your brain sees the cash as more “real” and thus worse to lose because it hasn’t evolved to deal with abstract, non-tangible items.)
People can use loss aversion to their advantage by framing their pitches negatively rather than positively, Dobelli says. If an ad says “Act now or lose these savings forever!” it's twice as effective as an ad that says, “Act now and get these savings!” (Shortform note: Negative pitches work by building anticipatory regret: They make you imagine a loss and the feeling of regret that comes with that loss. These imagined feelings are strong enough to trigger loss aversion. Anticipatory regret is more effective at boosting sales than excitement or curiosity, making it a valuable tool for marketers.)
Now that we’ve covered evolutionary fallacies, we’ll look at non-evolutionary fallacies. These are fallacies that Dobelli doesn’t classify as having an evolutionary basis: Rather, they have other causes, such as past experience, limited perspective, or information overload.
In this chapter, we’ll cover how misinterpreting cause and effect damages your judgment. According to Dobelli, humans struggle to interpret cause and effect because they confuse correlation and causation. When two events coincide, people assume there’s a causal relationship between the two of them, even when there’s not. For example, if a person gets the flu after they start taking vitamins, they might assume a causal relationship—taking vitamins gave them the flu—simply because the timing coincides.
(Shortform note: How do people make these mistaken links? They take their knowledge of the effect and look for any similar events that might point to a cause, regardless of the likelihood of that similar event actually being the cause. In other words, they look for possible correlations between the events and mistake this for one causing the other. For example, both vitamins and the flu are related to your health, so they’re correlated. Because the timing coincided between taking the vitamins and getting the flu, you confuse the correlation for causation.)
We’ll cover the following fallacies that arise from misinterpreting cause and effect:
One instance in which people confuse the cause-and-effect cycle is the result illusion, which Dobelli calls the “swimmer’s body illusion.” In this bias, you look at the traits of people who do a certain activity and think the activity causes those traits. In reality, the people participate in the activity because they already have those traits. The traits are the cause, not the effect. For example, you might look at a professional swimmer’s toned body and think you can gain that same appearance by swimming. However, these individuals are so good at swimming because they already had that kind of body.
Dobelli suggests using your knowledge of the result illusion to make realistic goals. If you confuse the cause (swimmers’ body type) with the effect (they’re good swimmers), you might set unrealistic goals, like swimming to gain a different body type. If you realize the swimmers were born with that body type, you can set healthier goals for exercising that fit your own body type.
The Result Illusion: Nature vs. Nurture
The result illusion is part of the nature vs. nurture debate: It’s hard to determine if people's genetics or upbringing dictate their characteristics. When you fall for the result illusion, you confuse the person’s nature—natural traits—with how they’ve been nurtured—traits they gained through their activities.
Dobelli implies that people’s traits are primarily assigned by nature, and thus that nurturing doesn’t have as large a role. However, others argue that the divide of importance between nature and nurture is more even: While it's true that some people do naturally have an optimized body for swimming, you can also gain some of those traits by training.
For example, height and long limbs are traits that are dictated by nature. People with these traits are better swimmers due to a natural advantage. However, other traits such as broad shoulders or strong legs are the result of training. Anyone can gain these traits through long hours of swimming.
You can still use this knowledge to set realistic goals, as Dobelli suggests. Consider the traits you want to have and whether they’re dictated by nature or nurture. Let go of impossible goals revolving around natural traits, but don’t give up on those surrounding things you can change.
Another way people misinterpret the cause-and-effect cycle is by ignoring coincidence, Dobelli states. People attribute strange events to the supernatural—for example, believing someone getting struck by lightning twice is a message from God or the universe—because they think such events are impossible and can’t be the result of coincidence.
In reality, these strange events are possible, though unlikely. People can’t accurately estimate probability (which we’ll discuss further in Chapter 8), so they think strange events are less likely than they actually are. Continuing our example, someone getting struck by lightning twice seems impossible, but with 7.7 billion people on earth and 1.4 billion lightning strikes per year, it’s bound to happen eventually.
Thus, don’t get too excited when something unusual happens, Dobelli suggests. Hardly anything is impossible, and the probability of any event occurring is likely higher than you think.
Coincidence and the Law of Very Large Numbers
Dobelli’s explanation that coincidences are more likely than you think is supported by the law of very large numbers: When your sample size is very large, “impossible” events are inevitable. For example, if the chance of being struck by lightning twice is one in 3 million and there are 7.7 billion people on earth, 2,000 people will be struck twice by coincidence.
So why don’t people take the law of very large numbers into account? It’s because humans can’t visualize very large numbers. Since early humans didn’t need to understand such large numbers, your brain didn’t evolve the instinctive ability to do so. Humans’ ability to understand just how large a number is starts to degrade after 100, and it breaks down completely for numbers above a million.
Thus, when people try to estimate probabilities that involve very large numbers—for instance, the size of the human race—they underestimate their sample size and don’t take the law of very large numbers into account. For example, you can’t visualize how big 7.7 billion or 3 million are. You think those numbers are closer in size than they really are, so you assume the chance of being struck by lightning twice is much lower than 2,000. Thus, when it happens, you search for another explanation, attributing it to supernatural intervention.
In this section, we’ll discuss another misrepresentation of cause and effect: association bias, or the brain’s tendency to make connections where none exist. Dobelli says this misrepresents cause and effect by forming false knowledge, where you falsely causally connect two unrelated things.
Superstitions form this way, Dobelli explains. For example, say you bring rainboots when camping, and the weather is perfect. The next time you go camping, you leave the rainboots behind and the weather is awful. The next time you bring them, the weather is wonderful again. After a few of these experiences, your brain connects the boots and good weather, even though it’s just a coincidence that the weather improved when you brought the boots.
Association bias is also involved in traumatic experiences, Dobelli adds. Say you step in a hole and break your leg while playing soccer. Your brain makes a connection between soccer and pain, and you may refuse to play soccer again. Because these connections have stronger inciting incidents—breaking a leg is more memorable than bringing boots—they can be very strong even after a single incident, while more innocuous connections must be reinforced over time.
Association Bias: Why It Happens and How to Overcome Unhealthy Connections
Why does association bias occur? Dobelli doesn’t say, but others argue that association bias is a defense mechanism: Making connections helps you form “protective frames.” These are practices or support systems that let you evaluate risk (for example, the risk of it raining when you go camping). Most of the time, these frames are helpful or harmless, like our rainboot example. However, connections made through traumatic experiences are maladaptive and grow stronger over time. For example, your fear of soccer may expand to fear of walking on any grass as you think it’ll put you at risk of more pain.
Though these connections grow stronger over time, you can change them. One method of changing the connections your brain makes is through exposure therapy, where you gradually expose yourself to the object of your negative connection. This replaces the connection between the object and your negative experience with a positive or neutral experience. For example, you might watch a soccer game or walk around on a soccer field. Your brain takes those positive experiences and forms a new connection to soccer which overrides the negative experience and the feeling of risk.
In this section, we’ll cover the fallacy of the single cause: To make a simple pattern of cause and effect, people oversimplify chains of events, Dobelli says. They do this because, as previously mentioned, humans dislike uncertainty, including uncertainty over how events transpire. Looking for simple patterns reduces uncertainty because the fewer elements involved in a situation, the easier it is to understand the chain of events.
(Shortform note: As Dobelli implies, a single-cause pattern provides a sense of understanding. When people understand a situation, not only are they less uncertain, but they can also better withstand any uncertainty they do feel. In addition, understanding something increases confidence and positive feelings, which further encourages people to simplify the cause-and-effect cycle.)
This mindset is dangerous because everything is affected by a complex web of influences and causes, Dobelli adds. There’s never a single cause for large, complex effects like crime or success. Trying to find one leads people to pin responsibility on a single person or group. Blaming a single person or group alleviates everyone else's guilt without addressing systemic problems, which lets the problems continue unhindered.
You can use the knowledge of this complex web of causes to your benefit, Dobelli says. When trying to determine how to make a project successful, don’t fixate on a single cause of success. Rather, examine all potential influences on your project. Focus on the factors you can influence, and experiment with how you can influence those factors. By experimenting with these factors, you can determine the most influential factors and use them to be more successful. For example, if you run a pizzeria, you can influence the quality of the food and staff you hire. By experimenting with these factors, you might learn that a skilled cook matters more than ingredients to customer satisfaction, so you focus more effort and money on hiring good cooks.
(Shortform note: These experiments are important even if you’re already experiencing success. People who experience success usually fail later because they attribute that success to their own actions without looking deeper. They don’t account for chance or evaluate the factors they influenced in gaining that success, which means they can’t replicate it. The only way to maintain success is to discern its cause and develop your ability to control said cause.)
Occam’s Razor and the Fallacy of the Single Cause
While Dobelli cautions against oversimplifying a situation, others warn against overcomplicating it. For instance, Occam’s Razor states that when presented with two options, the simpler one is better. In this case, “simpler” means whichever option requires fewer assumptions.
For example, consider these statements: “I have a headache because I didn’t drink water” and “I have a headache because I have a deadly disease.” The former requires two assumptions: You didn’t drink water and not drinking water causes headaches. The latter requires three assumptions: You have a disease, the disease is deadly, and the disease causes headaches. The first statement is simpler and therefore better.
Doesn’t this contradict Dobelli’s point? If simple answers are better, isn’t simplifying logical? No, because there’s a difference between “better” and “accurate.” Simpler explanations are better, in that they are easier to understand, good for getting a basic understanding of complicated matters, and more probable. However, simpler explanations are not always accurate: Sometimes the complicated explanation is correct, if less likely.
By this logic, ignoring information to create single-cause patterns trades accuracy for simplicity. As Dobelli says, this leads people to assign too much responsibility to an individual or group, but you also assign too much responsibility to single solutions. If you believe that high illness rates are due to unaffordable healthcare, you’ll work to make healthcare affordable. This singular focus means you won’t realize that you must address other factors like safe housing and income too.
The fallacy of the single cause is the desire to simplify a situation until it has only one cause. This is problematic because everything is caused by a complex web of influences. You can use this knowledge of the web of influences to overcome the fallacy.
Describe a current project. Make a list of all potential influences on the success of that project.
Next, break those influences into two groups: “I can influence” and “I can’t influence.” (Be realistic. Influences like the location of a store you run might technically be under your control, but realistically, you can only change that if you have the funds to move.)
Finally, make a plan to experiment with each item under the “I can influence” group. For instance, if you’re running a store, you might change the frequency of your social media posts or the quantity of your stock and see how it affects sales. (Schedule these experiments one at a time so you can isolate how each influence affects your success.)
In this chapter, we’ll look at fallacies related to memory. People believe their memories are untouchable, stored away and recalled when needed in perfect condition. However, this isn't the case, Dobelli warns. Your memory is affected by your feelings, opinions, and situation.
(Shortform note: Your memories are affected in these ways at several points: First, whatever you were feeling in a moment is tangled up with the details of the situation in your memory. Later, every time you remember the situation, your current mental state further alters your memories. Thus, the more you remember a situation, the more distorted the memory becomes.)
We’ll look at the following fallacies and situations in which your memory is unreliable:
The first and most serious situation that affects your memory is falsification of history. Your brain is constantly rewriting your memories, Dobelli explains. As your opinions and worldview change over time, your brain alters the details of your memories, making you remember the past in a way that better matches your current opinions and worldview.
(Shortform note: Your brain rewrites memories to be helpful: By updating the information, your memories become more relevant to the current moment and your current decisions. However, rewriting memories also means you become overconfident in your beliefs: When you think you’ve always held the same beliefs, you won’t feel the need to challenge them, and you’ll be skeptical of other people’s ability to change their own beliefs. On the other hand, if you remember how your beliefs have changed over time, you’ll be more open to adjusting them again and more welcoming to others who are challenging their beliefs.)
Even people's strongest memories, usually made in joyous or traumatic situations, are dramatically altered as time goes by, Dobelli adds. (Shortform note: Many people believe these “flashbulb” moments remain untouched and are represented totally accurately in your memory. However, traumatic memories are often less accurate than other kinds of memories, not more accurate. Sometimes, traumatic memories are almost entirely fabricated because your ability to form memories is hampered in traumatic situations, not increased.)
Falsification of history is often triggered by cognitive dissonance, or discomfort when your beliefs or desires conflict with your actions, Dobelli says. To alleviate this discomfort, you deny or rationalize your actions until your memories of the situation change: You deny that you ever acted against your beliefs.
For example, if Ann believes stealing is wrong, but contradicts her beliefs by shoplifting, she’ll experience cognitive dissonance. Her brain will rationalize her actions to minimize the conflict or even change her memories so she forgets she ever shoplifted, depending on how severe the dissonance.
(Shortform note: Cognitive dissonance occurs because people have an innate desire for consistency: You want your decisions to be consistent with your beliefs. This is partly why changing your beliefs and worldview is so difficult, as it triggers cognitive dissonance. However, you can train yourself into new beliefs by consistently behaving in line with those new beliefs: As your new pattern of behavior is reinforced, cognitive dissonance will decrease.)
Another reason why your memory is unreliable is the Zeigarnik effect: Before a task is completed, your brain keeps it at the forefront of your memory. However, once the task is completed, you immediately forget about it.
This makes your brain efficient, Dobelli explains. Your brain holds information as long as necessary, but once that information is deemed unimportant, your brain forgets it to free up mental space for the next, important piece of information. Completed tasks are considered unimportant and thus discarded.
There’s one exception to this rule, Dobelli adds. If you have a number of tasks to complete, making a concrete plan to deal with them can signal your brain to forget the tasks before you finish them. (Shortform note: Making a plan could have the same effect as actually completing the task because of imagination’s effect on the brain. When you imagine something, your brain reacts as if you’re actually experiencing it. So when you imagine completing a task, your brain feels like you really have.)
How Does the Zeigarnik Effect Work?
How does the Zeigarnik effect work, neurologically? Dobelli doesn’t say, but others argue that it comes down to repetition and the different kinds of memory. There are three kinds of memory: sensory, short-term, and long-term. Information is received sensorily, after which it's either forgotten or moved to short-term memory. Information generally lasts less than a minute in short-term memory, after which it's either forgotten or moved to long-term memory.
You can increase a piece of information’s stay in short-term memory through repetition, and this is how the Zeigarnik effect works. When you leave a task uncompleted, your brain repeats the knowledge for you, subconsciously keeping the information in your short-term memory. The effort that goes into this repetition causes cognitive tension, which results in anxiety and inability to focus. Once the task is completed, your brain stops the repetition and releases the cognitive tension. Depending on how important the information is, it may move to long-term memory or be forgotten entirely.
The final memory-impacting fallacies we’ll look at are the primacy and recency effects. According to Dobelli, the primacy effect means that the first piece of information you’re introduced to is easier to remember than information introduced later. This is because your brain latches onto that first information and holds it in your short-term memory for longer than usual. This extended stay in short-term memory allows the information to move to long-term memory.
While the primacy effect can extend information’s stay in your short-term memory, it can’t keep it there forever, Dobelli explains. Humans have small short-term memories, so when a new piece of information enters, an older piece of information has to leave. When enough time has passed that the primacy effect stops working and the first piece of information you hear leaves your short-term memory, the recency effect takes over. This means that whatever information you heard most recently is easier to remember. This effect operates entirely in your short-term memory.
(Shortform note: As long as the information is available in your short-term memory, it's easier to recall. However, because short-term memory generally holds information for less than a minute, the recency effect is only effective for brief periods of time.)
How do you overcome these effects? Dobelli suggests paying close attention to each individual piece of information you learn so that information you gain in the middle of a situation doesn't get crowded out of your memory by the first or last things you learn.
(Shortform note: While it’s a good idea to lessen the primacy and recency effects, you can also manipulate them to your benefit. When trying to memorize something, focus on the most important information first to trigger the primacy effect and transfer the information to your long-term memory. Second, review the information shortly before you need it to trigger the recency effect.)
How the Primacy and Recency Effects Work
There are two theories of how the primacy and recency effects work. As noted, Dobelli ascribes to the first of these theories, which states that the operation of the short- and long-term memories causes the effect.
In the second theory, the effects rely on the frontal lobe. The frontal lobe is the section of your brain directly behind your forehead, and it's responsible for your focus, as well as forming and retrieving memories. These functions are linked: Memories form around things you focus on.
According to this theory, the primacy and recency effects occur because you’re most likely to pay attention during the beginning or end of experiences, thus activating your frontal lobe and forming memories. Meanwhile, you lose focus during the middle of an experience, which means the frontal lobe is not activated and memories don’t form.
The Zeigarnik effect impacts how your brain handles tasks, Dobelli says. Before a task is completed, it stays in the forefront of your mind, often causing stress. Once you’ve completed the task, though, you forget it. You can manipulate this tendency to clear your mind of pressing tasks and alleviate stress.
Make a list of your three most pressing unfinished tasks that are clogging your mind and causing you stress.
Next, make a plan to deal with each task on the list. Be specific, with actionable steps. (Break complicated steps down further. For example, if your task is “Clean the house,” be more specific with your steps than “Clean the living room.” Instead, you might write, “Vacuum the carpet, clean the windows, and sweep.”)
Finally, reflect on your mindset: Are you feeling more at peace, less at peace, or the same since making your lists? Why? (If you’re not feeling at peace, try adding more concrete and specific steps to your list and then come back to this question.)
The next set of fallacies we’ll cover revolves around probability and predictions. Dobelli reminds us that people hate uncertainty, and he says that they try to predict future events to reduce that uncertainty. However, to make accurate predictions, you must understand probability, which humans struggle with.
Thus, people’s predictions are usually inaccurate, even those from experts who make predictions for a living. Yet, despite their inaccuracy, people still rely on predictions to make important decisions, leading to logical errors and financial loss.
(Shortform note: As Dobelli notes, even though humans struggle with probability and predictions are unreliable, people still make a living estimating probability and making predictions. This is authority bias: Because the person is seemingly an expert at making predictions, you assume they know what they're doing. However, no matter how knowledgeable the person, they’ll still struggle to process the information needed to make accurate predictions.)
In this chapter, we’ll look at the following reasons people misunderstand probability and make inaccurate predictions:
The first fallacy we’ll cover is neglect of probability. According to Dobelli, people struggle to make good decisions because they neglect to consider the probability or risk involved in those decisions. Logically, they should choose the option with the highest probability of going well for them and the lowest risk of going badly. However, people instead choose the option that will have the biggest positive impact on them if it occurs, regardless of how likely it is to occur. For example, Dobelli argues that people are more likely to invest in a risky stock that could have a huge return on investment than a safe stock that has lower returns.
(Shortform note: Dobelli claims that investors frequently invest solely based on the possible yield of an investment, rather than its risk, and uses this example to prove the validity of his claims regarding neglect of probability. However, this example isn’t accurate: Investors use various risk management strategies when investing, including analyzing the historical value of the stock to determine the risk of its value changing in the future.)
Denominator Neglect
A type of neglect that Dobelli doesn’t cover is denominator neglect, which Daniel Kahneman describes in Thinking, Fast and Slow. If probability is a fraction, with the situation you’re evaluating on top and the total number of possibilities on bottom, people base their judgment of risk solely on the top number, or numerator, and ignore the bottom number, or denominator. This means they regularly misinterpret probability, since the numerator depends on the denominator.
For example, if you only considered the numerators when comparing a 1/100 chance of injury and a 100/10,000 chance of injury, you’d misinterpret the second scenario as much more likely to occur, even though the probability of injury is actually the same.
Another probability-related fallacy is hindsight bias. Dobelli says hindsight bias makes unpredictable past events seem like they should have been easily predictable: People see an obvious pattern of circumstances that led to a past event occurring, and they think people should have noticed that pattern and predicted the event. At the time, though, the pattern wasn’t clear, so people couldn’t use it to predict the event. It’s only with hindsight that the pattern becomes clear.
For example, take the sinking of the Titanic. Looking back, there’s a pattern leading to the disaster, including weak construction and little lifesaving equipment. At the time, however, the pattern wasn’t clear. The construction seemed standard because no one anticipated extreme collisions, and the ship lacked equipment because the officer with the key to the storage room transferred off the ship. On top of these human influences, bad weather made conditions more dangerous. Thus, chance played a major role in the disaster. The “obvious” pattern of events people point to today is an example of hindsight bias.
Hindsight bias encourages overconfidence, Dobelli says. You think you’re good at detecting patterns when really, you’re not: You’re only seeing them because of hindsight. When you try to apply these pattern-spotting “skills” to predicting the future, you fail. (Shortform note: This overconfidence is dangerous when your inaccurate predictions affect your livelihood. For example, you might think you see successful patterns and invest in a company that fails. Because you were overconfident, you didn’t notice any warning signs that the investment wouldn’t work out.)
To avoid overconfidence, Dobelli recommends journaling your expectations and comparing them to reality. Because your expectations will often be incorrect, having a record of them forces you to accept your own inaccuracy. (Shortform note: Journaling also forces you to accept uncertainty by making you realize that you can’t really predict things. The more accepting you are of uncertainty, the less hindsight bias affects you.)
Hindsight Bias and Memory
Dobelli credits hindsight bias to people’s pattern-finding tendencies, but memory plays an important role in this fallacy as well. Hindsight bias changes your memories, making you think you always knew what was going to happen.
Your brain can alter your memories because of how it handles predictions: When shown two possibilities, your brain creates reasons why both are possible. Once Possibility A occurs, your brain doesn’t need to remember Possibility B. It forgets that information, making you believe Possibility A was obvious all along.
This altered memory also creates overconfidence. You forget any prior uncertainty or incorrect predictions, which reinforces your overconfidence about your pattern-finding and prediction abilities.
In this chapter, we’ll cover fallacies that affect how you value things. According to Dobelli, humans tend to value a person, situation, or item for arbitrary and illogical reasons. We’ll look at the following arbitrary standards of value:
The endowment effect is the first illogical shift in your valuation of an item. When you own an item, you subconsciously increase its value simply because it's yours, Dobelli explains. (Shortform note: Dobelli doesn’t explain why the endowment effect occurs, but others argue that it stems from loss aversion, discussed in Chapter 5. Once something is in your possession, you fear losing it, which makes you value the item more.)
The endowment effect makes buying and selling things difficult, Dobelli adds, because you inflate the value of an item to be higher than others value it. You’ll even turn down generous offers for the item because you’ve mistakenly increased its value so much.
The endowment effect also influences people who don’t own an item yet, Dobelli says. Even the anticipation of owning an item can make you subconsciously increase its value, especially if you’re in competition for the item. (Shortform note: This tendency is called “virtual ownership.” It’s manipulated through advertisements, which let you imagine ownership of an item by showing you what it’s like to own it, and trial periods, which let you own an item for a limited time.)
How can you overcome the endowment effect? Dobelli suggests thinking of your possessions as things you only have temporarily. This lessens your sentimental attachment to the items.
Overcoming the Endowment Effect: Think Like a Vendor?
While Dobelli believes the endowment effect influences anyone who owns an item, others argue that it only activates when you’re going to use the item. If you have an item for the sole purpose of exchanging it, the endowment effect doesn’t apply. This is how vendors are immune to the endowment effect and can sell goods at fair prices.
Since accurately valuing and selling items is difficult only when they’re personal items, if you avoid attachment by viewing your belongings like a vendor views their stock, you’ll have a more accurate idea of their value. However, avoiding connection to items in this way may harm your well-being. Valued belongings become an extension of your identity and a way to express your personality, and preventing those connections from forming can make you feel stifled and unable to be yourself.
Liking bias also affects how you value people, specifically. The more you like someone, the more value you put on their opinions and desires, Dobelli says. This means you’re more likely to do something for an individual you like.
This bias is frequently exploited in marketing. According to Dobelli, there are three main components to liking someone that are manipulated in marketing:
1. Physical attractiveness. You’re more likely to see attractive people as likable, which is why only attractive people appear in advertisements. (Shortform note: Attractiveness is subconsciously associated with good traits like kindness and generosity, which is why you’re more prone to like attractive people. However, the definition of “attractive” changes over time, meaning this subconscious standard of goodness—and the advertisements that take advantage of it—change just as quickly.)
2. Similarity. You find people who look, sound, or have experiences like you more likable. Marketers manipulate this through “mirroring”: purposefully mimicking other people’s behavior and speech patterns so the other person will like them and be more likely to purchase their products. (Shortform note: Mirroring might be manipulated by marketers, but it’s also used by minorities. For instance, black people often feel pressured to mirror their white counterparts to avoid prejudice. Studies show that minorities who change the way they speak, how they wear their hair, and sometimes even their name to fit in with their white counterparts are viewed as more professional and likable.)
3. They already like you. If someone is being friendly and makes it clear that they like you, you’re more likely to like them back. Marketers manipulate this tendency through flattery. (Shortform: It’s important to recognize when you’re being flattered because flattery’s goal is to control your behavior, not give you an honest compliment. You can identify flattery by paying attention to the circumstances in which someone compliments you. Do they only compliment you when they want something or when they know you can hear them? If so, they’re probably using flattery. Lessen flattery’s control over you by weighing the sincerity of someone’s compliments rather than how they make you feel.)
Dobelli suggests avoiding liking bias by separating the seller from the product. Would you purchase the product if the person selling it to you wasn’t there, or if they were unlikeable?
Liking Bias and Alliances
While Dobelli explains how people manipulate liking bias, he doesn’t say why liking someone makes you value them more. Some experts say it’s because when you like someone, you form an alliance with them. Having a common goal (friendship) unites you and the other person, making you more likely to value them and, as Dobelli says, fulfill their desires.
The three methods Dobelli discusses all manipulate liking bias with the goal of forming alliances: Physical attractiveness is associated with goodness, as discussed above, and you want to form alliances with attractive, “good” people. Similarity takes advantage of you already being in one alliance with the other person—whether preferring the same sports team, being a member of the same race, or coming from the same city—to encourage you to form a different alliance with them. Flattery emphasizes that the other person wants to be in your group, rather than making you join theirs, giving you a self-esteem boost and making you more likely to agree to the alliance.
Finally, Dobelli’s advice to avoid liking bias also operates on the alliance principle: Separating the seller from the product breaks the alliance.
Another error in thinking that affects how you value things is the sunk cost fallacy. According to Dobelli, the more time, effort, or resources you invest in something, the higher you value that thing. You'll also be more resistant to parting with it, even if keeping it means losing more time, effort, or resources in the future. (Shortform note: The sunk cost fallacy stems from a fear of waste: Most people try not to waste time, money, or effort, and letting go of something you’ve invested resources in feels like wasting those resources. While this is technically true—it is a waste of time, money, or effort—continuing to invest resources only creates more waste.)
People fall into the sunk cost fallacy because they don’t like admitting when they’re wrong, Dobelli adds. When they’ve invested resources into something that starts failing, people hold tighter to that thing instead of admitting their mistake, hoping the situation will improve and justify their decision.
(Shortform note: People want to justify their decisions because they’re afraid their mistake could define their life. However, the opposite is true: If you accept your mistake, cut your losses, and move on, your mistake will be forgotten as you move forward and make better decisions. On the other hand, clinging to a mistake lets it define you.)
Dobelli suggests overcoming the sunk cost fallacy by focusing on whether something is serving you in the present and will continue to do so in the future, rather than on what you’ve invested in the past. (Shortform note: This doesn’t mean ignoring the past entirely: It specifically refers to ignoring past investment. In other words, consider the past to make good decisions about the future based on all the data you’ve collected, but don’t let past effort stop you from moving on.)
The contrast effect is another fallacy that changes how you value things. Humans aren’t good at objective judgments, Dobelli says, so when something is presented as the better of two options, you see it as more valuable than if it's presented alone.
For example, a salesperson shows you two TVs: one that’s $1,000 over your budget, while the other is $500 over your budget. The cheaper TV isn't a good deal on its own because it’s still over your budget, but it’s a good deal in comparison to the more expensive option. Thus, instead of looking for a TV within your budget, you’ll probably accept the one that’s $500 over budget.
The Contrast Effect as a Heuristic
Why do people fall for the contrast effect? Dobelli doesn’t say, but others argue that it’s because the contrast effect is a heuristic: It helps you make judgments faster while expending less energy. Your brain lessens energy usage whenever possible, so it prioritizes swift judgments between two options, even if those options aren’t the best available.
In fact, your brain prioritizes swift comparison so much that when choosing between two items that are similar and a third that’s different, your brain dismisses the third option because it’s not comparable. Salespeople can manipulate this tendency, as well. Returning to our example, if the salesperson wants you to purchase a flatscreen but you’re considering a projector as an alternative, they’ll show you Flatscreen A, Flatscreen B, and Projector. Because Flatscreen A and B are similar and thus comparable, your brain dismisses Projector as an option, guiding you to the salesperson’s goal.
The sunk cost fallacy means that the more you invest in something, the more you value that thing. This means you’ll hold onto things you’ve heavily invested in beyond their usefulness. You can overcome the sunk cost fallacy by focusing on the future benefits or consequences keeping that activity or item would bring.
Describe something you’ve been investing in. (It could be a hobby, a business, stocks, or anything that takes time, money, or effort.)
Next, describe the potential benefits and consequences of keeping that activity or item. How likely is it that those benefits and consequences will occur? (Be realistic: The goal is to get an objective view of the situation.)
Now, describe the potential benefits and consequences of discarding that activity or item and their likelihood of occurring.
Finally, weigh the potential benefits and consequences of each option. Does it make more sense to keep or discard the activity or item? Explain your reasoning.
Our next set of fallacies revolve around situations in which you have too much of a good thing. Many times, you make bad decisions because you don’t have enough options, information, or experience to make logical ones. However, too many options, information, or experience can also inspire irrational decisions, Dobelli says.
In this chapter, we’ll look at the following situations where excess causes irrationality:
The first situation we’ll cover is having too many options. Most people think that having more options to choose from is better than having fewer, Dobelli says. However, having too many options can be just as bad as not having enough for three reasons:
1. Having too many options paralyzes you. A wide range of options makes you so afraid of choosing the wrong one that you avoid any decision, Dobelli says. (Shortform note: Some people tie this paralysis to loss aversion, discussed in Chapter 5. The more options available, the more options you lose by choosing. You fear losing the other options more than you want to gain a single option by choosing, so you don’t choose.)
2. Having too many options lowers your standards. Lower standards make your decision easier by lessening the analysis needed for each choice, Dobelli adds. For example, if you’re overwhelmed by options for a new dishwasher, you won’t look for an affordable, quiet model that’s the right size with a good warranty. Instead, you’ll pick whichever one is the right size, regardless of other features.
(Shortform note: You make your choice easier by lowering your standards because you’re suffering from decision fatigue. Analyzing your options drains your willpower, leaving you without any willpower to make the final decision. Therefore, to make a decision, you dismiss your standards, focusing your remaining willpower on a single feature.)
3. Having too many options inspires uncertainty. After making your decision, you’ll be unhappy because you’ll never be sure it was the right choice, Dobelli says. With so many other options available, how do you know a different dishwasher wasn’t the better choice after all?
(Shortform note: This uncertainty stems from buyer’s remorse. You narrow your choice criteria, as discussed above, but you don’t forget about all the other options and features you’re sacrificing. You ignore those options to make your choice easier, but once you’ve made your decision, the knowledge of what you ignored returns, inspiring remorse and uncertainty.)
How can you avoid being overwhelmed by too many options? Dobelli suggests writing down the qualities that are important to you before evaluating your options. This stops you from becoming overwhelmed because it lets you immediately dismiss options that don’t fit your criteria.
(Shortform note: When writing down the qualities you want, be realistic about what you can actually gain. For example, if you can’t afford the dishwasher that meets all your criteria, then you can’t gain that dishwasher and need to broaden your scope. This mindset also applies when you’re making life-changing decisions. While “You can do anything” is a nice sentiment, it’s not actually true: What you can do is based on your past experience and skills. Thus, objectively evaluate your experience, skills, and options to focus on the options you can actually achieve. Doing so will ease decision anxiety and make you happier.)
People also consider having more information to be better than having less. However, too much information causes problems, too. Dobelli says an excess of information causes problems in two ways: burying basic facts and wasting time. When you have too much information, a lot of it’s irrelevant to your situation. This irrelevant information makes it harder to access the useful, basic facts about your situation. In addition, you have to waste time unearthing those basic facts.
For example, consider an expert teaching a freshman physics class. The expert has so much information on atomic physics that he struggles to simplify his course and focus on the information the freshmen need to know: The basic facts are buried by the teacher’s knowledge. In addition, the teacher wastes time discussing information beyond the freshmen’s level. The students have too much complex information and not enough time to understand the simpler information they actually need.
Information Overload in the Modern World
The excess in information that Dobelli discusses is also called “information overload.” When experiencing information overload, your extensive knowledge is confusing rather than helpful. You can’t sort the information by relevance, so distracting, unimportant information intrudes on your thought processes.
The dangers of information overload that Dobelli discusses have grown increasingly severe in the modern world because there’s so much information available. Approximately 2.5 quintillion bytes of information are put online every day, and social media algorithms and 24-hour news cycles continually expose you to that information. Rather than helping people to stay updated with world events, this inundation of information makes it almost impossible to find the basic facts about certain situations because they’re buried by opinion pieces and analysis. This increases the time you spend trying to understand these situations.
Modern information overload arguably causes even more problems than Dobelli outlines: A lot of the information in the news is negative, and a constant influx of bad news sparks anxiety and depression. This is why many people take “information detoxes” or breaks from social media: When experiencing depressing information overload, ignorance really is bliss.
Finally, having too much experience, which Dobelli calls “déformation professionnelle," can also cause problems. When you have a lot of experience in a particular field, you form a rubric based on your experience that you follow when identifying and fixing problems. This is an asset when dealing with problems in your field of expertise. However, you apply this rubric when dealing with problems outside your field of expertise, too. This leads you to misinterpret situations or use your knowledge in harmful ways.
For example, soldiers are experts in emotional regulation and efficiency. These rubrics of problem solving are essential to how soldiers think during conflict because they allow the soldiers to act quickly in high-stress situations. However, when they return home, the same rubrics that helped soldiers in the military can damage their relationships. Trying to deal with a crying toddler through emotional regulation and efficiency won’t end well.
To overcome this issue, Dobelli recommends expanding your knowledge to give yourself as many rubrics of behavior as possible. This means you’ll know the best ways to behave in a variety of fields and situations, not just the one you’re an expert in. It's especially important to compensate for any areas your existing rubrics and expertise leave undeveloped. The soldier, for example, might learn more about the benefits of emotions or how to accept people being less efficient.
Over-Experience, Common Sense, and Morality
As discussed, Dobelli calls having too much experience “déformation professionnelle,” a play on the French phrase for professional training. The idea is that after a certain point, professional training stagnates your growth instead of helping you.
While most people agree with this definition of déformation professionnelle, some take it a step further. They argue that déformation professionnelle erodes your humanity as well as your adaptability. They say the modern focus on professional development means people discard their morals, compassion, and wisdom in favor of advancing their careers.
Even if you ascribe to the second definition, Dobelli’s recommendation to expand your collection of rubrics arguably fits. Forming other rubrics prevents you from focusing solely on professional development, protecting your humanity.
However, not everyone agrees with Dobelli’s recommendation. Some argue that you only need one rubric: common sense. They argue that if you step away from your rubric of expertise and look at your project with common sense, you’ll catch problems that your expertise hid.
That said, making this shift to common sense is very difficult. If you struggle to shift to your common sense rubric, you can also ask a non-expert for help. Chances are, they’ll be able to see a solution since they’re not blinded by expertise.
Now that we’ve looked at the major types of non-evolutionary fallacies, we’ll look at the following final miscellaneous fallacies:
When you suffer from confirmation bias, you retain information that reinforces your underlying beliefs or desired conclusions while ignoring contradicting evidence. Everyone has a tendency toward confirmation bias, Dobelli warns.
(Shortform note: Here, Dobelli references both underlying beliefs and desired conclusions. While confirmation bias can reinforce both, your underlying beliefs ultimately overpower your desires. For example, an anxious individual might desire confidence in their friends, but they believe their friends dislike them. Their confirmation bias takes neutral interactions and uses them as evidence to reinforce this belief, ignoring their desire.)
Some people manipulate others’ natural tendency toward confirmation bias, Dobelli adds. Fortune tellers, for example, give vague statements, trusting that your confirmation bias will latch onto the most fitting interpretation and reinforce your belief. (Shortform note: Avoid this manipulation by evaluating your sources’ reputation and whether they benefit from reinforcing your beliefs—fortune tellers earn money by making you believe their abilities, for example.)
Dobelli recommends avoiding confirmation bias by writing down your beliefs and finding evidence that disproves those beliefs. (Shortform note: Others are more specific, suggesting finding a minimum of four arguments against your position. These arguments must be convincing: Using weak arguments is falling for confirmation bias, not combating it.)
Another non-evolutionary fallacy is omission bias: When both acting and not acting have negative results, you’re prone to not acting. This bias causes problems when acting could at least mitigate the negative results, Dobelli explains. In other words, both Option A (active) and Option P (passive) cause negative result X to occur. Even though taking Option A means X will be less serious, you’ll choose Option P because of omission bias.
For example, if Option A was "act to close a school" and Option P was "passively let the school slowly fail," many would choose Option P. This is illogical, as letting the school fail is a waste of time and money, the school’s educational standards will drop over time, and the students will probably suffer more than if you closed the school. If you took Option A, while the students would temporarily be stressed and their education disrupted, they’d quickly find a new school, likely one with no chance of closing and more consistent educational standards.
Omission Bias and Morality
Why does omission bias occur? Dobelli doesn’t say, but others argue it’s because you feel guilty when your action causes negative results, but less guilty when your inaction causes negative results. Your brain alleviates mental strain by generalizing moral questions: Instead of having to assess the morality of every action and inaction, you classify inaction as better than action, making your decisions easier and alleviating guilt.
However, this heuristic is flawed because sometimes, inaction is worse than action. In fact, several major religions reject the idea that inaction is better than action. For example, Catholicism believes that failing to act necessitates forgiveness, and Buddhism says that sometimes even violent action is morally superior to inaction.
Alternative bias is another non-evolutionary fallacy: People hone in on two options—where Option A is “standard” and Option B is terrible—to the exclusion of every other option. This bias can manipulate you to follow other people’s plans, Dobelli explains. It makes Option A seem like your only choice, when a different option could actually suit you better.
To counter alternative bias, Dobelli recommends looking at underrepresented options. Often, a more moderate third option may fit your situation better than the extremes of Options A and B.
For example, many people feel pressured to attend college. The situation is presented as Option A: Go to college and get a good job, vs. Option B: Don’t go to college and don’t get a good job. However, there are other options, including Option C: Go to trade school and get a good job, or Option D: Complete an apprenticeship and get a good job. Option A is considered “standard”, but high tuition can make college a hindrance rather than a help. Looking at the underrepresented options can provide a more fulfilling and profitable experience than falling for alternative bias.
Overcoming Alternative Bias Through Perspective
Dobelli presents the solution to alternative bias as needing to expand your horizons and look to less conventional options. However, others argue that looking at more options isn’t enough to combat the bias. You can feel just as trapped choosing between three or four options as two, they point out.
To overcome this bias, instead of merely increasing the number of options available to you, consider both broadening your horizons and changing your perspective to assess each option logically. Here are a few ways to change your perspective:
Consider what you have. For instance, Option A: College assumes you can afford tuition. Challenge those assumptions; see where you stand and how that affects your choice.
Consider the gains and losses. A loss in one option (for instance, trade school doesn’t have great job mobility) may be a gain in the other (college degrees provide greater job mobility).
Consider what’s most important to you. All decisions are somewhat uncertain. Determine what’s important enough that it must be certain, and consider that first when deciding.
Another miscellaneous non-evolutionary fallacy is what Dobelli calls “incentive super-response theory,” but that we’ll call incentive manipulation for simplicity. People offer incentives to inspire hard work or better cooperation, Dobelli says. However, they don’t always consider the unintended consequences of incentives: Others will work hard for incentives, but they’ll also strive for those incentives however suits them best, even if their actions contradict the principle behind the incentive. Thus, the people setting the incentives can hurt their own cause if they’re not careful.
For example, a mother tells her son that when he cleans up his toys, he can have a cookie. She wants to encourage responsible behavior and sees the cookie as a reward that will encourage that responsible behavior. However, the son doesn’t care about learning responsibility: He just wants more cookies. Thus, the son manipulates the situation and makes more messes so that he can tidy up more often and earn more cookies, contradicting his mother’s original goal of teaching responsibility. The mother’s incentive didn’t consider the potential for manipulation, so it ultimately harmed her attempts at teaching responsibility.
Forming Good Incentives
How can you avoid manipulation and set effective incentives? Dobelli doesn’t say, but others suggest that effective incentives use the other people’s intrinsic motivation. Having intrinsic motivation means you do things because they’re personally rewarding, in contrast to having extrinsic motivation, which means you do things because of an outside incentive.
Here are some tips for using people’s intrinsic motivation in incentives:
1. Give them tasks they’re passionate about. The extrinsic motivation of an incentive can encourage people to work on tasks they dislike, but this leads to burnout and loses effectiveness over time. On the other hand, the intrinsic motivation of a task someone’s passionate about is fulfilling and long-lasting.
2. Provide positive feedback. Positive feedback encourages people and increases their pride in their tasks, which in turn increases intrinsic motivation.
3. Give them tools to complete their tasks. The more prepared people are for a task, the more confident and willing they are to complete that task, improving intrinsic motivation.
For our example, the mother might make her son more excited about cleaning by turning it into a game, providing plenty of praise, and making sure he knows where everything belongs. If she encourages this intrinsic motivation, she soon won’t have to provide any incentives at all, as her son will clean his toys unprompted.
Finally, we’ll cover the planning fallacy: People make plans that fail because they believe themselves to be more capable than they really are. They overestimate the benefits of their actions and minimize the risk and cost involved in their plans, Dobelli says. (Shortform note: Dobelli’s definition of the planning fallacy is unique because he includes factors like risk and cost in this overconfidence. In contrast, most definitions only include people’s overconfidence in estimating the time required to execute a plan.)
People never get better at realistic planning, Dobelli adds. One of the reasons you don’t improve is self-esteem: You want to feel good about yourself, so you exaggerate your abilities. (Shortform note: Basing self-esteem on your abilities means you’ll exaggerate when you need a boost, but you’ll feel worse when you struggle to meet those exaggerated standards. Instead, boost your self-esteem healthily by challenging self-criticism and forming relationships with positive people.)
Another reason you don’t improve at planning is life’s unpredictability. You can’t account for surprise events like bad weather or car accidents when planning. (Shortform note: Surprise events are called "Black Swans," and they can be positive or negative. While you can’t predict Black Swans, in The Black Swan, Nassim Nicholas Taleb says you can prepare for them by managing risk and maximizing your chances of positive Black Swans. When making an investment, for example, minimize risk for 90% of the investment, then take high risks with the remaining 10% to increase your chance of a positive Black Swan.)
Dobelli recommends mitigating the planning fallacy by estimating risks and costs higher than feels necessary and benefits lower than feels necessary. Also, avoid the impulse to plan in detail. Narrowing your focus on details makes you more likely to be surprised by unexpected events, exacerbating the fallacy. (Shortform note: Dobelli claims detailed planning exacerbates the planning fallacy, but others disagree. People are better at planning for small goals, so breaking a big project into its composite parts makes plans more accurate. In addition, setting intentions like the location and timescale in which to complete goals increases planning accuracy.)