If you think you understand why you make the choices you do, you’re probably wrong. The human mind is far more prone to error than we’d like to believe, which we know thanks to the groundbreaking research by two Israeli psychologists, Daniel Kahneman and Amos Tversky. Their collaboration produced ideas that reshaped the way psychology understands the mind.
The story of Tversky and Kahneman’s lives is as riveting as the truths they uncovered. On the surface, the two men couldn’t have been more different. Kahneman was reserved and given to self-doubt, while Tversky was brash, outgoing, and magnetic. Yet both had a deep interest in the workings of the mind and were insightful, inquisitive thinkers with a knack for seeing flaws in ideas that everyone else took as given. They identified systematic ways the mind fools itself, why we make irrational decisions, and how we reshape reality when the truth is too painful to bear. By making us aware of our mental fallibility, Kahneman and Tversky gave us the tools to sharpen our thinking and avoid the pitfalls of unconscious bias.
The author, Michael Lewis, first rose to popularity with his book Moneyball about the use of statistical analysis in major league sports. While that book was a hit, Richard H. Thaler and Cass R. Sunstein (co-authors of Nudge and Noise) pointed out that the science Lewis referenced came out of the work of Kahneman and Tversky. When Lewis looked into the psychologists in question, he found a whole new story that needed to be told.
The first part of this guide will examine the collaboration of Tversky and Kahneman, with a focus on how their personal experiences led to the paths they took in their research. In the second part, we’ll explore the three major theories formulated from their work: heuristics, prospect theory, and the concept of undoing. In addition, this guide will look at the historical context that shaped Kahneman and Tversky’s lives. We’ll expand on the implications of their work, how it applies to a variety of disciplines, and how their discoveries have been built upon over time.
Over the years, there have been collaborations, such as the Wright Brothers or Steve Jobs and Steve Wozniak, whose combined influence far exceeded what they would have achieved on their own. In psychology, Michael Lewis suggests that the quintessential duo was Daniel Kahneman and Amos Tversky. As individuals, they were unconventional thinkers, but together they amplified each other’s strengths, allowing them to upend behavioral psychology and change the way we understand the foibles of the human mind. In this section, we’ll examine Kahneman’s upbringing and early work, followed by Tversky’s parallel path, the beginning of their collaborative process, and the differences that eventually drove them apart.
According to Lewis, the more reserved of the two was Daniel Kahneman, a Holocaust survivor who, from an early age, developed an interest in human behavior, particularly in our capacity for error. He wondered what his own mental errors could teach him about the human mind. From his childhood in German-occupied France to his education in war-torn Israel, Kahneman would grow from a bright, young student to an insightful researcher with an eye for faulty thinking.
Though born in Tel Aviv in 1934, Kahneman grew up in Paris and was there for the outbreak of World War II. His father, like many others, misjudged how far the Germans would penetrate into France. Later in life, Kahneman would explain this error as a function of the availability heuristic, covered in detail elsewhere in this guide. (Shortform note: In particular, this was the cognitive error that set Kahneman on his eventual path. While it’s untrue that France was militarily unprepared, the breakdown was one of intelligence. French commanders enacted battle plans devised after World War I, with little flexibility for new situations.)
Kahneman and his family fled to the south, where they hid their existence as Jews. It was there, Lewis says, that young Kahneman became a student of other people’s behavior, which he quietly watched as an outsider. His father died before the war ended, and Kahneman’s family moved with many others to the future state of Israel in 1946.
(Shortform note: In the aftermath of World War II, the Allies repatriated many Holocaust survivors to their homelands, but nearly two million refused due to violent antisemitism. A long-brewing movement to establish a Jewish nation grew in strength and, as Lewis mentions, the United Nations voted to partition Palestine into separate Jewish and Arab states in 1947.)
Lewis writes that Kahneman learned Hebrew quickly, but unlike other students, he didn’t assimilate into Israeli culture. A vocational test pointed him toward psychology, and while service in the Israeli army was mandatory, Kahneman’s grades were high enough that in 1951 he was allowed to go directly to college before having to join the armed forces.
Kahneman went to Hebrew University, the only college accessible to him at the time. Because the school was still piecing together its psychology curriculum, Lewis claims that Kahneman essentially had to educate himself. (Shortform note: Hebrew University first opened to students in 1925. It was forced to suspend classes during the 1948 Arab-Israeli War, but reopened in 1949. Despite the following decades of turmoil, Israel’s university system grew into one of international renown and led Israel to become one of the most educated countries in the world.)
According to Lewis, the leading field of psychology at that time was Behaviorism, the study of mental conditioning through stimuli and rewards. Kahneman, on the other hand, was drawn to Gestalt psychology, which suggested that the mind acted as an interpreter between external stimuli and internal perceptions. The question at the center of Gestalt psychology was how the mind used the various senses to create a comprehensible picture of the world.
(Shortform note: Behaviorism was popularized by Ivan Pavlov and B.F. Skinner, who studied the physical behavior of living things without reference to internal thought processes. Gestalt psychology, introduced by Max Wertheimer, views the mind’s inner workings and our external behavior as individual parts of a larger whole.)
Unlike countries in which psychology was purely academic, Lewis suggests that Israel was interested in its practical applications. Upon graduation in 1954, Kahneman joined the army and, as the military’s only psychologist at the time, was assigned to evaluate new enlistees. He noted that the people interviewing recruits were falling victim to the “halo effect,” in which their first impressions colored all following judgments of a candidate.
(Shortform note: The halo effect is normally associated with positive associations, such as when marketers use celebrities and models in advertising. However, on a darker note, Kahneman points out in Thinking, Fast and Slow that the halo effect also applies to negative judgments. For example, one study reveals that jurors are more lenient to attractive defendants in court cases.)
Kahneman determined that the predictions he and other evaluators made about recruits did not bear out when compared to their actual performance. According to Lewis, Kahneman decided to remove human judgment entirely from the process, developing an evaluative “Kahneman score” based on new recruits’ documented behavior rather than faulty human perceptions. He found that his department’s evaluations improved once “gut feelings” were removed from the equation.
(Shortform note: Not everyone agrees with Kahneman’s disdain for gut feelings. Journalist Malcolm Gladwell has widely touted the benefits of instinctive judgments in his popular book Blink. However, even Gladwell admits that gut decisions can be influenced by personal bias, and that the unconscious mind needs to be trained in order to make quick decisions without error. Because Kahneman’s method for minimizing emotional error was so effective, it has also been used in the job-hiring process—the Israeli army continued to use a version of Kahneman’s scoring system until replacing it with a more inclusive model in 2015.)
In 1961, Kahneman earned his doctorate from UC Berkeley and began his postgraduate research at the University of Michigan. There, Lewis says, he discovered a conflict between the thinking mind and sensory perception. Returning to Hebrew University, he continued his research into human vision by focusing on what it failed to perceive and how it could be tricked.
(Shortform note: The study of optical illusions was especially important to Kahneman because it shows that people can be persistently fooled, even those who know the illusions exist. Research on visual illusions has shown that most illusions occur in the brain, not the eye, while revealing how the brain translates sensory impulses into a cohesive, yet flawed, picture of the world.)
Amos Tversky couldn’t have been more different from Kahneman. Whereas Kahneman was insecure and sensitive to criticism, Lewis relates that Tversky was energetic, outgoing, and self-assured, with an approach to science characterized by seeing problems through wholly novel angles. Between growing up in Israel, attending school in the US, and returning to his homeland, his academic studies were bracketed by stints of wartime military service.
Tversky was born in Haifa in 1937. Lewis says that as a boy, Tversky was an athletic thrill seeker, and though he had a natural gift for mathematics, he felt more strongly drawn toward the humanities. He was also brave; in between high school and college, he volunteered as a paratrooper and fought in the war of 1956. (Shortform note: The war of 1956, also known as The Sinai War, the Suez Crisis, and the Tripartite Aggression, began when Egypt took control of the Suez Canal in response to a lack of British and US economic aid. In an effort to reclaim the canal, Britain and France enlisted Israel’s aid in taking control of the Suez Peninsula.)
Once he was able to attend Hebrew University, Tversky studied psychology, though according to Lewis, he began to feel that much of its research was unscientific. However, his interest was sparked by a research paper that put forward the process of making decisions as a field for study. Intrigued, in 1961 Tversky enrolled in the Ph.D. program at the University of Michigan, which boasted the world’s premier psychology program. (Shortform note: The University of Michigan’s psychology department grew out of the school’s philosophy program in the early part of the 20th century and is still ranked as one of the top psychology schools in the country.)
Lewis points out that until then, decision-making had been studied by economists who assumed that people made choices rationally. An implication of this belief was the transitive nature of preference. For example, if a person prefers steak to chicken, and chicken to fish, then they clearly prefer steak to fish. (In math terms, if A>B and B>C, then A>C.) However, Tversky’s research showed that preferences are intransitive, and that sometimes that same person would choose fish over steak, (C>A). (Shortform note: Despite the evidence supporting Tversky’s view that preference is intransitive, there’s still a body of research in favor of the older principle.)
Tversky studied under Clyde Coombs, who believed that people evaluating any given choice compare it to an internal idealized image—the perfect steak, or the perfect chicken—and decided based on how closely reality matched their imagined ideal. (Shortform note: Some psychologists believe that similarity judgments based on an idealized image govern our choice of romantic partners. In Getting the Love You Want, Harville Hendrix and Helen LaKelly Hunt argue that people fall in love when they meet someone who closely resembles their imagined ideal of their childhood caregivers.)
Tversky hypothesized that people make judgments based on “features of similarity,” only taking into account the features that they happen to notice. Lewis points out that Tversky’s approach solved the intransitivity problem by asserting that the features people use to make decisions are dependent on context. For instance, the hypothetical person above might prefer steak if they’re drinking red wine, or they might prefer fish when visiting a place known for its seafood. Tversky’s ideas also predict what’s known as the framing effect—the principle that decisions can be manipulated by reframing the context in which they’re made.
(Shortform note: The framing effect has been shown to influence individual judgment and public opinion. While some people view framing as a useful leadership tool, others question whether its use in the political landscape leads to a poorly informed and easily manipulated populace.)
This work began the pattern that Lewis asserts would define Tversky’s career—he would examine widely held scientific beliefs, find chinks in their armor, and take them apart. (Shortform note: Tversky’s 1996 obituary notes his thought-changing impact on the fields of economics, business, philosophy, and medicine, while praising his intellectual rigor and gifts as a storyteller.)
While pursuing his research, Tversky married psychologist Barbara Gans in 1963. He earned his doctorate in 1965 and returned with his wife to teach at Hebrew University, but shortly after, he was called back to duty to command an infantry unit in the war of 1967. (Shortform note: The Six-Day War of 1967 between Israel, Syria, Jordan, and Egypt erupted after years of tension along the countries’ borders and within the West Bank. Israel achieved victory through air superiority and tripled in size from lands captured from its neighbors. The shock of their defeat reshaped the Arab political landscape and paved the way for the rise of Islamism.)
Though their lives had not intersected as yet, when they did finally meet, Kahneman and Tversky sparked in each other an intense collaboration that magnified their individual insights. According to Lewis, Kahneman’s understanding of the fallibility of perception mixed with Tversky’s persistent questioning and mathematical rigor let them demolish the common assumption that human thought is based upon reason.
Though they were colleagues at Hebrew University, their first real interaction took place in 1969 when Kahneman invited Tversky to speak at one of his classes. Tversky discussed research that showed that humans were unconscious statisticians—that guesses people made roughly matched up with mathematical statistical predictions. (Shortform note: The research Tversky referenced related to Bayes’ Theorem, a formula for calculating probabilities under varying degrees of information. In 1968, Ward Edwards proposed that in uncertain situations, people made guesses as if they were unconsciously using Bayes’ Theorem, albeit conservatively.)
Kahneman disagreed. He knew from his research that the senses could be fooled. Why not the mind as well? Lewis recounts that Kahneman argued strongly with Tversky, citing the various cognitive errors he’d seen his own students make. Tversky was troubled by the doubt he introduced, but instead of becoming antagonistic, the two psychologists chose to join forces and test whether the subconscious uses statistics to make judgments.
To investigate their problem, they devised a questionnaire that would pose an array of statistical questions purposefully designed to trip up the people being tested. Lewis writes that their collaborative style was to lock themselves in an office or classroom to hash out the details of their work. People passing by would often hear a mixture of shouting arguments and raucous laughter as the two worked out the various tricks they would embed into their questionnaire.
(Shortform note: While Kahneman and Tversky’s study may seem unorthodox, their methods weren’t. Questionnaires are a standard research tool in psychology, allowing scientists to collect information from large groups of people. However, they rely heavily on self-reporting and the subjects’ willingness to provide honest answers.)
Instead of inflicting their questionnaire on hapless students, Tversky and Kahneman took it to professional statisticians. The statisticians should have answered the questions correctly, yet they didn’t. What’s more, Lewis claims, the mistakes made by trained statisticians were consistent across the board. Chief among these was what Kahneman and Tversky dubbed “The Belief in the Law of Small Numbers,” a cognitive error in which a person assumes that a small sample of information is representative of a larger whole.
According to Lewis, Tversky and Kahneman’s paper on the subject called into question the bulk of social science research, because when conducting surveys across populations, social scientists vastly underestimated the number of participants needed to provide meaningful data. (Shortform note: In How to Lie With Statistics, Darrell Huff argues that an appropriate sample group is large and random, while those who use small sample sizes can easily distort their data to fit any outcome. In order to avoid the type of error that Kahneman and Tversky identified, researchers today can use a sample size calculator to determine how many survey respondents they need to accurately represent a population.)
Meanwhile, says Lewis, a body of evidence was growing that even expert opinions were marred by faulty thinking. Scientists at the Oregon Research Institute began to study how experts actually drew their conclusions, as opposed to how they claimed they did. In 1970, Tversky went there to further his work, while Kahneman continued his research at home in Israel.
The Oregon Research Institute
The Oregon Research Institute was founded in 1960 by Paul J. Hoffman to collect information on human behavior for use in education and mental health research. It works independently of any governing academic body, though it’s primarily funded by the National Institutes of Health and the US Department of Education.
Working at the Institute, researcher Lewis Goldberg led the above-mentioned charge to establish that so-called “expert judgments'' were dubious at best. He published a report in 1968 that called into question the accuracy of clinical psychologists’ judgments regarding their patients, showing that experts inflate the accuracy of their predictions and that their accuracy doesn’t even improve in the presence of additional information.
In America and Israel, the pair of psychologists presented students and children with a battery of oddball statistical problems. Lewis explains that they expected their students to get the answers wrong; what they wanted to study was how they got them wrong. After conducting these studies, Kahneman and Tversky reconvened in Oregon in 1971 to sort their data.
They found that the mind does not calculate statistics. Instead, it applies rules of thumb that Kahneman and Tversky dubbed “heuristics”—mental models that break down in problems that contain random elements. (Shortform note: Kahneman and Tversky discovered several distinct heuristics, including the “representativeness” heuristic (described in a paper on subjective probability) and the “availability” heuristic (defined in a paper on judging frequency and probability). We will go into more detail on these heuristics in the second half of this guide.)
Lewis asserts that by studying the ways in which the mind fails, Tversky and Kahneman revealed how it works. They discovered that people, even experts in their fields, rely heavily on stereotypes and ignore statistics. (Shortform note: The way we let stereotypes shape our judgment is particularly pernicious when it comes to the subject of race. In Biased, Jennifer Eberhardt identifies unconscious racial stereotyping as a pathway to confirmation bias, in which the mind only accepts data that conforms to a predetermined narrative.)
Given how important their discoveries were, Kahneman and Tversky wanted to reach a wider audience than the limited circles of academic psychology. Lewis writes that Tversky in particular gave talks to other academic groups, to make experts more aware of their own cognitive biases. (Shortform note: Reaching from one academic discipline to another is still considered unusual. The National Academy of Sciences reports that collaborations between economists and psychologists are rare because both fields have different methods and objectives. However, it’s becoming more commonplace—interdisciplinary research is a growing field, with psychology acting as a hub for other sciences that focus on specific aspects of human behavior.)
By 1973, Kahneman and Tversky were ready to summarize their findings, but once again, war intervened in their lives. The following military conflict gave fresh urgency to the problem of uncovering the foibles of the mind. Lewis writes that in the years following the 1973 Yom Kippur War, Kahneman and Tversky were particularly troubled by their government’s failure to predict the attack. They shifted their focus to examine the decision-making process and gave birth to a theory that overturned our understanding of human behavior in general.
(Shortform note: While much has been written in hindsight about failures in military intelligence, the problem of accurate forecasting remains an issue to this day. In Superforecasting, Philip Tetlock and Dan Gardener emphasize that accurate prediction hinges on avoiding cognitive bias, as well as focusing on statistical probabilities and taking a wide perspective of events. However, in Fooled by Randomness, Nassim Nicholas Taleb argues that not only do chaotic situations defy prediction, but that our very predictions themselves change the future.)
The Yom Kippur War began when Syria and Egypt launched a joint attack on October 6, 1973. When the fighting began, Tversky and Kahneman were assigned to the psychology unit embedded in the Israeli army. Lewis states that their primary job was to find ways to improve troop morale. What they in fact were able to accomplish was to let the troops open up about their experiences. Doing so was so heartbreaking that it reaffirmed the psychologists’ desire to find practical applications for their work, specifically to prevent another unexpected war.
(Shortform note: The employment of psychologists for military purposes began during the two World Wars, and not only to help soldiers deal with trauma. Military psychologists are often called upon to evaluate recruits and officer candidates, as well as to provide mental health interventions for soldiers operating under great stress.)
In order to stave off a similar surprise attack in the future, Kahneman and Tversky tried to convince the country’s policy makers to calculate probabilities and play the odds. However, when they tried to teach the government about statistical analysis, they learned that generals and leaders always favor their instincts, even after their instincts have been proven wrong. Lewis says this led Tversky and Kahneman to conclude that people need stories, not numbers. Realizing that providing leaders with information wasn’t sufficient, Kahneman and Tversky decided to examine the decision-making process itself.
(Shortform note: The idea that people value stories over data still holds true today. A 2015 report by PriceWaterhouseCoopers shows that business executives place data third on the list of factors that go into decision-making, after their own intuition and the experience of others. This degree of self-confidence flies in the face of the long list of dangers that come from overvaluing intuition.)
Lewis writes that prior studies of decision-making revolved around hypothetical gambles, such as whether people would prefer an 80% chance of winning $100 or a certain gain of $80. The reigning explanation for how people made decisions was expected utility theory—the idea that people are rational actors who calculate the utility of a situation to make the optimal decision. To address why this theory failed to predict actual human behavior, Swiss mathematician Daniel Bernoulli introduced the idea that people also calculate the psychological value of outcomes. Tversky and Kahneman were of the opinion that decision-making was far more chaotic.
(Shortform note: In Decisive, authors Chip and Dan Heath highlight several mental fallacies that make decision-making more chaotic, namely binary thinking, confirmation bias, clinging to a status quo, and overconfidence. In Nudge, Richard Thaler and Cass Sunstein suggest that those very fallacies often guide decision-making, a result that Kahneman and Tversky may have hoped for.)
In order to build a more accurate model of human decisions, Tversky and Kahneman decided that emotions must be taken into account, not just probabilities and economic value. Lewis says that in working the problem, the emotion Kahneman latched onto first was regret, or more specifically, that anticipation of regret guides many decisions. Bernoulli’s risk aversion, then, can be seen as the price of avoiding regret. However, they still couldn’t understand what prompted people to make risky decisions.
(Shortform note: While Kahneman would shortly abandon his theory that regret is the primary motivator in making decisions, regret’s psychological power continues to be a subject of much study. In The Paradox of Choice, Barry Schwartz goes into detail on the psychology of regret and the negative impact it can have on our lives. However, he also describes regret as a useful emotion that forces us to take decisions seriously and reduce the harm of our past mistakes.)
A flaw Tversky and Kahneman noticed in economics research was that the decision-making gambles in expected utility theory were always framed as a choice between two gains. Lewis writes that the breakthrough came when Kahneman and Tversky turned the problem on its head and framed experiments with gambles involving loss. What they found was that people are much more open to taking risks if it means avoiding loss instead of making gains.
(Shortform note: Kahneman and Tversky approached the subject of risk from a behavioral and economics perspective, but risk-taking also involves neurological factors. The surges of adrenaline and dopamine produced while engaging in risky activities can become addictive, and some people resort to high-risk activities as a response to emotional pain and depression.)
In 1975 they worked together on creating “risk-value theory,” also known as “prospect theory,” which balanced human decisions based on gains and losses and laid bare the patterns of seemingly irrational human behavior. At the time, economics theory asserted that the irrational errors people made were random and would be corrected by the market. Prospect theory, however, showed that humans are irrational in consistent, systematic ways, and that irrationality is built into the economic market itself. In other words, according to Lewis, prospect theory was the iceberg that could sink all of the prevailing economic thought. We’ll discuss prospect theory at length later in the guide.
(Shortform note: Decades after prospect theory’s initial publication, it is still widely held as the best predictive tool for explaining how people evaluate economic risk. It has also been applied to problems in political science, organizational management, marketing, and even travel.)
While Kahneman and Tversky were knocking down the pillars of psychology and economics, they were also undergoing upheavals themselves. Lewis claims that their newfound stardom in the academic world, coupled with changes in their personal lives, put their partnership on tenuous ground that affected them both in different ways. Their relationship shifted through the 1970s and ’80s such that Tversky received more and more credit, while Kahneman began to feel resentful of his partner’s antagonistic style, until he felt they could no longer work together.
The first major change to affect their partnership was when Kahneman divorced his wife and left Israel to strike up a relationship with a colleague, Anne Treisman. Tversky left Israel to follow Kahneman, and American universities scrambled to hire him. Lewis states that many in academia gave Tversky more credit for their work, mainly because he was more outgoing, and his contributions were more grounded in math. In 1978, Kahneman settled at the University of British Columbia, while Tversky took a posting at Stanford. Their partnership would now be conducted long-distance.
(Shortform note: The University of British Columbia is a well-regarded research institution, but Stanford University is considered more prestigious, counting Nobel laureates and billionaires among its faculty and alumni. While the two psychologists researched and published their work jointly, Tversky was often given singular credit, such as in this 1998 article which implies that Kahneman was merely Tversky’s coauthor. However, it’s understandable that people were drawn to the more extroverted Tversky—in Quiet, Susan Cain acknowledges that society itself values the qualities of extroverts more than those of introverts.)
Following prospect theory, Lewis says that Kahneman turned his attention back to the power of regret, but this time as it connects to grieving events in the past. He’d observed that people experiencing grief try to “undo” the pain of a tragedy by dwelling on what might have happened to prevent it (hence the term Undoing; we explore this concept further in the next part of the guide).
(Shortform note: The most common understanding of grief is based on the five stages identified by Elisabeth Kübler-Ross in On Death and Dying—denial, anger, bargaining, depression, and acceptance. However, the “five stages'' concept has been criticized as unscientific. To be fair, in her book On Grief and Grieving, Kübler-Ross herself points out that she never intended for the five stages to be interpreted as a literal, linear progression.)
Because of their geographic separation, Kahneman developed most of his work on Undoing without being able to bounce ideas off his partner. Lewis notes that Tversky contributed some material to the project, but he spent much of his time giving lectures abroad. Tversky, for his part, thought they were still a team, though over the next few years he began to receive awards and recognition, while Kahneman was largely left out.
Lewis makes it clear that Tversky rebuked anyone who painted him as the more important of the pair. Nevertheless, he was the one whom other academics came to for insights on how prospect theory related to their work. Kahneman was aware of Tversky’s spotlight and recognized that he’d grown envious of the recognition his partner was receiving. (Shortform note: In academia, where reputation is valuable currency, it matters very much who’s given credit for an idea. However, reacting with anger and resentment is often counterproductive. Claiming credit back when it’s unfairly given elsewhere does require an active effort to assert ownership for your contributions, which unfortunately went against Kahneman’s style.)
Throughout the ’80s, they continued to publish papers jointly, though each was written mostly by one and not the other. Their work also began to be attacked by academics whose research they’d thrown into question. Some challenged prospect theory in particular, while others rejected Kahneman and Tversky’s basic premise that human beings are fundamentally irrational.
(Shortform note: Present-day opponents of prospect theory criticize it for being too mathematical, relying on restrictive assumptions, and not taking neurological processes into account. Those who dispute Tversky and Kahneman’s findings on heuristics claim that they overstate their evidence, misrepresent the complexity of intuition, and unfairly malign the benefits of heuristics.)
Lewis writes that Kahneman’s impulse was to avoid conflict, whereas Tversky wanted to go on the warpath. He pushed Kahneman to coauthor a paper that would show conclusively that the mind ignored logic. Their paper was published in 1983, but the process of writing it made Kahneman miserable; it reflected Tversky’s antagonism toward their critics more than Kahneman’s level-headed approach.
(Shortform note: In addition to personality differences, another major source of contention between research psychologists arises from the problem that many studies cannot be reproduced, to the point that researchers have hotly debated whether a study on the reproducibility problem itself is reproducible. While some claim that psychological research is inherently flawed to begin with, recent studies have continued to confirm Tversky and Kahneman’s findings.)
In 1986, Kahneman took a position at UC Berkeley, only 40 miles from where Tversky worked at Stanford, but being in such proximity made Kahneman so unhappy that in 1992 he left California for Princeton in New Jersey. Lewis says that Kahneman considered their partnership over, but Tversky reached out in 1993 to cajole him into writing one more paper responding to a critic, Gerd Gigerenzer. The process of crafting their reply was so contentious that for Kahneman it not only ended their collaboration, but it also marked the close of their friendship.
He told Tversky as much, and almost immediately afterward, Tversky was diagnosed with terminal cancer. Lewis recounts that Kahneman was one of the first people Tversky reached out to, and though Tversky faced his death with peace and stoic calm, his mortality reminded Kahneman of the value of their friendship. Tversky and Kahneman spoke nearly every day until Tversky passed away in June 1996.
The Marriage of True Minds
While many people found Kahneman and Tversky’s intellectual marriage surprising, given their different personalities, their intense relationship and subsequent breakup wouldn’t seem strange to experts on marriage. In Getting the Love You Want, Harville Hendrix and Helen Hunt explain that people are attracted to partners who fill in the missing parts of themselves, but that over time, the traits that connect them become the same ones that drive them apart.
Hendrix and Hunt contend that to manage these differences and maintain a strong bond, partners should engage in a structured dialogue that forces them to communicate rationally. This involves mirroring the other person’s statements, validating their point of view, and responding with empathy.
Lewis writes that after Tversky’s death, Kahneman finally began to receive the attention and acclaim he’d been denied, culminating in a Nobel Prize in Economics in 2002 and the publication of his seminal work, Thinking, Fast and Slow, in 2011. (Shortform note: In honor of Tversky, Stanford University held a multidisciplinary symposium to highlight his legacy. Kahneman moved on to study hedonic psychology, the science of pleasure and suffering. In addition to his Nobel Prize, in 2013 he was awarded the Presidential Medal of Freedom.)
While describing the history of Kahneman and Tversky’s collaboration, Lewis shows that between 1969 and 1979 they overturned our understanding of how we make decisions not once, but three times. The lessons from that decade of research call into question much of what we think about what we think. Tversky and Kahneman revealed that the mind relies on impressions over logic, that we can be tricked into making illogical decisions based on how choices are presented, and that given the chance, our minds rewrite reality in order to avoid the pain of regret. We’ll explore each of these ideas in detail.
Kahneman and Tversky’s first breakthrough was determining that when making judgments, the mind doesn’t unconsciously calculate statistics, as was the common belief of the time. Instead, they showed that the mind applies stories and stereotypes through processes that Tversky and Kahneman called heuristics. In short, our minds use a variety of shortcuts to make guesses when we don’t have enough information.
In their research, they identified three separate heuristics that systematically cloud human judgment—representativeness, availability, and anchoring. (Shortform note: Heuristics have become a standard psychological tool for describing the mental shortcuts the brain takes when evaluating judgments and decisions. In Algorithms to Live By, Brian Christian and Tom Griffiths argue that you can train yourself to use heuristics derived from computer programming to make better decisions and optimize your time. Algorithms, they claim, let us make more efficient use of our limited memory and attention so we can avoid analysis paralysis and decision fatigue.)
The representativeness heuristic describes the way our minds persistently compare people and events to stereotypes and other assumptions based on past interactions. Lewis points out that from an evolutionary perspective, this heuristic is a handy mental measure to speed up decision-making in crucial situations—such as determining whether that shadow up ahead is a panther about to attack you from a tree. However, Tversky and Kahneman showed that this heuristic breaks down when random elements are involved.
According to Kahneman and Tversky, the human mind has a fundamental misunderstanding of randomness, to the point that we concoct inaccurate beliefs to explain why random things happen. In particular, as Lewis explains, people find it hard to accept that randomness naturally generates clusters that look like patterns even when they’re not. Instead, we wrongly expect randomness to create an even spread. For example, we think flipping a coin will produce heads and tails in equal numbers, when nothing guarantees that that will be the case. To our minds, an even distribution is more “representative” of what we believe a random sample will produce, so we concoct erroneous stories to explain any coincidences that naturally occur.
(Shortform note: There’s nothing special about coincidence; randomness guarantees that it will occur. In The Improbability Principle, mathematician David Hand invokes the Law of Truly Large Numbers to explain that the sheer number of possibilities and opportunities in the world makes oddball occurrences statistically inevitable. They only seem significant because our minds demand a narrative to explain why they happen.)
Lewis says that this cognitive error is problematic in disciplines like psychology, social science, and even medicine, where research is performed on small sample groups that may not represent the larger population because of random factors in test group selection. (Shortform note: The smaller the sample size used in a study, the larger its margin of error. In a paper published in 2005, medical professor John Ioannidis asserts that most current research findings are flawed due to the statistical limitations of the studies they’re based on.)
Availability, the second heuristic, states that you will consider any given scenario more likely if you can easily recall a similar situation. Lewis states that this heuristic makes us draw conclusions based on common occurrences, recent events, or anything that’s heavy on our minds. For example, after watching a movie about a serial killer, you may suddenly be afraid of becoming a victim yourself, even though the actual likelihood did not change after seeing the film. (Shortform note: While this effect may seem innocuous, it can have a negative impact on decision-making, such as when a manager overlooks an employee’s good record in favor of one recent mistake. It can also increase your anxiety by making you dwell on unlikely events.)
Lewis says that as with representativeness, the availability heuristic makes sense from an evolutionary standpoint—scenarios that occur more frequently may indeed be more likely than others. However, on the societal level, this heuristic leads to self-reinforcing systemic bias. For an individual, it can trick you into drawing poor conclusions when the proper evidence isn’t readily available, but misleading information is.
(Shortform note: In Biased, Jennifer Eberhardt goes into detail about the self-reinforcing societal impact of both the availability and representativeness heuristics, though she does not cite the heuristics by name. Availability comes into play when negative depictions of Black people in media create more “available” memories from which people make associations. Representativeness comes into play in what Everhardt dubs the other-race effect, in which people judge individuals based on preconceptions about the group they belong to.)
Anchoring, the third heuristic, is a phenomenon related to how the mind deals with numbers. Tversky and Kahneman found that when asking test subjects to estimate numbers, their guesses could be manipulated by “priming” their subjects with irrelevant information.
For example, if students were told, “There are 14 lines in a sonnet,” then asked to guess how many countries there are in Africa, their answers would tend to be low. If another group were told, “There are 5,000 students enrolled in this college,” their guesses to the number of countries would be high. (The correct answer as of 2022 is 54.) Lewis claims that Kahneman and Tversky weren’t able to identify why the brain behaves like this, but the fact that it does reveals another way that the mind is vulnerable to error.
How Numbers Fool the Mind
Kahneman elaborates on the anchoring effect in Thinking, Fast and Slow, where he identifies two mechanisms that cause it. First, the mind uses the anchor number as an initial guess from which to adjust. Second, the mind makes an association with the anchor—high or low, big or small, long or short—that colors the narrative within which the guess is made.
Others have noted that the anchoring effect is of particular use to marketers, who influence customers’ perceptions about price by anchoring their impressions about how much a product should cost. Some research has suggested that anchoring affects moral judgments as well, such as when one person’s ethical opinion is used as an anchor for those of others.
After establishing the systematic ways in which the mind can be fooled into error, Kahneman and Tversky turned their attention to how we assess risk when making decisions. Whereas economists approached the problem strictly in terms of financial gain, Lewis writes that Tversky and Kahneman took a broader tack and discovered that people are far more influenced by the desire to avoid potential loss. Their “prospect theory” explains why we take risks, how we define loss, and how our choices can be influenced by the way risks are presented.
When Kahneman and Tversky began to explore how people reacted to possible losses rather than theoretical gains, they learned the drive to avoid loss is very strong. Lewis says that in order for subjects to accept a gamble with a potential loss, the promised payout has to far exceed the possible cost they might incur, as it does when playing the slots at a casino. Generalized more broadly, people dislike losing what they have far more than they enjoy getting what they want.
(Shortform note: The promise of large payouts for relatively small risks is common to both Las Vegas and Wall Street, leading to what’s known as the casino mentality of investing. This mentality contributed to the 2008 global financial crisis when lenders were encouraged to make high-risk investments disguised as low-risk opportunities. Nevertheless, there are those who still recommend gambling strategies in financial investment.)
Tversky and Kahneman also discovered that people react to changes in value, not the absolute value of their situation. For example, if Bob is demoted to a middle management position, while Lucy is promoted into middle management, Lucy is happy while Bob is upset, even though they both now hold the same job. Lewis points out that the determining factor is the change in each person’s status quo, and it matters how that status quo is framed.
(Shortform note: In framing wealth, most people perceive “the rich” to be those earning more than themselves, regardless of their personal income. Investor Jacob Schroeder says that by using the framing effect on yourself and choosing to look for potential gains even in negative situations, you can make better choices about your wealth and overall happiness. This echoes the advice of Norman Vincent Peale, who contended In The Power of Positive Thinking that happiness is rooted in a mindset you can choose to adopt by reframing your outlook on life.)
In summary, Kahneman and Tversky found that people are risk-averse when considering gains, and more risk-taking when facing a loss. Therefore, Lewis says, our judgments can be manipulated depending on whether our circumstances are framed as positive or negative in relation to our perceived status quo. For example, if Lucy expects a $5,000 bonus, but then receives only $4,000, her gain is perceived as a loss. (Shortform note: What’s key to our perception of loss vs. gain is our individual reference point from which each is measured. Business professors have used this concept in the investment market to determine whether stocks are over- or undervalued.)
“Framing,” it turns out, is a powerful tool that can be employed by businesses marketing goods, politicians swaying votes, or doctors persuading patients to have surgery. Lewis relates that Tversky and Kahneman’s results led to an axiom—we don’t choose between options, we choose between how those options are described.
(Shortform note: Being aware of the power of framing is a useful tool for reminding us that anyone presenting information may have an unspoken agenda. In Nudge, Richard Thaler and Cass Sunstein explicitly lay out a program on how to use framing to advance a particular political agenda. Tools they suggest for reframing political decisions include offering “default” choices, providing clear links between decisions and outcomes, and narrowing the number of options to choose from.)
According to Lewis, the last major project to come out of Kahneman and Tversky’s collaboration was when they pivoted from looking at how people make decisions about the future to how they deal with the past. At the root of this shift were Kahneman’s thoughts about how people avoid feelings of regret, and the mental permutations they go through in order to cope with regret in the present. In their studies, Tversky and Kahneman uncovered a fourth heuristic of the mind, one in which we create alternate realities to avoid the pain of tragedy and frustration.
Lewis says they named their new mental model the simulation heuristic, referring to the power of “what might have been” to cloud present-day judgments and decision-making. Imagining an alternate, happier life offers a temporary salve to feelings of sorrow, but it also contaminates our perception of reality by evoking feelings of envy and regret for paths not taken.
Dwelling on the Past vs. Living in the Present
Many wellness experts warn against the dangers of engaging the simulation heuristic; instead, they espouse living in the moment as a balm for anxiety and depression. In The Power of Now, Eckhart Tolle explains that ruminating on the past leads to resentment and bitterness, while accepting the highs and lows of the present allows you to face them without wishing them away.
This thought is echoed in Radical Acceptance by psychologist Tara Brach, who says the stories we create to undo our frustrations are unhealthy coping mechanisms that only lead to us feeling unworthy and unhappy. She argues that accepting the present as it is allows us to recognize our reality and treat ourselves with compassion.
Kahneman and Tversky suggested that these “might have been” fantasies use counterfactual emotions to cover up uncomfortable realities. Lewis explains that the strength of those emotions depends on how close the alternate reality is to the present. For example, imagining a different career choice made 20 years ago carries less emotional weight than imagining you made a different choice yesterday. What also determines the strength of counterfactual feelings is how realistic and desirable the alternate reality seems. For example, imagining that you could have dealt with a problem at work more gracefully engenders stronger feelings than imagining you could’ve avoided the issue by spending the last 10 years as a beachcomber.
(Shortform note: The feelings aroused by missed opportunities may become even more harmful as we age. Research has shown that letting go of what might have been leads to better mental health later in life. In The Power of Positive Thinking, Norman Peale recommends creating a daily ritual to move on from whatever mistakes you might have made.)
Kahneman dubbed the process of rewriting painful events “undoing,” which he interpreted as a coping mechanism to deal with life’s infinite possibilities. Lewis states that in their research, Kahneman and Tversky established four rules the mind follows when undoing the past:
Because of the process of undoing, Tversky observed that for the mind, reality isn’t fixed. Instead, it’s a haze of possibilities. (Shortform note: Further work on the process of undoing was continued by Kahneman’s colleague Barbara Fredrickson, who revealed that positive emotions can undo the physiological effects of negative emotions. However, a later study by Melissa Falkenstern showed that the cognitive effects of negative emotions aren't as easily undone.)
Michael Lewis contends that Daniel Kahneman and Amos Tversky accomplished more together than they would have alone. Think back on projects in your own life that involved collaboration.
In group projects, do you feel like an equal contributor, or do you take on a leadership role? Which of the two do you feel is more empowering, and why?
Have you ever participated in a group project that you could have done better on your own? If so, why did the group process break down?
Are you particularly proud of something you accomplished while working as a member of a team? If so, how much credit do you give to the group as opposed to your own contribution?
Tversky and Kahneman developed the theory that our minds use a variety of shortcuts to make guesses when we don’t have all the information. How good of a guesser are you? The answers to these questions will be listed below. (Don’t peek.)
Texas, the largest state in the contiguous US, is divided into 254 counties. Brazil is the largest country in South America. Without looking it up, estimate the number of separate states that make up the country of Brazil.
The United States suffers the second-highest number of natural disasters in the world (after China). Guess what percentage of insured homes make claims for damage in the US per year.
Brazil is divided into 26 states. If you guessed significantly higher, you may have fallen prey to the anchoring heuristic. In what specific examples from your life did one “anchored” value, such as the price of a purchase, influence your perception of something else’s value?
Only 6% of insured homeowners make claims in the US per year, and not all of those are due to natural causes. If your guess was significantly higher, you may have been affected by the availability heuristic. In this case, you were cued to think about natural disasters, which may have made damage to homes seem more probable. What instances have occurred in your life when something weighing on your mind made it feel more likely?