Every decision, no matter how big or small, carries risk.
We don’t always think in terms of risk, and maybe that’s why we’re prone to making decisions in irrational ways:
But what if we worked on keeping risk at the forefront of our minds when we made decisions? One way to do that is by thinking of every decision as a bet.
In Thinking in Bets, author Annie Duke discusses how her decades-long poker career helped her refine her approach to decision-making and develop strategies that can be applied to day-to-day life and business settings. Playing poker enabled her to observe how other people make decisions in a setting where every choice leads to one of two clear outcomes—winning or losing money—and those outcomes serve as feedback about the quality of your decisions. Her strategies will help you become more consistently rational and intellectually flexible.
When you think in bets, you start with a foundation of well-informed beliefs; you become adept at learning from the outcomes of your past decisions; and you find a group of people who can help stop you from slipping into ineffective habits. These three building blocks—beliefs, outcomes, and groups—are the core of Duke’s philosophy.
So let’s break down what “thinking in bets” actually entails.
The decisions you make are ultimately driven by beliefs. You believe that job will be more fulfilling than this one. You believe you'll like living in a big city better than your small town. You believe cardio is good for your heart and you believe that you'll live happily ever after with your partner. Some of these beliefs are based on our direct experiences. But you might be surprised by how often false beliefs creep into our heads and stick there. Making a habit of interrogating your beliefs is essential to making good decisions.
A series of experiments by a Harvard psychology professor showed that we tend to process information as true even if it is explicitly presented as false, especially if we're under pressure. Unless we're given a reason to question those beliefs, vet them, and do the research necessary to acquire the facts, those wrong beliefs stick—and sometimes they'll stick even after they've been debunked.
From there, those old beliefs can influence how we perceive new information, feeding back into our existing biases and intensifying them. This cycle is called "motivated reasoning." That's why the phenomenon known as fake news is so insidious, and so effective. It's not converting anyone to a brand new way of thinking—it's designed to reinforce existing beliefs.
Similarly, every few years, news articles claiming there's no scientific proof that flossing is necessary go viral. Many people are eager to share and believe these articles without researching beyond the headlines. Why? Because they just don't like flossing. They're biased in favor of anything that makes them feel less guilty about not flossing often enough. Dentists do continue to recommend flossing daily, and the claims that "science" doesn't support a need to floss have many caveats attached. But the articles still get written and shared, not because they're revelatory, but because they reinforce existing feelings and beliefs.
To make good decisions, you have to fight against motivated reasoning; you have to make sure your beliefs are rooted in objectivity. How do you do that?
1. Imagine you had to place a bet on every belief you hold.
If being wrong about something suddenly came with an immediate financial penalty, you'd be a lot more likely to examine that belief, second-guess yourself, and do some fact-checking. The reality is that you won't have to pay for most of the misconceptions you might hold in cold, hard cash, but many beliefs do carry a penalty when they influence your choices: the alternative futures you might have had if you'd made a different decision.
2. Acknowledge that there's likely some uncertainty in most beliefs you hold.
The more you scrutinize your beliefs, the more you'll start to realize there aren't many things you feel confident enough about to bet money on, and that's okay. Even scientific concepts are constantly being updated or disproven. You don't have to be completely sure of every belief you hold, but it's good to think about how sure you are. There’s a range of certainty, differing levels of confidence, for every belief you hold.
Before you can get better at making decisions, you have to get better at learning from the decisions you’ve made in the past.
In a bet, you know that even if the odds are in your favor, there's a chance you could lose. Betting means knowing you don’t have all the information; you have to use your best judgment in the face of risk and uncertainty. Decisions are the same. You weigh the odds and hope for the best outcome.
Avoid black-and-white thinking—or viewing things as being either totally right or totally wrong. Most things, including the decisions you make, will be somewhere in between. In the ideal decision-making process, you evaluate your choices and pick the option with the highest chance of success. You may also take measures to reduce the possibility of failure even if you can't get rid of it completely. If the less likely, unfavorable outcome is what you get, it could mean:
None of these factors means that your process was bad or your decision was “wrong.” And if all the choices available to you carry a degree of uncertainty, a chance of failure, you can learn to evaluate which one is the least likely to fail. By thinking about not only whether you're unsure, but also how unsure you are, your guesses become more educated.
“Resulting” is a poker term that refers to our habit of judging a decision based solely on the outcome it produced. It’s dangerous because it can lead you to believe you have to change your strategy based on one bad outcome. What if the bad outcome was due to luck, rather than the quality of your decision? In that case, changing your strategy won’t help in the long run; it’ll just make you confused and erratic.
Making good decisions isn’t just about achieving the best outcome: It’s about having a decision-making process that is sound regardless of what the final outcome is.
Before you blame a negative outcome on poor decision-making, analyze the factors that led you to make the choice you made. You can’t control your luck, but you can control your skill and the soundness of your thought process. Make sure that you:
Being a good decision-maker means staying rational in the face of losses. You won’t always be right, but you can always be working toward objectivity and away from emotional or biased decision-making.
Before you can learn from the outcomes of your decisions, you have to figure out how much of the outcome you can attribute to skill, and how much of it was due to luck. If an outcome is the product of forces that you don’t control, like luck, then it might not be able to teach you anything. But you’ll find that most outcomes are the result of a combination of factors, some of which you can control, like skill. You can think of outcomes as existing on a spectrum between luck and skill, with the majority falling somewhere in the middle of the two extremes.
If an actor can’t land any jobs, he has to figure out if it’s because he needs more training—an issue of skill—or if he just hasn’t found the right role yet—an issue of luck. Or if it’s a little of both. To move his career forward, he’ll have to learn to tell those factors apart and address them. Duke calls this sorting process “outcome fielding.”
How you sort an outcome is a kind of bet, just like your decisions and beliefs. If you attribute an outcome to skill, you’re betting you can learn from it and thereby adjusting future decisions (other bets) in a way that will impact future outcomes. It’s a chain reaction. That’s why it’s important to get better at outcome fielding and get it right as often as you can.
"Mental time travel" is a strategy where you consider how past decisions turned out and imagine future outcomes when making a decision in the present. You ensure that you're actively learning from your past. And you keep your long-term goals in mind even when it comes to decisions where possible benefits or consequences might not be immediately obvious, as is the case with many decisions we make, like relocating or changing careers.
You can mentally time travel into the future, the past, or a combination of the two.
Traveling into the future might involve imagining the consequences of your present decision for your future self, in detailed and specific ways. What are the positive impacts that working out regularly could have on your life? What could be the negative impacts of not staying fit?
Traveling into the past can involve calling on past regrets. Usually, regret isn’t all that helpful, because it occurs after the fact. But if you conjure up the regret of a similar past decision, it can help guide you toward making a more rational choice in the present moment.
One form of mental time travel is “scenario planning,” an exercise where you imagine all the possible outcomes of a decision—all the possible futures—and try to guess the probability of each of these outcomes occurring. Just by weighing the possibilities and confronting your uncertainty, you’ll be more likely to make a more rational decision. Why?
Self-critique is an important skill, but other people can help you see your blind spots. They bring their own unique life experiences to the table and give you the chance to view ideas from angles you hadn't considered before.
But first, you need to find someone who’s willing to have those discussions with you. Not every person has the bandwidth to provide you with serious feedback on your life choices, or wants to have their decisions or beliefs picked apart in return. That’s why forming a group around good decision-making practices is important.
What are the qualities of a good group, when it comes to learning from each other?
Once you’ve formed the group, you’ll need to have clear rules of engagement. Duke looks to sociologist Robert K. Merton's guidelines for how he believed the scientific community should function, which he described using the acronym CUDOS:
Data is commonly owned, belonging to everyone. In a decision group, sharing data means being honest about the factors that went into your decision and providing as much detail as possible, omitting nothing, especially the things you're tempted to leave out.
All evidence must be treated in the same way, with a standard set of criteria, no matter what the source is. This doesn’t mean you can’t evaluate whether a source is credible or not—but it means you shouldn’t disregard information without a fair evaluation just because you have negative feelings about the source, like a news outlet with political leanings you don’t agree with or a colleague you find annoying.
Don't let conflicts of interest or other biases influence the work. Those “conflicts of interest” can include resulting. If knowing the outcome influences what we think of a decision, then one way to ensure your group isn’t swayed in either direction is to withhold the outcome until after the discussion happens.
All ideas are subject to scrutiny, criticism, and dissent. Organized skepticism is not argumentative or confrontational—not if everyone is willing to engage with their own uncertainty. As a group, you should reward civil dissent and debate with approval, encouraging healthy discussion and making it part of the group's norms.
In Thinking in Bets, author Annie Duke discusses how her decades-long poker career helped her refine her approach to decision-making and develop strategies that can be applied to day-to-day life and business settings.
She left her doctoral studies in cognitive psychology, took up poker to earn a little money, and ended up becoming a professional poker player whose 20-year career would garner over $4 million in earnings. The game enabled her to observe how other people make decisions in a setting where every choice leads to one of two clear outcomes—winning or losing money—and those outcomes serve as feedback about the quality of your decisions. In the early 2000s, she began sharing her strategies with professionals across such fields as finance, law, and business.
Her process boils down to “thinking in bets,” or re-framing every decision as a “bet” against yourself. By "thinking in bets," you will:
We’ve reorganized the chapters for coherency. Here’s how the chapters in our guide correspond to those in the book:
Your beliefs play a major role in the decisions you make. This chapter goes over why we're prone to forming beliefs based on false information, how your existing beliefs affect the way you process new situations, and how you can make a habit of constantly challenging and updating your own beliefs—which in turn will make you a better decision-maker.
Betting involves a degree of risk, usually financial. There's something at stake for you. The same is true of most of the decisions you make in your life. You’re choosing between "alternative futures," as Duke puts it, and betting that you'll be happier in one future than the other.
When you decide to move to a new neighborhood, you're making a bet in which you hope to gain something (an easier commute to work, closeness to friends or relatives, a house that better suits your family’s needs). If it doesn't work out in your favor (your new neighbors are loud, the house needs more renovation than you thought, construction begins on the next street and forces you to take a different route to work), then you've lost not only those potential rewards, but also the option to stay in your old home. You could wind up better off, but you could also end up worse off.
Extend this thinking to your other day-to-day choices. A choice to exercise every week is partly about choosing a healthier future. A choice to travel around the world is partly about choosing a future of adventure and broadened horizons. A choice to get married is partly about choosing a future with one specific person over any number of potential futures with other people.
But you're not only choosing a different future. You're choosing a different version of yourself: the self that you could become if you advance your career, start a family, or leave behind your material possessions and spend a year globe-trotting. Your life could branch out in any number of directions, but when you make one choice, you leave behind others.
The decisions you make are ultimately driven by beliefs. You believe that job will be more fulfilling than this one. You believe you'll like living in a big city better than your small town. You believe cardio is good for your heart and you believe that you'll live happily ever after with your partner.
Some of these beliefs are based on our direct experiences. But you might be surprised by how often false beliefs creep into our heads and stick there. Making a habit of interrogating your beliefs is essential to making good decisions.
False information seems to catch on and spread like wildfire, so much so that you could spend your whole life trying to combat the most commonly held misconceptions and likely never run out of material. Think of the popularity of MythBusters (a television program that ran for over a decade where popular “myths” were tested and frequently debunked through experimentation) or Snopes.com (a fact-checking website). Why do misconceptions spread so quickly and linger even when the information needed to disprove them is readily available by doing a simple Google search?
It comes down to how we’re wired. Our early ancestors had to trust their senses in order to survive. Our belief-forming apparatus is predisposed, then, toward credulity rather than doubt.
With language, humans developed the ability to form beliefs with second-hand knowledge, based on things other people tell us. But we still lean toward accepting new information as true right away, rather than doubting it, even though it's being delivered by something other than our own senses. A series of experiments by a Harvard psychology professor showed that we tend to process information as true even if it is explicitly presented as false, especially if we're under pressure.
Unless we're given a reason to question those beliefs, vet them, and do the research necessary to acquire the facts, those wrong beliefs stick—and sometimes they'll stick even after they've been debunked.
Our beliefs can even affect how we perceive shared experiences. That effect can happen on a small scale—how many times have you viewed another person’s innocuous comments or actions in a negative light simply because you already dislike them based on an unrelated, past experience?
But your beliefs can also influence how you process objective facts on a larger scale, as a pair of psychology professors from Dartmouth and Princeton observed in a study of how students from each of their schools viewed a football game between their teams. Because of the deeply entrenched rivalry between them, emotions ran high, and sharply different narratives emerged from each set of students (as well as the school newspapers that covered the game). Students noticed and remembered more penalties committed by the opposite team, and fewer committed by their own. They projected aggression and pinned blame on the opposition, and downplayed the infractions committed by the team they supported. They’d all watched the same game, but they came away with conflicting narratives about what had happened.
And that's just college football. Consider the ways that this belief-based processing of what should be objective information affects how we view things like news events and politics.
When old beliefs color our perception of new information, that feeds back into our existing biases and intensifies them. This cycle is called "motivated reasoning." That's why the phenomenon known as fake news is so insidious, and so effective. It's not converting anyone to a brand new way of thinking—it's designed to reinforce existing beliefs. If those beliefs were suppressed or buried, that extra fuel can bring them to the surface. And the algorithms in our social media feeds and search engines, designed to cater to our existing preferences, ensure that we're fed a stream of information that reinforces the way we already think and feel rather than challenging it.
Similarly, every few years, news articles claiming there's no scientific proof that flossing is necessary go viral. Many people are eager to share and believe these articles without researching beyond the headlines. Why? Because they just don't like flossing. They're biased in favor of anything that makes them feel less guilty about not flossing often enough. Dentists do continue to recommend flossing daily, and the claims that "science" doesn't support a need to floss have many caveats attached. But the articles still get written and shared, not because they're revelatory, but because they reinforce existing feelings and beliefs.
With our own natures and the very technology that’s supposed to make our lives easier working against us in this way, we have to make a conscious choice to check ourselves and the beliefs that we’re internalizing.
(Shortform note: For more on how bias can impact your life, read our guide to Biased.)
It's easy to become defensive when our beliefs are challenged—even minor beliefs—because we don't like to feel like we're wrong. That's why hearing that a belief we hold is incorrect can feel like a personal attack. But there are strategies you can use to get into the habit of vetting your own beliefs and being intellectually flexible.
1. Imagine you had to place a bet on every belief you hold. If being wrong about something suddenly came with an immediate financial penalty, you'd be a lot more likely to examine that belief, second-guess yourself, and do some fact-checking. The reality is that you won't have to pay for most of the misconceptions you might hold in cold, hard cash, but many beliefs do carry a penalty when they influence your choices: the alternative futures you might have had if you'd made a different decision.
2. Acknowledge that there's likely some uncertainty in most beliefs you hold. The more you scrutinize your beliefs, the more you'll start to realize there aren't many things you feel confident enough about to bet money on, and that's okay. Even scientific concepts are constantly being updated or disproven. You don't have to be completely sure of every belief you hold, but it's good to think about how sure you are. There can be a range of certainty, differing levels of confidence, for every belief you hold.
There are concrete benefits to allowing room for uncertainty in your beliefs.
The Reproducibility Project: Psychology is a study that showed how thinking in bets can produce more effective decisions.
The team behind the project asked scientists to bet on the likelihood that the results of an experiment could be replicated. They provided those scientists money to use in a betting market, and then they attempted to recreate some of the studies (and the results produced) found in top psychology journals.
When betting, those scientists were right 71% of the time about whether or not the replicated experiments would produce the same result, in comparison to the 58% accuracy found when experts gave their opinions via peer review. The discrepancy can be chalked up to the fact that even scientists can fall prey to resulting and motivated reasoning, and they can be biased against or in favor of a study depending on whether they are ideologically aligned with its findings. By betting, they were forced to be more critical and more objective.
It's not that those scientists who participated in the Reproducibility Project cared more because money was at stake; it's that the project highlighted the risk involved in getting something wrong. There is always risk in peer review—risk to the reviewers’ reputations, as well as the risk of accidentally spreading false information that could affect policy decisions and, ultimately, people’s lives. The Reproducibility Project just put that risk at the front of their minds.
Try Googling “list of common myths” right now. Your search results should turn up any number of articles on the topic. Click one of them, and see how far you have to scroll before you find something you believed to be true is actually false.
What was the myth that you’d believed? Why did you believe it?
If you’d had to bet on that belief before looking it up, how much would you have bet? (In other words, how sure were you that it was true?)
What beliefs do you hold that you would feel confident betting on?
What steps can you take to reduce your likelihood of internalizing false information in the future?
We’ve discussed the role of uncertainty and bias in the formation of your beliefs, and how interrogating your beliefs can help you build a stronger foundation on which to base your decisions.
Now we’ll examine the decision-making process itself, and how you can reframe your usual ways of thinking about it. The goal is to create a mindset that moves away from absolutes and toward intellectual flexibility, objectivity, and rationality.
We’ll cover the role uncertainty plays in our decisions, the need to judge our decisions based on our reasoning and not just the results we achieved, and the importance of thinking in shades of grey as opposed to extremes of “right or wrong.”
Before you can get better at making decisions, you have to get better at learning from the decisions you’ve made in the past. The key to that is making sure that you’re actually learning from the decisions themselves, and not thinking solely in terms of results or outcomes.
“Resulting” is a poker term that refers to our habit of judging a decision based solely on the outcome it produced. It’s dangerous because it can lead you to believe you have to change your decision strategy based on one bad outcome. What if the bad outcome was due to luck, rather than the quality of your decision? In that case, changing your approach won’t help in the long run; it’ll just make you confused and erratic.
Making good decisions isn’t just about achieving the best outcome: It’s about having a decision-making process that is sound regardless of what the final outcome is.
For example, imagine that you have an appointment coming up. It’s raining and you guess that traffic will be slower, so you leave early. But an accident on the highway, caused by the slippery roads, leads to a traffic jam; you get stuck there for an hour and miss your appointment. While you’re waiting, the storm clears up. You realize that if you hadn’t bothered leaving early, you’d have missed the storm altogether. Or if you’d left even earlier, you might not have gotten stuck behind the accident.
But that doesn’t mean you made a bad decision. You paid attention to conditions on the road, predicted that the commute would take you longer, and took action to avoid being late. You didn’t know there would be an accident or that the weather would clear up soon, and you can’t make a decision based on information you don’t have. In general, leaving early to make your appointments is probably a practice that will serve you well, and not something you should abandon because of one bad outcome.
Before you blame a negative outcome on poor decision-making, analyze the factors that led you to make the choice you made. You can’t control your luck, but you can control your skill and the soundness of your thought process. You can make sure that you considered alternatives, took steps to lower your risk, and thought through all the possible outcomes.
Being a good decision-maker means staying rational in the face of losses, because some losses are inevitable. You won’t always be right, but you can always be working toward objectivity and away from emotional or biased decision-making.
Consider the reverse: bad decisions that lead to positive outcomes. Picture a college student deciding to skip an early morning class, only to belatedly remember that they have an exam that day. They race to the lecture hall, rehearsing excuses the whole way there—and find it empty. After quickly checking their email, they find out that class has been unexpectedly canceled and the exam delayed. The outcome was good, but their decision to skip class was still bad.
If we can separate positive outcomes from negative decisions, we should be able to do the same with negative outcomes that result from what could have been good decisions. Avoid falling into the trap of “hindsight bias”—which is how, once we’ve learned the outcome of a decision, we’re prone to thinking it was obvious and we should’ve predicted it beforehand. It only seems obvious because, as the saying goes, “hindsight is 20/20.” But in the moment, things are never that simple.
(Shortform note: To learn more about how to avoid hindsight bias, read our guide to Thinking, Fast and Slow.)
Duke cites psychologist Gary Marcus's labels for the different types of brain functions that are responsible for decision-making: the "reflexive mind" and the "deliberative mind." The reflexive mind makes rapid, instinctive decisions. The deliberative mind, which is associated with the prefrontal cortex, is where logic and careful consideration lie.
It would be nice if we could get our deliberative mind to take point on all our decisions, but that's not possible. In fact, the reverse is true. It's actually our reflexive mind that handles the bulk of our decisions.
There's an evolutionary reason for that: The snap decision to run at the sound of a strange noise in the wilderness may have saved our early ancestors from predators or other dangers, ensuring the survival of our species. In other words, irrational decision-making is in our DNA. By understanding our limitations, though, it's possible to find ways to work around them.
(Shortform note: To learn more about how to use your deliberative mind to mitigate the errors of your reflexive mind, read our guide to Thinking, Fast and Slow. For a different perspective, read our guide to Blink and learn why Malcolm Gladwell thinks reflexive decisions can lead to better outcomes than deliberate decisions.)
Poker forces you to balance your reflexive and deliberative thinking, which is one of the reasons why the game can teach us valuable lessons about rational decision-making. Players have to make fast decisions under sometimes immense pressure, depending on how much money is at stake. This means they must find ways to reconcile the instantaneous decision-making of their reflexive mind with the deliberation required to strategize in the midst of a game. To get really good at poker, you also have to be able to reflect on all those fast, in-game decisions later and figure out if a loss came down to a bad move, or just bad luck despite perfectly sound reasoning.
As you start to break down all the factors that go into making a particular decision, you’ll probably be forced to confront the element of uncertainty. Accept that uncertainty, and use it to change the way you think about “right and wrong” in the context of decision-making.
Game theory is a field of study that involves using mathematical models to analyze decision-making. There's more to it than that—it's a complex field that examines the interaction between multiple decision-makers, taking into account chance and other factors. But we’re going to focus on the fact that scientist and mathematician John von Neumann, whom Duke names as "the father of game theory," modeled it after poker. That's because poker—unlike games such as chess—involves luck, hidden or incomplete information, and players who deliberately try to deceive each other. Uncertainty is part of the game.
In chess, all the information is available on the board, and the result of the game comes down to the moves that you and the other player make. Strategy, not luck. Life is a lot more similar to poker than chess in that you can do everything right and lose anyway if luck is working against you. Accepting that harsh reality will get you a step closer to being able to evaluate your decisions fairly and objectively, both before and after the outcome.
Black-and-white thinking—when you view things as being either totally right or totally wrong—can sabotage your decision-making ability. Taking uncertainty into account can help you avoid that trap.
In a bet, you know that even if the odds are in your favor, there's a chance you could lose. Betting means knowing you don’t have all the information; you have to use your best judgment in the face of risk and uncertainty.
Decisions are the same; you weigh the odds and hope for the best outcome. In the ideal decision-making process, you evaluate your choices and pick the option with the highest chance of success. You may also take measures to reduce the possibility of failure even if you can't get rid of it completely.
If the less likely, unfavorable outcome is what you get, it could mean that you got unlucky, didn’t have any good choices, lacked important information, or took a risk because you thought the potential reward was worth it. It could also mean that you didn't make the best possible choice, but you didn't make the worst choice, either—you picked something in between for reasons that made sense to you at the time.
None of these factors means that your process was bad or your decision was “wrong.”
Just as it's important not to immediately blame yourself for being "wrong," it's also good practice not to give yourself too much credit for the times when you're right. Even if you achieved the desired outcome, you still benefit from evaluating your decision-making process. How much of your success was due to luck? Is there something you could do differently in the future to avert disasters, limit the influence of whatever isn't in your control, and make your success more of a certainty?
Going back to an earlier example about skipping class: Even though the student didn’t miss their exam, they probably recognize that they won’t get that lucky every time they unthinkingly decide to sit out a lecture. They might think twice before skipping again, or at the very least, the near-disaster might inspire them to start keeping a calendar with all tests and other important dates marked.
It can be difficult to build a habit of self-evaluating a positive outcome because research shows that failure carries more weight for us than success. We feel the sting of a loss more strongly than the high of a victory. That's why you're better off freeing yourself from the concept of winning or losing altogether, when possible, and embracing the fact that most decisions and outcomes fall somewhere in between. And if all the choices available to you carry a degree of uncertainty, a chance of failure, you can learn to evaluate which one is the least likely to fail.
You do this mental math every time you choose between two different jobs, two different homes, two different strategies in a poker game. Each option carries the chance that something will go wrong; you pick the choice you think will turn out the best, even though you can’t be sure.
For example, maybe you choose the better-paying job out of two options. After weighing the pros and cons, you determined that being able to save up for your long-term goals was a priority. You’re aware that the hours will be longer and the responsibilities heavier, but that’s a trade-off you’re prepared for. You’re not guaranteed to succeed at the job or enjoy it. There’s a chance you’ll wind up hating the company culture and leaving in a few months, or that you’ll get fired. But you can’t know that will happen until you’ve tried. All you can do is pick the choice that you think is more likely to work out for you.
Measuring your uncertainty can make you a better decision-maker because you'll be better at guessing the odds. By thinking about not only whether you're unsure, but also how unsure you are, your guesses become more educated.
(Shortform note: For more strategies for dealing with uncertainty, read our guide to Fooled By Randomness.)
Practice breaking down your decisions and see if you gain some insight into your thought process.
Think about an important decision you made in the last few months. What factors did you consider before you made your choice?
What steps did you take (or could you have taken) to lessen any risks?
If you didn't achieve your desired outcome, was it due to bad decision-making or something else (for instance, bad luck, bad options, or incomplete information)? If you did achieve your desired outcome, how much of your achievement was due to your decision-making process, and how much to luck?
What could you do differently in the future to better manage risk, limit the influence of what's out of your control, and make your success more of a certainty?
We’ve learned about how engaging with our own uncertainty can lead us to beliefs and decisions that are more balanced and well-thought-out. Now we’re going to address the role of uncertainty in how we analyze our outcomes and learn from them.
Before you can learn from the outcomes of your decisions, you have to figure out how much of the outcome can be attributed to skill, and how much of it was due to luck. We'll discuss some of the obstacles that prevent us from accurately sorting outcomes into the "luck" or "skill" category, and how you can work around those obstacles by building more effective thinking habits.
To improve at anything, you have to update your beliefs and change your behavior based on feedback. When it comes to decision-making, you recalibrate based on your outcomes. We’ve all heard the advice to “learn from your mistakes,” but how do you actually do that? (Bear in mind that you can learn from your successes, too.)
First you have to figure out when an outcome is something you can learn from; not all of them will provide useful information. If an outcome is the product of luck or other forces that you have no control over, then it might not be able to teach you anything. But you’ll find that most outcomes are the result of a combination of factors, some of which you can control, like skill.
You can think of outcomes as existing on a spectrum between luck and skill, with the majority falling somewhere in the middle of the two extremes.
When you attribute an outcome to skill, you’re taking the credit or the blame for it. You can learn from that by adjusting your behavior going forward. When you recognize that it was about luck, and it wasn’t in your control, then you can discard it rather than changing your behavior based on what happened.
In poker, a loss could come down to the fact that you made the wrong moves—an issue of skill. Or it could be because another player drew better cards—an issue of luck. The ability to tell those two situations apart differentiates good players from bad ones. Duke calls this sorting process “outcome fielding.”
How you sort an outcome is a kind of bet, just like your decisions and beliefs. If you attribute an outcome to skill, you’re betting you can learn from it and thereby adjusting future decisions (other bets) in a way that will impact future outcomes. It’s a chain reaction. That’s why it’s important to get better at outcome fielding and get it right as often as you can.
Let’s break down some of the hurdles we have to overcome for more accurate outcome fielding.
It’s difficult to work backward from a negative outcome and pinpoint exactly what went wrong. Different causes can produce the same effect. Multiple causes can work together, each with a different degree of influence, to produce a single effect.
For example, people who achieve great success or fame in the arts often attribute some degree of that success to luck. You probably have to have some level of talent and understanding of craft to make it as an actor. But you also have to hope that a casting director takes a chance on you when you’re starting out, that you form the right connections, that roles are created that you’re a good fit for, and that the projects you get cast in are well-received by the public. And a variety of factors other than the quality of the work can influence public reception, like current events or whether the release date of your film coincides with (and has to compete against) the latest installment of a popular franchise film.
When you’re analyzing a complicated situation or outcome—such as the trajectory of a career—it might be impossible to draw clear conclusions about how much luck played a role as opposed to skill.
We're prone to "self-serving bias"—we own the positive outcomes, attributing them to skill, and brush off the negative ones as bad luck. We don't want bad things to be our fault. It's tempting to give in to self-serving bias even if we've accepted that most outcomes result from a combination of luck and skill.
This self-serving bias can make us overlook the learning opportunity in a mistake, or double down on beliefs that are holding us back. An actor who can’t find work might tell himself that he’s just unlucky or he hasn’t come across the right role yet. He’s always been praised for his talent; he knows he’s good enough. And he might be right that it’s just bad luck stopping him. But maybe he needs to try auditioning with a different monologue or learn new techniques. His certainty about his own talent might be preventing him from experimenting with new, potentially more effective, approaches.
(Shortform note: For more on bias and the other ways our innate irrationality throws off our decision-making, read our guide to Predictably Irrational.)
You can learn from watching people just as much as you can learn from gaining your own experiences. You watch others play a game and observe the strategies they use. You watch a parent or instructor drive when you're first getting your license. You watch how a colleague delivers a presentation and then apply their tactics to become a better speaker and presenter yourself.
When you watch other people make decisions—make bets—you can learn without having to risk anything yourself. But the same kind of thinking that produces self-serving bias exists when we watch others, too—only in reverse. In the same way that we instinctively want to attribute our successes to skill and our failures to bad luck, we're tempted to attribute other people's successes to luck and their failures to a lack of skill. This is especially true if it's a person we have an existing opinion about—a belief that affects our perception of them.
Competition is in our nature—competition for resources, for jobs, for partners. For victory in a poker game. Downplaying other people's skills is a way of downplaying the competition. Even when we're not competing, we're comparing. Research shows that a significant factor in how happy we feel with our lives is where we stand in comparison to other people. That’s why status symbols like designer clothing or luxury cars are so popular. A logo or a higher price tag doesn’t necessarily make a product better. The appeal is in how that product makes you appear to others, and how their perception of you lifts your self-perception.
But if you don't keep that instinct in check, it'll hinder your ability to learn from other people. You'll miss your chance to pick up on strategies you could've put to use in your own life. More importantly, you might fail to treat others with a compassion you would have wished for yourself if you were in their shoes.
These flaws in our thinking are habits that can be broken. There are three parts to a habit:
If you have a habit of biting your nails, the cue might be stress. Nail-biting is the routine that helps you cope. The reward is a brief respite, or at least a distraction, from that stress.
The trick to changing a habit is to replace the unwanted routine with something else. You can't get rid of the cue (stress). And you don't want to get rid of the reward (less stress). You just need a different routine to bridge the two—like going for a walk or squeezing a stress ball when you get the urge to bite your nails, until eventually, the urge goes away altogether. You can deal with the habit of self-serving bias, toward yourself and others, in the same way.
(Shortform note: For more on how to understand and change your habits, read our guide to The Power of Habit.)
The routine of blaming your negative outcomes on bad luck and other people's negative outcomes on a lack of skill rewards you with a more positive view of yourself. So find ways to uplift your self-perception without that biased thinking.
If taking credit for a success makes you feel good, learn to derive pleasure from interrogating those successes and identifying things you could have done better—because that will set you up for more success later on, and isn't that a good thing? If being better than others makes us feel good about ourselves, what if we focus on being better at judging people fairly, better at giving them credit where it's due and compassion where it's needed?
Self-serving bias and motivated reasoning are routines that you can swap out for better ones, like self-reflection and the ability to view situations from multiple perspectives. When you've confronted the fact that brushing off someone else's success as a lucky break is not just a throwaway thought you can forget about, but a bad habit that can determine how fast or how much you learn, therefore affecting your ability to make better decisions down the line—then changing that habit becomes more important. More learning, better decisions, and ultimately more success are rewards, so it's possible to build habits to produce those rewards more consistently. Like any routine, it takes practice to master and to turn into a habit.
One strategy when observing someone else's outcome is to ask yourself what you'd think in their shoes—if it was your outcome. Is it easier to see, then, what they did right? Is it more obvious what was out of their control?
You can take your own mistakes and successes and view them as if you were an outsider, too. When you picture someone else making all the same choices you did, can you see where there’s room for improvement? Can you see where you got lucky?
Although training yourself to change the way you think can be hard, remember that you don't have to be perfect. You don't have to completely eliminate bias from your life; in fact, that's probably not possible. But small improvements can have a snowball effect that produces significant changes to your life over time. Learning a little bit more every week might mean learning a lot more in a year. And being a faster, better learner gets you closer to pretty much any goal you can think of.
Practice outcome fielding by applying it, right now, to events in your life. Start by thinking about something good that you experienced recently. It can be anything—a promotion, a fun night out with friends, a new favorite movie.
List one aspect of that good thing that was out of your control—something you can attribute to luck or other forces.
Then, name one thing about that experience that you did have power over—something you can attribute to skill. (For example: Maybe you had time to watch that amazing new movie because you stayed caught up on your work all week.)
Now, think of a negative event or result that you experienced recently, and repeat the exercise. What about it was completely out of your hands?
And what was a mistake you made or something you could have done differently?
We’ve discussed two ways to analyze your decisions: by guarding against resulting, and by outcome fielding. We still have one more strategy to cover. In this chapter, Duke discusses how thinking about past errors and future goals can aid your truthseeking endeavors, thereby enabling you to make better bets.
"Mental time travel" is a strategy where you consider how past decisions turned out and imagine future outcomes when making a decision in the present. You ensure that you're actively learning from your past. And you keep your long-term goals in mind even when it comes to decisions where possible benefits or consequences might not be immediately obvious, as is the case with many decisions we make.
We have a tendency to prioritize our immediate desires—our present-day self—over the well-being of our future self. That’s why we tend to do things like avoid working out or neglecting the laundry a little too long.
Research shows that we make more rational decisions when forced to mentally time travel. That rationality arises in part because the same areas of your mind involved in recalling the past or envisioning the future are the areas associated with deliberative thinking. You can mentally time travel into the future, the past, or a combination of the two.
Traveling into the future might involve imagining the consequences of your present decision for your future self, but in detailed and specific ways. What are the positive impacts that working out regularly could have on your life? What could be the negative impacts of not staying fit?
Traveling into the past can involve calling on regrets. Usually, regret isn’t all that helpful, because it occurs after the fact. But if you conjure up the regret of a similar past decision, it can help guide you toward making a more rational choice in the present moment. If you recall what a hassle it was to have to do three loads of laundry in one day because you let it pile up, you’re more likely to get to it sooner this time around.
Mental time travel helps put your feelings in the moment—which can feel intense and all-encompassing—into perspective. When you're having a bad day at work, the frustration and misery can consume you and drive irrational decisions, like arguing with a colleague over a minor issue that you normally would’ve ignored. If you recall similar events in your past, you'll likely find that they didn't have a huge impact on your life over time; the emotions are a lot less significant, in retrospect, than they were in the moment.
Rather than focusing on the immediate emotions, keep a big-picture view of your life. What decision or series of decisions will make you happier over time? Is that argument with your coworker actually a discussion worth having? If so, are you in the right frame of mind to handle it productively and in a way that won’t permanently damage your relationship?
Another problem is that, in the moment, the way we feel about a situation (or outcome) is influenced by how we got there. For example, if you regularly stay at work for an extra hour or two because you like to stay ahead on various projects, you won’t be upset that you have less time to cook dinner or watch your favorite shows when you get home. But in the rare instance when you have to stay at work for an extra hour because you made an error you have to correct, that lost time weighs on you a lot more, even though ultimately it’s not much of a change from the norm. That negative outcome might put you in a terrible mood, affecting how you interact with your family when you do get home or even making you more short-tempered with your colleagues the next day. In other words, it’ll put you on tilt.
Poker players use the term "tilt" to describe the state of mind where you make irrational, emotionally driven decisions because of a previous outcome (whether that outcome was good or bad, such as a crushing loss or a euphoric win). Being on tilt can cause you to make decisions you wouldn't have made if you'd taken a step back and gotten some perspective.
You can often tell when you're on tilt: Are you experiencing intense emotions? Is your heart racing? Are you snapping at people? Then you might not be in the best place to make a decision.
Once you notice you're on tilt, you can use mental time travel to gain the perspective you're lacking in the moment. How much will the present event matter later on? If it's going to matter a lot, then maybe you shouldn't make a decision until you've calmed down. If it won't matter all that much, then realizing that can help you get your emotions in check more quickly.
You might not be able to make rational decisions 100% of the time, but the goal is to reduce the amount of emotionally driven decisions—to get better at long-term decision-making.
Just as there are emotional and physical signs of tilt, you might start to notice behaviors and thinking patterns that can indicate that you're not making a good decision, or that you’re drifting away from objectivity.
When you’re engaging with other people, these signs might include dismissing them with insults and ignoring the points they’re making—for example, with ad hominems or straw-man arguments, which are used to attack the speaker or misrepresent their viewpoint. Or you might begin to notice when you’re trying to influence them toward passive belief, maybe by speaking in absolutes (“That strategy always succeeds”; “You should never do that”) or subtly discouraging feedback by not acknowledging your uncertainty about the topic being discussed.
When you’re alone, a red flag might be when you realize you’re beating yourself up over something that happened, as opposed to constructively focusing on how you could do better in the future. Conversely, you might notice that you’re prone to blaming everything negative that happens to you on bad luck, or being overconfident about your skills, which might be a clue that your outcome fielding needs work.
Once you’ve practiced identifying the signs that you’re being influenced by your biases, you can get better at overcoming them.
You can use what's known as a "Ulysses contract" to ensure that your past or future self has input into a present-day decision. The term comes from a story from Homer's Odyssey, when the hero Odysseus (or Ulysses) had his crew tie him up as they passed by the island of the Sirens so that he wouldn't steer the ship there and doom them all. He knew that the Sirens' song would affect his ability to think rationally, so he used his past-self to keep his present-self in check.
Make commitments in advance that will place barriers on your future self, reducing the chance that you'll make an irrational decision. For example, imagine that you have trouble waking up when your alarm goes off. When you’re half-asleep, you reach over, hit the snooze button, and doze off for a few more minutes. You do this again and again until you have to rush to make it to work on time, which is stressful.
So you decide to put your alarm on the other side of the room. Now, to turn it off, you have to get out of bed and walk over to it. You could still crawl back into bed afterwards—but, by adding a barrier, you’ve reduced the likelihood of making that choice. You're forcing yourself to take an extra step before you can get to that bad decision. In other words, you're giving your deliberative mind a chance to take charge.
Another strategy is “scenario planning,” an exercise where you imagine all the possible outcomes of a decision—all the possible futures—and try to guess the probability of each of these outcomes occurring. Just by weighing the possibilities and confronting your uncertainty, you’ll be more likely to make a more rational decision. You'll have a chance to prepare for negative outcomes, rather than being taken by surprise if they happen. You can set up barriers to your own irrational tendencies (like moving your alarm across the room). Your emotional response to either a failure or a success will be moderated—you're less likely to self-flagellate if things don't go your way, because you braced yourself for that result in advance. You’ll avoid the trap of resulting, or judging decisions based on the outcome rather than the thought process—because you'll have weighed the merits of the decision well before the outcome happened.
Scenario planning allows you to consider not only what the possible outcomes are, but also which outcomes to prioritize. If there's a slim chance of getting your best-case scenario, but a very high chance of getting your second choice option, then the second choice probably has greater value to you. You can also consider the consequences that could follow each outcome, which could reveal pros or cons to each that you hadn't previously considered.
For example, imagine that you forgot to prepare for a big holiday dinner you’re hosting tomorrow. Your two options are:
Let’s say you lean toward option 1. But then you start scenario planning: What are the ways that option #1 could go wrong? “Wrong” in this case means that you don’t get the result you want: delicious food and happy guests.
What are the ways that option #2 could go wrong?
By scenario planning, you realize that if option #1 goes wrong, it will be hard to fix. You won’t be able to whip up enough food at the last minute to feed everyone. If option #2 goes wrong, you could order a rush delivery from the restaurant, but as scenario planning showed you, that’s not a foolproof plan either. You also have more control over option #2, whereas the success of option #1 relies mostly on other people’s actions. So how confident are you about your cooking? Conversely, how much faith do you have in your local Italian place?
You settle on a third option, a happy medium: You’ll prepare a few crowd-pleasing recipes you’ve made before and you feel confident about pulling off, but you’ll order a few items from the restaurant. Your bases are covered; if something goes wrong, it won’t be a complete disaster.
Another approach to scenario planning is to start with the goal you're trying to achieve and then consider the different routes to getting there. You can do this from a positive standpoint, imagining that you’ve achieved the goal and focusing on what you did to make it happen, or a negative one, imagining you failed and considering all the factors that may have contributed to that result. What are all the things that can go wrong? What precautionary steps can you take to avoid that?
Research by a psychology professor at NYU showed that people who imagine negative scenarios related to their goals are more likely to achieve those goals. In imagining how they could fail, they do a better job of taking steps to avoid that failure.
By planning for both negative and positive futures, you end up with a clearer picture of all the possibilities. That will help you be more accurate when guessing at the probability of each outcome, which will give you a clearer sense of what to prioritize, which all adds up to making better decisions.
Scenario planning can protect you from hindsight bias. If you've considered all the different outcomes, the one that actually happens won't seem like it was inevitable—especially if the outcome that happens is one that you'd guessed had a low probability of occurring. That awareness will make you a more accurate judge of your own decisions, able to have compassion for yourself and others when things go wrong, able to keep from becoming overconfident about a success, and ready to make similar decisions going forward.
Use scenario planning to envision different outcomes related to your goals, and map out all the routes that could lead to those outcomes.
Picture one of your goals for the next five years. This could be a career goal, a fitness goal, a personal goal. It could involve anything from getting a promotion to crossing an item off your bucket list. Imagine a future where you’ve achieved that goal.
What are three steps that your future self might have taken to make that outcome happen?
Now, imagine a future where you haven’t achieved that goal. What are three barriers that might have prevented you from getting where you were hoping to go?
Finally, think of one way you could overcome each of those three barriers or at least reduce the risk they pose.
We’ve covered the key components of thinking in bets on an individual level. This chapter discusses why having a group of people to help you make better decisions is also important. Duke cites it as a key part of her growth as a poker player, especially when she was first starting out. She also addresses questions about how you find such a group and how you set up the rules of engagement, mutually agreeing to interrogate your own and each other's decisions in a way that you wouldn't usually do in most social settings.
Self-critique is an important skill, but other people can help you see your blind spots. They bring their own unique life experiences to the table and give you the chance to view ideas from angles you hadn't considered before.
But first, you need to find someone who’s willing to have those discussions with you. Not every person wants to have their decisions or beliefs picked apart—to be asked to bet on their level of certainty. If you try to engage in that kind of “betting” with someone who hasn't agreed to it, all you're likely to do is create an awkward situation.
Think about it: If a friend loses a game and complains about their bad luck after, they’re more likely to be looking for sympathy than for someone to ask, “Well, what moves did you make? Where do you think you went wrong?” And if they come to you boasting about a win, chances are they’re looking for a pat on the back and not a back-and-forth about what they could have done better.
You can’t force someone to participate; they have to volunteer for it. People have to want to learn. That’s why forming a group around good decision-making practices is important.
Duke calls the group of people who help you vet your decisions a "truthseeking pod" or "decision pod." You can find these people among family, friends, or colleagues, or seek them out in professional groups or organizations.
This all sounds very formal. It doesn’t have to be. It can be as simple as an experienced poker player telling Duke, when she was first starting out, that she was free to discuss strategy with him but he wouldn’t indulge self-pity about bad luck. This was the same sentiment she found among a group of players her older brother (who also went on to be a professional with millions of dollars in poker earnings) introduced her to. Members of that group had a shared understanding that being good players meant continually examining their strategies, their decisions, and their outcomes—and helping each other do the same.
In a group, everyone is a willing participant and expects certain behaviors from the others. You can establish a norm of challenging each other’s beliefs and biases. Make helping each other improve a shared goal. Whether this group forms organically or if it’s deliberately set up with written rules, there needs to be an explicit agreement about how you engage with one another. You’re interacting in ways that are outside typical social norms, so expectations need to be clear. But if you can gather or join that group of like-minded people, you can help each other grow together much faster than any of you would have managed to grow on your own.
What are the qualities of a good group, when it comes to learning from each other? How do you avoid ending up in an echo chamber, with people who just reinforce each other's existing ideas—or, alternately, people who are so opposed to each other’s viewpoints that they’ll dissolve into pointless arguments? The qualities of a productive group are:
Let’s break down these qualities in more detail.
We discussed the importance of fighting against your own biases and considering ideas from multiple perspectives. In a way, it’s about challenging yourself. For a group to help you make better decisions, they have to be willing to challenge you, too.
This willingness to dissent occurs in groups who value objectivity over polite acquiescence. In that kind of setting, challenge and discussion are encouraged. Disagreement can feel impolite, but if disagreement is part of the group’s norms, then people are more willing to share their views and contradict each other.
By being part of a group that values accuracy, you can earn approval as a reward for behaviors like outcome fielding and analyzing your decision-making process—which creates and reinforces good habits.
And they expect to be held accountable themselves.
Members of the group understand that they're not in a setting where their audience will passively accept whatever they say. If reward encourages certain behaviors, accountability can be used to discourage you from succumbing to your more irrational or self-destructive impulses.
The thought of having to explain to someone—a group of friends or peers, a professor you respect, or your parents—that you missed a major exam because you just forgot about it can be a much more effective barrier against poor decision-making than trying to resist those impulses by yourself.
Having different perspectives in the group is important for generating new ideas and helping each other see what you'd have otherwise missed.
Imagine you're watching a scene in a movie that's shot using one camera that remains in the same angle the entire time. What might you miss? The expressions or body language of a character that's facing the other way? A looming threat just outside the frame? The text on the maps pinned to the opposite wall, too far away to see clearly? Just as most movies tell a story using many different camera angles, we're more likely to get a full and complete view of the truth by consulting a diversity of perspectives.
Being able to draw on the experiences of a group of people gives you a greater bank of knowledge than you would have if you were limited to only your own lived experiences. Along with access to more knowledge, people with different viewpoints can point out evidence you'd missed, expose you to new information sources, or walk you through lines of reasoning you hadn't considered. Giving thought to different viewpoints helps you test what you believe and gets you closer to objectivity. It also helps you see yourself more clearly, as other people can point out biases you might not yet recognize in yourself.
(Shortform note: For more on the pitfalls of ideological polarization, read our guide to The Righteous Mind.)
Once you’ve formed the group, you’ll need to have clear rules of engagement. Duke looks to sociologist Robert K. Merton's guidelines for how he believed the scientific community should function, which he described using the acronym CUDOS:
Data is commonly owned, belonging to everyone. This means that scientists can't, for example, publish the results of an experiment without also sharing the actual data. That's part of how reviewers can evaluate whether the results are sound or not.
In a decision group, “sharing data” means being honest about the factors that went into your decision—the data—and providing as much detail as possible, omitting nothing, especially the things you're tempted to leave out. If you're reluctant to share a piece of information, it could be because deep down, you think that detail will make you look bad or will run counter to a narrative you're trying to build—but it's just that kind of detail that warrants more investigation and critique.
Encourage others to do the same by rewarding them with approval when they share as full and complete a version of their story as they can.
All evidence must be treated in the same way, with a standard set of criteria, no matter what the source is.
This doesn’t mean you can’t evaluate whether a source is credible or not—but it means you shouldn’t disregard information without a fair evaluation just because you have negative feelings about the source. We also have to vet information that comes from sources we feel positively about—just because we like that person or source doesn’t mean they can’t be wrong or make mistakes sometimes. We separate decisions from their outcomes; we must also separate information from its source, judging the information on its own merits.
There are a few exercises you can do to start picking up this skill.
1. Before rushing to dismiss someone, make an effort to find one thing that they're right about or that you agree with. For Duke this meant studying her opponents more carefully before deciding they were bad players just because she wasn’t familiar with their strategy. It can also mean sometimes reading news from sources that hold different political or ideological beliefs from your own and trying to find common ground.
2. When judging a piece of information, imagine if it had come from a different source. If someone you dislike shared that information, would you be less likely to accept it? What if it came from someone you hold in high regard? Pay attention to how much your personal feelings about the source are influencing your judgment of the information.
Don't let conflicts of interest or other biases influence the work.
A conflict of interest can be as obvious as a corporation paying a scientist to produce a study supporting the health benefits of their product. Or it can be as subtle as our tendency toward resulting. If knowing the outcome influences what we think of a decision, then one way to ensure your group isn’t swayed in either direction is to withhold the outcome until after the discussion happens. Don’t tell them if you failed or succeeded; see what they think of your choice regardless of how it turned out.
All ideas are subject to scrutiny, criticism, and dissent. Organized skepticism is not argumentative or confrontational—not if everyone is willing to engage with their own uncertainty. You can express doubt and offer a dissenting opinion in ways that acknowledge that you, too, could be wrong.
As a group, you should reward civil dissent and debate with approval, encouraging healthy discussion and making it part of the group's norms. For example, professional organizations can set up ways for members to anonymously make suggestions or offer criticism, and reward them by actively addressing the topics that get brought up. And if you’re asking someone for advice, you can make it clear that you’re looking for feedback (and not just support)—and express gratitude when they give it to you.
Although we've established that people have to consent to “thinking in bets” before you can engage with them on that level, it’s still possible to maintain some truthseeking behaviors around people outside your group.