Fooled by Randomness is the first work in a five-book series exploring randomness by bestselling author and former options trader Nassim Nicholas Taleb. In this book, Taleb examines:
While Taleb primarily focuses on examples from the world of investing, his principles are applicable to any field ruled by unpredictability (such as economics and politics) and demonstrate how we’re fooled by randomness in many aspects of our lives.
(Shortform note: To further explore Taleb’s thoughts on randomness, read our summary of his second book in the series, The Black Swan.)
Usually, when we see someone who’s become fabulously successful, we attribute her success to a combination of skill and hard work. When we see someone who’s never had such riches, or who’s had them and lost them, we consider her simply not as capable.
But people don’t typically realize how much of an influence luck has on success or failure. Skill and hard work will generally only earn a person moderate success. Wild success, the kind that comes with many millions of dollars and lasting fame, is most often due to luck: a fortunate rare event plus a lack of negative rare events.
This is more true in industries that rely heavily on chance, such as investing, than it is for professions like carpentry or medicine that are built more on perseverance and skill. Often, though, people don’t recognize how fundamentally different chance-based professions and skill-driven professions are in how they each produce success. This misunderstanding, and the tendency to credit success to skill instead of luck, can lead people to make poor decisions.
Some ways in which we misunderstand the effect of luck on success are explored in the following sections.
People tend to see examples of enormous success as representative of the kind of success any person can expect in that industry. This is called the “survivorship bias,” by which we see only the people who have “survived,” that is, thrived in any given situation, and we extrapolate lessons from their survival: mainly, that wild success can be reasonably expected.
For example, when an investor strikes it rich or a writer lands a multi-million-dollar movie contract, we often internalize those successes as a likely possibility for anyone in those fields. However, to accurately evaluate the potential for success in any venture, you must consider not only the observable results but also the invisible alternatives: the possibility that those successful people might have failed had they experienced unluckier circumstances.
So, to properly determine the likelihood of getting rich as a trader, you must take into account the many people who’ve attempted it and failed, not just those who’ve made a fortune. To properly judge the likelihood of getting rich through writing, you must consider all the authors who couldn’t find a publisher, or those whose published book garnered few sales, not just those who penned bestsellers.
Looking at invisible alternatives allows you to better see the role luck plays in your successes and failures. What might have happened if the market hadn’t jumped 500 points that morning? What if that large bet you made on that particular stock hadn’t worked out?
Getting into this habit can take practice. It’s easy to spot the chance inherent in a clearly-defined, hypothetical situation: for example, a game of Russian roulette in which you are offered fifty million dollars to shoot a gun loaded with one bullet and five empty chambers at your head.
It’s less easy to spot the random nature of professions that hew closer to mainstream sources of income and have less-clearly-defined parameters, like trading. Because there are so many elements involved in trading, the randomness driving it is better hidden and consequently, people tend to be blind to it—they then assume that success in such a field is due to skill in the same way that success in a non-luck-driven field (such as, for example, teaching) would be.
When we speak of luck or randomness determining success or failure, we are talking specifically about rare events and their outsized influences on any particular path. Rare events are infrequent and usually unpredictable events that bring with them either a huge payoff or a devastating wipeout. Very often a person who’s had wild success has merely been lucky enough, so far, to either take advantage of a positive rare event or to elude a negative and devastating one.
History is littered with rare events, but it’s impossible to know exactly when and where they are going to hit. This is why people tend to ignore the probability of rare events; it is hard to plan for something you can’t predict, and it’s hard to understand something that doesn’t follow the rules.
However, when you don’t understand how rare events work and how significant their influence is, you are unable to properly assess risk and opportunities, or to clearly see what’s shaping the world around you—including your success.
Because of the nature of randomness and rare events, given a large starting set of people, a certain percentage will inevitably end up wildly fortunate, regardless of their skill or competence. For example, imagine a set of 10,000 traders, each with a 50 percent chance of either making money or losing money every year. Assume that if a trader loses money, she’s out of the game. After the first year, there will be 5,000 traders left. After five years, there'll be just over 300 traders left. The continued success of these 300 traders might be due entirely to luck, but in the real world, each of these traders would be lauded for her skill.
Because of this, it is impossible to judge a person—particularly in a luck-based field—based on her track record, which might indicate nothing more than a lucky streak. You should be wary of anyone touting their excellent track record if they work in a profession that relies on chance.
Because of the nature of randomness, life can be unfair. The most capable companies and the most skilled people are not the always ones who achieve success. Consider that if we ran the simulation above so that each trader only had a 45 percent chance of making money in a year, at the end of five years we would still have almost 200 traders still in the game. These traders would be celebrated as extraordinary even though they were slightly incompetent (as illustrated by their greater-than-even chances of loss).
People find it hard to properly assess the likelihood—or probability—of randomness for several reasons:
These concepts are explored further in the following sections.
One reason we’re bad at assessing and preparing for the risk of rare, random events is because we are not good at learning from the past. We mistakenly believe that because something has never happened before, it can’t happen now.
Further, even when we do remember a past rare event, we tend to believe we now understand the events that led up to it, and we therefore think we can “predict” it; that is, if it were to happen again, we wouldn’t be taken by surprise: We’d see it coming and be able to minimize our losses. However, this is rarely the case; when something is happening in real time, we are too close to judge which events will end up being consequential and which will not. We can typically only have clarity on an event when some time has passed. For example, we can look back on the stock market crash of 1929 now and clearly see the warning signs; at the time, though, those signs would have been harder to spot.
The past is a poor teacher in another way, too: When we look back on it, we only see what actually happened. We stay blind to the possible events that might have happened.
This is the classic problem with “induction”: the method of understanding the world by making observations and drawing conclusions from them. When your theories of the world are based on only what you can personally observe, you’ll miss rare events.
For example, you may look at thousands of swans, see that each one is white, and conclude that all swans are white. However, some swans are black. They are rare, and if you don’t live in Australia you might never see one, but they do exist and they disprove your conclusion.
You can use observations to disprove a theory, but you can’t use them alone to prove one. Just one counterexample is enough to disprove a theory based on millions of observations.
Even when we can see the past clearly, it’s still hard to apply those lessons to the future because it’s difficult to break through the noise of the present and know in real time which events are going to change the course of history in a significant way. This makes it hard for us to see a rare event coming.
“Noise” is the overwhelming deluge of facts that bombards us from newspapers, television, online outlets, and so on. It includes up-to-the-minute stock fluctuations, daily explanations of market moves, and endless analyses of companies, most of which will be out of business within a decade. Small changes in the market are random, and paying close attention can lead you astray, convincing you that unimportant things have larger consequences than they actually do. The passage of time generally filters out noise, allowing you to see what ended up mattering.
Avoid getting caught up in noise by limiting your exposure to it. Ignore the day-to-day headlines and allow yourself to see events from a distance of time. That way, when something of significance happens, such as an important IPO or market movement, you will be better able to recognize it and act on it.
The future eludes us in another way, too. The future is affected not just by outside influences but also by our own understanding of those influences. Our predictions of the future can themselves change it. For example, if traders as a whole notice that the market always rises in March, they would all buy stocks in February in anticipation of the rise. Consequently, the market would no longer rise in March; it would rise in February.
Because of this, if everyone fully and properly understood the past, their predictions would no longer hold true because the preparations they’d make for the future event would prevent the event from happening as expected. The tendency of the future to repeat the past depends upon our being driven by the same invisible forces that drove past events. Thus, rare events like stock sell-offs exist because they are unexpected; if they were expected, people would prepare for them and the sudden sell-off wouldn’t happen.
Our brains have developed shortcuts of thinking that allow us to react quickly and decisively to threats. We’ve evolved these shortcuts to save ourselves time and mental energy; if we were to stop and think thoroughly about each interaction we have throughout the day, we would either miss opportunities or succumb to threats.
Unfortunately, because these shortcuts lead us to believe many things without fully thinking them through, our views of the world are often based on misunderstandings and biases we unwittingly hold. This prevents us from correctly evaluating probabilities and the likelihood of a rare event affecting us. A few of these misunderstandings and biases are outlined below.
We make decisions and evaluate risk almost entirely through our emotions, and we only use our rational brain afterward to justify our decisions. For example, you might fall in love with a particular car, purchase it, then justify the decision afterward based on gas mileage and interest rates.
Without any emotions, we can’t make decisions at all (as evidenced by brain-trauma victims who lose the ability to feel and end up unable to choose between rational options), but the flip-side is that emotions can steer us wrong by blocking out our rational thinking and leading us to misjudge risk.
To better identify risk, the primitive and emotional parts of our psyche have evolved to quickly scan the environment for threats. Because of this, we don’t like complexity. We respond best to simple concepts that are easily understood and quickly summed up. We therefore tend to gloss over the finer points of probabilities, which are not only difficult but are often also counter-intuitive.
The primitive and emotional sections of our brain also pay much closer attention to surprises than to run-of-the-mill news. We attach greater significance to shocking events even if they are not ultimately important, and we tend to believe events that are more easily brought to mind are more likely to occur. We therefore overestimate the risk of unlikely events while ignoring the risk of more likely ones. We can see this in how the media covers bizarre but relatively unthreatening news (like mad cow disease) while ignoring much more common, and more likely, threats (like car accidents).
For most of human history, people had little need to calculate probabilities; we lived simple lives and dealt with tangible threats. Our brains thus evolved to better understand concrete ideas rather than abstract ones, and consequently, we have trouble assessing the risks of abstract circumstances. Studies have shown that when presented with two sets of risks, people will be more concerned about the one that describes specific threats even if the more general threats would also include those specific threats. For example, travelers are more likely to insure against a death from a terrorist threat on their trip than death from any reason (including, but not specifying, terrorism).
We dismiss the risk of rare events when we consider the likelihood of them happening in any one specific situation or in one specific way. We then underestimate the larger risk of a rare event happening in general. To illustrate, your odds of winning the next lottery may be one in 25 million. But the odds of someone winning are one in one: one hundred percent. In the same way, the odds of a specific market correction happening on a specific date might be small; the likelihood of any market correction happening on any date is large.
We often misunderstand opportunities, risks, and probabilities because we evaluate them from too small of a sample size. We extrapolate lessons from just a few examples and apply them to wider situations, resulting in misguided strategies. For example, if a trader makes one right call, people may believe she’ll make more right calls in the future. If she makes one wrong call, they may question her reputation as a skilled trader. Neither assessment may be correct: More evidence is needed to prove either claim.
Our brains are wired to look for meaning, and this often causes us to find meaning where there is none.
People see meaning in intelligent-sounding but empty sentences. When investment funds or advisors publish grandiose statements filled with industry buzzwords, they project an aura of expertise that they may not actually have. Investors often base decisions on such advice or trust those advisors with their money, not recognizing the lack of meaning obscured by the fancy phrases.
People try to find rational explanations for circumstances of luck. When a trader makes a lot of money, people look for reasons for her success; they’ll often credit her intelligence or market-savvy. If she later loses money and is forced out of the game, people will again look for reasons; they might point to relaxed work ethics, for example. In reality, her success may be entirely down to luck.
People look for patterns that might reveal some hidden winning code. Though you can find meaning in any random data points (like constellations of stars), most market movements are random noise and not worth studying for meaning.
In judging whether or not a strategy is smart, we tend to focus on whether or not there is a high probability of winning, but ignore the more important aspect of how much we might win or lose. Because rare events have an outsized influence, frequent, typical events do not matter as much as infrequent, rare events do. For example, someone who makes her money in large but infrequent bursts, by anticipating rare events, often ends up wealthier than someone who makes her money slowly and steadily but ignores rare events (and thus either misses rare opportunities or succumbs to rare collapses).
Eventually, luck gives way. While short-term success is often driven by luck, long-term success is more often determined by skill, as good or bad luck eventually runs out and outliers of either success or failure eventually earn outcomes more closely matching their skills. This is called “reverting to the mean.” Traders who make bad trades will blow up. Waiters who win the lottery will likely not win it again. People who experience bad luck despite hard work and skill will eventually rise in their station.
Therefore, to avoid failure in our random world, remember that success built on hard work and smart choices is more lasting than that built on luck. Avoid chasing success that relies on luck, such as that born of risky trades. Be aware of the ways your instincts and emotions blind you to the roulette wheel. Prepare for risks and don’t expose yourself to losses you wouldn’t be able to handle. Some specific thoughts on how to develop these sorts of mindsets follow.
As humans, we are pre-programmed to react emotionally to stimuli—and, as previously noted, our emotions can cloud our judgment. Instead of fighting your emotional instincts, eliminate their triggers. For instance, people instinctively react emotionally to news of market changes, so eliminate such noise from your daily life. Don’t read the market reports, don’t watch television, and don’t scroll through the financial analysis online. Set your alerts so that you are only notified should the market, for example, jump 2 percent.
Don’t focus on whether the market will rise (as in a bullish market) or fall (as in a bearish one); focus instead on the impact that either movement will have on your results. Correctly predicting small market movements might earn you small gains, but hedging against large drops or rises can be much more profitable.
Aim to make money from rare events rather than from slow and steady bets; rare events tend to be more profitable in the long run. Short the market (bet that it will decrease) when you detect it would be significantly more profitable than betting on a market increase, even when an increase is more likely.
Traders who defend their chosen positions long past the point where they’re helpful end up following those losing bets to failure, unwilling to admit errors of judgment in their original positions. Traders who find long-term success are willing to change their positions at any moment as they see conditions change in the market.
To achieve long-term success, acknowledge you are fallible and prone to mistakes and be willing to abandon poor decisions—for instance, an unfavorable position—when it becomes clear they are no longer wise. Then, incorporate the lessons you learn from those mistakes into your future decisions.
We don’t have control over the random events that happen to us. The best we can do is plan for them and react appropriately when they inevitably happen. Anticipate the risks you can predict, and allow room in your plans for those you can’t.
Keep in mind that no matter how much you plan, randomness will always be ready to strike. Bear this in mind and react with dignity to the rare events that inevitably hit you:
Without realizing it, people often attribute luck to skill, randomness to determinism, and coincidence to causality. We confuse noise with signals, forecasts with prophecies, and supposition with certainty. We consider the lucky idiot a skilled investor without understanding the games of chance that created her success. Consequently, we leave ourselves vulnerable to risks and randomness that we should have anticipated, but didn’t.
In his book Fooled by Randomness, bestselling author and former options trader Nassim Nicholas Taleb examines the outsized role that luck plays in success, how and why people don’t generally understand luck, and how we can accommodate randomness in our lives once we’re aware of it. The book is the first in a five-book series entitled Incerto that examines many different aspects of randomness.
While Taleb primarily focuses here on examples from the world of investing, his principles are applicable to any field ruled by unpredictability (such as economics and politics) and demonstrate how we’re fooled by randomness in many aspects of our lives.
(Shortform note: To further explore Taleb’s thoughts on randomness, read our summary of his second book in the series, The Black Swan.)
We’ll divide our summary of this book into four parts:
(Shortform note: We have significantly reorganized the book from its original structure to add clarity.)
When we talk about luck, we are talking about randomness—more specifically, “rare events”: infrequent and usually unpredictable events that bring with them huge payoffs or devastating wipeouts. Rare events have outsized and uneven effects on achievement, occasionally bestowing success on less competent people and at other times taking it away from those who’ve had long winning streaks.
This is more true in industries that rely heavily on chance, such as investing, than it is for professions like carpentry or medicine that are built more on perseverance and skill. Often, though, people don’t recognize how fundamentally different chance-based professions and skill-driven professions are in how they each produce success. This misunderstanding, and the tendency to credit success to skill instead of luck, can lead people to make poor decisions.
In Part 1, we’ll examine how people mistake randomness for skill and how rare events can influence success and failure.
When we see someone who’s had incredible success, we often credit that success to a combination of skill, hard work, intelligence, and perhaps some other mysterious traits that create millionaires. However, skill and hard work will generally only earn a person moderate success. Wild success, the kind that comes with millions of dollars and lasting fame, is usually due to luck: a positive rare event plus a lack of negative rare events.
This is not to say that luck is the only ingredient in a successful working life. To take advantage of luck, you must have a base of preparation that includes skill, experience, and presentation. Showing up on time, wearing appropriate clothing, and working hard are all necessary first steps. But this doesn’t explain why some people find runaway career success but others don’t, any more than the process of buying a lottery ticket, while a necessary step, can’t explain why one person wins over another.
When people mix up luck with skill, they are confusing what is necessary with what is causal. It might be necessary for you to wear a clean shirt to work, but that didn’t cause you to handsomely profit off that last trade. Millionaires might necessarily work hard and take risks; this does not mean all hardworking risk-takers are millionaires.
History is littered with examples that illustrate this. Though Julius Caesar was surely intelligent, noble, and brave, so were many others who never rose to his heights. His personal characteristics were necessary for his achievements but are not enough to explain his lasting fame, which was more likely due to him repeatedly being in the right place at the right time.
One reason people attribute luck to skill is the “survivorship bias.” We typically see only the people who have “survived,” or thrived in any given situation, and we extrapolate lessons from their survival: mainly, that wild success can be reasonably expected from this particular industry or venture. We fall prey to the survivorship bias partly because the wildly successful examples are simply more visible; the failures tend to slink away into obscurity and remain unnoticed. When we don’t see them, we forget they’re possible.
This bias causes people to see examples of enormous success as representative of the kind of success any person can expect in that industry. For example, people see a fabulously wealthy stockbroker and think, “Trading is very profitable.” Or they see a bestselling author and think, “Writing is a great way to get rich.”
However, to accurately evaluate the potential for success in any venture, you must consider not only the observable results but also the invisible alternatives: the possible failures had the person’s luck been worse and success not been achieved.
For example, you can’t properly determine the likelihood of getting rich as a trader without accounting for the many people who have attempted it and failed. You can’t judge your chances of getting rich through writing without considering all those who couldn’t find a publisher or whose published book garnered few sales.
When you think about these invisible failures, you are thinking through “alternative histories” or “possible worlds”: other ways your path might have played out if your luck had been different. For instance, what might have happened if the market hadn’t jumped 500 points that morning? What if that large bet you made on that particular stock hadn’t worked out?
The idea of possible worlds has been promoted in various branches of thought.
In philosophy, possible-world theories consider whether or not God produced an infinite number of possible worlds and then followed through on just one.
In physics, a many-world theory of quantum mechanics posits that the universe branches out, tree-like, at every change; we are living in just one of these many worlds.
In economics, the “state-space” method looks at economic uncertainty by examining the “what-ifs” that might happen under different markets or world conditions.
In mathematics, a field often applied to the study of investing, so-called Monte Carlo methods of thinking help produce a hypothetical set of alternative paths: successions of events that might have happened after an initial, defined event. Mathematicians even have a computer program called a Monte Carlo generator that produces enormous sets of these possible paths, allowing the user to examine thousands of possible outcomes given a certain set of conditions. These simulations have been used in everything from war preparations to financial markets to help policymakers and businesspeople calculate odds without the aid of complicated mathematical formulas.
Of course, you don’t need an actual Monte Carlo generator to think about possible worlds. When presented with a choice, imagine the possible outcomes and then go a few steps further to imagine the possibilities that might arise from each outcome. In this way, you can start to consciously see that the outcome you’re hoping for is only one of many possible outcomes.
It’s easy to spot the random nature of the fortune produced by a hypothetical game of, say, Russian roulette, in which you are offered fifty million dollars to fire a gun loaded with one bullet and five empty chambers at your own head. These conditions are simple, limited, well-defined, and unarguably random.
It’s less easy to spot randomness in professions that hew closer to a mainstream expectation of income and that have fewer clearly-defined parameters, like trading. But the random nature of these professions is equally as influential. This has been illustrated by studies that show stocks chosen by throwing darts at a list are as likely to succeed as those chosen by seasoned professionals.
The difficulty of recognizing random influences means it’s even more important to consciously try to. Because the fatal “bullet” is more infrequent in the real world, and because its risks are more vague and harder to spot, people end up playing Russian roulette without consciously realizing it—that is, they end up playing a game based on luck while thinking it’s based on skill. Then, when the beneficiaries of this random luck become role models, others are lured into the game thinking their odds of success are greater than they actually are.
Get into the habit of thinking about possible worlds and alternate paths and you will gain the ability to see possibilities every time you are presented with a choice. This will help you “learn from the future” so you can better assess risk, and more readily resist the pull of the roulette wheel.
When we speak of luck determining success or failure, we are talking specifically about rare events and their outsized influences on any particular path. A person can be vaulted to great success if she catches a positive rare event, such as an unlikely and highly profitable trade. Conversely, she can spend every day making winning bets but can lose everything in a few minutes when a rare bad bet wipes out all her winnings up to that point.
(When a trader makes a trade that wipes out her capital and results in her getting fired or leaving investing altogether, it is called “blowing up.” A blow up is more than just losing money; it is losing more than the trader was expecting or was prepared for, and is usually enough to end her career. Most traders blow up at some point. In fact, the 10-year survival rate for traders is under 10 percent.)
History is littered with rare events, but it’s impossible to know exactly when and where they are going to hit. This is why people tend to ignore the probability of rare events; it is hard to plan for something you can’t predict, and it’s hard to understand something that doesn’t follow the rules.
However, when you don’t understand how rare events work and how significant their influence is, you are unable to properly assess risk and opportunities, or to clearly see what’s shaping the world around you. Rare events and randomness:
These ideas are explored further below.
A solid track record does not indicate future success in a luck-based profession like investing. Because of the nature of randomness and rare events, given a large starting set of people, a certain percentage will end up wildly fortunate and thus have an excellent track record, regardless of their skill or competence.
For example, imagine we run a Monte Carlo simulation in which 10,000 traders each have a 50 percent chance of making money or losing money every year. Assume that if a trader loses money, she’s out of the game. After the first year, there will be 5,000 traders left. After five years, there'll be just over 300 traders left. The continued success of these 300 traders might be due entirely to luck, but in the real world, each of these traders would be lauded for her skill.
Thus, to properly judge the value of someone’s track record, you need to know the size of the initial group she came from. Out of a large starting set, at least some people will dodge negative rare events for longer and will end up randomly successful, but from a small starting set, it is more likely that everyone will succumb to bad luck sooner. Therefore, if a person is successful coming from a small group, her success is more likely due to skill than luck. To illustrate, if we run the Monte Carlo simulation above with a group of only 10 traders, all of them will likely be wiped out within three or four years. Thus, if one of those people is successful after five or more years, it more likely indicates skill than luck.
To press the point further, let’s examine a scam that takes advantage of the numbers in the example above to manufacture a winning track record. Imagine someone sends letters to 10,000 randomly selected people in January. Half of these letters predict the market will rise in the next month and half predict it will decrease. At the end of the month, 5,000 letters have correctly predicted the market. This person then sends letters to just those 5,000 that were correct—again, with half these new letters predicting an increase in the next month and half predicting a decrease.
At the end of each month, she repeats: sending new letters to only the people who’d received the correct predictions. By June she’s left with only about 300 people, but these 300 are likely to be amazed at her powers of prediction, not realizing the randomness responsible for her winning track record. They may, therefore, be more likely to make a poor decision—in this case, they might entrust her with their money.
Because of the nature of randomness, life can be unfair: The most capable companies and the most skilled people are not the always ones who achieve success. For example, consider if we ran the simulation above so that each trader only had a 45 percent chance of making money in a year. Even though these traders would be slightly incompetent (as evidenced by their greater-than-even chances of loss), at the end of five years we would still have almost 200 traders still in the game, each of whom would be celebrated.
This possibility of luck favoring the less-skilled is compounded by the fact that in the real world, unlike in hypothetical situations, early success helps determine subsequent success. In a hypothetical situation, if you were to flip a coin, your chances of guessing heads or tails would be the same for each successive flip. However, the real world often functions so that if you win an early random advantage, you are better placed to win subsequent random advantages, and vice versa: Early random losses lead to subsequent random losses.
We see this when the paths of two companies, both well-qualified, diverge because one had early chance meetings with key executives or unexpected early product orders. The success of Microsoft is an example of this. Though Bill Gates was undeniably hard-working and intelligent, so were many of his rivals, and few would argue that his software took off because it was the best available. It is more likely that he got an early advantage through chance events and early positive feedback.
This is called a “path dependent outcome” and it explains why we can see a few astonishing successes but many more failures. The QWERTY keyboard is another illustration of this effect. It did not become the standard (and thus a success) because it was the most efficient layout of letters; in fact, it was developed purposefully to slow typists down with the difficulty of its design. However, it was the initial design introduced on a commercial scale, and once people got used to it, it became impossible to change—despite many attempts—simply because it had been the first, not because it was the best.
People also misunderstand the way evolution works in either the natural or business worlds, believing that it steadily and irreversibly improves both animals and companies little by little. However, this belief ignores the way that rare events affect evolution.
Darwinian evolution describes the tendency of a species overall to survive in the long-run. Often, though, in short-term chunks, the path is not smooth or linear. Sometimes animals evolve unhelpful genetic mutations, which might not harm them in the short-term—and may even help them—but might weaken them in the long-term. Such mutations will persist as long as the animals don’t encounter a rare event that the mutation makes them unable to handle. As soon as the animals encounter this kind of event, though, their unhelpful traits will cause them to fail. In this way, after a few generations, this “genetic noise” typically gets filtered out.
In the same way, behaviors and strategies that benefit traders under certain lucky conditions might wipe them out under others, and someone who does well when markets behave predictably may not do well when they don’t. At any given time in the market, the most successful traders are those whose qualities or strategies best fit the current environment. For example, traders who tend to buy during dips do well when those dips later turn into rallies, but that same strategy will sink them if the market continues to plummet. In a real-life example, traders who bought foreign currencies in the 1980s when the US dollar was overpriced went bust; traders who did the same thing a few years later, when the dollar was priced lower, got rich.
In sum, short-term survival can be often attributed to lucky conditions but long-term survival does not favor those who depend on luck for their success.
(Shortform note: For more discussion on our misconceptions of evolution, read our summary of The Selfish Gene.)
The idea that success built on luck is short-term brings us to our next point: Success built on luck is not worth the risk. Luck-based success will always disappear at some point, and when that happens, a person can lose everything she’s won up to that point. In the end, it doesn't matter how frequently a bet succeeds if its costs of failure are too high to bear. In the long run, it is better to have smaller success and protect yourself against large failures, than to have large success but be vulnerable to even bigger failures that will take it all away.
For example, in that hypothetical game of Russian roulette described above, there are six possible outcomes; in five of them, you emerge fabulously rich. But in one, you die. Though the likelihood of success is greater, the cost of failure is too high to make this a wise way to make money. Similarly, if you are making bets that earn you money every time they go through but leave you vulnerable to a loss that wipes you out, you will not have longevity. Sooner or later, the lesser-likely event will happen; at that time, all of your previous wins count for nothing.
In contrast, moderate success built without the aid of luck is less vulnerable to the whims of randomness. While the rewards of this success may be lesser, the stability it provides is greater. For example, if you are building success with persistence and intelligence in a field like, say, cardiology, then in none of your possible worlds might you end up fabulously rich, but in most of them you’ll end up at least moderately successful. In contrast, traders who make bad trades will blow up. Waiters who win the lottery will not likely win it again. Over the long run, success tends to be determined by a person’s inner qualities rather than long streaks of good fortune.
People often underestimate the influence of luck on success. We very often attribute luck to skill and randomness to determinism.
Look back on your own career progression. What were the major turning points along the way—the pivotal moments where you were either faced with a choice or an opportunity found you?
Do you think these pivotal moments were due either in part or in whole to chance? Why or why not?
Describe a time along the way when a choice you made regarding your career—or a gamble you took—might have turned out differently.
Had you made this choice differently, how might that have affected your career path?
Though hard work, skill, and intelligence are often necessary first steps toward success, they rarely account for runaway success, which is far more often due to luck: a positive rare event plus a lack of negative rare events.
Think back to your time at school, your early adulthood, or your early days in a new company. List two or three people you considered at that time to be somewhat equally talented and intelligent.
Compare their career paths since that time. Did any of them achieve significantly more success than the others?
Consider the choices these people made—was their success or lack of it an inevitable result of those choices? Can you see where luck might have played a part anywhere along the line?
Now that we’ve explored some of the ways in which randomness affects success and failure, we’ll start to examine our difficulty in understanding and anticipating it. In Part 2, we’ll explore three concepts that reflect this difficulty:
One reason we’re bad at assessing and preparing for risk and random events is that we are not good at learning from the past. We mistakenly believe that because something has never happened before, it can’t happen now. We then defend our lack of planning accordingly: “That had never happened before!”
A longer-term examination of history shows that rare events of all kinds do, indeed, happen. The very definition of a rare event is its unpredictability. History is littered with examples of events that never happened before. If history’s past brought surprises, why shouldn’t our own past do the same?
Even when we do remember a past rare event, we tend to falsely believe that we now understand the events that led up to it, and we therefore think we can “predict” it; that is, if it were to happen again, we wouldn’t be taken by surprise. We’d be more prepared for, and therefore less exposed to, any negative fallout from a similar rare event.
We also tend to falsely believe that mistakes of the past that led to these events have been resolved, making it even more unlikely that they would happen again. For example, people know that 1929 proved that stock markets can crash, but they often chalk that up to specific causes of that time. They believe, in other words, that the event is contained and non-repeatable.
We thus like to imagine that if we were to live through certain historical events, such as the stock market crash of 1929, we would recognize the signs and wouldn’t be taken by surprise in the way that people at the time were. This is the “hindsight bias,” otherwise known as the “I knew it all along” claim. However, seeing something clearly after the fact is much easier than seeing it clearly in real time.
In the same way, a manager taking over a trading department might do an analysis and find that only a small percentage of the trades made that past year were profitable. She might then point out that the solution is to simply make more of the profitable trades, and less of the losers. Unfortunately, such a statement of the obvious doesn’t provide any usable guidance for future trading decisions.
One reason it’s hard for us to understand the risk of rare events is that when we examine the past, we are looking at a specific set of events that actually did happen, while ignoring the possible events that might have happened. This points to a fundamental problem with inductive reasoning: While deductive reasoning tries to make sense of the world by coming up with a theory and then finding data to support it, inductive reasoning works in the opposite direction: You observe data, detect patterns, and formulate a theory based on those observations.
The problem with induction is that when your theories of the world are based on only what you can personally observe, you’ll miss rare events. For example, you may look at thousands of swans, see that each one is white, and conclude that all swans are white. However, some swans are black. They are rare, and if you don’t live in Australia you might never see one, but they do exist and they disprove your conclusion.
You can use observations to disprove a theory, but you can’t use them alone to prove one. Just one counterexample is enough to disprove a theory based on millions of observations. The same thing holds for observations of history. You can point to historical events to refute a conclusion, but you can’t make a sound conclusion based on a lack of historical events. For example, before September 11, 2001, you might have said that airplanes are never purposefully flown into skyscrapers. Your theory, based on a lack of comparable events, would have been disproven by the events of that one date.
Therefore, if you base your understanding of the market on what you have observed, you do not allow for the rare event, and you leave yourself vulnerable to it. You might say, “The market never drops by thirty percent in any four-month period,” and point to the fact that it’s never happened to prove your theory. But “It has never gone down” is different from “It never goes down.”
One reason we misconstrue inductive reasoning is that we confuse absence of evidence with evidence of absence. This faulty reasoning shows up across all industries. For example, imagine a drug trial shows improved outcomes of 2 percent, but the researcher decides that her sample size was too small and the improvements she found were too slight to be decisive. She might report that she has found no evidence as of yet of improved outcomes. Other doctors might then read that and decide she means, “This medicine does not help.”
In the same way, looking at history and saying “this hasn’t happened” is often interpreted as “this cannot happen,” an attitude that blinds people to potential risks.
Not only do we incorrectly interpret the past, but we also have trouble using lessons of the past to predict future rare events—no matter how hard we try—for three primary reasons:
These three reasons are explored further below.
When people think they could have correctly predicted past rare events, they think that they can also correctly predict contemporary or future rare events. However, we can typically only have clarity on an event through the lens of time passed; in real time, there’s usually too much “noise” to be able to judge what’s consequential and what’s not.
“Noise” is the overwhelming deluge of facts that bombards us from newspapers, television, online outlets, and so on. It includes up-to-the-minute stock fluctuations, daily explanations of market moves, and endless analyses of companies—most of which will be out of business within a decade.
Small changes in the market are most likely random, and paying close attention to them can lead you astray, convincing you that unimportant things have larger consequences than they actually do. This means that noise is not just useless, but can be actively harmful, if it leads you to make bad decisions, such as selling a stock abruptly because of minor movements, when it would have been wiser to hang onto it. Only the passage of time reveals what is important and what is noise: Time filters out the inconsequential changes and allows you to see what matters, since it reveals which changes ultimately prove unimportant and which change the direction of history.
This is especially true for stock prices. Looking at the minutiae of constant price changes means you are focusing on the variance but not the returns. It’s unhelpful, and the negative moments may even convince you to act prematurely. Imagine an investor is sitting on a portfolio that has a 90 percent chance of increasing over the course of a year. If she checks the stock prices every minute, she might experience 250 happy minutes each day as prices rise and 240 unhappy minutes as prices fall.
Because a person reacts more strongly to negative news than positive, she’ll end each day exhausted. She’ll also have 240 instances during which she’ll question her strategy and consider changing course. However, if she only checks her balance yearly, she’ll experience 9 happy moments for every one unhappy one, if that 90 percent chance of increase means she’ll have good news 9 out of 10 years. Time will have filtered out the unhelpful noise.
This idea also holds for world events. The daily news covers all events, consequential or not, while historians can see, with the benefit of hindsight, which events turned out to be transformative.
Often when studying history to apply it to today, we look at one or two narrowly-framed past events and believe their lessons apply to the future as a whole—at least as it relates to that general type of event—without taking into account all the ways in which the world constantly changes on a fundamental level.
So many details change in the way things work that it calls into question the usefulness of studying history at all—except, of course, to acknowledge its ability to serve up surprises. Lessons from previous eras may not apply to today. For example, the Asian markets of the 1990s bear little resemblance to the ones of today now that the structure of the world economy has changed so dramatically, so market strategies that worked back then might not work now.
Likewise, because the structure of the past can be so different from the structure of the future, we can only see similarities between the two from a distance, after the events have passed. In the moment, we’re too close to judge. This is true for connections between specific past and future events as well as between broader past and future landscapes.
The future eludes us in another way, too: It is affected not just by outside influences but also by our own understanding of those influences. Our predictions of the future can themselves change it.
For example, if traders as a whole notice that the market always rises in March, they would all buy stocks in February in anticipation of the rise. Consequently, the market would no longer rise in March; it would rise in February.
Because of this, even if we could fully and properly understand the past, if everyone did so, our predictions would no longer hold true because the preparations people would make for a future event would prevent the event from happening as expected. The tendency of the future to repeat the past depends upon our being driven by the same invisible forces that drove past events. Thus, rare events like stock sell-offs exist because they are unexpected; if they were expected, people would prepare for them and consequently, they wouldn’t happen.
When people misunderstand past rare events, they misunderstand the likelihood of future rare events and consequently, they don’t plan for risk appropriately. This mistake is apparent in how people feel about buying insurance.
Often, people resist getting insurance for things that are highly unlikely. If they do get the insurance, and the unlikely event never happens, they often feel upset that they shelled out “useless” money. When people do this, they conflate “forecast” with “prophecy,” and chastise a person for not prophesying correctly, instead of for accurately forecasting risk.
This can be frequently seen in reactions to warnings against risk in the stock market. For example, journalist George Will once interviewed Robert Shiller, author of Irrational Exuberance, a book about the mathematical randomness of the stock market. Will pointed out that had investors listened to Shiller's warnings at one point that the market was overpriced, they would have lost money, since the market had actually risen during that time. Will did not understand that being wrong in one particular call did not mean Shiller’s caution was unwarranted overall. But it’s easy to pick out wrong calls by looking back on them.
It would be the same as chastising someone for not playing Russian roulette, if, in hindsight, the people who did play the game got lucky and won the 50 million dollars. Over time, the people who keep an eye on risk end up better off because they’re better prepared for rare events.
The difficulty of predicting the future and the tendency to dismiss risks that passed but didn’t happen leads to an uneasy existence for people employed by investment funds as risk managers. Their job is to identify potential catastrophic risks to investors’ portfolios. They must walk a line between advising investors to avoid certain risks and the inevitable blow-back that they’ll receive if the risk does not materialize and the investor ends up losing out on profit.
As a result of this rock-and-a-hard place situation, most risk managers end up merely pointing out potentially risky moves without going so far as to warn against them. Consequently, risk managers exist more to give an impression of risk reduction than to actually reduce it. This is called “epiphenomena”: the ambiguous link between cause and effect, when people feel that merely watching risks is the same as reducing them.
When people misunderstand past rare events, they misunderstand the likelihood of future rare events and consequently don’t plan for risk appropriately.
Describe a time when you were surprised by an event, either in business, the news, or your personal life.
If you had been able to predict it, how might you have prepared for this event?
Knowing that you can't predict the details of rare events, how can you generalize your plan of preparation for similar rare events in the future? For example, you couldn’t have predicted the events of September 11, 2001, but you could have had a pre-established escape plan from the city if you lived in New York.
In Part 1, we examined why and how rare events affect success and failure. In Part 2, we discussed some ways in which we fail to understand and anticipate randomness. We’ll now look at why we have such trouble comprehending randomness, and how our brain’s wiring makes it difficult for us to understand probability.
Overall, these reasons include:
The first reason we fail to properly anticipate randomness is that we are guided by our primitive brain. To aid our survival, our brains evolved shortcuts of thinking that allow us to react quickly and decisively to threats. We’ve evolved these shortcuts to save ourselves time and mental energy; if we were to stop and think thoroughly about each interaction we have throughout the day, we would either miss opportunities or succumb to threats.
For most of our history, this system worked: Our lives were localized and simple, and we could optimize our survival without accounting for rare events. Unfortunately, in today’s complex world, we need to calculate probabilities, and our brains’ shortcuts lead us to believe many things without fully thinking them through. Consequently, we find ourselves equipped with primitive tools to face our contemporary challenges, and our views of the world are often based on misunderstandings and biases we unwittingly hold. A few of these are outlined below.
Neurologists observe that the human brain has developed into three general parts: the primitive brain, the emotional brain, and the rational brain. The rational brain acts as an advisor, but it’s the other two parts—primitive and emotional—that are responsible for decision-making.
This is not inherently a bad thing. Our thoughts can advise us, but without a feeling to direct us toward one option or the other, we get caught in endless rational deliberations as to what’s the best course of action. This can be seen in patients who’ve had brain trauma that destroyed their ability to feel emotions but left them intelligent, making them completely rational beings. People with this sort of brain damage cannot make decisions even as simple as whether or not to get out of bed in the morning.
The negative side of this, of course, is that emotions can steer us wrong and cause us to make mistakes. Emotions can cloud our judgment by blocking out rational thinking and causing us to wrongly assess risk, thereby leading us to make poor decisions. For example, we might buy a particular stock because we love the company and get emotionally invested in its future, though it may not be financially wise to do so.
Feelings also steer us wrong because people are more emotionally impacted by negative events than positive ones. This means they also view volatility much more starkly when it involves lower prices than when it involves higher ones. Likewise, volatility during negative world events is seen as worse than volatility in peaceful times. For example, in the eighteen months leading to September 11, 2001, the market was more volatile than in the same period after, but people gave the later volatility much more media attention. As a result, people are more likely to make moves during times of stress, even if those moves are not strategically wise.
To better identify risk, the primitive and emotional parts of our psyche have evolved to prioritize speed when scanning the environment for threats. Because of this, we don’t like complexity. We respond best to simple concepts that are easily understood and quickly summed up. Often we regard complex ideas with suspicion, assuming ill intent or falsehood.
Because of this, we tend to avoid concepts that feel difficult to explain, even when those concepts are more enlightening than simpler ones. We therefore tend to gloss over the finer points of probabilities, which are not only difficult to understand but are often also counter-intuitive.
For example, a study of how medical professionals interpret probabilities shed light on how often people who are supposed to know better, don’t. Doctors were asked this question: A disease affects one in 1,000 people in a given population. People are tested for it randomly with a test that has a 5 percent false positive rate and no false negatives. If someone tests positive, what is the percentage likelihood that she has the disease?
Most doctors responded by saying she’d be 95 percent likely to have it (since the test has a 95% accuracy rate). However, a person testing positive under these conditions would in fact only be 2 percent likely to be sick. (If 1,000 people are tested, only one will be sick, but an additional 50 will test falsely positive, for a total of 51 positive tests but only 1 actual illness. One divided by 51 is about 2 percent.) Fewer than one in five respondents answered correctly, as the right answer feels counter-intuitive.
(Shortform note: This does not mean that people are getting regularly treated for diseases they don’t have. The scenario doesn’t account for the human element of testing: Most people only get tested for a disease when they have symptoms of something, which increases the likelihood that a positive result does indicate sickness.
But the math holds true in real life for diseases that are uncommon but for which asymptomatic people get regularly tested—for example, breast cancer. There is a fairly high rate of false positives for mammograms, and the vast majority of those who test positive do not turn out to be sick. These false alarms are weeded out through further testing.)
The primitive and emotional sections of our brain also pay much closer attention to surprises than to run-of-the-mill news. We attach greater significance to shocking events even if they are not ultimately important, and tend to believe events that are more easily recalled are more likely to occur. We therefore overestimate the risk of unlikely events while ignoring the risk of more likely ones.
We can see this in how the media covers bizarre but relatively unthreatening news while ignoring much more common—and more likely—threats. For example, in the 1990s, mad cow disease got fevered treatment from the media but only killed several hundred people over the course of a decade. You were far more likely to be killed in a car accident on the way to a restaurant than from the tainted meat you might eat there. But due to the skewed media focus, people became more frightened of the (unlikely) threat of mad cow disease than of threats they were far more likely to face.
Because for most of human history people faced tangible threats rather than theoretical probabilities, our brains evolved to better understand concrete ideas rather than abstract ones, and consequently, we have trouble assessing the risks of abstract circumstances. Studies have shown that when presented with two sets of risks, people will be more concerned about the one that describes specific threats even if the more general threats would also include those specific threats.
For example, travelers are more likely to insure against a death from a terrorist threat on their trip than death from any reason (including, but not specifying, terrorism). In another example, a study found that people predicted an earthquake in California was more likely than an earthquake in North America (again, including but not specifying California).
The second reason we fail to anticipate randomness is that we don’t inherently understand how probabilities—the likelihood of rare events—work. Researchers have found that people have a lot of difficulty comprehending concepts that feel counterintuitive. Mathematical probabilities and random outcomes frequently fall into this category.
Our inability to correctly judge probabilities shows up in many ways, including the following:
These ideas are explored further in the sections below.
Rare events happen infrequently enough that we sometimes are lulled into believing they’re rarer than they actually are, so that when they do happen, we are more surprised than we should be.
This happens because when we think about rare events, we evaluate their probability based on the likelihood of them happening in one particular way—for example, we think about the risk of a market correction of a certain kind within a certain time period. However, this mindset causes us to miss the likelihood of any random event happening in any way—a market event of any kind over the next decade, for instance.
To illustrate: You have a one in 365 chance of sharing your birthday with someone you meet randomly. In a room of twenty-two other people, you’d have a 22 in 365 chance that you’d share a birthday with any of them—still a relatively small chance. But the chance that any two people in that room will share any birthday is about fifty percent. This is because you are not limiting the rare event to one person and one date. However, should such a matching date be discovered, it feels like a highly unlikely occurrence and will most likely result in exclamations along the lines of “What a small world!”
In another analogy, your odds of winning the next lottery may be one in 25 million. But the odds of someone winning are one in one: one hundred percent.
Conversely, we sometimes overestimate the likelihood of a rare event if it appears in conjunction with another probability, and therefore might prepare for risks that have an exceedingly small chance of happening. When two or more probabilities combine, the original, individual probabilities are multiplied. For example, the probability of you being diagnosed with a particular rare disease in any given year might be one in 1,000,000. The probability you’d get in a plane crash in any given year might also be one in 1,000,000. The probability of both happening in the same year would be the odds of either happening independently, multiplied: one in 1,000,000,000,000.
Though it’s easy to accept that in theory, in practice we often focus only on the individual probabilities and not their compounded likelihood. Peoples’ inability to properly understand compounding probabilities was demonstrated during the OJ Simpson trial of 1995.
Simpson’s lawyers argued that any DNA evidence was irrelevant because there might be four other people in Los Angeles with his same DNA characteristics. Though this might be technically true, the likelihood of Simpson being innocent of blood evidence (a 1 in 500,000 chance), plus the fact that he was the victim’s husband, plus the various additional evidence, puts the compounded likelihood of him being innocent in the one-to-several-trillion range. Despite this almost-infinitesimal chance that he was innocent, the jury focused on the individual probabilities alone—like the blood evidence—and acquitted him.
We often misunderstand opportunities, risks, and probabilities because we evaluate them from too small of a sample size. We tend to extrapolate lessons from just a few examples and apply them to wider situations. Usually, this results in misguided strategies.
Authors of self-help books purporting to reveal secrets of millionaires are often guilty of this. They’ll study a small sample of millionaires, come up with some traits they share, and declare that these are the characteristics you need to also get rich. Their sampling problem is twofold: First, they only look at a small number of millionaires, and second, they don’t look at the wider set of people who also have these traits but are not millionaires.
For example, the authors of one bestselling book advise their readers to accumulate investments because the millionaires in their study do. However, their sample size is too small to show that all millionaires collect investments. Further, many people who never become millionaires collect investments but of the wrong things: stocks of companies that soon go belly-up or foreign currencies that devalue.
Such books also might look at a very narrow time period—for example, the 1980s and 1990s, when average stock grew almost twenty-fold. People who invested during these years were far more likely to get wealthy than at other times, but for no other reason than timing. Lessons derived from this will be virtually useless; the best advice based on this sample would be, “buy a time machine and invest in the late twentieth century.”
Similarly, if a trader makes one right call, people may believe she’ll make more right calls in the future. If she makes one wrong call, they may question her reputation as a skilled trader. This can be seen when journalists grill successful traders to predict the market on any given afternoon. If the traders make a wrong call, even once, the journalists will often shed doubt on their entire career, even though they’re only considering a very small sample size of the trader’s calls.
Sometimes, we make this sampling mistake because we see only the data that is nearby to us and therefore more visible. For example, a lawyer making $500,000 a year might be out-earning 99 percent of Americans, but if she lives in a swanky neighborhood in an expensive city surrounded by nothing but multimillionaires several times over, she might feel that she’s not actually very successful at all. This is not only due to the small sample size of the comparisons, but also the fact that the sample is self-selecting: Anyone who is not fabulously wealthy cannot afford to live in her building and thus she only sees the super-winners in her sample set.
In judging whether or not a strategy is smart, people tend to focus on whether or not there is a high probability of winning, but they ignore the more important aspect of how much they might win or lose. Such uneven results are described by the term “skewness,” when there is a large chance of a small win but a small chance of a large loss.
People often value a strategy that has frequent small wins, even if a rare large loss might wipe out all those gains. Their emotional response to winning makes them focus on the frequency of the wins and the infrequency of the losses instead of striving to optimize the overall result. However, frequent, typical events do not matter as much as infrequent, rare events do because the consequences of rare events can be much more substantial.
For example, a trader might be happy if her portfolio gains 1,000 dollars for eleven months in a row but then loses 15,000 dollars in the last month. She might focus not on the end-of-year result (a loss of 4,000 dollars) but on the eleven months when she gained.
Someone who makes her money in large but infrequent bursts, by anticipating rare events (usually negative ones), often ends up wealthier than someone who makes her money slowly and steadily but ignores rare events. These types of traders are called “crisis hunters.” They may lose money frequently, but only in small amounts. When they make money, it’s infrequent but in large doses. This strategy can be seen outside of trading as well: Television production companies and book publishers produce a high volume of work that’s only expected to be slightly profitable or even slightly unprofitable, while holding out for the occasional blockbuster.
In our everyday life, outside of business, we often can better judge risks that have high levels of skewness since they are less abstract. For example, imagine you are packing for a week-long trip to the mountains, where you are told that the weather will be about 65 degrees but might swing 30 degrees in either direction. Here, you’d pack for the variances as much as you’d pack for the expected temperature: You’d pack both light and heavy clothing, anticipating the risk in either direction.
In this same way, you should approach investing with a similar mindset. Anticipate for the most likely outcomes but also plan for deviations.
Our blindness to skewness is illustrated by the typical trader’s reluctance to buy options. In buying an option, a trader pays a relatively small amount for the option to buy a stock at a certain price by a deadline. If she does not exercise that option by the deadline, she does not buy the stock, and she is out the money she paid for the option.
For example, say stock XYZ currently trades at 39 dollars. A trader pays 1 dollar for an option to purchase XYZ’s stock at 40 dollars by the end of the month. If the stock rises to 50 dollars, she will exercise that option by purchasing the stock at 40 dollars and immediately selling it back at 50 dollars. Because she’s paid 1 dollar for the option, she nets a profit of 9 dollars. However, if the stock doesn’t rise, she won’t exercise the option. Instead, she’ll let it expire and she’s lost 1 dollar.
Because people don’t like to lose money even in small amounts, they often undervalue options. If a person loses 1 dollar on a regular basis by letting her options expire, but then gains an occasional 9 dollars, she might end up ultimately profitable. However, the pain of the regular small losses keep many people from making this bet. Thus they let emotions prevent them from making a potentially profitable trade.
Our misunderstanding of skewness also shows up in our difficulty understanding the concepts of “mean” and “median.” Because of this misunderstanding, we often don’t properly understand real-life risks as presented to us.
The mean is the average of a set of numbers, but it’s often mistakenly thought of as the middle number. However, within a set of numbers, one outlier can greatly skew the average so that most of the other numbers might be over or under the mean. For example, if the income of nine people is one hundred dollars, but the income of one other person is one thousand dollars, the mean of all ten of them will be 190 dollars. Ninety percent of the people will be below the mean.
Similarly, if you are told the average investor earns one thousand dollars a month with a certain fund, you need to find out more about how that number was arrived at: If it turns out that most people lose money but a few gained an enormous amount, you might decide not to invest in that fund.
The median is the middle number in a set of numbers, but it does not account for the wide range those numbers can be. The median of the set “1, 3, and 47,000” is 3. If you are calculating risk based on the median of these numbers, you might fail to account for the outlier of 47,000.
For example, after a cancer diagnosis, you may be told that you might be expected to live another six months, based on the median survival rate of patients with this type of cancer. However, on further investigation, this median number might show that half of all such patients die within six months but the other half live for many more decades. Your end-of-life decisions may be markedly different once you better understand the numbers.
Skewness also shows up in the nonlinear (uneven) way that randomness works, so that small influences make a big difference: The addition of a single grain of sand can topple a sandcastle.
This is also called the “chaos theory,” describing the process of a small change making an outsized difference, for example, when the explosion of a population can be traced to a very small change at some starting point.
We are not wired to anticipate nonlinearity, either with our thoughts or our emotions. We tend to think that if you increase one variable, the other variable that’s linked to it will increase correspondingly. This is not always the case. For example, you may practice an instrument regularly for a long time without making much progress, but at some point, it clicks and your skill suddenly leaps forward. If you’d given up in frustration before that point, you would not have reaped the benefit of this nonlinear development.
Most routes to success will involve some kind of nonlinearity like this, but very often, people don’t have the stamina to stick with the venture through the slow times. The positive rare event benefits those who give it time to happen.
Another reason why people have trouble understanding probabilities is that they can be true at one point in time but not true as time passes. When a probability changes like this, it switches from an unconditional probability (true overall), to a conditional one (true under new circumstances).
For example, a financial planner might proclaim that because the life expectancy of an American is 75 years, if you are 69, you should plan for 6 additional years. This common misunderstanding confuses your life expectancy at birth (unconditional) with your life expectancy as you age (conditional). An expectancy of 75 at birth includes those who die by age 5, age 20, age 60, and so on. But if you’ve reached age 69, you might now be expected to live another 10 years. And if you then reach 79, you might still be expected to live another 5 years.
The third reason we fail to anticipate randomness is that we tend to automatically ignore it and search for patterns and meaning instead, even where there is none. Our brains are wired to look for meaning; it’s an evolutionary adaptation to aid in our survival, but it misleads us when we see meaning in randomness and then use that perceived meaning to guide our decisions. In particular, we tend to read meaning into:
All three mistakes can lead us to make poor decisions and are further explored below.
Fancy phrasing can make people think a piece of communication is significant when it in fact is nonsense. Feed a sentence-generating computer program phrases like “shareholder value,” “position in the market,” and “committed to customer satisfaction,” and you’ll get a paragraph that sounds like it has meaning but doesn’t. This kind of language can be found frequently in corporate and investment fund communications.
When you have experts in a field using industry-specific buzzwords and convoluted sentences, they project an aura of expertise that is often unwarranted. All too often, people buy into this aura, don’t properly question the advisors’ policies, and end up losing money on poor investments.
The experiences of some investors in the late 1990s illustrates this. Economists at the International Monetary Fund (IMF) at that time misunderstood the true risk of default by the Russian government but sounded like they knew what they were talking about. Many emerging-market traders invested deeply in Russian Principal Bonds as a result of advice from these IMF experts and lost hundreds of millions of dollars each.
Our tendency to look for meaning leads us to see meaning in random events, and to find rational explanations for circumstances of luck. For instance, when a trader makes a lot of money, people look for reasons for her success: They’ll often credit intelligence or market savvy. If she then loses money and is forced out of the game, people will again look for reasons: They might point to relaxed work ethics, for example.
(Interestingly, we usually view our own misfortune through a different lens than we use to view others’ misfortune. We credit other peoples’ failure to a lack of skill; we blame our own failures on bad luck. This is called the attribution bias: the faulty thinking we use to explain our own and others’ behaviors.)
Chances are if you observe enough data, you’ll find some correlations that seem relevant even if they aren’t. When we do this, we confuse correlation with causation. Correlation is when two events happen at the same time either through coincidence or because they are both the results of another, unseen cause; causation is when two events happen at the same time because one makes the other happen.
“Data mining” illustrates this in action. Data mining is the process of looking for patterns in large quantities of data. Although it’s an important tool in industries from insurance to health care, it can be misused to create a sense of meaning in otherwise meaningless data. You can always find some detectable pattern in a random series of events if you look hard enough.
For example, bestselling books have been published that examine irregularities in the Bible and use them to show how the Bible “predicted” events; these powers of prediction are, of course, improved by the fact that the events have already passed and can be matched to their “predictions” through the lens of certainty.
The same can happen with rules of investing: You can take a database of historical stock prices and sift through it, applying various rules until you find one that works for that dataset. Investors often do this hoping to find a “magic” rule that will allow them to predict future price fluctuations. However, the same laws of randomness apply to rules as they do to events: If enough rules are tried against a large enough data set, some correlations will emerge. But a rule that describes what happened in the past will not necessarily predict what will happen in the future.
For example, a trader might examine what would have happened had she bought stocks closing 2 percent higher than their price the previous week. If that rule doesn’t produce a winning formula, she might re-run the experiment using 1.8 percent as a benchmark. Continuing in this way, she may hit upon a specific number under specific conditions that would have produced a specific result. But applying those parameters to future trades rarely produces the same result.
In fact, any random set of data, by its very nature, will not look random. There will inevitably be some patterns in the data, for if not, and the data was truly “random,” it will actually look manufactured. Consider a painting of a night sky. To look real, the stars will be clustered here and there in ways that might indicate constellations. A sky full of evenly spaced stars, though more definably “meaningless,” will also be recognizably unnatural.
Likewise, we also see meaning in random noise. Small changes do not warrant explanation; they are likely random, with no discernable causes. However, people get caught up trying to parse them for meaning. For instance, if the stock market moves, there might be any number of reasons—or a combination of reasons. Listening to a pundit try to explain shifts in either direction, especially small ones, is usually a waste of time.
It would be as if you watched a marathon in which one person crossed the finish line one second before another. Such a small difference does not warrant examination. It’s unlikely to be because of a meaningful difference in diet or training; more likely, it’s a random shift of wind at some point during the previous 26.6 miles.
Further, it can be difficult to determine a single cause for a small event because there may be many possibilities. The dollar can react against the euro, the euro against the yen, the market against interest rates, interest rates against inflation, inflation against OPEC, and on and on. Isolating one cause among all the possible influences, particularly to explain small, frequent shifts in the market, is impossible.
(Shortform note: For a more in-depth exploration of cognitive biases, read our summary of Thinking, Fast and Slow.)
The primitive and emotional sections of our brain pay much closer attention to surprises than to run-of-the-mill news, and we attach greater significance to shocking events even if they are not ultimately important. This can lead us to prepare for the wrong things.
Describe a time you started to prepare for something unlikely because it was in the news cycle. What was the event or circumstance that you felt compelled to prepare for?
How did you determine the event or circumstance was something to worry about? How did it turn out to fulfill your expectations? (For instance, was it more or less of a risk than you had prepared for?)
Next time you encounter startling news, what can you do to ensure you are assessing the risk properly? How can you research it more fully?
Now that you have a fuller understanding of how and why randomness has such an outsized effect on success, and how and why people generally misunderstand its effects, let’s explore how you can approach risk with these concepts in mind. We’ll examine characteristics of people who fail to anticipate risk properly and we’ll discuss specific things you can do to anticipate risk mindfully.
Traders—and others in fields affected greatly by randomness, such as economics or politics—often have the same set of qualities that lead them to misevaluate risk and make poor decisions:
To avoid developing the misguided beliefs above and thus avoid failure, remember that success built on hard work and smart choices is more lasting than that built on luck. Therefore, try to avoid accumulating success that relies on luck. Be aware of the ways your instincts and emotions blind you to the roulette wheel. Prepare for risks and don’t expose yourself to losses you wouldn’t be able to handle. Some specific thoughts on how to develop this mindset follow.
It is hard to act rationally when we have irrational biases driving us, even when we know better. We often know how we’re supposed to act, but we act impulsively anyway. The problem is not lack of knowledge, it’s poor execution.
As humans, we are pre-programmed to react emotionally to stimuli. The best—and often only—way to prevent our emotions from clouding our judgment is to eliminate the triggers that activate them. For example, if you can’t control your cravings for chocolate, your best strategy would be to simply not keep it in your desk. In the same way, limit your exposure to market triggers.
To prevent unnecessary emotional reactions to noise, limit your exposure to the daily barrage of information available to you: Don’t read the market reports, don’t watch television, and don’t scroll through online finance analyses. Insulate yourself from small shifts that might grab you emotionally and cause you to make rash moves.
Set some parameters for your data so that you are only alerted when the market makes relatively large moves, the size of which you specify in advance. (Keep in mind that changes are not linear in their significance. A 2 percent change is not just twice a 1 percent change; it is more like 4 to 10 times, depending on the specifics of the numbers. Determine your threshold for significance with this in mind.)
Essentially, filter noise the way an engineer filters background static from a phone call. A phone’s technology detects sounds that have small changes in amplitude, pegs them as noise, and filters them out so that only sounds with significant changes in amplitude are delivered to the listener. Without such technological help, our ears have trouble picking out the voices from the noise. In the same way, our brains have trouble picking out important changes from noise, especially when we are also being bombarded with “expert” advice.
Don’t focus on whether the market will rise (as in a bullish market) or fall (as in a bearish one); focus instead on the impact that either movement will have on your results. Correctly predicting small market movements might earn you small gains, but hedging against large drops or rises can be much more profitable.
Aim to make money from rare events rather than from slow and steady bets; rare events—either positive or negative—tend to be more profitable in the long run. Act like a crisis hunter: someone who keeps alert for unusual opportunities or unlikely threats, and plans to profit off either accordingly. Crisis hunters are the ones shorting the market to anticipate a large drop, or buying options in the hopes of an unlikely price jump. Even if those scenarios are less likely than market stability, they can pay much more handsomely and are therefore worth seeking out.
For example, if you judge the market has an 80 percent chance of rising, but would only rise by two percent, but if it falls, would fall by 20 percent, you’re better off shorting the market and betting on the fall, as doing so will earn you higher returns.
Superstitions are an innate part of the human psyche: People instinctively look for connections between unrelated objects, events, or patterns and tend to be more primed to accept a hypothesis than to reject it. Therefore, if we are presented with a hypothetical possibility that our good fortune is caused by something like what we ate for breakfast that morning or a quirky market strategy we used, we are more likely to consider it than to dismiss it outright.
Though you may associate superstitions with cultural myths passed down from your grandparents, most people harbor at least a few superstitions that lead them to find causal links between two unrelated phenomena—like a certain pair of shoes bringing you a good trading day. These are sometimes referred to as “gambler’s ticks.”
Avoid being driven by such meaningless superstitions. Don’t look for connections where there are none—accept the random nature of your profession and resist attaching importance to unimportant things. Examine your attachment to your lucky pair of shoes or to an odd strategy, neither of which is actually lucky.
One trait common in traders who stay successful over the course of years is their willingness to change their positions at any moment. George Soros is renowned for it; he’ll talk bearishly about the market and then suddenly buy bullishly.
We referred to “path dependence” earlier when discussing how early success affects later success. To refresh, path dependence describes when a series of events unfolds in response to an initial, dominant event. In practice, it means that unless we consciously choose otherwise, we tend to be driven by the decisions we’ve already made, such as a decision to take a particular trading position.
This trait probably evolved to ensure a continuous society instead of one where people wake up every day and choose a different spouse, job, and kids. In the negative, though, it can mean a person does not question her emotional investment to a decision even when evidence shows she should.
This problem runs strongly in industries outside investing. For example, a scholar rarely suddenly contradicts all of her previous research. In fact, we say she “defends” her thesis; it would be unusual for her to change her position in the middle of her doctoral interview.
In the world of trading, this can be seen when traders continue to double down on losing positions, insisting that their strategy will one day be proven correct. It often leads to a blow up and an exit from the industry.
Instead of admitting they were wrong and incorporating the lessons they’ve learned into their understanding of the world, the trader blames market forces. It’s hard to admit that everything we’ve believed in and worked for up to this point has been wrong. But a better strategy is to acknowledge that you sometimes make mistakes and then incorporate the lessons you learn from those mistakes into your future decisions. Once you become comfortable accepting your fallibility, you’ll be better able to change your position if signs indicate it's the wrong one.
Sometimes randomness is an acceptable—even desirable—part of the human experience. Outside of matters of survival, such as making a living through investing or other means, randomness can enhance a person’s life. For example, poetry is constructed of random-sounding phrases, and in fact, a computer-generated poem can often sound just as lyrical as a human-generated one. But within the world of business, minimizing—or on occasion, harnessing—randomness is a crucial element to building lasting success.
We don’t have control over the random events that happen to us. The best we can do is plan for them and react appropriately when they inevitably happen. Anticipate the risks you can predict, and allow room in your plans for those you can’t. Keep in mind that no matter how much you plan, randomness will always be ready to strike. Bear this in mind and react with dignity to the rare events that inevitably hit you:
Ignoring “noise,” or the endless minutiae detailing small changes, can help you see large, important trends more clearly, and can prevent you from getting caught up in—and reacting to—inconsequential triggers.
Describe a kind of “noise” that you frequently come across in your daily life. (It might be related to business, current events, or something else. For instance, do you find yourself bombarded with news about small changes or endless analysis?)
How does this noise affect how you feel or how you act?
How might you limit your exposure to this noise? For example, can you unsubscribe to certain news sites or limit how often you check the value of your investments?
How might limiting your exposure change your behavior and your emotions? (For example, would you decrease the number of trades you make each day? Would you feel less stressed? Or would you feel more anxious about missing out on information?)
People often underestimate the influence of luck on success. We very often attribute luck to skill and randomness to determinism.
Describe a time when you stuck to a position long after it was wise. What were the results? Did you experience any negative consequences because you didn’t change your strategy?
What misunderstandings or biases did you hold that blinded you to the need to abandon that stance?
What steps might you take so that in the future, you are more willing to abandon an unwise strategy or position? (For example, could you think more thoroughly through the different possible worlds that might result if you either hold onto or let go of your position?)