1-Page Summary

The world is full of evil—every day, people inflict unimaginable pain and suffering on one another. Our history, too, is bloody: We don’t need to look far into the past to find genocides, murderous wars of conquest, and systematic torture. It’s difficult to imagine what kind of individual could willingly participate in evil like this. Surely, no one we know could ever murder or torture an innocent person, right?

The Lucifer Effect was written by Philip Zimbardo, the psychologist famous for running the notorious 1971 Stanford Prison Experiment, in which undergraduate students acted as prisoners and guards in a mock prison and quickly lost themselves in their roles. He published the book in 2007, intending it to be a retrospective of everything he learned from the experiment. In it, Zimbardo argues that most of us drastically misunderstand evil: We underestimate the potential for the circumstances of a given time and place to transform ordinary people into heartless killers.

In this guide, we’ll first detail two competing theories of evil—one that blames specific people for being evil, and another that blames the circumstances of a given situation for causing people to do evil. Then, we’ll dive into the circumstantial factors that Zimbardo believes can influence anyone to do evil, illustrating these principles by explaining how they played into Zimbardo’s famous Stanford Prison Experiment. Next, we’ll investigate the institutions responsible for establishing evil-inducing circumstances. Finally, we’ll explain Zimbardo’s tips on how to resist the influences that tempt us to do evil.

The Nature of Human Evil: Two Opposing Theories

Zimbardo defines evil as any intentional harm done to innocent people. Most people assume that if someone does something truly evil—on the level of murder, torture, or rape—there must be something uniquely twisted about them. They’re an “evil person,” and it would be difficult or impossible to rehabilitate them. (We’ll call this the “theory of fixed morals,” as it implies that each individual’s moral compass is relatively fixed and slow or impossible to change.)

Zimbardo offers an opposing theory: He argues that any one of us could commit the worst evil imaginable if we found ourselves in the wrong situation. (We’ll call this the “theory of circumstantial morals” because it implies that each individual’s moral compass can change rapidly in response to new circumstances.)

In other words, Zimbardo argues that evildoers are ordinary people who find themselves in situations that cause them to disengage their normal sense of morality. Interviews have confirmed that many people who do great evil—terrorists, torturers, those who facilitate genocide—are otherwise psychologically healthy and rational. They’re just like any of us.

The Knobe Effect: Situational Biases Toward Each Moral Theory

Arguably, Zimbardo’s view of the two theories of morality is incomplete. He implies that people tend to believe one theory or the other—either the theory of fixed morals or circumstantial morals. However, the “Knobe effect” is a psychological bias that reveals we tend to use the two different theories of morality in different situations.

The Knobe effect states that in situations where someone’s intentions are ambiguous, if the outcome is bad, we believe they intended for it to happen, and if the outcome is good, we believe they stumbled into a selfless act by accident. In other words, we assume people do evil intentionally and do good accidentally.

By this logic, we’re more likely to believe the theory of fixed morals when people do evil. This is the point Zimbardo makes when he claims that most of us assume that all murderers are inherently twisted and broken people.

However, this logic implies that the reverse is also true: We are more likely to believe the theory of circumstantial morals when people do good. Zimbardo doesn’t make this point, but intuitively, it feels accurate—imagine someone whose friends all plan to volunteer at a soup kitchen for a night. If that person decides to tag along, we likely wouldn’t call them a particularly virtuous or heroic person—the circumstances just happened to influence them to do good. In contrast, if that person’s friends instead all plan to graffiti offensive slurs on the walls of the soup kitchen and they decide to tag along, we would be more likely to use the theory of fixed morals and judge them to be an immoral, selfish person.

We Are Biased to Believe the Theory of Fixed Morals

The theory of fixed morals isn’t just a popular idea—we’re hardwired to believe it because the human brain is biased against the theory of circumstantial morals. In psychology, applying the theory of fixed morals to the behavior of others (by blaming their immoral behavior on their character rather than the environment) is known as the fundamental attribution error. Why do we hold this bias so deeply?

First, Zimbardo explains that we believe this theory because it’s simpler—viewing people as either good or bad makes the world much easier to understand, even if it’s not true. Second, this bias protects our ego by identifying evil as something separate from ourselves. The theory of fixed morals allows us to blame all of the world’s evil on the few villains and criminals who are directly involved in it. On the other hand, if we embrace the theory of circumstantial morals, we may have to accept responsibility for creating or prolonging the circumstances that influence people to do evil.

Zimbardo makes it clear that the theory of circumstantial morals does not excuse evil—he still believes that we should hold people accountable for their harmful actions. However, he claims that it should influence us to punish them less severely and extend justice to those responsible for creating the broader circumstances in which the evil took place. In other words, Zimbardo asserts that by overcoming the fundamental attribution error, we can more accurately identify the source of evil and discover what to do to prevent it.

The Theory of Fixed Morals May Encourage Virtuous Action

While Zimbardo argues that the theory of fixed morals is an irrational bias and a harmful oversimplification, some make the case that when we apply it to ourselves, the theory of fixed morals can motivate us to do good. If we truly believe we’re inherently moral—fundamentally different from those who do evil—we’ll want to avoid contradicting that belief with our actions. For example, Calvinists believe that God has already chosen them to be righteous people and they’re predestined to go to heaven. At first, it doesn’t seem like this would motivate them to do good—if they’re predestined for salvation, why wouldn’t they do whatever they want?—but in practice, Calvinists want to behave in accordance with their self-identity, so they follow their doctrine’s moral laws.

However, you could still argue that the opposite is true: If we believe that we’re inherently good and the flaws in our nature don’t exist, we won’t take steps to fix them. One example of this kind of self-blindness is Zimbardo’s point that the theory of fixed morals motivates us to deflect blame for the world’s evil away from ourselves.

Three Philosophies of Punishment

Zimbardo asserts that punishment is still justified in light of the theory of circumstantial morals, but he doesn’t clarify why he believes punishment is justified. There are three schools of thought about the ideal purpose of punishment. First, retributive theories argue that punishment is a moral good: It establishes justice by inflicting pain on those who cause others pain. Second, deterrence theories argue that the goal of punishment is to discourage people from committing crimes in the first place. Third, rehabilitative theories argue that punishment is a flawed deterrent and healing and educating prisoners is the best way to reduce crime.

Zimbardo’s rationale seems to be a mix of retributive and rehabilitative theory. His statement that we should not excuse evil is tinged with the moral judgment of retributive theory. However, he seems doubtful that the threat of punishment can scare people into resisting powerful circumstantial variables. Instead, since he believes that most people behave virtuously in the right circumstances, Zimbardo would likely argue that rehabilitation is a realistic possibility for most criminals.

Circumstantial Variables That Can Corrupt Morality

If Zimbardo’s theory of circumstantial morals is true and the wrong situation can turn anyone evil, then what circumstances determine whether we do good or evil? In this section, we’ll discuss three main situational variables that can transform any upstanding citizen into a cold-blooded killer.

Zimbardo illustrates these circumstantial variables with a case study: his famous Stanford Prison Experiment. We’ll use this same example to help explain them.

Variable #1: Identity Cues

One type of situational variable that can cause a drastic shift in morals is the identity cue. Zimbardo explains that identity cues are aspects of the environment that we draw on when shaping our self-image and determining how to act. These can include specific locations, clothing and other props, and the expectations of others.

We typically assume that our personal identities are fixed, but to a certain degree, it’s normal to adapt our identity to specific circumstances. Zimbardo states that we all take on roles, or temporary identities, in various areas of life. For example, at work, we take on the role of “employee,” and act a certain way that we wouldn’t at home. However, in extreme cases, if we’re ensnared in enough environmental cues placing us in a given role, these roles can take hold of our emotions and influence us to do things we never would have rationally agreed to do.

Accept That No Single Role Defines You

Zimbardo asserts that while most people assume their personalities are fixed and unchanging, in reality our identities are largely dependent on our immediate circumstances. In The Subtle Art of Not Giving a F*ck, Mark Manson uses this assumption as the basis of a further argument: Many of us fail to realize how variable our identities are, instead clinging to the notion that we are a specific kind of person and suffering a fearful identity crisis when that belief is challenged.

Clinging to the idea of our fixed identity makes us suffer because it discourages us from positively changing. The wrong role can corrupt our moral compass (as Zimbardo notes), and we can easily get stuck in these roles because we’re scared of changing who we are. For example, a violent gang member may resist leaving a life of crime because they’re scared of losing their identity as a successful gangster.

Manson claims that letting go of your identity is a key step in personal growth. He advises choosing a simple identity for yourself and holding it loosely: Accept that none of the roles you play define you as an individual—it’s always possible to change roles and become someone totally new. To continue our example, instead of merging with the role of “tough gangster,” our gang member could simply label himself “tough” and build self-esteem by persevering through a challenging job search.

Introduction to the Stanford Prison Experiment

In 1971, Zimbardo turned the basement of Stanford University’s psychology building into a simulated prison, paying undergraduate male volunteers to act as prisoners and guards. Decades later, psychologists still discuss the Stanford Prison Experiment because of its shocking findings about human nature and the unethical treatment of its test subjects.

While he planned to run the experiment for two weeks, Zimbardo ended it after just six days because his “prisoners” were suffering far more than he intended, largely due to their guards’ extreme abusive behavior. For example, the guards refused to let the prisoners sleep, constantly harassed them with insults and arbitrary demands, and punished them by making them exercise until they dropped.

The Stanford Prison Experiment Wasn’t Really an Experiment

Various experts have criticized the Stanford Prison Experiment for its lack of scientific rigor ever since it was publicized. The “SPE” isn’t even technically an experiment, as it has no control group or independent variable—Zimbardo admits he intended it more as a “demonstration” than an experiment. Instead of publishing his findings in a peer-reviewed journal, Zimbardo took his project directly to The New York Times for the mainstream media buzz.

This move away from traditional scientific publication channels paid off for Zimbardo: The experiment’s shocking account of abuse gripped the nation, and its notoriety earned the psychologist numerous prestigious paid positions: He made a successful documentary, authored one of the most popular psychology textbooks of the time, and hosted a 1990 documentary series titled Discovering Psychology.

Identity Cues in the Stanford Prison Experiment

Zimbardo states that by forcing the students into the simulated roles of “prisoner” and “guard” during the Stanford Prison Experiment and surrounding them with identity cues that reinforced this power dynamic, he created the circumstances necessary to disengage the guards’ sense of morality. After they had donned identical uniforms and spent enough time in the lifelike prison environment, the role of “guard” overwhelmed the volunteers’ normal personality.

While they were initially merely pretending to be fearsome and domineering, within days the guards internalized this role, gaining genuine feelings of disgust toward the prisoners and escalating their cruelty far beyond what Zimbardo asked of them. The guards were the most sadistic toward the prisoners when they felt they were not being watched—they would insult and punish the prisoners more on the night shift than during the day and shove prisoners into the urinals during the presumably unobserved toilet runs. This is the opposite of what we would expect if they were just playacting for the cameras.

This situation’s identity cues transformed the prisoners’ personalities, too. After a couple of days of arbitrary punishment for senseless rules, the college-age volunteers became mindlessly obedient to the guards. Their personalities from the outside world disappeared. Rationally, they knew they were volunteers who could quit at any time, but they embraced their roles as reality so deeply that none of them really tried—they accepted their fate as helpless prisoners.

Evidence Against the Power of Roles in the SPE

Zimbardo asserts that the Stanford Prison Experiment proves the power of situational identity cues to overpower your everyday personality. Recent interviews with the experiment’s participants and recently uncovered records of the experiment call this conclusion into question.

Zimbardo argues that the guards’ cruelty escalated as they slipped more into their roles, losing touch with their existing personal identities. However, in an interview, the guards’ brutal leader Dave Eshelman (renamed as “Hellmann” for privacy’s sake in The Lucifer Effect) states that he remained detached from the guard identity during the experiment. Eshelman claims he saw the work as a mere theatrical exercise, drawing from his experience studying acting. Judging from the rest of his interview, Eshelman identified more as Zimbardo’s partner in experimentation than as a real prison guard. Therefore, we can perhaps blame his cruelties more on Zimbardo’s coaching and direction than the corrupting influence of the “guard” role.

The prisoners’ roles may not have overtaken their personalities as much as Zimbardo claims, either. Zimbardo argues that the prisoners obeyed the guards even though they knew they could have left at any time because they identified so deeply with the “prisoner” role. However, a recently uncovered recording from the experiment shows Zimbardo admitting that he told the prisoners they really couldn’t leave—meaning that their obedient behavior was indicative of true helplessness, not role identification. In a later interview, Zimbardo claims he gave the prisoners a “safe phrase”: If they said the exact words, “I quit the experiment,” he would release them. However, the consent forms the prisoners signed before participating make no reference to this safe phrase.

Variable #2: Social Pressures

Zimbardo explains that social pressures are another major circumstantial variable with the power to influence us to act immorally. When the people around us want us to do something evil, it becomes significantly more difficult for us to resist. These social pressures come in two main forms that often overlap: group pressure and authoritative pressure.

Group Pressure

Zimbardo asserts that we all have a basic need to feel accepted by those around us. For this reason, when we find ourselves in new situations, we observe those around us to determine what behavior is appropriate. In this way, our morals often shift to match those of the people around us.

Zimbardo notes that if we perceive a group to be prestigious and exclusive, the pressure it exerts on us is even more powerful. Our human need to belong becomes stronger when combined with our desire for status. For example, a schoolchild who wants to be accepted by the “cool kids” may make fun of kids they would normally be friends with.

The Rationale Behind Group Pressure

In Influence, Robert Cialdini points out that we allow group pressure to shape our behavior for a good reason: Most of the time, when the majority of a group is doing something, it is the right thing to do. For example, if we’re walking down the sidewalk and see several people running panicked in the same direction, it would be reasonable to follow suit since they probably have a good reason to run—perhaps an armed robbery is taking place nearby.

Since this instinct to imitate those around us is hardwired in the human brain (as Zimbardo points out), we can assume that it has helped us survive since the early days of humanity. Our tendency to imitate those in prestigious groups makes sense through this lens, too—emulating those at the top of the dominance hierarchy would presumably increase our chances of becoming high-status ourselves.

Authoritative Pressure

Authoritative pressure is when an individual or group in power bids you to do something. Zimbardo asserts that, generally speaking, it’s far more likely that we will commit evil if we’re “just following orders.” While group pressure is indirect and sometimes accidental, authoritative pressure is a direct attempt to control your behavior.

(Shortform note: Contradicting Zimbardo’s theory of circumstantial morals, research shows that there are people who are more likely to resist authority on moral grounds than others. Surprisingly, these “moral rebels” report lower levels of self-esteem than the average person. Perhaps this grounded self-image helps moral rebels second-guess their behavior before complying with an immoral authority.)

Research has shown that we’re far more compliant to authority than we believe ourselves to be. To support this point, Zimbardo describes the most famous experiment on this subject: the Milgram experiment. In 1963, Yale professor Stanley Milgram ran this experiment: An assistant in an authoritative-looking lab coat ordered volunteers to administer increasingly severe shocks to a fellow volunteer for a study on memory. Unbeknownst to them, this second volunteer was an actor who would pretend to be in incrementally greater pain until they screamed in agony, begged the volunteer to stop, and finally pretended to lose consciousness.

A group of psychiatrists predicted that fewer than 1% of volunteers would follow orders and administer the most severe shock level, but in reality, 65% of people did. Milgram presented this as evidence of the extreme power of authoritative pressure.

(Shortform note: Like the Stanford Prison Experiment, the validity of the Milgram experiment is the subject of debate. Detractors of Milgram’s experiment argue that Milgram is misrepresenting his data: While he claimed that the assistant in authority used the same four prewritten commands to get the participants to continue, the experiment’s archived audio reveals that the assistant heavily improvised, offering any argument they could to get the participant to continue the electric shocks. This lack of standardization threatens the objectivity of Milgram’s 65% compliance rate. There’s also evidence that some of the participants saw through the charade and only continued because they believed no one was really getting hurt.)

Social Pressures in the Stanford Prison Experiment

Zimbardo recounts how group pressure influenced the Stanford Prison Experiment guards to be more cruel to the prisoners. In every shift, one guard would take the lead in abusing the prisoners, and at least one would imitate him. Quickly, tormenting the prisoners became the norm, and guards who didn’t actively do so stuck out. Many of the guards who initially didn’t want to hurt the prisoners eventually did so to fit in. No guards ever stood up to the group consensus and demanded they tone down the abuse.

The power of authoritative pressure in the Stanford Prison Experiment is best seen in the prisoners. The guards frequently used their authority to get the prisoners to degrade and harm themselves and one another, and for most of the experiment, the prisoners complied. The guards ordered the prisoners to sing songs for them, insult one another, and perform sexual pantomimes on one another. In retrospect, the guards reported being shocked by how readily the prisoners conformed to their extreme commands. They continually expected the prisoners to eventually stand up for themselves and refuse to play along, but they never did.

A Replication of the Experiment Highlights the Power of Social Pressure

Recreations of the Stanford Prison Experiment may indicate that social pressures have a far greater influence on behavior than any other circumstantial variable. The most famous recreation of the experiment is the 2001 BBC Prison Study, in which different social pressures led the experiment to a completely different outcome, even though the identity cues and other circumstantial variables were all the same as in Zimbardo’s experiment.

Zimbardo freely admits that in the original Stanford Prison Experiment, he instructed the guards to behave cruelly to the prisoners, exerting his social (authoritative) pressure and corrupting their morality. In the BBC Prison Study, in which the researchers did not exert authoritative pressure on the volunteers, the same extreme level of abuse was not replicated: The guards failed to set a cohesive norm for the group or build any sense of camaraderie. Furthermore, instead of bowing to the guards’ authoritative pressure, the prisoners banded together and overthrew the guards, setting their own rules and turning the prison into a “commune.”

Defending his experiment, Zimbardo criticizes the 2001 BBC Prison Study, calling it a fake experiment, as it was produced for reality TV and involved far more experimenter intervention than the Stanford Prison Experiment.

Variable #3: Awareness of Individuality

The final morality-shifting circumstantial variable that we’ll discuss is awareness of individuality. Zimbardo explains that people disengage their sense of morality when they lose the sense that they and the people they are mistreating are unique individuals. This lack of awareness comes in two forms: anonymity and dehumanization.

Anonymity

The more mundane form of this loss of individuality is anonymity: Zimbardo claims that if we think we’re unseen and unlikely to be identified, we’re more likely to do evil. This anonymity could take the form of a lack of witnesses, or a simple mask or disguise.

Anonymity is powerful because it lessens our personal accountability. When we know we can’t be identified (and therefore we can’t be punished or shamed for our actions), we’re far more likely to commit evil acts. Directly removing personal accountability has the same effect: When someone else volunteers to take responsibility for an evil action, we will readily carry it out ourselves.

(Shortform note: In Skin in the Game, Nassim Nicholas Taleb incorporates the idea of anonymity encouraging immoral action into a broader concept he calls “skin in the game.” In Taleb’s eyes, an action can only be ethical if the one doing it has skin in the game—if they will lose something if the action hurts someone. As Zimbardo points out, anonymity removes skin in the game by reducing personal accountability, and thus, it encourages evil. However, Taleb would argue that anonymity reduces other benefits of skin in the game, too—anonymous people will learn less from the consequences of their actions and feel less motivated to do good work.)

Dehumanization

The more intense form of this loss of individuality is dehumanization: According to Zimbardo, when we perceive others as less than human—typically a category of people that are different from us, such as the members of another race—it makes it far easier for us to treat them with cruelty.

Those who dehumanize others also often fall into the mental habit of dehumanizing themselves: They perceive themselves as an object or force instead of a human being with morals. This encourages them to continue hurting others. Physical disguises that promote anonymity have a self-dehumanizing effect, too. For this reason, warriors have traditionally painted their faces or donned uniforms before going into battle—dehumanizing themselves makes it easier to kill.

Dehumanization as the Only Intolerable Language

Author Brené Brown uses dehumanization as the line that divides tolerable language from intolerable language. She states that generally, we should tolerate the behavior of those around us, showing them as much kindness as possible, even if they anger or offend us. However, we need to set boundaries and make it clear what behavior is too hurtful to tolerate. Brown states that it’s difficult to know where we should draw this line—we shouldn’t refuse to interact with someone just because they say something that offends us or challenges our point of view.

However, Brown asserts that calling people “pigs,” “rats,” or other, more vulgar dehumanizing labels is the first step toward viewing others as less than human and rationalizing physical harm. All of history’s genocides began this way. In Brown’s eyes, this kind of dehumanizing language should never be tolerated.

Furthermore, Brown agrees with Zimbardo that dehumanizing others dehumanizes ourselves. She frames this as inherently immoral—not only should we avoid dehumanizing ourselves because we don’t want to hurt others, we shouldn’t “desecrate our divinity” by diminishing the humanity we share with others.

Loss of Individuality in the Stanford Prison Experiment

Zimbardo recounts that the guards of the Stanford Prison Experiment wore identical uniforms, masked themselves with reflective sunglasses, and forced the prisoners to refer to them by title instead of name, all of which contributed to their anonymity and self-dehumanization, disengaging their sense of morality.

There were a number of factors at play contributing to the prisoners’ dehumanization as well. The guards only referred to prisoners by the numbers on their jumpsuits and forbade them from using their real names. The guards also prohibited the prisoners from openly or honestly expressing their emotions, causing them to feel (and appear) less human. For these reasons, the guards reported seeing the prisoners as animals and losing their feelings of empathy for them.

When Individuality Fuels Violence

Some argue that extreme acts of cruelty are not fueled by the loss of individuality, but instead moral righteousness—the desire to do good through violence. For example, a religious extremist may think they’re improving the world by hurting someone as punishment for doing something immoral. Contrary to Zimbardo’s argument, this kind of “moral” violence requires perpetrators to see their victim as a fully human individual with the ability to make moral choices.

This kind of humanity-focused violence may have been present in the Stanford Prison Experiment—just because the guards donned identical uniforms and kept the prisoners in dehumanizing conditions doesn’t necessarily mean that these cues caused the abusive behavior. On the contrary, there is evidence that the guards justified their abuse on moral terms: According to Zimbardo, the guards reported punishing the prisoners because they felt they deserved to be punished.

Powerful Institutions Are at Fault

If the people who do evil are largely at the mercy of circumstances outside of their control, who is ultimately to blame for the evil in the world? Zimbardo argues that we should blame the institutions in power that establish specific circumstances. In his eyes, to effectively curb evil, we have to change the systems that give rise to the situations that encourage it.

According to Zimbardo, these institutions gain power by influencing enough people to accept their ideology—a belief system centered around a highest value that must be achieved by any means necessary. These institutions attempt to perpetuate their ideology and stay in power by relentlessly pursuing their highest value, creating the circumstances conducive to great evil in the process.

For example, Zimbardo extensively criticizes the Bush administration for its role in creating the circumstances that gave rise to the American torture and abuse of Iraqi prisoners of war at the Abu Ghraib prison. He claims that the Bush administration used the War on Terror and the ideology of “national security above all else” to convince the American population to keep them in power. In their pursuit of national security (brutally interrogating prisoners for information that would help win the Iraq War and prevent future terrorist attacks), they established circumstantial factors that directly led to torture and abuse: recreating, to an extent, the conditions of the Stanford Prison Experiment in the real prisons of Iraq and Afghanistan.

Is Ideology Really Imposed From Above?

Zimbardo argues that institutions create ideologies and gain power by persuading the general public to buy into them. In contrast, some make the argument that this exchange goes the other way around—the public develops an ideology, and institutions profit by following it.

For example, the news media is often accused of manipulating the public by drawing their attention to specific issues and ignoring others. However, the public largely ignores most of the stories in the news. Only a small fraction of stories “go viral” when the public responds enthusiastically to them, causing the news media to report more on the topic to attract an audience. In this way, the audience’s ideology shapes the news media, not the other way around.

From this perspective, the Bush administration didn’t create the ideology of national security, as Zimbardo argues. Instead, they made decisions based on the security-obsessed ideology that was already popular among the public, likely due in large part to the 9/11 terrorist attacks.

Survey data supports this theory, revealing that the majority of Americans already supported the Iraq War before the Bush administration’s rally for war support. Public approval of an invasion of Iraq peaked just after 9/11, two months before Bush announced that Iraq was one of America’s enemies as part of the “Axis of Evil” in his 2002 State of the Union Address. Support for the Iraq War only dwindled from that point on, indicating that the Bush Administration’s war-focused rhetoric had little effect on public opinion.

How Can We Resist Circumstantial Influence?

Now that we’ve fully explained Zimbardo’s view of how circumstances and institutions contribute to human evil, we’ll conclude by discussing his tips for navigating this evil-filled world.

On the bright side, Zimbardo notes that not only does everyone have the potential to do something unspeakably evil, everyone has the potential to do something remarkably noble, too.

(Shortform note: Some point out a contradiction in Zimbardo’s argument here: If circumstances are largely responsible for whether or not you do something evil, they should also be responsible for whether or not you do something heroic. By this logic, whether or not you behave heroically is out of your control, and these “tips” won’t help.)

Follow these three tips to increase your odds of becoming a hero yourself.

Tip #1: Overcompensate When Tightening Your Morals

Zimbardo explains that most people have a self-serving bias—we understand how circumstances impact human behavior, but we assume that we’re too clever and self-aware to make the same mistakes. Overconfidence leaves us vulnerable to circumstantial influence—instead, tighten your morals more than you feel is necessary to prevent yourself from unwittingly doing evil.

One way Zimbardo suggests doing this is by taking full responsibility for your actions. Never blame someone else for making you do something—for example, if a friend were to convince you to help them rob a convenience store, or if your boss told you to hide the evidence of their embezzlement, you should view these misdeeds as if you did them alone. This habit will help you think twice before social pressures influence you to become complicit in someone else’s evil.

(Shortform note: In Extreme Ownership, Jocko Willink and Leif Babin take Zimbardo’s argument further: They argue that instead of just taking responsibility for your own actions, you should take responsibility for the actions of everyone on your team, even things that no one would reasonably blame you for. Even though Willink and Babin offer this advice in the context of goal-oriented leadership, it potentially applies to everyday morals as well—if you see yourself as responsible for the misdeeds of everyone around you and do everything you can to discourage people from harming others, you can be sure that you’ll never fall prey to the self-serving bias and unknowingly condone evil.)

Tip #2: Be Critical of Your Group

Zimbardo advises that you turn a critical eye to whatever groups you belong to—always be ready to take a firm moral stance against them. Recall that social pressures are one of the main circumstantial factors that can corrupt your sense of morality. There’s no need to deny your need to be liked and accepted, but you won’t be happy if you sacrifice your individual moral values to avoid being rejected. Instead, seek acceptance from groups that share your morals and accept rejection from those who don’t.

(Shortform note: Zimbardo claims that group acceptance is a universal human need, but in The Courage to Be Disliked, Ichiro Kishimi and Fumitake Koga argue that this is a popular misconception. On the contrary, they claim that seeking the approval of others is the root of unhappiness. Like Zimbardo, Kishimi and Koga recommend connecting with groups that share your moral values, but only for the intrinsic joy of acting in accordance with those values—not for the temporary high of external approval.)

We’re biased to disrespect and even dehumanize those outside of our groups, writes Zimbardo. Try to celebrate your group’s good qualities without disparaging the different qualities of other groups. If you can, look past groups entirely—instead of defining yourself and others by the groups they belong to, work to see everyone as a unique individual.

(Shortform note: It can be challenging to see people as individuals because this in-group bias often occurs on the unconscious neurological level. For example, as ​​Jennifer Eberhardt explains in Biased, our brains are wired to more easily recognize members of our race than members of other races. For this reason, we’re more likely to see members of our race as individuals, and members of other races as homogenous group members.)

Tip #3: Oppose Unjust Institutions

To overcome our bias to obey authority, Zimbardo suggests being skeptical of all institutions in power. Obey authorities when they act in alignment with your values, but resist any authority that tries to coerce you into doing something immoral, no matter how large or powerful they are. When we encounter immoral groups and institutions, we often assume that we’re powerless to change them. However, all it takes is one person to start a movement, attract followers, and enact positive change.

(Shortform note: In David and Goliath, Malcolm Gladwell offers a more extreme version of Zimbardo’s argument: Gladwell not only claims that it’s possible for individuals and small groups to change immoral institutions, but he also asserts that small groups have a number of advantages over larger, more powerful institutions. For instance, since small groups can’t hope to win in a “fair fight,” they come up with original unexpected tactics to achieve victory. Large institutions aren’t prepared to combat these unexpected tactics, putting them at a disadvantage.)

According to Zimbardo, it’s vital to resist authority this way because doing nothing to avert evil is evil in itself. Scandals of abuse, unjust wars, and even genocides happen with the support of countless silent observers. Many people make half-hearted attempts to resist evil with empty words that accomplish nothing. Zimbardo makes it clear that such words don’t matter—only actions that push back against evil are truly virtuous.

(Shortform note: Omission bias, the human tendency to judge harmful actions as worse than harmful inaction, makes it difficult for us to actively oppose evil as Zimbardo encourages. For example, imagine you know that the brakes in someone’s car are broken, but you say nothing, allowing them to get injured in a car crash. Due to omission bias, we would typically judge this to be less immoral than if you had gone in and cut the brakes yourself, even though both actions have the same outcome. For this reason, we may find ourselves tempted to commit the former sin, even if we would never commit the latter. To overcome omission bias, we would have to judge both situations as unequivocally immoral and unacceptable.)

Exercise: Try Out the Theory of Circumstantial Morals

Zimbardo’s theory of circumstantial morals is very different from the way most of us judge the behavior of others. See what it’s like to intentionally apply this perspective to the people in your life.

Exercise: Articulate Your Opinion of Human Morality

As a final reflection, figure out exactly what you believe about the human capacity for evil.