Why don’t we pull the trolley lever?

The 

Omission Bias

, explained.
Bias

What is the Omission Bias?

The omission bias refers to our tendency to judge harmful actions as worse than harmful inactions, even if they result in similar consequences.

Where it occurs

Imagine the following scenario.

You are on a walk when you see a runaway trolley car barreling down the railroad tracks. A group of five people are in the path of the trolley, and are unable to move out of the way in time to escape. You see there is a lever close to you that can switch the direction of the trolley onto another set of tracks. However, you notice one man standing on the other tracks that would also be unable to escape if you pulled the lever.

You find yourself in a moral dilemma with two options. You can A) do nothing and have the trolley kill five people or B) pull the lever and kill one person in order to save five. What is the right thing to do?

While neither option is optimal, most people would agree that option B is the most morally sound. However, you might feel like the action of pulling the lever and killing one person would instill more guilt than the inaction resulting in the death of five people. Even though the consequences of choosing option A are worse, our desire to abstain from any harmful actions (and the subsequent blame) can override the more ethical choice. This famous thought experiment, dubbed “the Trolley Problem”, demonstrates the omission bias in action.1

Debias Your Organization

Most of us work & live in environments that aren’t optimized for solid decision-making. We work with organizations of all kinds to identify sources of cognitive bias & develop tailored solutions.

Learn about our work

Related Biases

Individual effects

Generally, most people want to do good and avoid causing harm in their everyday lives. We like to feel altruistic and compassionate. Although there is often gray area, we try to listen to our internal barometer of morality and act accordingly. Yet, sometimes the moral judgments we make are grounded in biased thinking. The omission bias causes us to view actions as worse than omissions (cases where someone fails to take action) in situations where they both have adverse consequences and similar intentions.

However, many philosophers believe that the distinction between omission and action is more arbitrary than we like to think. Philosopher Jonathan Bennett even argues that there are many more possible ways to execute an omission than an action.2 According to Bennet, there are many ways we could avoid pulling the trolley lever and fewer ways that we could actually pull the lever to save the five people. Thus, Bennett argues that moral differences we attribute to action vs. omission are not so definite.

As a result, the omission bias can manifest in poor judgment in our perception of others and enable our own negative behaviors. When we are assessing the integrity of others, the omission bias can cause us to mentally underplay the insidiousness of inaction in certain situations. Through internalizing these judgments, we often feel morally protected in our own omissions and can rationalize harm by saying, “Well… I didn’t do anything!” In the Trolley Problem, we might think, “It wasn’t our fault! Plus we couldn’t harm the one man in the other set of tracks!” and fail to reflect on our own moral discrepancies.

Systemic effects

Individual judgments of morality and assessments of harm amplify on a group-level, especially within the justice system. In “tort law”, victims can file suits against injurers in order to compensate for their losses. A study by behavioral science researchers Jonathan Baron and Ilana Ritov found significant evidence of the omission bias in subjects’ judgments of compensations and penalties.3 For example, they proposed a case in which a woman becomes sterile from taking birth control. They then proposed alternative scenarios: A) the injury was caused as a side effect of birth control or B) the injury was caused because a company did not release a safer birth control that was deemed less profitable. A significant amount of participants asserted that the woman should receive greater compensation for being a victim of harmful actions (scenario A) versus harmful omissions (scenario B). Baron and Ritov assert that these findings reflect issues of biases in the overall tort law system.

The omission bias also has major impacts within the field of medicine. For example, we can look at how organ donation rates are influenced by the omission bias.4 You must “opt-in” to become an organ donor In the United States. Under this system, there were over 60,000 Americans waiting for an organ transplant in the year 2000. For those in the US, the harms caused by omission (not opting in) can seem “less blameworthy”. In contrast, countries such as Belgium, Austria, and Brazil have presumed consent, or an “opt-out” policy. With this policy, countries typically have an organ donation rate of around 86% to 100%. For these countries, actively opting out feels like an act of harm, which makes people less likely to do so. This example also demonstrates the power of framing on our decision-making, a phenomenon otherwise known as the framing effect.

Why it happens

There are frequently situations in which actions actually are more harmful than omissions. In those cases, our judgment is unbiased and our moral compass points in the right direction. So what offsets our moral compasses and why?

We overgeneralize

As previously stated, there are many cases where our judgment that actions are worse than inactions is correct. This becomes a heuristic, or a cognitive ‘short-cut’, we use to assess morality of others and guide our own actions. It is when we are confronted with scenarios in which the outcome and the intent of harmful actions and inactions are the same, but we continue to treat them differently, that this heuristic becomes overgeneralized and detrimental.5  Overgeneralizing a heuristic can be likened to the “inappropriate transfer of mathematical rules”, like using the Pythagorean theorem to determine the length of a rectangle.5

Philosopher and ethicist Peter Singer also suggests that the omission bias also allows us to impose a limit on our moral responsibilities.6 If there is a greater weight to harmful actions, we can feel unbothered by the harms inflicted by our omissions.

We are averse to loss

Another explanation for the omission bias is that we weight losses more than gains of the same amount, otherwise known as loss aversion. If we fail to act and it results in a bad outcome, we can think of it as a missed opportunity for gain. If we act, and it results in a bad outcome, we think of this as a loss.4

For example, say one investor has shares in stock A and thinks of switching to stock B, but decides not to. If stock B skyrockets, the investor will probably kick herself a bit, but it won’t feel like a loss. If another investor has shares in stock B and sells them for shares in stock A, when stock B skyrockets it will feel like a major loss.7 This investor would probably be more upset with himself or be perceived as more foolish by others.

Here we can see how we tend to judge a person more negatively when their actions result in a loss, as opposed to when their inactions forgo a gain. Our aversion to losses is powerful and often blinding. If we view actions and omissions within a framework of losses and gains, we can deepen our understanding of the omission bias.

Why it is important

The omission bias is embedded in our societal framework, from legal to medical practices.4 In our personal relationships, we can fall victim to the omission bias and feel justified in omitting the truth because we consider it better than lying. When we are assessing the ‘goodness’ of an action, it isn’t always black and white. It makes sense that we would want to make it easier on ourselves and take a shortcut. Yet, the ways in which overgeneralization can have a costly impact that should not be ignored.

However, as we saw in the example of organ donation, policymakers have immense power in how they frame the decisions we make as individuals. By understanding the mechanisms behind the omission bias, policymakers have the opportunity to harness this for the public good. Imagine if certain clean energy components were part of an opt-out system rather than opt-in. As with organ donations, this would most likely result in major changes in carbon emission levels.

How to avoid it

Avoiding our biases can be complicated, as they are so deeply ingrained in our thinking. Earlier we talked about how the omission bias can occur because of overgeneralization.

Sometimes this overgeneralization occurs because we don’t even realize that we are using a heuristic to assess morality. This prevents us from thinking critically about the situations in which it may be incorrectly applied and results in biased thinking.5 So, a good place to start is reflecting on the ways in which we revere omissions over actions in our everyday lives. Think about the cases where this heuristic is grounded and think about the cases where it might not fit. Moving forward, we can try and think about the consequences of our inactions, rather than thinking of our inactions as inconsequential.

Just like policymakers, there are ways we can frame things in our own lives to work with our omission bias and make better decisions. For example, we can mirror the opt-out method when preparing for an exam by setting a plan with classmates to meet at the library every night at 7. This way, even if we are really not in the mood to study, it would take the action of canceling to avoid it. On the other hand, if we don’t make any commitments we are using an opt-in method. This gives us the opportunity to avoid studying through omission without feeling like our actions are resulting in negative consequences

How it all started

The omission bias was first studied by behavioral science researchers Mark Spranka, Elisa Minsk, and Jonathon Baron from the University of Pennsylvania in 1990.5  In one overarching study, Spranka, Minsk, and Baron administered a series of experiments through paid online questionnaires asking participants to make moral judgments of actors in various scenarios. For example, in their first experiment, they proposed the following case:

John is a tennis player at a tennis club. He is the best in the club, but not good

enough to play professionally. Every year at John’s club there is a tournament with a prize of $20,000, which sometimes attracts major players. John makes it to the finals but is now up against tennis-pro Ivan Lendl for the prize. On the eve of the finals at dinner, John remembers that Ivan is allergic to Cayenne pepper and the club serves a house salad dressing with Cayenne. John knows Ivan will have stomach issues that interfere with his performance if he eats this.

The participants were then asked to rate John’s morality in a series of possible endings:

  • John recommends the house dressing before Ivan orders
  • John says nothing when Ivan orders the house dressing
  • John recommends changing to the house dressing after Ivan orders Italian dressing

The results showed that 65% of participants showed evidence of the omission bias, rating “John saying nothing..” as less immoral than the other options, even though the outcome was the same. They also found that participants felt John should have a greater penalty in the endings where he recommended the dressing. The majority of participants who showed the omission bias rationalized their moral judgments by saying John “had a greater causal role”.

This work by Spranka, Minsk, and Baron contributed crucial findings in omission bias research by proving its prevalence with experimental data and dissecting the cognitive mechanisms at play.

Example 1 - Anti-vaxxers

A 1994 study by David Asch and his colleagues explored how the omission bias affects parents’ decisions of whether to vaccinate their kids.8 Some parents choose not to have their children vaccinated for pertussis (also known as ‘whooping cough’) because of “fears that reaction to the vaccine itself may lead to death or serious injury”. Medical data proves these fears to be negligible. In the 1970’s Britain, there was a decline in pertussis vaccinations that resulted in a major increase in cases and pertussis related deaths. Thus, the researchers used the real-life example of the pertussis vaccine to examine these decisions with historical relevance.

Asch and his team administered a questionnaire to parents about the vaccine and various questions testing their bias. Their results showed that respondents who reported they would not vaccinate their kids were “more likely to believe that vaccinating was more dangerous than not vaccinating” and were “more likely to exhibit omission bias”. Even though vaccinating had much lower probabilities of causing harm than not vaccinating, parents with the omission bias favored inaction over action.

Example 2 - Professional sports

In their book Sportscasting, Tobias Moskowitz and L. John Wertheim discuss how biases impact professional sports games.9 For example, they explore how the omission bias causes referees to avoid making calls that will determine game outcomes.

In baseball, a player walks to first base if the umpire calls four ‘balls’. According to Moskowitz and Wertheim, umpires have an error rate of 12.2% for balls outside the strike zone. However, when there are three balls, the umpires will have an error rate of 20% for balls outside the strike zone. This allows them to avoid sending a batter to base.

Further, in professional basketball, there is evidence that referees call fewer fouls at the end of tight games. Data shows that they are especially less likely to call fouls that are “more at the discretion of the referee” when a close game is ending. Omission bias in referees can cause them to not call actual fouls, in order to avoid the possibility of calling a foul and altering the game.

Summary

What it is

The omission bias refers to our tendency to view harmful inactions as more morally sound than harmful actions.

Why it happens

The omission bias occurs because we overgeneralize the belief that actions cause more harm than omissions. Additionally, when we act and cause negative outcomes, we view that as a greater loss than when we fail to act and cause negative outcomes.

Example 1 – How the omission bias influences the case for anti-vaccination

A study by David Asch and colleagues found that parents who refused to vaccinate their children for pertussis showed the omission bias. These parents saw not vaccinating as the safer option even though the probability of harms were greater than if they were to vaccinate.

Example 2 – How the omission bias impacts professional sports

Toby Moskowitz and L. John Wertheim report that professional sports referees tend to avoid making game altering calls due to the omission bias. In baseball, this manifests in umpires avoiding calling a fourth ball. In basketball, the omission bias causes referees to avoid calling fouls towards the end of tight games.

How to avoid it

We can reflect on how the omission bias skews our perception and actions. We can remind ourselves to consider the consequences of our omissions. Also, we can learn to harness our omission bias through changes in framing.

Related TDL articles

CO2 Out Of Sight, Not Out Of Mind: Carbon Capture and Storage Risks

This article discusses the process of carbon capture and storage (CCS), which traps and contains carbon dioxide for elimination from our atmospheres, in the wake of our global climate crisis. The author discusses the risks of CSS and breaks down how our biases and beliefs intersect with this proposed climate solution.

Framing effect – Biases & Heuristics

This article explores how the way information is presented can influence our decision making. The author uses examples in the legal and medical system to illustrate how this bias reverberates on the societal level, and provides tools on how to make better choices in light of the framing effect.

Sources

  1. Thomson, J. J. (1976). KILLING, LETTING DIE, AND THE TROLLEY PROBLEM. The Monist59(2), 204–217. JSTOR.
  2. Bennett, J. (1981). Morality and Consequences (S. McMurrin, Ed.; Vol. 2). University of Utah Press.
  3. Baron, J., & Ritov, I. (1993). Intuitions about penalties and compensation in the context of tort law. Journal of Risk and Uncertainty7(1), 17–33. https://doi.org/10.1007/BF01065312
  4. Baron, J., Bazerman, M. H., & Shonk, K. (2006). Enlarging the Societal Pie Through Wise Legislation: A Psychological Perspective. Perspectives on Psychological Science1(2), 123–132. https://doi.org/10.1111/j.1745-6916.2006.00009.x
  5. Spranca, M., Minsk, E., & Baron, J. (1991). Omission and commission in judgment and choice. Journal of Experimental Social Psychology27(1), 76–105. https://doi.org/10.1016/0022-1031(91)90011-T
  6. Singer, P. (2011). Practical Ethics. Cambridge University Press.
  7. Kahneman, D. (2013). Thinking, Fast and Slow (1st Edition). Farrar, Straus and Giroux.
  8. Asch, D. A., Baron, J., Hershey, J. C., Kunreuther, H., Meszaros, J., Ritov, I., & Spranca, M. (1994). Omission Bias and Pertussis Vaccination. Medical Decision Making14(2), 118–123. https://doi.org/10.1177/0272989X9401400204
  9. Moskowitz, T., & Wertheim, L. J. (2012). Scorecasting: The Hidden Influences Behind How Sports Are Played and Games Are Won. Three Rivers Press.

About the Authors

Dan Pilat's portrait

Dan Pilat

Dan is a Co-Founder and Managing Director at The Decision Lab. He is a bestselling author of Intention - a book he wrote with Wiley on the mindful application of behavioral science in organizations. Dan has a background in organizational decision making, with a BComm in Decision & Information Systems from McGill University. He has worked on enterprise-level behavioral architecture at TD Securities and BMO Capital Markets, where he advised management on the implementation of systems processing billions of dollars per week. Driven by an appetite for the latest in technology, Dan created a course on business intelligence and lectured at McGill University, and has applied behavioral science to topics such as augmented and virtual reality.

Sekoul Krastev's portrait

Dr. Sekoul Krastev

Sekoul is a Co-Founder and Managing Director at The Decision Lab. He is a bestselling author of Intention - a book he wrote with Wiley on the mindful application of behavioral science in organizations. A decision scientist with a PhD in Decision Neuroscience from McGill University, Sekoul's work has been featured in peer-reviewed journals and has been presented at conferences around the world. Sekoul previously advised management on innovation and engagement strategy at The Boston Consulting Group as well as on online media strategy at Google. He has a deep interest in the applications of behavioral science to new technology and has published on these topics in places such as the Huffington Post and Strategy & Business.

Notes illustration

Eager to learn about how behavioral science can help your organization?