People walking in the courtyard of the Louvre Museum in Paris, with the iconic glass pyramid and historic building façades in the background.

Does Anchoring Work In The Courtroom?

read time - icon

1 min read

May 07, 2020

Anchoring is one of the most prevalent and enduring heuristics that decision-makers encounter in their daily lives and is particularly powerful when making decisions under uncertainty. An anchor is an initial reference point that has an outsized impact on how decision-makers interpret and encode subsequent information on the topic. And most strikingly, we struggle to overcome the anchor’s effect even when given incentives to do so or when made conscious of the resulting bias.

Anchoring: A Simple Example

Take for example a study where participants were asked to recall the last two digits of their social security number and then were subsequently asked to price a bottle of wine. You can already guess the outcome: those with social security numbers ending with high digits (think 70s, 80s, or 90s) were willing to pay more for the wine than those with social security numbers ending with lower digits.

A cartoon titled "Anchoring Effect" shows two mugs, each with a price tag of $300. The left mug has only the $300 price tag, while the right mug's tag has $1,000 crossed out, showing $300 as the new price. A stick figure below points to the right mug, saying, "That's like free money!

But does anchoring work in the courtroom too?

Anchoring Effect & Juries

Juries are just regular people without any specialized legal expertise, so you’d expect that just as regular people are influenced by anchors in pricing wine, they might also be influenced by anchors in the courtroom.

And you would be right. 56 mock jurors were presented with a hypothetical case where the plaintiff was arguing that her birth control pill caused her ovarian cancer. She was suing the Health Maintenance Organization (HMO) for prescribing her the pill. In the high anchor group, the plaintiff asked for $5 million in damages; in the low anchor group, she asked for only $20,000. The question was: is this anchor going to affect the jurors’ perception of causation?

The answer, predictably, was yes. Jurors in the low anchor condition were 26.4% confident that that the HMO caused the injury, whereas jurors in the high anchor condition were 43.9% confident that HMO caused the plaintiff’s injury.

behavior change 101

Start your behavior change journey at the right place

Anchoring Effect & Judges

Judges, unlike juries, are subject-matter experts. Can they really be susceptible to anchoring as well?

According to some preliminary research, the answer is probably, yes. Once an anchor is set, research suggests that a judge is more likely to interpret subsequent information around that anchor, even if the anchor is totally irrelevant.

In one study, judges were presented with a hypothetical case involving a shoplifter who had just been caught for the 12th time. The judges were asked to sentence the shoplifter, but only after the prosecutor made a sentencing demand. And here’s the twist, the judges were told ahead of time that the prosecutor’s demand was totally arbitrary and random; therefore the prosecutor’s sentencing demand contained no useful information.

Even so, the judges who received the low anchor (i.e., the prosecutor demanding a shorter sentence) landed on a shorter average sentence than the judges in the high anchor condition.

Bar chart titled 'The Prosecutor’s Random Anchor' showing two bars comparing low and high anchors for months of imprisonment. The low anchor results in about 4 months of imprisonment, while the high anchor results in around 6 months, even though judges knew the anchors were random.

[image adapted from https://www.thelawproject.com.au/insights/anchoring-bias-in-the-courtroom]

Ok, so the judges assigned weight to a prosecutor’s (random) sentencing demand. But does this really prove the anchoring effect in judges?

The AI Governance Challenge book
eBook

The AI Governance Challenge

So, to squash any doubts about the anchoring effect, the same group of researchers went about designing the most absurd scenario possible. Here, a group of judges were given a hypothetical case where prosecutors were charging a defendant with theft. Instead of being given the prosecutor’s sentencing demand, the judges were told to discover the sentencing demand by rolling a dice. (Yes, really.) The dice was rigged so as to land on high numbers for one group of judges and low numbers for the rest.

And … somewhat unbelievably, the outcome of the dice-rolling exercise influenced the judges’ sentencing decisions.

Bar chart titled 'Judges Throwing Dice' showing the difference in years of imprisonment based on low and high anchors. The low anchor results in about 5 years of imprisonment, while the high anchor results in around 8 years. The chart highlights that the judges themselves threw the dice to determine the anchors.

[image from https://www.thelawproject.com.au/insights/anchoring-bias-in-the-courtroom]

Ok, so even if the results of this study hold, judges don’t roll a dice before they make sentencing decisions. And prosecutors don’t make random sentencing requests. So does anchoring really affect judges’ decision-making on the bench?

Well, it might. Englich et al. explain:

“Even though judges typically do not throw dice before making sentencing decisions, they are still constantly exposed to potential sentences and anchors during sentencing decisions. The mass media, visitors to the court hearings, the private opinion of the judge’s partner, family, or neighbors are all possible sources of sentencing demands that should not influence a given sentencing decision.”
[Playing Dice With Criminal Sentences: The Influence of Irrelevant Anchors on Experts’ Judicial Decision Making (2006) by Birte Englich, Thomas Mussweiler, & Fritz Strack]

References

“Coherent Arbitrariness”: Stable Demand Curves without Stable Preferences (2003) by Dan Ariely, George Loewenstein and Drazen Prelec

The More You Ask for, the More You Get: Anchoring in Personal Injury Verdicts (1996) by Gretchen B. Chapman & Brian H. Bornstein

Playing Dice With Criminal Sentences: The Influence of Irrelevant Anchors on Experts’ Judicial Decision Making (2006) by Birte Englich, Thomas Mussweiler, & Fritz Strack

The Anchoring Bias and Its Effect on Judges by Rod Hollier, https://www.thelawproject.com.au/insights/anchoring-bias-in-the-courtroom

About the Author

A young man smiles while standing on a brick sidewalk lined with trees, cars, and American flags, adjacent to a row of townhouses.

Tom Spiegler

Georgetown

Tom is a Co-Founder and Managing Director at The Decision Lab. He is interested in the intersection of decision-science and the law, with a focus on leveraging behavioral research to shape more effective public and legal policy. Tom graduated from Georgetown Law with honors. Prior to law school, Tom attended McGill University where he graduated with First Class Honors with majors in Philosophy and Psychology.

About us

We are the leading applied research & innovation consultancy

Our insights are leveraged by the most ambitious organizations

Image

I was blown away with their application and translation of behavioral science into practice. They took a very complex ecosystem and created a series of interventions using an innovative mix of the latest research and creative client co-creation. I was so impressed at the final product they created, which was hugely comprehensive despite the large scope of the client being of the world's most far-reaching and best known consumer brands. I'm excited to see what we can create together in the future.

Heather McKee

BEHAVIORAL SCIENTIST

GLOBAL COFFEEHOUSE CHAIN PROJECT

OUR CLIENT SUCCESS

$0M

Annual Revenue Increase

By launching a behavioral science practice at the core of the organization, we helped one of the largest insurers in North America realize $30M increase in annual revenue.

0%

Increase in Monthly Users

By redesigning North America's first national digital platform for mental health, we achieved a 52% lift in monthly users and an 83% improvement on clinical assessment.

0%

Reduction In Design Time

By designing a new process and getting buy-in from the C-Suite team, we helped one of the largest smartphone manufacturers in the world reduce software design time by 75%.

0%

Reduction in Client Drop-Off

By implementing targeted nudges based on proactive interventions, we reduced drop-off rates for 450,000 clients belonging to USA's oldest debt consolidation organizations by 46%

Read Next

Insight

Why Are We Polite to ChatGPT?

Explore why we instinctively say "please" and "thank you" to ChatGPT, even though it's just a chatbot. Learn how politeness affects AI responses and how our interactions with chatbots can shape workplace culture and human etiquette.

Insight

A New SPIN on Misinformation

We all stretch the truth from time to time. The real problem is when our lies spread to thousands of people, assisted by recent technological advancements such as social media or artificial intelligence. This has a real impact on the decisions people make—such as who to vote for or whether to get vaccinated.

Notes illustration

Eager to learn about how behavioral science can help your organization?