Does Anchoring Work In The Courtroom?
Anchoring is one of the most prevalent and enduring heuristics that decision-makers encounter in their daily lives and is particularly powerful when making decisions under uncertainty. An anchor is an initial reference point that has an outsized impact on how decision-makers interpret and encode subsequent information on the topic. And most strikingly, we struggle to overcome the anchor’s effect even when given incentives to do so or when made conscious of the resulting bias.
Anchoring: A Simple Example
Take for example a study where participants were asked to recall the last two digits of their social security number and then were subsequently asked to price a bottle of wine. You can already guess the outcome: those with social security numbers ending with high digits (think 70s, 80s, or 90s) were willing to pay more for the wine than those with social security numbers ending with lower digits.
But does anchoring work in the courtroom too?
Anchoring Effect & Juries
Juries are just regular people without any specialized legal expertise, so you’d expect that just as regular people are influenced by anchors in pricing wine, they might also be influenced by anchors in the courtroom.
And you would be right. 56 mock jurors were presented with a hypothetical case where the plaintiff was arguing that her birth control pill caused her ovarian cancer. She was suing the Health Maintenance Organization (HMO) for prescribing her the pill. In the high anchor group, the plaintiff asked for $5 million in damages; in the low anchor group, she asked for only $20,000. The question was: is this anchor going to affect the jurors’ perception of causation?
The answer, predictably, was yes. Jurors in the low anchor condition were 26.4% confident that that the HMO caused the injury, whereas jurors in the high anchor condition were 43.9% confident that HMO caused the plaintiff’s injury.
behavior change 101
Start your behavior change journey at the right place
Anchoring Effect & Judges
Judges, unlike juries, are subject-matter experts. Can they really be susceptible to anchoring as well?
According to some preliminary research, the answer is probably, yes. Once an anchor is set, research suggests that a judge is more likely to interpret subsequent information around that anchor, even if the anchor is totally irrelevant.
In one study, judges were presented with a hypothetical case involving a shoplifter who had just been caught for the 12th time. The judges were asked to sentence the shoplifter, but only after the prosecutor made a sentencing demand. And here’s the twist, the judges were told ahead of time that the prosecutor’s demand was totally arbitrary and random; therefore the prosecutor’s sentencing demand contained no useful information.
Even so, the judges who received the low anchor (i.e., the prosecutor demanding a shorter sentence) landed on a shorter average sentence than the judges in the high anchor condition.
[image adapted from https://www.thelawproject.com.au/insights/anchoring-bias-in-the-courtroom]
Ok, so the judges assigned weight to a prosecutor’s (random) sentencing demand. But does this really prove the anchoring effect in judges?
The AI Governance Challenge
So, to squash any doubts about the anchoring effect, the same group of researchers went about designing the most absurd scenario possible. Here, a group of judges were given a hypothetical case where prosecutors were charging a defendant with theft. Instead of being given the prosecutor’s sentencing demand, the judges were told to discover the sentencing demand by rolling a dice. (Yes, really.) The dice was rigged so as to land on high numbers for one group of judges and low numbers for the rest.
And … somewhat unbelievably, the outcome of the dice-rolling exercise influenced the judges’ sentencing decisions.
[image from https://www.thelawproject.com.au/insights/anchoring-bias-in-the-courtroom]
Ok, so even if the results of this study hold, judges don’t roll a dice before they make sentencing decisions. And prosecutors don’t make random sentencing requests. So does anchoring really affect judges’ decision-making on the bench?
Well, it might. Englich et al. explain:
“Even though judges typically do not throw dice before making sentencing decisions, they are still constantly exposed to potential sentences and anchors during sentencing decisions. The mass media, visitors to the court hearings, the private opinion of the judge’s partner, family, or neighbors are all possible sources of sentencing demands that should not influence a given sentencing decision.”
[Playing Dice With Criminal Sentences: The Influence of Irrelevant Anchors on Experts’ Judicial Decision Making (2006) by Birte Englich, Thomas Mussweiler, & Fritz Strack]
References
“Coherent Arbitrariness”: Stable Demand Curves without Stable Preferences (2003) by Dan Ariely, George Loewenstein and Drazen Prelec
The More You Ask for, the More You Get: Anchoring in Personal Injury Verdicts (1996) by Gretchen B. Chapman & Brian H. Bornstein
Playing Dice With Criminal Sentences: The Influence of Irrelevant Anchors on Experts’ Judicial Decision Making (2006) by Birte Englich, Thomas Mussweiler, & Fritz Strack
The Anchoring Bias and Its Effect on Judges by Rod Hollier, https://www.thelawproject.com.au/insights/anchoring-bias-in-the-courtroom
About the Author
Tom Spiegler
Tom is a Co-Founder and Managing Director at The Decision Lab. He is interested in the intersection of decision-science and the law, with a focus on leveraging behavioral research to shape more effective public and legal policy. Tom graduated from Georgetown Law with honors. Prior to law school, Tom attended McGill University where he graduated with First Class Honors with majors in Philosophy and Psychology.
About us
We are the leading applied research & innovation consultancy
Our insights are leveraged by the most ambitious organizations
“
I was blown away with their application and translation of behavioral science into practice. They took a very complex ecosystem and created a series of interventions using an innovative mix of the latest research and creative client co-creation. I was so impressed at the final product they created, which was hugely comprehensive despite the large scope of the client being of the world's most far-reaching and best known consumer brands. I'm excited to see what we can create together in the future.
Heather McKee
BEHAVIORAL SCIENTIST
GLOBAL COFFEEHOUSE CHAIN PROJECT
OUR CLIENT SUCCESS
$0M
Annual Revenue Increase
By launching a behavioral science practice at the core of the organization, we helped one of the largest insurers in North America realize $30M increase in annual revenue.
0%
Increase in Monthly Users
By redesigning North America's first national digital platform for mental health, we achieved a 52% lift in monthly users and an 83% improvement on clinical assessment.
0%
Reduction In Design Time
By designing a new process and getting buy-in from the C-Suite team, we helped one of the largest smartphone manufacturers in the world reduce software design time by 75%.
0%
Reduction in Client Drop-Off
By implementing targeted nudges based on proactive interventions, we reduced drop-off rates for 450,000 clients belonging to USA's oldest debt consolidation organizations by 46%