Imagine you’re driving along the highway, and see an electric sign saying “79 traffic deaths this year.” Would this make you less likely to crash your car shortly after seeing the sign? Perhaps you think it would have no effect?
Neither is true. According to a recent peer-reviewed study1 in the journal Science, one of the world’s top academic publications, you would be more likely to crash, not less. Talk about unintended consequences!
Where this road safety campaign went off course
The study examined seven years of data from 880 electric highway signs, which showed the number of deaths so far this year for one week each month as part of a safety campaign. The researchers found that the number of crashes increased by 1.52% within three miles of the signs on these safety campaign weeks compared to the other weeks of the month when the signs did not show fatality information.
That’s about the same impact as raising the speed limit by four miles or decreasing the number of highway troopers by 10%. The scientists calculated that the social costs of such fatality messages amount to $377 million per year, with 2,600 additional crashes and 16 deaths.
The cause? Distracted driving.2 These “in-your-face” messages, the study finds, grab your attention and undermine your driving — the same reason you shouldn’t text and drive.
Supporting their hypothesis, the scientists discovered that the increase in crashes is higher when the reported deaths are higher. Thus, later in the year as the number of reported deaths on the sign goes up, so does the percentage of crashes. And it’s not the weather: the effect of showing the fatality messages decreased by 11% between January and February, as the displayed number of deaths resets for the year. They also uncovered that the increase in crashes is largest in more complex road segments, which require more focus from the driver.
Their research also aligns with other studies. One3 proved that increasing people’s anxiety causes them to drive worse. Another4 showed drivers fatality messages in a laboratory setting and determined that doing so increased cognitive load, making them distracted drivers.
If the authorities actually paid attention to cognitive science research, they would never have launched these fatality message advertisements. Instead, they relied on armchair psychology and followed their gut intuitions on what should work, rather than measuring what does work.5 The result was what scholars call a boomerang effect,6 meaning when an intervention produces an effect opposite to that intended.
Behavioral Science, Democratized
We make 35,000 decisions each day, often in environments that aren’t conducive to making sound choices.
At TDL, we work with organizations in the public and private sectors—from new startups, to governments, to established players like the Gates Foundation—to debias decision-making and create better outcomes for everyone.
How bad policy decisions may have contributed to youth drug use
Unfortunately, such boomerang effects happen all too often. Consider another safety campaign, the National Youth Anti-Drug Media Campaign between 1998 and 2004, which the US Congress funded to the tune of $1 billion. Using professional advertising and public relations firms, the campaign created comprehensive marketing efforts that targeted youths aged 9 to 18 with anti-drug messaging, focusing on marijuana. The messages were spread by television, radio, websites, magazines, movie theaters, and other venues, and through partnerships with civic, professional, and community groups, with the intention for youths to see two to three ads per week.
A 2008 National Institutes of Health-funded study7 found that indeed, youths did get exposure to two to three ads per week. However, on the whole, the campaign didn’t reduce marijuana use. Not only that, but the researchers found that for some survey participants, exposure to these ads predicted an increase in the likelihood that youth would use marijuana.
Why? The authors find evidence that youths who saw the ads got the impression that their peers used marijuana widely. As a result, the youths became more likely to use marijuana themselves. Indeed, the study found that those youths who saw more ads had a stronger belief that other youths used marijuana, and this belief made starting to use marijuana more likely. Talk about a boomerang effect!
How behavioral design can shape better policy decisions
We know that message campaigns — whether on electric signs or through advertisements — can have a substantial effect. That fits broader extensive research from cognitive science on how people can be impacted by nudges, meaning non-coercive efforts to shape the environment so as to influence people’s behavior in a predictable manner.
Those with authority — whether in government or business8 — frequently attempt to nudge other people based on their mental model of how others should behave. Unfortunately, their mental models are often fundamentally flawed, due to dangerous judgment errors called cognitive biases.9 These mental blindspots10 impact decision-making in all life areas, from business5 to relationships.11 Fortunately, recent research has shown effective strategies to defeat these dangerous judgment errors, such as by constraining our choices to best practices and measuring the impact of our interventions.
Unfortunately, such reliance on best practices and measurements of interventions of such techniques is done too rarely. Fatality signage campaigns have been in place for many years without assessment. The federal government ran the anti-drug campaign from 1998 to 2004, until finally the measurement study came out in 2008.
Instead, what the authorities need to do is consult with cognitive and behavioral science experts on nudges before they start their interventions. And what the experts will tell you is that it’s critical to evaluate in small-scale experiments the impact of proposed nudges. That’s because, while extensive research shows nudges do work,12 only 62% have13 a statistically significant impact, and up to 15%14 of desired interventions may backfire.
The importance of monitoring outcomes
Fortunately, it’s very doable to run a small-scale study in most cases. Returning to the road sign example, authorities may want to pilot, say, 100 electric signs in a diversity of settings and evaluate their impact over three months. You can run ads for a few months in a variety of nationally representative markets for a few months and assess their effectiveness.
Behavioral science is critical here: when road signs are tested by those without expertise in how our minds work, the results are often counterproductive. For example, a group of engineers at Virginia Tech did a study4 of road signs that used humor, popular culture, sports, and other nontraditional themes with the goal of provoking an emotional response. They measured the neuro-cognitive response of participants who read the signs and found that messages “messages with humor, and messages that use wordplay and rhyme elicit significantly higher levels of cognitive activation in the brain… an increase in cognitive activation is a proxy for increased attention.” The researchers decided that because the drivers paid more attention, therefore the signs worked.
Guess what? By that definition, the fatality signs worked, too! They worked in that they drew drivers’ attention to the fatality numbers, and therefore distracted them from the road. That’s an example of how NOT to do a study. The goal of testing road signs should be the consequent number of crashes, not whether someone is emotionally aroused and cognitively loaded by the sign.
Yet that’s not what the authorities chose to do. And based on past experiences, the future has many more boomerang effects in store that have the potential to harm our society and organizations, unless those with power make a commitment to following the science.15
- Hall, J. D., & Madsen, J. (2020). Can Behavioral Interventions Be Too Salient? Evidence From Traffic Safety Messages. SSRN Electronic Journal. https://doi.org/10.2139/ssrn.3633014
- Strayer, D.L., Cooper, J.M., Turrill, J., Coleman, J.R., Medeiros-Ward, N. & Biondi, F. (2013). Measuring Cognitive Distraction in the Automobile (Technical Report). Washington, D.C.: AAA Foundation for Traffic Safety.
- Roidl, E., Frehse, B., & Höger, R. (2014). Emotional states of drivers and the impact on speed, acceleration and traffic violations—A simulator study. Accident Analysis & Prevention, 70, 282–292. https://doi.org/10.1016/j.aap.2014.04.010
- Shealy, T., Kryschtal, P., Franczek, K., & Katz, B. J. (2021). Driver Response to Dynamic Message Sign Safety Campaign Messages (FHWA/VTRC 20-R16). https://www.virginiadot.org/vtrc/main/online_reports/pdf/20-r16.pdf
- Tsipursky, G., & Howard, J. R. (2019). Never Go With Your Gut: How Pioneering Leaders Make the Best Decisions and Avoid Business Disasters. CAREER Press.
- APA Dictionary of Psychology. (n.d.). American Psychological Association. https://dictionary.apa.org/boomerang-effect
- Hornik, R., Jacobsohn, L., Orwin, R., Piesse, A., & Kalton, G. (2008). Effects of the National Youth Anti-Drug Media Campaign on Youths. American Journal of Public Health, 98(12), 2229–2236. https://doi.org/10.2105/ajph.2007.125849
- Güntner, A., Lucks, K., & Sperling-Magro, J. (2021, March 1). Lessons from the front line of corporate nudging. McKinsey & Company. https://www.mckinsey.com/business-functions/people-and-organizational-performance/our-insights/lessons-from-the-front-line-of-corporate-nudging
- Kahneman, D., & Tversky, A. (1996). On the reality of cognitive illusions. Psychological Review, 103(3), 582–591. https://doi.org/10.1037/0033-295x.103.3.582
- Tsipursky, T., & McRaney, D. (2020). The Blindspots Between Us: How to Overcome Unconscious Cognitive Bias and Build Better Relationships. New Harbinger Publications.
- Tsipursky, G. (2022, April 3). 10 Science-Based Tips to Avoid Dating Disasters. Www.Top10.Com. https://www.top10.com/dating/10-science-tips-to-avoid-dating-disaster
- Mertens, S., Herberz, M., Hahnel, U. J. J., & Brosch, T. (2022). The effectiveness of nudging: A meta-analysis of choice architecture interventions across behavioral domains. Proceedings of the National Academy of Sciences, 119(19). https://doi.org/10.1073/pnas.2204059119
- Hummel, D., & Maedche, A. (2019). How effective is nudging? A quantitative review on the effect sizes and limits of empirical nudging studies. Journal of Behavioral and Experimental Economics, 80, 47–58. https://doi.org/10.1016/j.socec.2019.03.005
- BehavioralEconomics.com. (2022, February 17). How Well Do Nudges Work? BehavioralEconomics.Com | The BE Hub. https://www.behavioraleconomics.com/how-well-do-nudges-work/
- Tsipursky, G. (2020, May 4). 12 Mental Skills to Defeat Cognitive Biases. Disaster Avoidance Experts. https://disasteravoidanceexperts.com/12-mental-skills-to-defeat-cognitive-biases/
About the Author
Dr. Gleb Tsipursky is a behavioral economist, cognitive neuroscientist, and a bestselling author of several books on decision-making and cognitive biases. His newest book is Pro Truth: A Pragmatic Plan to Put Truth Back Into Politics (Changemakers Book, 2020). Dr. Tsipursky is on a mission to protect people from dangerous judgment errors through his cutting-edge expertise in disaster avoidance, decision making, social and emotional intelligence, and risk management. He founded Disaster Avoidance Experts, a behavioral economics consulting firm that empowers leaders and organizations to avoid business disasters. His thought-leadership has been featured in over 500 articles that he has published as well as 450 interviews he has given to popular venues such as CBS News, Scientific American, Psychology Today, and Fast Company, among others. Dr. Tsipursky earned his PhD in the History of Behavioral Science at the University of North Carolina at Chapel Hill, his M.A. at Harvard University, and his B.A. at New York University.