Why do we accept the first plausible AI solution and stop searching?

The Automation Bias

, explained.
Bias

What is Automation Bias?

Automation bias describes our tendency to accept and favor answers from automated decision-making systems, such as large language models like ChatGPT, even when we encounter contradictory information. We often trust the output of automated systems without critically evaluating it, even when our own human judgment suggests otherwise.

Where this bias occurs

Imagine that you are flying a plane (don’t worry, you have your pilot’s license!) with advanced autopilot and flight management systems. When it comes time to land, you program the systems for an automatic landing. As you begin your descent, you notice that the runway lights appear higher than usual, which can be a sign that the plane is too low—but since the navigation guide display shows that you’re perfectly lined up for landing, you ignore it. A few moments later, air traffic control calls in and warns you that the plane is too low for landing. Luckily, you have time to adjust and land safely, but that was a close one!

In this scenario, a key factor at play is automation bias. You trusted the navigation system more than your own judgment, causing you to ignore what you saw with your own eyes. As automated systems have become more advanced, automation bias is more likely to occur, as we tend to perceive technology as more reliable than human judgment. Unfortunately, automation bias has contributed multiple real-world crashes. It’s also led to poor outcomes in healthcare, finance, and military defense.1

Although automated tools can help us complete tasks, scenarios like this demonstrate the importance of continuing to apply our critical thinking skills to evaluate their outputs rather than blindly accepting them.

Sources

  1. Hoffman, B. (2024, March 10). Automation bias: What it is and how to overcome it. Forbes. https://www.forbes.com/sites/brycehoffman/2024/03/10/automation-bias-what-it-is-and-how-to-overcome-it/
  2. Pacific Lutheran University. (n.d.). The most Googled health questions and symptoms of 2024 in every state. https://absn.plu.edu/most-googled-health-questions/
  3. Abdelwanis, M., Alarafati, H. K., Tammam, M. M. S., & Simsekler, M. C. E. (2024). Exploring the risks of automation bias in healthcare artificial intelligence applications: A Bowtie analysis.Journal of Safety Science and Resilience, 5(4), 460‑469. https://doi.org/10.1016/j.jnlssr.2024.06.001
  4. Deloitte UK. (2023, January 20). Automation bias: What happens when trust goes too far? Deloitte UK. https://www.deloitte.com/uk/en/services/consulting/research/automation-bias.html
  5. Khan, A. (2023, January 6). Military investigation reveals how the U.S. botched a drone strike in Kabul. The New York Times. https://www.nytimes.com/2023/01/06/us/politics/drone-civilian-deaths-afghanistan.html#:~:text=63-,Military%20Investigation%20Reveals%20How%20the%20U.S.%20Botched%20a%20Drone%20Strike,their%20assessment%20of%20civilian%20casualties
  6. American Civil Liberties Union. (n.d.). Williams v. City of Detroit — face‑recognition false arrest. https://www.aclu.org/cases/williams-v-city-of-detroit-face-recognition-false-arrest
  7. Phoenix Strategy Group. (2025, March 4). AI bias in financial forecasting: Risks and solutions. https://www.phoenixstrategy.group/blog/ai-bias-in-financial-forecasting-risks-and-solutions
  8. Lumenova AI. (2024, August 27). Overreliance on AI: Addressing automation bias today. https://www.lumenova.ai/blog/overreliance-on-ai-adressing-automation-bias-today/
  9. Skitka, L. J., Mosier, K. L., & Burdick, M. (1999). Does automation bias decision‑making? International Journal of Human‑Computer Studies, 51(5), 991‑1006. https://doi.org/10.1006/ijhc.1999.0252 ScienceDirect+2dl.acm.org+2
  10. Paula, D., Bauder, M., Pfeilschifter, C., Petermeier, F., Kubjatko, T., Böhm, K., Riener, A., & Schweiger, H.-G. (2023). Impact of Partially Automated Driving Functions on Forensic Accident Reconstruction: A Simulator Study on Driver Reaction Behavior in the Event of a Malfunctioning System Behavior. Sensors, 23(24), 9785. https://doi.org/10.3390/s23249785
  11. Wesley, D., & Dau, L. A. (2016). Complacency and automation bias in the Enbridge pipeline disaster. Ergonomics in Design: The Quarterly of Human Factors Applications, 25(1), 17–22.https://doi.org/10.1177/1064804616652269

About the Author

Emilie Rose Jones

Emilie Rose Jones

Emilie currently works in Marketing & Communications for a non-profit organization based in Toronto, Ontario. She completed her Masters of English Literature at UBC in 2021, where she focused on Indigenous and Canadian Literature. Emilie has a passion for writing and behavioural psychology and is always looking for opportunities to make knowledge more accessible. 

About us

We are the leading applied research & innovation consultancy

Our insights are leveraged by the most ambitious organizations

Image

I was blown away with their application and translation of behavioral science into practice. They took a very complex ecosystem and created a series of interventions using an innovative mix of the latest research and creative client co-creation. I was so impressed at the final product they created, which was hugely comprehensive despite the large scope of the client being of the world's most far-reaching and best known consumer brands. I'm excited to see what we can create together in the future.

Heather McKee

BEHAVIORAL SCIENTIST

GLOBAL COFFEEHOUSE CHAIN PROJECT

OUR CLIENT SUCCESS

$0M

Annual Revenue Increase

By launching a behavioral science practice at the core of the organization, we helped one of the largest insurers in North America realize $30M increase in annual revenue.

0%

Increase in Monthly Users

By redesigning North America's first national digital platform for mental health, we achieved a 52% lift in monthly users and an 83% improvement on clinical assessment.

0%

Reduction In Design Time

By designing a new process and getting buy-in from the C-Suite team, we helped one of the largest smartphone manufacturers in the world reduce software design time by 75%.

0%

Reduction in Client Drop-Off

By implementing targeted nudges based on proactive interventions, we reduced drop-off rates for 450,000 clients belonging to USA's oldest debt consolidation organizations by 46%

Notes illustration

Eager to learn about how behavioral science can help your organization?