Lowering the Human Cost of Drone Warfare
If you had a 10% success rate at work for five months, how much longer do you think your employer would keep you around? Most of us wouldn’t be allowed to fail at that rate for five months, let alone receive additional time to make up for those shortcomings. And we generally hold those who are responsible for human life, such as medical professionals and law enforcement officials, to even higher standards in their work.
Yet, as revealed in The Drone Papers last year, America’s drone warfare programs have at times operated at this low level of success. During one five month period between Jan. 2012 and Feb. 2013 almost 90% of those killed in American drone strikes were not the intended targets. This amount of collateral damage – of innocent lives lost – should not be acceptable. Nevertheless, neither substantive changes to drone programs nor individual punishments for lethal errors have occurred in the wake of these reports.
There are many factors that shape America’s drone policy, most notably political and economic concerns, but a social psychological perspective offers another way to understand and potentially address some of the failures of drone policy. For instance, considering two psychological phenomena, the bystander effect and moral disengagement, and how they may function within our drone bureaucracies reveals ways that policymakers can improve the current infrastructure and save lives.
Behavioral Science, Democratized
We make 35,000 decisions each day, often in environments that aren’t conducive to making sound choices.
At TDL, we work with organizations in the public and private sectors—from new startups, to governments, to established players like the Gates Foundation—to debias decision-making and create better outcomes for everyone.
The Bystander Effect
The bystander effect, which has been defined as“the phenomenon that an individual’s likelihood of helping decreases when passive bystanders are present in a critical situation,” has been studied in laboratory and naturalistic settings for over 40 years (see Fischer et al., 2011 for an examination of over 50 studies). Three psychological processes have been highlighted as the main contributors to this effect: diffusion of responsibility, evaluation apprehension, and pluralistic ignorance. Fischer et al. (2011) defined each of these terms as follows:
Diffusion of Responsibility
“The tendency to subjectively divide the personal responsibility to help by the number of bystanders. The more bystanders there are, the less personal responsibility any individual bystander will feel.”
“The fear of being judged by others when acting publicly. In other words, individuals fear to make mistakes or act inadequately when they feel observed, which makes them more reluctant to intervene in critical situations.”
“The tendency to rely on the overt reactions of others when defining an ambiguous situation. A maximum bystander effect occurs when no one intervenes because everyone believes that no one else perceives an emergency.”
Essentially, these findings can be boiled down to this: in a critical situation, we are less likely to assist a person in need when we are uncertain about the appropriate action to take, are afraid of making a mistake or looking foolish, and are witnessing others who are not helping.
Much of the literature on the bystander effect focuses on how individuals react in everyday situations, and a litany of YouTube videos demonstrate how we can pass by strangers in public who need assistance. However, much less attention has been paid to the impact the bystander effect may have on everyday operations within organizations like the military.
Bystander Effect in Drone Policy Implementation
While it is important to note that the effects of diffusion of responsibility, evaluation apprehension and pluralistic ignorance have not been empirically studied within drone operation units, what we know about how these psychological processes inhibit people from taking action should make us concerned about the depersonalized nature of dehumanizing components of drone warfare.
Diffusion of responsibility
Even before the Drone Papers, investigations of the situational and psychological pressures affecting teams of drone operators found that “it is most appropriate to approach [drone warfare] as a form of killing that has an elaborate and intentional bureaucratized structure,” (Asaro, 2013). Such bureaucratic chains of command literally diffuse responsibility throughout the social system and away from any individual operator, making it difficult to hold any one person responsible for a mistake.
The AI Governance Challenge
For instance, when the LA Times covered the unreleased Pentagon review of a friendly-fire drone incident that resulted in the deaths of two enlisted service members in Afghanistan, they found that no individuals involved were held culpable for the killings. Instead, the report found that the incident resulted from a “fatal mix of poor communications, faulty assumptions and ‘a lack of overall common situational awareness.’”
Furthermore, accounts from former drone operators attest to the difficulties in trying to report concerns about fellow operators or mission objectives to their superiors. Thus, it seems likely that evaluation apprehension, in the form of anxiety or reluctance to voice alternative plans of action, factored into these friendly-fire deaths.
Feeling unable to voice dissent likely contributed to the pluralistic ignorance displayed in this incident as well, for the poor communication from those involved in these missions probably arose because many of them “believed that they had come to the same conclusions” despite having reservations about the selected targets (Asaro, 2013).
These three psychological processes seem to have contributed to bystander effects that allowed drone teams to make fatal mistakes, but there are additional psychological factors contributing to unintended deaths in drone operations.
Liked this? Read part two.
About the Author
Jared is a PhD student in social psychology and a National Science Foundation Graduate Research Fellow at the University of California, Irvine. He studies political and moral decision-making and believes that psychological insights can help improve political discourse and policymaking.