Tracing the Origins of the Anti-Vaccine Movement

The COVID-19 pandemic was accompanied by a surge in conspiracy theories that blunted global efforts to stop the spread of SARS-CoV-2. At the time of writing, more than three and a quarter million people have died of COVID-19.1 It’s likely that many of these deaths could have been prevented but for the proliferation of conspiracy theories that reduced the public’s trust in medical experts and government officials.

These conspiracy theories are driven by the unintentional and intentional dissemination of false information—misinformation and disinformation, respectively.2 Disinformation is particularly harmful as it is designed to damage social institutions.

In February 2020, the WHO identified the danger posed by the proliferation of misinformation and disinformation to combatting COVID-19.3 The Director-General of the WHO and Secretary-General of the UN both characterized this as an infodemic.4,5 One year later, empirical evidence indicates that existing strategies for mitigating this infodemic are inadequate.6 Policymakers must use these data to design effective strategies that counter the growing public health threat posed by novel SARS-CoV-2 variants. Central to this goal is identifying the target for these intervention strategies.

Vaccination efforts are impeded by misinformation and disinformation

In an article for TDL, Sanketh Andhavarapu identified vaccine hesitancy and the anti-vax movement as the greatest challenge to controlling and ending the COVID-19 pandemic. Vaccines provide the greatest measure of protection against the SARS-CoV-2 virus, and the risk of vaccine side effects is far less than the risk of complications or death from COVID-19. However, a large fraction of the U.S. population is reluctant to be vaccinated. This hesitancy stems from the spread of vaccine misinformation and disinformation, particularly through social media.7

The spread of anti-vax conspiracy theories must be stopped if vaccines are to achieve their full potential in ending the global pandemic. Behavioral interventions aimed at promoting vaccination campaigns must target online misinformation and disinformation.

The design of effective interventions requires an understanding of the origin of the modern anti-vax movement. This began nearly a quarter-century ago with the publication of a clinical study that reported an increased incidence of autism spectrum disorder (ASD) in individuals who were vaccinated for measles, mumps, and rubella (MMR). This case also serves to highlight lessons learned by the scientific and clinical research communities that will strengthen efforts to stop the dissemination and proliferation of anti-vax ideology.

Vaccines, autism, and the clinical study that changed the world

The anti-vax movement has existed since Edward Jenner established the use of vaccines to treat smallpox.8,9,10 Resistance towards vaccination was based upon grounds of civil liberty and religious objection to the injection of non-human substances. Although these concerns also lie at the heart of the modern anti-vax movement, public vaccine hesitancy was rooted in the class divide that existed in Victorian England, a poorly regulated medical community, and inadequate public education efforts.11

In contrast to Victorian times, modern vaccination efforts in the information era are bolstered by free public access to more knowledge than has ever been available at any other time in human history. Unfortunately, this has not dampened the deeply emotional and politicized opposition to vaccines that scientists, healthcare workers, and policymakers face today.12 This is due in large part to a failure of the scientific community to address the anti-vax movement when it was catalyzed by a 1998 publication in the peer-reviewed medical journal, The Lancet.13

This study reported the onset of regressive autism in 12 patients within two weeks of receiving a measles-mumps-rubella (MMR) vaccine. (The authors also linked vaccination with bowel disease, but this is not often mentioned by anti-vaxxers.)14 The possibility that vaccines could cause neurodevelopmental disorders in previously healthy individuals appropriately received great attention in the academic and public stakeholder communities. Its publication in The Lancet—one of the world’s most influential peer-reviewed clinical research journals—gave it immediate credibility. However, it was soon noted that the implication of a causal link between MMR vaccination and regressive autism in the work reported by Wakefield and colleagues was based on shoddy evidence.15

The publication of unsubstantiated claims linking vaccine use with autism was immediately criticized and followed by studies refuting the causal association between vaccines and developmental disorders.16,17 Investigations by Sunday Times journalist Brian Deer followed from this critique and culminated in a complaint to The Lancet editors of possible research misconduct committed by Wakefield and colleagues.18 The editors of The Lancet were presented with credible evidence of research misconduct and were ethically obliged to investigate this 2004 complaint.

An editor of The Lancet, Richard Horton, published a response stating that there was no basis for Deer’s allegations.19 The Lancet also allowed Wakefield and senior co-authors to publish a mild correction of interpretation20 and to outright refute Deer’s allegations of misconduct without providing any evidence to support their position.21,22 In further contempt of scientific ethics, a complaint was filed against the investigative journalist.

Deer was not deterred. His investigations exposed extensive fraud committed by Wakefield and colleagues, including:

  • The selective exclusion of specific traits in patients that did not fit the article’s conclusions;
  • Failure to report that 5 of the 12 patients had been previously diagnosed with developmental abnormalities at the time of recruitment into the study;
  • The labeling of all 12 patients as “healthy,” when in reality all had pre-existing conditions that were relevant to the study;
  • Failure to disclose that patients were recruited to the study by an anti-vax organization; and
  • Failure to report that the study was initiated and funded by lawyers planning litigation against vaccine manufacturers, and that Wakefield received payment from this source.23,24,25

In 2010, increasing pressure led the editors of The Lancet to quietly issue a full retraction notice for the Wakefield article.26 Wakefield remains adamant that his work linking an MMR vaccine to the development of autism is based on ethical and replicable clinical research. The Wakefield case is now condemned by the academic community as one of the greatest frauds of the 20th century, as is best exemplified by the 2011 article published by the editors of the British Medical Journal aptly entitled “Wakefield’s article linking MMR vaccine and autism was fraudulent.”27

Unfortunately, the belated response by the Lancet editors to the Wakefield case did very little to undo the damage caused by the persistence of this work in the public record for 12 years.

The utility of scientific research depends upon public trust

The damage caused by the Wakefield article is evident in statistics on vaccination trends following his 1998 publication. In the U.K., MMR vaccination decreased from 92% in 1996 to 84% in 2002, and by 2003 decreased below the level necessary to prevent an outbreak of measles in London.28 Outbreaks of measles have been reported around the world and caused deaths that would likely have been prevented by vaccines.

The destructive consequences of the Wakefield fraud would have been mitigated or prevented if points of concern raised by a small number of researchers and the evidence brought forth by Deer in 2004 were acted upon ethically by the academic community. External oversight of the research community will serve to restore public trust in science and promote scientific progress by preventing fraud. Unfortunately, the need for this oversight was not met, and this left a void that was filled with misinformation and exploited by the purveyors of disinformation.

The persistent consequences of the Wakefield case emphasize several important lessons for scientists, physicians, and policymakers who face the daunting task of addressing vaccine hesitancy and denial during the COVID-19 pandemic.

First, the academic community failed public stakeholders. The work of investigative journalists—not scientists and physicians—exposed the fraudulent link between vaccines and autism. The failure of the academic community to respond ethically to Wakefield’s fraudulent research linking vaccines with autism was the catalyst for a movement that now promotes the erroneous belief that vaccines cause autism and that scientists cannot be trusted. Damage caused by this failure must be undone if efforts to reach minimum vaccination goals are to be achieved. Public trust in the work performed by scientists and physicians must be restored.

Second, the destructive consequences of the Wakefield fraud would have been mitigated or prevented if points of concern raised by a small number of scientists and the evidence brought forth by Deer in 2004 were acted upon ethically by the academic community. External oversight of the research community will serve to restore public trust in science and promote scientific progress by preventing fraud. Unfortunately, the need for this oversight was not met, and this left a void that was filled with misinformation and exploited by the purveyors of disinformation.

Third, the spread of misinformation must be stopped, and more should be done to detect and shut down disinformation campaigns. Physicians and scientists increasingly view the loss of public trust in scientific research as the greatest threat facing healthcare and social stability in the future.29 During the ongoing pandemic, non-authorities can say anything they want and are viewed as trustworthy sources by public stakeholders who are now unwilling to trust scientists, physicians, and policymakers. Scientists and physicians can earn back public trust, but only by reaching out to communicate in an accessible language.30 Similarly, policymakers can encourage this process by communicating research in a non-partisan manner.31

The key to ending the spread of misinformation, disinformation, and conspiracy theories is increased access to reliable information and scientific literacy among public stakeholders.7 The reason for this need and the means to achieving this end are one and the same: social media.

The weaponization of social media

Social media-driven disinformation campaigns are targeted towards specific nations and subpopulations in order to disrupt social stability by manipulating the behavior of the public. The COVID-19 pandemic—specifically, the speed with which the SARS-CoV-2 virus was able to spread around the world, and the scale of the devastation it wrought—has renewed fears that in the not-so-distant future, such techniques could represent a new frontier in biowarfare.32 The combined effect of disinformation campaigns and naturally occurring health pandemics has the potential to be as effective as biological weapons at destabilizing societies.

Now is the time to turn COVID-19 into an opportunity to develop effective behavioral interventions that counter disinformation campaigns targeting vulnerable populations. During future pandemics, this work could be essential for saving lives and avoiding an even greater disaster than what we’ve seen over the past year.

We must address the role of social media in perpetuating and exacerbating the damage caused by the Wakefield fraud to public perception of vaccine safety. Since the outbreak of COVID-19, the role of social media in disseminating misinformation and disinformation has been tracked and reported in peer-reviewed journals. This research identified behavioral interventions targeting social media use that can increase public confidence in vaccines and science.

This will be accomplished through the development of novel interventions and the improvement of existing strategies that improve cooperation between scientists and stakeholders.32 One solution is to invest in existing organizations to provide a social media platform that translates primary research into an accessible format.33 Public confidence in these organizations will be improved if they are independent of government influence.34

This strategy must be alert to changing trends in social media misinformation movements and disinformation campaigns, and be highly responsive to these changes by posting facts supported with valid primary source references.35,36 The means to fact-check disinformation in real time will likely be provided by machine learning technology.37

Success will follow from cooperation and vigilance

Disinformation campaigns exploit existing social divisions and disparities and often use hate speech to target specific audiences. Disinformation campaign managers can counter all efforts to expose their existing campaigns, and these efforts are useless if the public isn’t inclined to go to verified sources first and to examine sensational information in a neutral, dispassionate manner.

It all comes down to the public’s perception of a source’s credibility. This is the major challenge facing efforts to improve public awareness of the need to identify fake news, and to increase its willingness to go to verified, apolitical sources for their information. Improved trust and cooperation between public stakeholders and scientific, medical, and government officials will improve health outcomes and vaccination efforts. We can only hope that the ongoing devastation of COVID-19 and the inevitability of future pandemics will drive innovations that heal the infodemic.

Read Next


Tackling Conspiracy Theories Amid COVID-19

Conspiracy theories pose a threat during a pandemic. Social and behavioral sciences emerge to dismantle them and align people’s conduct with public health recommendations.