stack of tvs

The Fatal Consequences of COVID-19 Misinformation

read time - icon

0 min read

Nov 05, 2020

There is a significant link between the mainstream media sources we rely on and our real-world behavior. This is the driving insight of a new study on viewers of Tucker Carlson’s and Sean Hannity’s Fox News shows, which found that the hosts’ divergent handling of COVID-19 (mis)information directly affected their viewers’ likelihood of contracting and even dying from the virus.1

Misinformation around COVID has spread widely since the outset of the pandemic through drivers such as talk show hosts, politicians, and myriad other online influencers.2,3 Researchers previously suspected that such misinformation leads to adverse public health outcomes; this new study clearly demonstrates it.

The propensity to believe misinformation is, of course, intricately tied to our broader worldviews and the content of the information itself. Moreover, in the case of COVID-19 risk perception, a number of cognitive biases (such as overconfidence or present bias) may factor into how seriously we view our risk of adverse health outcomes. Fortunately, peer-reviewed research has recently revealed effective behavioral science-based interventions that address cognitive biases to prevent people from both falling for and spreading misinformation. Even newer research has demonstrated that such evidence-based approaches are effective in regard to COVID-19 misinformation in particular—leaving some cause for hope as a pronounced second (or third) wave beckons.4

Behavioral Science, Democratized

We make 35,000 decisions each day, often in environments that aren’t conducive to making sound choices. 

At TDL, we work with organizations in the public and private sectors—from new startups, to governments, to established players like the Gates Foundation—to debias decision-making and create better outcomes for everyone.

More about our services

COVID-19 as covered by Sean Hannity And Tucker Carlson

Hannity and Tucker Carlson Tonight are two very popular opinion shows on Fox News, and the hosts share similar ideological profiles and audience demographics (typically older, conservative adults).

Yet, despite their similarities, Hannity and Carlson handled the initial pandemic coverage in February and early March in very different ways. The health consequences of this difference were the focus of a study done by researchers at the Becker Friedman Institute for Economics at University of Chicago.1

Carlson treated COVID-19 seriously from the onset, and was quite ahead of most other mainstream media venues when he started highlighting the dangers of a pandemic on his show on January 28. These warnings continued through the following month, and on February 25 Carlson told his viewers that more than a million would die in the United States due to COVID-19.5

On the other hand, Hannity downplayed COVID-19 from the start. He claimed the risk of a US-based outbreak was low, spread the discredited view that the virus was no deadlier than the average flu, and accused Democrats of weaponizing and politicizing the virus. On February 27, he said that many Democrats wanted COVID-19 to “wreak havoc” in the country, just to score political points.6

This did eventually change when President Donald Trump declared COVID-19 a national emergency in mid-March. Hannity and other Fox News hosts began to acknowledge the gravity of the situation, and Fox’s coverage converged around Carlson’s early narrative. Yet, as the below study demonstrates, the damage had been done. 

Behavioral and health impacts of coverage differences

Researchers at the Becker Friedman Institute studied how each host’s coverage of COVID-19 affected viewer behavior. To do so, they surveyed over 1,000 people who watch Fox News at least once a week. The researchers evaluated whether the respondents modified their behavior in response to the pandemic, and if they adopted the recommended practices such as social distancing, improving hygiene, and so on.

The researchers next compared behavior changes to viewing patterns. They found that on average, Hannity viewers changed their behavior five days later than viewers of other shows. Meanwhile, Tucker Carlson Tonight viewers changed their behavior three days earlier than viewers of other shows.

To estimate the impact of each information source on actual health outcomes, researchers compared the shows’ popularity in specific US counties to data on COVID-19 infections and deaths. After they controlled for a variety of possible confounding variables, they discovered that the US counties where Hannity had higher viewership had more COVID-19 cases and deaths two weeks later (roughly the length of time it takes after initial infection for the virus’ community impact to be seen).

According to these academics, the “effects [from news sources] on cases start to rise in late February and peak in mid-to-late March before starting to decline, consistent with the convergence in coronavirus coverage between Hannity and Carlson. A one standard deviation greater viewership difference is associated with approximately 2 percent more cases on March 7, 5 percent more cases on March 14, and 10 percent more cases on March 21. Deaths follow a similar trajectory on a two-week lag.”

Based on their findings, the researchers concluded that misinformation during the early stages of a pandemic can have significant consequences for health outcomes.

Role of cognitive biases in misinformation

While political persuasions will no doubt influence people’s view of the study’s results, I would argue that the Becker Friedman Institute researchers were not looking to score any political points, particularly considering that Hannity and Carlson (and their audiences) tend to be ideologically aligned. The authors simply studied the effect of accurate and inaccurate information about COVID-19 on viewership. It was clear from the findings that misinformation had a deadly effect.

These kinds of results are due in part to the authority bias, which is the excessive trust and obedience that we tend to give those we perceive as authority figures, such as media figures that we follow.7

When Hannity dismissed concerns about COVID-19 and shrugged it off as just the flu, it seems his viewers neglected safety measures for five days compared with views of other shows. This led to corresponding COVID-19 infections and death outcomes.

By contrast, when Carlson sounded the alarm and treated the virus seriously, he led his viewers to take the necessary course of action and they changed their behavior three days earlier.

A related mental pattern we need to take note of is emotional contagion, where people are unwittingly infected with the emotions of those they perceive as leaders.8 Emotions can be shared and propel action even if the perceived leader does not have formal authority. This is particularly crucial for those with informal authority, such as popular news hosts Carlson and Hannity. 

Thus, when Carlson focused on the deadly nature and disastrous effects of COVID-19, he motivated his viewers to take the right actions. And when Hannity informed his viewers that Democrats were politicizing and weaponizing COVID-19, his viewers, in turn, shunned fears around the pandemic, even though fear of COVID-19 and consequent behavioral changes was the correct way to deal with the virus.

The spread of COVID-19 misinformation and how to address it

We now know not only that spreading COVID-19 misinformation can kill, but also that underlying mental blindspots make us vulnerable to such falsehoods. How does such fake news spread and how can we combat it?

One recent study analyzed 225 pieces of misinformation as identified by quality fact-checkers, and then evaluated the spread of this misinformation on social media. A notable finding: top politicians and other prominent public figures shared only 20% of the falsehoods but these posts got 69% of all engagement on social media.2 This finding aligns with the previous insights about the importance of authority bias and emotional contagion in driving the spread of fake news.

The researchers also analyzed how social media companies responded to this potentially deadly misinformation. While all companies have taken some steps to address such fake news, through a combination of taking down the posts or attaching a warning label, the effectiveness of their actions varied. On Twitter, for example, 59% of the COVID-19 misinformation remained up at the conclusion of the study without any warning labels. YouTube, by contrast, had 27%; Facebook had 24%. While any such misinformation remaining after being identified by fact-checkers as false can have fatal consequences for the users of these platforms, it is important to give YouTube and Facebook credit for their more active steps in this area.

How about addressing such misinformation? Some behavioral nudges have proven effective. A study on 1,700 people examined how to prevent people from sharing COVID-19 misinformation. It found that people spread falsehoods in part because they don’t stop to think about whether the content is accurate before sharing. The researchers compared how participants evaluated the truthfulness of the content when sharing on social media versus when asked directly about accuracy. The result? Participants were much worse at discerning accuracy when just sharing versus when directly asked.4

The implication is that many people don’t feel much of a commitment to ensuring that what they shared on social media is true: if they did, they would take the time to think about what they shared and be much more likely to avoid sharing misinformation. Indeed, the researchers also found that reminding participants about the importance of accuracy greatly increased the extent to which participants determined the accuracy of what they intended to share.

Fortunately, mechanisms exist to get people more committed to truth-oriented behaviors. These include nudges such as the Pro-Truth Pledge, which asks signers to make a commitment to 12 truth-oriented behaviors, ranging from following best fact-checking practices to discouraging allies from sharing questionable information.9 Peer-reviewed research has demonstrated the effectiveness of the pledge in preventing people from sharing misinformation, and this commitment device is the foundation of the Pro Truth movement, which I describe in more depth in my book. 

Your health, and that of many others, depends on as many people as possible feeling a personal commitment to truth-oriented behaviors. Though COVID-19 continues to spread, we hope you’ll help stop misinformation from doing the same. It could save a life.

References

  1. Bursztyn, L., Rao, A., Roth, C., & Yanagizawa-Drott, D. (2020). Misinformation during a pandemic. University of Chicago, Becker Friedman Institute for Economics Working Paper, (2020-44).
  2. Brennen, J. S., Simon, F., Howard, P. N., & Nielsen, R. K. (2020). Types, sources, and claims of COVID-19 misinformation. Reuters Institute, 7, 3-1.
  3. Evanega, S., Lynas, M., Adams, J., Smolenyak, K., & Insights, C. G. (2020). Coronavirus misinformation: quantifying sources and themes in the COVID-19 ‘infodemic’.
  4. Pennycook, G., McPhetres, J., Zhang, Y., Lu, J. G., & Rand, D. G. (2020). Fighting COVID-19 misinformation on social media: Experimental evidence for a scalable accuracy-nudge intervention. Psychological science, 31(7), 770-780.
  5. Garcia, V. (2020, February 25). Tucker Carlson sounds the alarm: ‘America is not ready’ for the coronavirus. Fox News. https://www.foxnews.com/media/tucker-carlson-america-not-ready-for-cornavirus
  6. Halon, Y. (2020, February 27). Sean Hannity accuses Democrats of ‘weaponizing’ coronavirus ‘to score cheap, repulsive political points’. Fox News. https://www.foxnews.com/media/sean-hannity-democrats-weaponizing-coronavirus-trump
  7. Hinnosaar, M., & Hinnosaar, T. (2012). Authority Bias.
  8. Hatfield, E., Cacioppo, J. T., & Rapson, R. L. (1993). Emotional contagion. Current directions in psychological science, 2(3), 96-100.
  9. Straight, W. (2017, August 13). Threading the fact-checking needle. Pro-Truth Pledge. https://www.protruthpledge.org/threading-fact-checking-needle/

About the Author

Gleb Tsipursky

Gleb Tsipursky

Disaster Avoidance Experts

Dr. Gleb Tsipursky is a behavioral economist, cognitive neuroscientist, and a bestselling author of several books on decision-making and cognitive biases. His newest book is Pro Truth: A Pragmatic Plan to Put Truth Back Into Politics (Changemakers Book, 2020). Dr. Tsipursky is on a mission to protect people from dangerous judgment errors through his cutting-edge expertise in disaster avoidance, decision making, social and emotional intelligence, and risk management. He founded Disaster Avoidance Experts, a behavioral economics consulting firm that empowers leaders and organizations to avoid business disasters. His thought-leadership has been featured in over 500 articles that he has published as well as 450 interviews he has given to popular venues such as CBS News, Scientific American, Psychology Today, and Fast Company, among others. Dr. Tsipursky earned his PhD in the History of Behavioral Science at the University of North Carolina at Chapel Hill, his M.A. at Harvard University, and his B.A. at New York University.

Read Next

Fake news
Insight

How To Fight Fake News With Behavioral Science

Popular fact correction strategies, such as the myth-versus-fact format, may actually not work in fighting false information. We need new strategies to combat the spread of this harmful type of misinformation.

Notes illustration

Eager to learn about how behavioral science can help your organization?