Remember that rumor on your newsfeed about masks making a COVID infection more likely? The one you are not quite sure about. It sounded remotely reasonable—but should you share it? Or do you go and check a reliable second source first?
Such choices are ubiquitous: Do we trust an intuitive judgment or do we seek more information?
In a new paper published in PNAS, my colleagues Max Rollwage, Ray Dolan, Steve Fleming and I explored how we deal with such information-seeking decisions. We were particularly interested in how these choices differ across people who are more or less dogmatic. Dogmatic people believe that their worldview reflects an absolute truth, which often stifles debates and drives us apart.
However, it’s unclear what sort of cognitive processes drive this outlook on life. We believe that understanding how dogmatic people search for information would be a good starting point.
People who think dogmatically often appear uninterested in novel information that could change their mind. One reason for this is what’s known as motivated search. In other words, more dogmatic people might be particularly enamored with their opinions: Why hear what the other candidate has to say when my own view is better anyway?
We all share this bias and it may be inflated in more dogmatic people. However, there’s a catch: Motivated search is tied to our specific group membership or opinion. If you’re Republican, your bias is likely red; if you’re a Democrat, your bias is likely blue. This made us wonder: Is it the dogmatic individual’s specific opinions that make them seek less information? Or is their lowered search driven by something that transcends particular views?
Odds you double-check: Dogmatism and information-seeking
To answer these questions, we asked over 700 US adults to play a simple computer game that was completely unrelated to their personal values: They saw two black boxes and had to decide which contained more flickering dots (imagine comparing two old TVs without a signal). They would be paid for a correct decision, so they had an incentive to choose carefully.
But the important part came before they gave us their final judgment. Our participants could choose between two options. They could either decide that the first set of boxes was enough to make this final choice; or, they could pay a small fee to see another, clearer display of dots, which would help them make a more informed decision.
We borrowed this set-up from cognitive neuroscience. In that field, researchers have long used simple tasks to get at people’s basic thought processes. Although they might seem simple, such experiments mirror everyday information-seeking scenarios; luckily for us however, the tasks don’t carry their political baggage.
After the task, participants filled out several questionnaires. They told us about their political preferences and how strongly they believed in their worldview. The latter allowed us to measure dogmatism. We found that both extremes of the political spectrum tended to be more dogmatic than those in the political center, although dogmatism was slightly higher on the conservative end of the spectrum.
In our dot task, more dogmatic participants made as many mistakes and were as confident as their less dogmatic peers. That meant we could be sure that they didn’t hold these simple judgments dearer than their peers, as they might with their partisanship.
However, we found a striking difference when we looked at how often they purchased the additional information: More dogmatic participants were less willing to ask for helpful information. This reluctance to seek information also didn’t pay off: Their reduced search led more dogmatic thinkers to make less accurate judgments. In the end, they lost money.
The difference between more and less dogmatic people was particularly strong when the participants had little confidence in their initial decisions. In other words, dogmatists were happier to refuse educative information—especially when they weren’t quite sure if their initial judgment was correct.
Our findings are especially concerning today: What we read and hear is, more than ever, in our own hands. At the same time, unfiltered tweets and posts are often our first contact with news stories—not a carefully vetted report. So, even if there is a correction published somewhere, we might never read it unless we care to look for it. Our study suggests that dogmatism predisposes some of us to fall prey to not checking more often.
The fact that we find this lowered search in a simple game also shows that this dogmatism isn’t just a feature of specific opinions but may be driven by more fundamental cognitive characteristics.
Importantly, the differences between more and less dogmatic participants in our study are subtle. We also do not know how information seeking with personally relevant material like news stories might alter our results. Lastly, it remains unclear what comes first: dogmatism or reduced information-seeking?
Regardless, our research tells a cautionary tale, whether we consider ourselves to be dogmatic or not. When we are unsure about something, we shouldn’t just run with it. Rather, we are often better off checking a reliable source.