To Be Right or Liked? Evaluating Political Decision-Making
In the era of Twitter mobs and polarizing pundits, it seems like we care a lot about figuring out the truth and expressing it – resoundingly. Ideological battles are constantly being waged in the halls of Congress, on our news channels, and in our Facebook feeds. We can share our worldview like never before, yet we often feel worlds apart when assessing our shared reality.
But if we care so much about being right, why do we argue so much when facts about contentious topics are readily available? If the data are gathered, shouldn’t we all be reaching the same conclusions?
Recognizing biases in how we attend to information
Unfortunately, we don’t care about the truth as much as we typically think. A large body of evidence suggests people will attend to political information in incredibly biased ways. Our highly social brains make discerning truth more difficult than we might hope because we often protect our previous beliefs rather than face inconvenient truths.
For instance, Kahan et al. (2013) tasked a nationally representative set of participants with a difficult numeracy problem. Quickly looking at the data could easily lead participants to the incorrect interpretation, for the intuitive answer was designed to be incorrect. Reaching the correct answer required participants think carefully about the data. Interestingly, people were less accurate in interpreting the same data when they believed the information came from a study on gun control than when it was about the supposed efficacy of a new skin cream.
(From Kahan et al., 2013)
In the skin cream conditions, accuracy was best predicted by participants’ previously established quantitative abilities: those with better numeracy skills were more likely to interpret the results correctly. Yet for those in the gun control conditions, interpretive accuracy was significantly predicted by whether or not the data affirmed participants’ previous beliefs. Conservatives were more accurate when the correct interpretation suggested that banning concealed guns increased crime, and liberals were more accurate when the correct interpretation suggested that banning concealed guns decreased crime.
About the Author
Jared Celniker
Jared is a PhD student in social psychology and a National Science Foundation Graduate Research Fellow at the University of California, Irvine. He studies political and moral decision-making and believes that psychological insights can help improve political discourse and policymaking.
About us
We are the leading applied research & innovation consultancy
Our insights are leveraged by the most ambitious organizations
“
I was blown away with their application and translation of behavioral science into practice. They took a very complex ecosystem and created a series of interventions using an innovative mix of the latest research and creative client co-creation. I was so impressed at the final product they created, which was hugely comprehensive despite the large scope of the client being of the world's most far-reaching and best known consumer brands. I'm excited to see what we can create together in the future.
Heather McKee
BEHAVIORAL SCIENTIST
GLOBAL COFFEEHOUSE CHAIN PROJECT
OUR CLIENT SUCCESS
$0M
Annual Revenue Increase
By launching a behavioral science practice at the core of the organization, we helped one of the largest insurers in North America realize $30M increase in annual revenue.
0%
Increase in Monthly Users
By redesigning North America's first national digital platform for mental health, we achieved a 52% lift in monthly users and an 83% improvement on clinical assessment.
0%
Reduction In Design Time
By designing a new process and getting buy-in from the C-Suite team, we helped one of the largest smartphone manufacturers in the world reduce software design time by 75%.
0%
Reduction in Client Drop-Off
By implementing targeted nudges based on proactive interventions, we reduced drop-off rates for 450,000 clients belonging to USA's oldest debt consolidation organizations by 46%