Students are studying at long, wooden tables in a large, well-lit library with green desk lamps and surrounded by bookshelves. The space is busy and quiet.

Learning Within Limits: How Curated Content Affects Education

read time - icon

0 min read

Feb 01, 2018

We are increasingly living in realities that we construct ourselves, with adverts of things we’ve already bought and opinions we already hold being reinforced through social media and advertising. As the content that is presented to us is refined to reflect our previous choices, we frame more of our own reality so that it is subject-orientated before we’ve even experienced it. As a result, the behavioral science concepts of framing and confirmation bias (amongst other cognitive biases) can be seen in daily decisions that shape our worlds.

In the midst of this drive towards self-centered content came the advent of trigger warnings —statements that alert people to the fact that a piece of writing/video/talk contains potentially distressing material. It is important to note that these were introduced with noble intentions: for example, of preventing a victim of sexual violence from having to accidentally view content that could trigger a flashback. And while, when used in this way, trigger warnings can have a valid and valuable role in daily life, they can also be used to cocoon those who are distressed by views contrary to their own, and serve simply to reinforce existing biases. An example of this could be a religious fundamentalist who would not take a course discussing evolutionary theory.

With the introduction of trigger warnings into education at institutions such as Cornell and Oxford Universities, amongst others — so that even the learning process is censored to what individuals feel comfortable with, and learning institutions provide a ‘safe space’ instead of a space where open discussion and different opinions are celebrated — we should consider our obligation to introduce into our education system some kind of mandatory learning around the effects of such hyper-curated content, particularly in the use of social media.

According to prospect theory (Kahneman and Tversky, 1979), from which framing is derived, decisions are made in two phases. The first involves editing available options, which are coded as gains or losses. This can be applied to trigger warnings, whereby the student decides, based on the warning, whether they will experience a gain or loss, such as whether they will feel something positive or negative as a result of going to a class. The second involves evaluation (McElroy and Seta, 2003). If the class is subsequently evaluated as a loss, with a high likelihood of feeling stressed or upset, they will be less likely to choose to attend the class.

Confirmation bias rests on a heuristic called confirmatory hypothesis testing, which is a tendency to search for and overweight information that confirms an existing belief rather than paying mind to disconfirming evidence (Jones and Sugden, 2001). It is evident in the decisions of students who use trigger warnings and make a decision not to expose themselves to certain elements and ideas, as they may be upsetting. Such students would rather look for ideas and groups that confirm their current way of thinking, instead of disturbing it and providing contradictory ideas.

Broadening one’s mind and challenging one’s ideas are key goals of learning in general. This is far more likely to happen when exposed to ideas that have the potential to upset, to force one to question one’s way of thinking, and thereby, to force a deeper evaluation of one’s opinions. When a limited way of thinking is introduced in an educational context, the possibility for a person’s pre-existing ideas about something to be perpetuated increases, allowing her more power to curate her own reality — something we all do to a certain extent anyway, but which makes staying in one’s comfort zone easier than ever before.

While it is okay to have your own opinion, is it not healthier to have considered the merits and downfalls of both your own and others’ opinions before taking a decision? Furthermore, are our decisions not sounder if we seek and consider disconfirming evidence when it appears? Based on previous research by Kahneman and Tversky, Stanovich and West (2000), propose a two systems approach to decision-making. They argue that we have two main types of cognitive processes: system one is heuristic-based, automatic and subconscious (which leads to an automatic contextualization of problems), while system two is conscious, measured and reflective. When system one decisions lead to sub-optimal choices (not conscious or mindful), system two is supposed to override system one. When this fails to happen on a regular basis, the heuristics used in system one become ingrained biases, inherent in the way we will think about certain things.

In the case of trigger warnings, students are making snap decisions, based on potentially little information about the context of the content, allowing these mental shortcuts taken to reach the decision (in this case not to attend a class) to become entrenched as biases. We thus end up avoiding anything that has to do with an associated topic or phrase not to our liking. One’s decisions then become more influenced by cognitive biases and the way that situations are framed. One way to interpret this is that people who use trigger warnings are inherently more likely to be influenced by biases. An ‘I didn’t do it because they warned me’ justification for not introducing system two thinking into a decision of whether or not to go ahead with a course is not the effect that trigger warnings should have.

However, with more consideration, system two can override system one. If we pay more mind to the greater context of our decisions, and the potential benefits (or damaging effects) of shutting ourselves off from certain arguments, we may overcome this rush to prejudgment. Short of doing this, our heuristics and biases will reinforce the idea that it is okay not to consider other opinions —  that not questioning your beliefs, however fundamental they might be, is an acceptable way to exit the education system.

In a world in which boundaries of all kinds are becoming increasingly blurred — with integration happening through travel, business, and other channels — the reality is that most people who participate actively in the global economy are going to have their culture and beliefs questioned at some point, and need to be prepared for that. Allowing yourself to be in uncomfortable situations is a key learning experience, and those who don’t do so at university should be made aware of the impact that not doing this can have.

Moreover, increasingly intelligent machines are only likely to compound our one-sidedness. Algorithms that drive our social media platforms and product advertisements rely on their own heuristics to decide what is presented to you, the user — which are directed by our past behavior; clicks, likes and searches. In this way, technology conspires with our inherent biases to present us with a world that is curated to our individual belief system. Our world and beliefs get confirmed and reconfirmed to us, as that is how the machine algorithms learn, by giving us back related topics to what we have already looked at. 

References

Jones, Martin, and Robert Sugden. Positive confirmation bias in the acquisition of information. Theory and Decision, 50, no. 1 (2001), 59-99. doi:10.1023/a:1005296023424.

Kahneman, Daniel, and Amos Tversky. Prospect Theory: An Analysis of Decision under Risk. Econometrica, 47, no. 2 (1979), pp. 263–291. JSTOR [JSTOR], doi:10.2307/1914185.

McElroy, Todd, and John J. Seta. Framing effects: An analytic–holistic perspective. Journal of Experimental Social Psychology, 39, no. 6 (2003), 610-617. doi:10.1016/s0022-1031(03)00036-2.

Stanovich, Keith E., and Richard F. West. Individual differences in reasoning: Implications for the rationality debate? Behavioral and Brain Sciences, 23, no. 5 (2000), 645-665. doi:10.1017/s0140525x00003435.

About the Author

Smiling woman with long brown hair, wearing a tan coat over a black top, stands against a plain beige wall.

Katharine Sephton

Stellenbosch University

Katharine graduated from Stellenbosch University in South Africa with an MPhil in Organisational Decision-Making and Knowledge Management. She has worked in aviation, retail and market research in South Africa and Qatar, and is currently working in the finance sector in Switzerland, observing the application of behavioural science in investing.

About us

We are the leading applied research & innovation consultancy

Our insights are leveraged by the most ambitious organizations

Image

I was blown away with their application and translation of behavioral science into practice. They took a very complex ecosystem and created a series of interventions using an innovative mix of the latest research and creative client co-creation. I was so impressed at the final product they created, which was hugely comprehensive despite the large scope of the client being of the world's most far-reaching and best known consumer brands. I'm excited to see what we can create together in the future.

Heather McKee

BEHAVIORAL SCIENTIST

GLOBAL COFFEEHOUSE CHAIN PROJECT

OUR CLIENT SUCCESS

$0M

Annual Revenue Increase

By launching a behavioral science practice at the core of the organization, we helped one of the largest insurers in North America realize $30M increase in annual revenue.

0%

Increase in Monthly Users

By redesigning North America's first national digital platform for mental health, we achieved a 52% lift in monthly users and an 83% improvement on clinical assessment.

0%

Reduction In Design Time

By designing a new process and getting buy-in from the C-Suite team, we helped one of the largest smartphone manufacturers in the world reduce software design time by 75%.

0%

Reduction in Client Drop-Off

By implementing targeted nudges based on proactive interventions, we reduced drop-off rates for 450,000 clients belonging to USA's oldest debt consolidation organizations by 46%

Read Next

A group of people in a modern meeting room, some seated and working at a large table with laptops and cameras, while others stand and converse. The atmosphere is casual, with natural light filtering through large windows in the background.
Insight

Why Teams Make Bad Decisions

Sometimes, the best way to avoid group decision-making failures is not to make decisions as a group at all.

Notes illustration

Eager to learn about how behavioral science can help your organization?