Do I Agree? Cognitive Bias and Terms of Service

We all have gone through this at least once in our digital lifetime: as we sign up for a service, we fill in some personal information only to face an extensive list with every imaginable term of service. We then quickly scroll down, click “I agree,” and start using it, unaware of the nuances in data and usage conditions.

Have you ever wondered how long the terms and policies disclosed by leading online services are really? Artist Dima Yarovinsky did, and even transformed them into an impactful art project.1 The result: the technicolored scrolls pictured below, reaching down the wall and (in some cases) sprawling out on the floor.

Printed in A4-size sheets and using standard font size, this visualization of this type of document is both impressive and paradoxical. Everything is there—from personal data storage to advertisement strategies and permissions to use any content you share, even in yet non-existent channels. But just because everything is there, we end up rushing down to the bottom of the screen and agreeing on terms we did not bother to read. It’s all out in the open—and that’s why it’s so difficult to process.

“I agree,” art created by Dima Yarovinsky as part of the project “Visualizing Knowledge” (2018). Source: https://www.designboom.com/readers/dima-yarovinsky-visualizes-facebook-instagram-snapchat-terms-of-service-05-07-2018/

This issue is not exclusive to the online universe: terms of use for popular computer software, insurance policies, mortgage contracts, and car owner’s manuals have all joined this (not-so-inspired) technical literature. A recent study from Bristol Street Motors, for example, found that the Audi A3 handbook is 167,699 words long.2 Considering an adult’s average reading speed, this represents 11 hours and 45 minutes of reading—or 49 minutes longer than it would take to read The Lord of the Rings: The Two Towers by J. R. R. Tolkien. This is an alarming number, if we compare it to the 16 minutes the average American spends reading per week, according to the American Academy of Arts and Science.3

Cognitive bias and information avoidance

Professor Donald O. Case and colleagues have stated that “Many early studies of communication (…) have assumed that individuals seek, or at least pay some attention to, sources of information. This assumption is deeply embedded in Western culture, at least as far back as Aristotle’s statement that ‘all men, by nature, desire to know’ (circa 330 BC).” From this perspective, information-seeking could be interpreted as a natural aspect of human nature.4,5

However, the evidence doesn’t necessarily support the idea that humans always seek out information when it’s beneficial to them. A review by Golman, Hagmann, and Loewenstein found that people often avoid information, even when “it is useful, free, and independent of strategic considerations.”6 Abraham Maslow, best known for his hierarchy of needs, also wrote about this point: “We can seek knowledge in order to reduce anxiety and we can also avoid knowing in order to reduce anxiety.”7 This phenomenon is known simply as information avoidance.

In an experiment with participants from six different countries who maintained detailed diaries of information-related thoughts and activities, Bhuva Narayan and colleagues pointed out that even in information-rich economies, “sometimes people avoid information if paying attention to it will cause mental discomfort and cognitive dissonance, or increase uncertainty, irrespective of the utility of the information.”8 Thus, understanding why people seek some types of information and avoid others can help governments and companies design better information systems and improve overall customer experience.

TMI: Information overload

In 2012, a project called “Too Long; Didn’t Read” was created to fix what they called the “biggest lie on the web,” i.e. the fact that almost no one ever reads the terms of service they agree to. This user rights initiative analyses terms of use & privacy policies from various websites, ranking them from A (very good) to E (very bad).9 

But why exactly do we fail to read those details when they could help us understand conditions and characteristics of the services and products we are about to use? After all, most companies do declare them explicitly.

The psychologist Barry Schwartz raised an interesting point: in a 2005 TED Talk, he explained that information overload often brings paralysis instead of freedom of choice. Based on several experiments, he found that although people generally welcome more options and details, they also want to simplify their lives. Having more choices then contributes to a cumulative effect on decision-making that causes anxiety and distress.10

A classic experiment by Sheena Iyengar and Mark Lepper tested the effects of choice overload. The researchers organized displays of jam in grocery stores, either with 6 or 24 different flavours, and observed how many people would stop, taste, and purchase the product. Their findings indicated that the extensive choice displays in fact attracted more consumers (60%). However, only 3% of them bought the jam, compared to nearly 30% of those in the limited-choice condition. The results suggest that a large array of options may be more appealing at first, but it could also reduce the intrinsic motivation for subsequent action.11

Feeling overwhelmed by information is a primary reason that virtually no one reads the terms of service. One personal example: my smartphone recently (and automatically) updated its core download manager tool. When I accessed it, a pop-up informed me that its privacy policy had been changed, asking me to read it again. Fine. Well… until I realized the document had 19 pages (8,246 words, or 27 minutes of reading) and absolutely no clues on which clauses had indeed changed.

Even though this document provided essential information regarding data privacy, it was difficult to find the motivation to go through its contents. But when it comes to user agreements, the consequences of not reading can vary from personal data being quietly shared with third parties, to copyright issues related to stored files, to lawsuits.

Therefore, when companies provide huge amounts of information, especially if it is not categorized and fails to highlight the main ideas, they burden users with details they may not be willing to acknowledge—whether at that moment or any moment at all.12,13 

Fighting sludge: How to fix the terms of service

Richard Thaler and Cass Sunstein, the influential behavioral economists responsible for nudge theory, have emphasized how this kind of intervention can help people make better decisions.14 Nudges are cheap and simple interventions, designed to support decision-making in contexts where biases, habits, and mental shortcuts may lead us to results not in our self-declared best interests. They are based on behavioral principles and do not promote significant financial incentives. Default options enrolling workers in 401(k) programs, reminders for medical appointments, and footprint stickers showing the path to the nearest trash bin, for example, are all kinds of nudges; a fine for throwing garbage on the street is not.       

But parallel to the nudges, an opposite approach emerged: “(…) situations where these contextual variables actively impede activities that are in the consumers’ best interest, resulting in a reduction of welfare. These are known as sludge.”15 In the words of Sunstein, “Consumers, employees, students and others are often subjected to sludge: excessive or unjustified friction such as paperwork burdens that cost time or money that make life difficult to navigate, that may be frustrating, stigmatizing or humiliating and that might end up depriving people of access to important goods, opportunities and services.”16

Some sludges may be purposely placed to cause confusion and ambiguity, or lead consumers to choices that are not in their best interests. Others, however, may be created unintentionally, either because the development team is too close to its product or service to notice friction points, or because businesses have not thoroughly analyzed all interactions involved. Either way, the typical set-up of a user agreement or terms of service can be seen as sludge-like: the design of these documents makes it excessively difficult for users to learn important information, undermining their autonomy and overall well-being.

Happily, behavioral scientists concerned about this problem have begun to develop solutions to it. In a recent report titled “Seeing Sludge,” behavioral economist Dilip Soman and colleagues advocate that “organizations should keep in mind that they are designing for human beings who are cognitively lazy, forgetful, emotional and myopic.” Consequently, they have designed a dashboard to help companies review processes, communications, and inclusivity (PCI), and maximize effectiveness from the end-user perspective, simplifying their journey.15

This tool contains verification blocks for each of the three aspects (PCI), and supports companies to identify and adjust eventual friction points. For example: 

  • Are the channels to accomplish the task easy to use, or do they require multiple interfaces and multiple interactions with service personnel?
  • How many unique activities or steps are required to complete a task?
  • How many distinct entities or touch points does the end user need to interact with to complete the task?
  • Do some parts of the process interfere with other parts of the process?

Tools like this dashboard should be used as an initial effort to review the three spheres. The authors also recommend that organizations create dedicated teams and customize their own dashboards, thus improving relationships with customers. After all, it is in their best interest that interactions run without friction, and users achieve their goals or complete desired tasks as simply as possible.

This strategy can also be used in situations involving information disclosure, such as a service’s terms of service. Simply stating all terms in a lengthy and boring document does not help parties to be fully aligned on conditions and policies. “We know that the human brain is particularly efficient at processing information that is structured, linear, and that takes the form of concrete checklists, rather than identical information that is presented in a block of text.”15 The mere act of splitting information into distinct blocks, summarizing their context and using easy-to-access references can then increase user engagement and information awareness, and prevent misunderstandings or frustration—a win-win situation.17,18

Conclusion

Soman and colleagues highlight that “seeing and cleaning up sludge involves an appreciation of the fact that seemingly little, and seemingly irrelevant things matter. It is only if we develop habits to think small and look for the little things that might create impedance for humans, that we will be successful in developing more human-compliant organizations.” Hence, we can support our organizations with tools such as Soman’s  dashboard to build a new perspective on how to implement and disclaim products and services, assuring that relevant information is accessible—and actually accessed.

Read Next

Perspective

The “Social Dilemma” Dilemma

Netflix's social media docudrama has made "dark tech" a hot topic. How can behavioral scientists balance ethics with persuasive design?