Dark Patterns

The Basic Idea

Imagine this: you're scrolling on an app when an ad interrupts your browsing. You automatically click the "X" button, but it still redirects you to the ad's website. You close out of it, feeling quite annoyed, then find the tiny button that says "close" in the corner that finally removes the ad. We can explain this all too common experience using dark patterns.

Dark, or deceptive patterns, are misleading design techniques used in websites and apps to trick users into making unintended decisions. They usually guide users down a path that they never intended to take. These manipulative strategies can range from hidden costs to misleading navigation, all crafted to benefit the service provider at the user's expense. 

Think of when a website presents you with two options: accepting all cookies or selecting specific ones from a lengthy list. Often, when faced with limited time or understanding of each point, you might opt to accept all cookies, even if you weren’t planning on it. Another common scenario is when an app or website asks if you want the service to "use your activity to provide a better experience" – which actually means it wants permission to track all your data.1

Understanding dark patterns is crucial not just for consumers looking to protect their information online but also for designers committed to fostering trust and transparency in digital spaces. After all, dark patterns are made to benefit the organization or company rather than the user or client.2

Lead generators must be honest about who they are and why they are collecting consumer information.


United States Federal Trade Commission’s Staff Report, September 2022.2

Key Terms

Below are some examples of different types of dark or deceptive patterns:

Confirmshaming: This type of design aims to make us feel guilty about making a decision. The platform exploits the users’ emotions to persuade them to perform the desired action. This usually occurs when the provider is trying to convince them to subscribe to a service or newsletter. Instead of using sentences such as “I don’t wish to subscribe” they might use: “I’m not interested in being informed,” “I don’t wish to make a change,” “I want to lose this amazing opportunity,” and so on.3,4

Forced Registration: To gain access to certain services or content, users may be required to create an account or subscribe to a newsletter. For example, you might not be able to complete a purchase unless you create an account. By doing so the provider gains access to your data.2

Hidden Costs: This one is quite popular for apps, especially games. The service might appear as free when you first download it, but once you start using it, the provider will quickly ask you to pay to continue. For gaming, this might pop-up once you’re invested or reach a certain level, making you more likely to give in.5

Preselection: This one convinces the user that the service is “doing them a favor” by already selecting certain options. For example, when you book a flight, they might preselect an option for you to subscribe to their newsletter, donate money to a charity, contribute funds to compensate for CO2 emissions, or redirect you to book a car or hotel, among others. This is a clear example of status quo bias, as it exploits the user’s tendency to stick to the way things already are (preselected options), rather than performing actions that require change.

History

Dark patterns were first addressed by Harry Brignull, a user experience (UX) expert, in 2010 to identify and categorize manipulative design practices that were becoming increasingly prevalent across the internet. This caused a detrimental impact on users, inspiring Brignull to start a campaign that would encourage a transparent digital environment, educating the public and exposing these unethical practices. Since then, awareness and criticism of dark patterns have grown, leading to calls for regulation and ethical design standards.6

For example, the concepts and terms outlined on Brignull’s website “Deceptive Patterns” have been influential in creating new laws and regulations such as the EU Digital Services Act (DSA, the Digital Markets Act (DMA), the California Privacy Rights Act (CPRA), among others. In general, all of these legislative measures are designed to enhance digital transparency, protect online consumer rights, and ensure fair competition in the digital marketplace.6

Considering their website’s impact on worldwide user experience and privacy laws, the name was changed from dark patterns to deceptive patterns in order to avoid using words that could unintentionally evoke negative connotations or harmful stereotypes.6

These new regulations caused companies such as Apple to implement a transparent pop-up in apps asking if you allow an app to track your data or not. In other words, they are restricting apps from using deceptive designs for intentional data tracking.7

People

Harry Brignull

User experience (UX) director and founder of the Deceptive Patterns Initiative. A pivotal figure in bringing attention to dark patterns. His work laid the foundation for ongoing discussions and efforts to combat unethical design practices. In his spare time, Brignull provides expert testimony in cases involving deceptive practices.6

Mark Leiser

Member of the Deceptive Patterns Initiative. A professor in Digital, Internet, and Platform Regulation, Transnational Legal Studies in Amsterdam. Leiser is a well-respected expert on the legal repercussions of deceptive design, focussing on how these repercussions might impact fundamental rights, e-commerce, platform regulations, security, privacy, freedom of speech, and cybercrime.8

Cristiana Santos

Also part of the Deceptive Pattern Initiative. Assistant Professor in Privacy and Data Protection Law in Utrecht. She mainly focuses on the enforcement of regulations surrounding dark patterns.9

Consequences

The widespread use of dark patterns undermines user trust and can lead to significant privacy, financial, and emotional consequences. It also contributes to a cluttered and confusing digital environment. Users may not always recognize dark patterns as deliberate manipulations and categorize them as annoyances rather than breaches of privacy. After all, it feels so normal now to accept all cookies, make new accounts, and receive dozens of emails every day from accidental subscriptions.

However, there is growing mass concern about when and who we have accidentally consented to share our data with. Although it’s difficult to find pros to dark patterns, one upside is they have pushed us towards more ethical design practices, encouraging transparency and user respect in the digital world.

Unfortunately, the drive towards ethical design currently relies on local regulations, which introduces political complications and creates social disparities and inequities around the globe. With the rapid advancement of AI technology, there’s also a growing concern that regulations and laws may not be able to keep up as new deceptive strategies are devised.

Controversies

Some controversial debates center around the fine line between persuasive and manipulative design. Are providers “being smart” or do they actually have a bad intention? Some argue that certain tactics are simply part of a competitive business strategy, while others view them as inherently unethical. 

These debates often delve into the ethics of user engagement and the responsibility of designers to create environments that prioritize user welfare over business gains. The controversy intensifies with the realization that what might be considered a clever marketing strategy by some could potentially exploit users' psychological vulnerabilities, leading to unintended harmful outcomes. 

This raises important questions about the balance between innovation and ethical standards in design, challenging the industry to redefine the boundaries of acceptable persuasion. Ultimately, the distinction between smart design and manipulative tactics hinges on the transparency of intentions and the respect for user autonomy.

Case Study: The Legal Battle Against Dark Patterns – A Publishers Clearing House Lawsuit

Publishers Clearing House (PCH), a well-known marketing company famous for its sweepstakes and prize-based games, came under legal scrutiny in 2023 for allegedly using dark patterns to deceive consumers. The company essentially tricked clients into believing that buying certain products would boost their chances of winning, or even that it was a necessary step. They were also charging hidden fees and sending misleading emails about the use of their customer’s data.

The lawsuit ended up in an 18.5 million dollar fine which was used to refund all customers as they were mostly from a lower income status.10 This case shows how companies are starting to face consequences for using deceptive design, and hopefully brought us one step closer to an honest online world.

Related TDL Content

The Personalization Paradox: Balancing Convenience and Privacy 

This piece highlights how the manipulation of user data for personalized experiences can sometimes cross ethical boundaries, leveraging user psychology in ways that might compromise privacy and autonomy.

Confirmation Bias

This article delves into the human propensity to seek, interpret, and remember information that confirms pre-existing beliefs, while often ignoring contradictory evidence. It also provides strategies for mitigating confirmation bias and underscores its impact on decision-making and critical thinking.

References

  1. Schiffer, Z. (2021, April 8). How 'dark patterns' influence travel bookings. Vox. Retrieved from https://www.vox.com/recode/22351108/dark-patterns-ui-web-design-privacy
  2. Pham, H. (2021, January 5). Dark Pattern: The Dark Side of UX. UX Design. Retrieved from https://uxdesign.cc/dark-pattern-the-dark-side-of-ux-6a40dea32715
  3. Federal Trade Commission. (2022, September 14). Bringing Dark Patterns to Light: An FTC Report. Retrieved from https://www.ftc.gov/system/files/ftc_gov/pdf/P214800%20Dark%20Patterns%20Report%209.14.2022%20-%20FINAL.pdf
  4. Deceptive Design. (n.d.). Confirmshaming. Retrieved from https://www.deceptive.design/types/confirmshaming
  5. Deceptive Design. (n.d.). Hidden Costs. Retrieved from https://www.deceptive.design/types/hidden-costs
  6. Deceptive Design. (n.d.). About Us. Retrieved from https://www.deceptive.design/about-us
  7. Çağlarca, S. (n.d.). Four Common Dark UX Patterns: Would it be ethical if everyone did it? Medium. Retrieved from https://sercan-caglarca.medium.com/four-common-dark-ux-patterns-would-it-be-ethical-if-everyone-did-it-c4ac58690e94
  8. Vrije Universiteit Amsterdam. (n.d.). Mark Leiser - Researcher. Retrieved from https://research.vu.nl/en/persons/mark-leiser
  9. Universiteit Utrecht. (n.d.). Dr. C. Teixeira Santos - Staff Member. Retrieved from https://www.uu.nl/staff/CTeixeiraSantos
  10. New York Times. (2023, June 26). Publishers Clearing House accused of using 'dark patterns' in a lawsuit. Retrieved from https://www.nytimes.com/2023/06/26/business/publishers-clearing-house-dark-patterns-lawsuit.html

About the Author

Mariana Ontañón

Mariana Ontañón

Mariana holds a BSc in Pharmaceutical Biological Chemistry and a MSc in Women’s Health. She’s passionate about understanding human behavior in a hollistic way. Mariana combines her knowledge of health sciences with a keen interest in how societal factors influence individual behaviors. Her writing bridges the gap between intricate scientific information and everyday understanding, aiming to foster informed decisions.

Read Next

Notes illustration

Eager to learn about how behavioral science can help your organization?