Bounded Rationality

The Basic Idea

As hard as it is to believe, the average person makes about 35, 000 decisions everyday. Surely, not every one of these decisions takes intense thought and deliberation—if they did, we’d never get anything done! For most decisions, we are bound by the resources and information we have available to us at a given time.

Instead of scouring every possible option you could have for lunch if you consulted all the restaurants and grocery stores in your city, you simply open your fridge and see what is available. You may outsource some decisions by going to a nearby cafe or looking up a restaurant online, but generally, you work within a certain framework of options that is set by your own mind, and influenced by the cognitive and informational resources you have at your fingertips. In other words, you operate from a standpoint of bounded rationality, using a limited set of information and resources to make your daily decisions.

 

Broadly stated, the task is to replace the global rationality of economic man with a kind of rational behavior that is compatible with the access to information and the computational capacities that are actually possessed by organisms, including man, in the kinds of environments in which such organisms exist.

– Herbert Simon

Key Terms

Informational limit

The idea that not all information is available to us at any given time.

Cognitive limit

Each person has a limited amount their mind can handle before burnout. Daniel Kahneman’s experiments on cognitive limits have proven that trying to resist temptation or make many decisions at once can take up cognitive resources that prevent us from doing our best on basic cognitive tasks later on. When we’ve already made many decisions in a day, our mind may be burnt out, leading to a worse decision later on, or cognitive fatigue.

Satisficing

Satisficing is essentially our way of saying “good enough.” We satisfice when we accept that there are possible better options out there, but that considering the time, information, and resources available to us, that it would be in our best interest to settle for this one.

History

If you’ve read other TDL content, you might know that we most often believe we are rational beings, and behavioral scientists are those of us unpacking the complexity that comes with realizing that that belief doesn’t always hold water.

Until Herbert Simon came along, economists treated people as “homo economicus” in their predictive models. They assumed that we would always make rational decisions in order to achieve financial gain. According to Simon, however, even homo economicus cannot make decisions that are always 100% rational because they too are limited in both cognitive and informational resources. In this sense, Simon realized that being economical does not always translate to acting rationally.

In his 1957 book, Models of Man, Simon pointed out that humans are only partially rational. He suggested that humans use heuristics to make quick decisions, rather than optimal decisions, because decisions weigh on their mental resources. In their famous research experiments, Daniel Kahneman and Amos Tversky expanded on Simon’s ideas, discovering many of the heuristics and biases commonly known today. Simon, however, was the first to hypothesize that humans work with limited cognitive resources and information, leading to decisions that are not always optimal. These heuristics and biases, and much of Kahneman and Tversky’s research, fits under the idea of bounded rationality.

 

People

   

Herbert Simon

Herbert Simon was an interdisciplinary cognitive psychologist, economist, and political scientist. His research interests largely focused on decision-making within economic organizations, for which he won the Nobel Prize in Economics in 1978. Within the context of economic organizations, Simon studied how employees’ individual motivations often differed from company goals, leading to his theory of bounded rationality. Later, he became a pioneer of artificial intelligence and organizational theory. For most of his career, Simon taught at Carnegie Mellon University. He died in 2001.

Daniel Kahneman

Daniel Kahneman is widely considered the father of behavioral science, although much of his research was inspired by Simon’s idea of bounded rationality. Kahneman and his research partner Amos Tversky first became well known for their famous article “Judgment Under Uncertainty: Heuristics and Biases.” This article, which later became a book, describes the mental tricks used daily by the human mind in its decision making process, many of which are described in detail on TDL’s website. Kahneman’s more recent book, Thinking Fast and Slow, delves into more detail on the two systems of our mind and the most recent applications of his heuristics and biases. Despite having never taken a single economics course, Kahneman earned the 2002 Nobel Memorial Prize in Economics for his application of psychological insights to economic theory, particularly within the realm of judgment and decision-making.

 

Amos Tversky

Amos Tversky worked closely for over a decade with Kahneman on findings that have provided the foundation for modern and future behavioral science research. Aside from their many famous biases and heuristics, the pair is also well-known for developing prospect theory and loss aversion, a behavioral model that predicts how people will make decisions involving potential loss, risk, or uncertainty. Much of Tversky’s early research focused on cognitive science and the handling of risk, as well as the mathematical foundations of measurement. Upon teaming up with Kahneman, the pair worked for over a decade together, conducting creative and thoughtful behavioral science experiments in an effort to understand how the irrational human mind makes everyday choices.

Consequences

Chances are, you are already experiencing the consequences of bounded rationality in your everyday life.

For one thing, to understand bounded rationality is to understand that as a human, you are inherently an irrational being trying to fit into a rational box. You are constantly making choices—and likely putting pressure on yourself to make the ‘right’ one—while having to overcome a variety of biases and heuristics you are already prone to.

Of course, you are also susceptible to cognitive fatigue. After a 15-hour shift at work or the mental effort it takes to resist a third slice of pizza, you are more prone to using heuristics and more likely to make a poor decision than you might after a restful 8 hours of sleep. Although heuristics and biases are always present, they can be much more influential when your mental resources are depleted.

Also, each of your choices requires a certain amount of information before it can be made. While you know this, you also know that you don’t have the time nor money to consult all possible information on a topic before making a decision about it. You may ask one or two people who have been to different universities before making your own choice, but you’re not going to consult all existing alumni. Thus, bounded by a certain fixed level of cognitive resources and information at any given time, your choices are probably about half chance.

So far, do you feel more sure of your last choice, or less sure? On one hand, the concept of bounded rationality can be terrifying since it comes with the conclusion that you don’t have full control over your choices, and therefore can’t guarantee your desired outcomes.

On the other hand, the concept of bounded rationality is humbling, as it brings attention to the fact that there is not always a perfectly ‘right choice’ and that our decisions are the result of a variety of uncontrollable factors.

Overall, the concept of bounded rationality calls attention to the fact that we need a certain degree of irrationality in order to live a rational life. Although your choices may be partially irrational since they are based on limited resources, would it be more rational to spend days on end compiling all possible information you could find before making a simple choice? Absolutely not. In order to live a life that is rational, we need to give in to a certain level of ‘irrational’ decisions.

Controversies

While bounded rationality was certainly a provocative theory in its time, decision theorists have not necessarily adhered to Simon’s ideas.

While Simon, Kahneman and Tversky, and now a whole bunch of other behavioral scientists have argued that humans are irrational because of our mental limitations, the German psychologist Gerd Gigerenzer sees this research finding differently. Gigerenzer looks at the heuristics and biases and highlights their adaptivity: how helpful they are to us on a regular basis. In his view, a heuristic becomes rational when it adapts to its own environment, and actually becomes a useful decision-making tool.

Gigerenzer supports the research on biases and heuristics, but simply conceptualizes it from a new angle, by thinking of rationality itself as the ability to choose good heuristics for the task at hand. With his associate researchers, Gigerenzer has found certain situations in which the use of heuristics can make better decisions with less effort; in other words, where less is more. While the traditional view, supported by Simon, Kahneman and Tversky, supposes that more information and cognitive resources will always help you make a better decision, Gigerenzer argues that relying on heuristics can often be in your best interest.

The argument of whether or not heuristics are in our favour rages on. British economist Huw Dixon supposes that analyzing the process of bounded rationality may not be necessary at all. Dixon created an equation to determine a decision-making process in mathematical terms. His assumption is even if people don’t have all the information, how close they come to their optimal decision is the metric of importance. His mathematical theory has been used in computer processing and AI algorithms, causing some to believe that the rationality of a decision is determined by the computer’s ability to mimic it (its computational intelligence). In other words, if a computer would make the same choice, it is considered a good decision. Advances in technology thus narrow in on what is rational versus what isn’t, creating opportunities for machines to make more automated and efficient decisions for us.

Case Studies

Choice architecture

In their book Nudge: Improving Decisions about Health, Wealth, and Happiness, Cass Sunstein and Richard Thaler use the term ‘choice architecture’ to describe how corporate decision-makers nudge the public toward certain decisions, sometimes without even knowing it. Sunstein and Thaler urge choice architects like grocery store workers to nudge customers toward positive decisions like healthy food options by placing them at eye-level or through targeted advertising. Because our decisions are bound by the information and cognitive resources available at any given time, we are more likely to make choices that are ‘nudged’ toward us, and therefore often an easy way out of a long decision making process. In this way, choice architects of all kinds—from teachers to doctors to grocery store workers—can capitalize on our bounded rationality by nudging us toward positive choices. On the other hand, however, choice architects like marketing professionals may turn us toward negative choices by using misleading advertising techniques, like ‘sugar free’ or ‘skinny girl’ to market unhealthy options like junk food or alcohol. It is important that in these cases, we make use of additional information like nutrition facts or ingredients before giving into quick marketing schemes in order to avoid making a decision.

Classroom teaching

Bounded rationality can also limit and thus reinforce our conceptions of others, having harmful and dangerous consequences. In a 1990 study on classroom teaching, researchers Okhee and Porter found that teachers’ preconceived notions of their classes influenced how they treated individual students, thus affecting the students’ learning outcomes. While classes are certainly complex and multifaceted as a result of being composed of 20-30 students, a handful of students can paint a picture in a teacher’s mind that extends to students who do not fit the mold. In situations in which teachers conceptualized their classes as ‘enriched’ based on the majority of students in the class, teachers treated students more positively, and thus confirmed their own conceptions by being treated positively in return. When teachers thought of their classes as ‘problem classes,’ individuals were treated poorly, in turn creating circumstances for them to become more of a ‘problem.’ Bounded rationality makes us satisfice with the limited information and cognitive resources we have at a given time, thus allowing some teachers to lean back on a mental model of a classroom instead of paying attention to individual needs and personalities. In turn, this can have negative and inequitable effects on individual learners over many years, particularly racialized students.

Related TDL resources

Cognitive dissonance, explained.

Sometimes, trying to make decisions quickly and with limited information causes a disconnect between our values and our actions. In this article, you will read about cognitive dissonance and why a lack of consistency in our beliefs continues to bother us.

Nudges: Social Engineering or Sensible Policy?

This article goes deeper into the concept of choice architecture and examines some of the concerns, motives, and politics behind the hidden engineers of our decisions.

Sources

  1. Administrative behavior. (2020, November 7). Wikipedia, the free encyclopedia. Retrieved February 3, 2021, from https://en.wikipedia.org/wiki/Administrative_Behavior
  2. Barros, G. (2010, September). Herbert A. Simon and the concept of rationality: Boundaries and procedures. SciELO – Scientific Electronic Library Online. https://www.scielo.br/scielo.php?script=sci_arttext&pid=S0101-31572010000300006
  3. Bounded rationality. (2020, August 25). The Decision Lab. https://thedecisionlab.com/biases/bounded-rationality/
  4. Bounded rationality. (2002, August 11). Wikipedia, the free encyclopedia. Retrieved February 3, 2021, from https://en.wikipedia.org/wiki/Bounded_rationality
  5. Choice architecture. (2008, December 1). Wikipedia, the free encyclopedia. Retrieved February 3, 2021, from https://en.wikipedia.org/wiki/Choice_architecture
  6. Gerd Gigerenzer. (2004, August 15). Wikipedia, the free encyclopedia. Retrieved February 3, 2021, from https://en.wikipedia.org/wiki/Gerd_Gigerenzer
  7. Herbert A. Simon. (2002, February 25). Wikipedia, the free encyclopedia. Retrieved February 3, 2021, from https://en.wikipedia.org/wiki/Herbert_A._Simon#CITEREFSimon1976
  8. Hodgson, A. (2020, June 14). Bounded Rationality [Video]. YouTube. https://www.youtube.com/watch?v=rfRpLUrOVqE&ab_channel=AshleyHodgson
  9. Lee, Okhee & Porter, Andrew. (1990). Bounded Rationality in Classroom Teaching. Educational Psychologist. 25. 159-171. 10.1207/s15326985ep2502_4.
  10. Milkman, K. (2018, December 7). What is Choice Architecture?https://www.youtube.com/watch?v=vd2WbBRCT-E&ab_channel=TheLavinAgencySpeakersBureau

Read Next