A Brain-Changer: How Stress Redesigns our Decision-Making

Right now, stress is thriving, especially in the United Kingdom. This year the Mental Health Foundation have performed the largest known study of UK stress levels to date, with more than 4,500 participants, and the results are staggering. During the past year, 74% of respondents reported feeling so stressed that they felt “overwhelmed or unable to cope” [1].

This is no small claim, and the statistic underlies an unacceptable situation that more and more people are having to endure. The result highlights the increasingly urgent need to understand what is causing such overwhelming stress levels, and to recognise the potential consequences if we do not act to remedy the situation.

What is stress?

To some extent, the reaction which we might refer to as ‘stress’ on a daily basis can be unavoidable, and simply a part of normal life. However, stress is officially defined by HSE as a reaction to “excessive” pressures or demands [2], as distinguished from ordinary day-to-day pressure. This means that stress is not, and should not be treated as, the norm. On the contrary, consistent stress should be treated as a warning sign that the demands placed on the individual need to be reduced.

Certainly, acute stress has its place in evolutionary terms. In primal situations of danger, like being chased by an animal predator, stress was a useful mediator of response choices. Its function was to take the reins to promote our survival. On sight of a predator, our stress response would activate the sympathetic nervous system and catapult us straight into a fight-or-flight state. Distractions like hunger were helpfully suppressed, to prevent our energy and attention from being diverted away from the life-or-death choice between standing our ground or fleeing.

While we may no longer have to run from predators on a daily basis, our bodies continue to produce the same physical response to modern-day causes of acute stress. This is why we might lose our appetite and feel like running away before a big presentation at work.

Why does it matter?

To some extent, the evolutionary advantages of stress can translate into the modern-day workplace, channelling our focus into higher priorities, forgoing lesser ones, and therefore dealing with urgent situations more quickly. Ultimately, however, the extreme response which was necessary in the face of a hungry lion is disproportionate in non-life-threatening situations in the boardroom – and can in fact become actively harmful. When excessive stress is experienced as a part of daily life, it can become chronic, and can cause serious health problems. Stress has been specifically linked to depression [3] as well as more general changes in mood, behavior and physical health.

Recently, a lesser-known impact of stress is increasingly becoming a focus of research: the impact of stress on decision-making. In 2012, an experiment showed that chronic stress biases human decision-making towards habits rather than goals [4]. This study tested two groups of participants – one group had been exposed to severe stress (this group consisted of students in preparation for a medical selection exam), and one group had not. Both groups were given the identical task of repeatedly making choices between several options on a screen. Each option resulted in the participant receiving either a food-based reward, some water (a neutral outcome), or nothing. Participants were instructed to learn (and to choose) the options which most often resulted in the food-based rewards.

During a break, participants were allowed to eat until they confirmed that they no longer wanted or ‘valued’ the food-based reward, and the experiment was then continued. Participants were informed that from this point onwards all options would now have an equal probability of delivering a food-based reward. The non-stressed group adapted their choices based on their current needs and wants (i.e. their decreased desire for food), and also based on the altered outcomes of the options. In other words, they continued to make goal-directed decisions.

However, the stressed participants continued to rely on the habitual choices they had learned earlier in the experiment, and failed to adjust these according to their reduced hunger or the altered probabilities of reward. In other words, stressed participants could not adapt based on the outcomes of their behavior.

The study also made use of fMRI scans to examine what was happening in the brains of stressed participants. Structural changes were observed, and the results also showed that the neural networks which govern decision-making are activated differently under chronic stress. Activation shifted from the associative (goal-orientated) circuit to the sensorimotor (habitual) circuit. This provided a neurological map for the observed differences in decision-making behaviors between the two groups.

In some situations, evidently, habitual decision-making is helpful – when you leave the house for work, it is useful to be able to trust habit to remind you to pick up your keys, phone and bag on your way out. However, it is also necessary to be able to override habitual decisions in order to adapt to changing circumstances. For example, a change in the weather might mean that we might need to add a different item into the mix, like an umbrella or a raincoat.

As the former experiment demonstrated, this ability to adapt is inhibited under acute stress; the brain resorts to habitual decision-making because it exerts less demands on our cognitive resources. Forgetting the umbrella on a rainy day may not be a big problem, but reliance on habitual decision-making is not optimal when making complex decisions with significant life outcomes, be it deciding how to navigate our career goals or choosing healthy eating over habitual snacking.

Perhaps more worryingly, a recent study conducted in 2017 [5] suggested that it is not only the methodology of our decision-making which is altered under chronic stress, but also our ability to make a reliable cost-benefit evaluation. This study found that in rodents, chronic stress exposure resulted in an “abnormal evaluation of costs and benefits”, which translated into “non-optimal decision-making”, with an increased probability of choosing high-risk and high-reward options. In this way, chronic stress can mar the quality of our decisions and cause us to focus more on potential rewards and too little on potential risks.

This becomes problematic when weighing up life-changing decisions, such as changing career or having a baby. Again, this study revealed parallel alterations in the activation of the neural circuits used for decision-making. It suggests that this functional alteration in the brain could be the underlying physical link between stress and aberrant decision-making. For individuals consistently exposed to excessive stress – in highly-pressured jobs, for example – these detrimental effects on decision-making are likely to impact negatively on life outcomes.

What is making us so stressed?

Work consistently emerges in research as one of the most prominent sources of stress in the UK. Among the most commonly-cited factors in work-related stress are heavy workload and long hours, as well as a lack of control or autonomy at work [6]. Indeed, in the OECD Better life index, the UK emerges only 28th out of 38 countries in the rankings of work-life balance. It is clear that we have lessons to learn from those topping the ranks, with the Netherlands and Denmark emerging as the two best-rated countries.

The UK’s soaring rates of workplace stress, however, are old news. The Labour Force Survey statistics suggest that the sky-high rates of work-related stress in this country have remained relatively constant since 2003 [7]. Work-related stress should be, in theory, relatively easy to access through interventions, and health and safety legislation already holds employers legally accountable for protecting employees from health problems arising from workplace stress. Interestingly, an HSE report for 2016/17 even suggests that the UK is “better placed than many in Europe for having an action plan to prevent work-related stress, at 60%” [8]. So why does the problem remain?

It is apparent from the enduring statistics that the existing legislation and action plans are not enacted rigorously enough. In a study conducted in 2014, for example, 80% of respondents claimed that they would not feel able to discuss stress with their employer [9], mostly because they felt afraid of being stigmatised and were concerned that the disclosure might reduce their chances for promotion. The latter reason suggests an unspoken expectation that employees will simply accept and shoulder excessive burdens. As the statistics show, this attitude is simply producing an unhealthy and unhappy workforce.

So what can we do?

Evidently, a cultural shift is needed. Accountability is key: employers need to take active responsibility for reducing workplace stress levels. Whether this involves introducing flexible hours, giving employees more control over their schedule, or simply encouraging openness to discuss stress-related issues.

Openness to real change in the workplace is yet to be embraced in the UK, and is much-needed. Other countries are already exploring a deep-seated shift in work culture. In Sweden, for example, a six-hour working day was piloted in 2015-17. While the six-hour day has not yet been adopted as a country-wide standard, the pilot itself showed a commendable effort to prioritise the issue of work-life balance. UK policy-makers need to mirror this openness to cultural change, and to recognise the damaging impact of chronic stress on the decisions we make every day.

Why is Changing Behavior So Hard?

My grandmother used to say “change is hard”. Two recent examples reminded me of this simple wisdom.

1) I had recently gotten a promotion of six months of Netflix with our cable provider here in Israel.  As I became more and more hooked on Netflix, I began to worry about the inevitable moment my Netflix would be cut off.  And then one day, while checking my credit card, I realized I had been charged.  “Oh!” I realized, “It must have been one of those ‘if-you-don’t-cancel-we’ll-just-assume-you’ll-continue’ things.”  And guess what? I still haven’t cancelled.  Doing something to change a status quo is hard.  Here, the company intelligently created the default that I’d have to opt out.  And guess what? I haven’t bothered to exert the effort to cancel.

2) I recently discovered a very good new cell phone deal with a new company.  The third, fourth, and fifth lines were free with unlimited data for the next year.  No commitment.  No change of phone number.  I was sold, and I tried to talk my in-laws into joining me.  There were simply no downsides.  They, however, had been with their old company for years.  They said “we’ll think about it” (i.e., “no”).  Here, they were continuing with the status quo, despite the clear irrationality of it.

Both of these examples, although different, remind us that we humans, are not always making logical, rational decisions.  Psychologists call this tendency to stick with what we have the Default Bias or Status Quo Bias, and sure enough, research has shown my in-laws are not the only ones who will stick with a given mobile company out of habit (Khedhaoria, Thurik, Gurau, & Van Hecket al., 2016)

A powerful illustration of this bias to avoid change was published in Science (Johnson & Goldstein, 2003).  The authors compared organ donation data by country, and found that, in countries where organ donation is the default (that is, people have to opt out of donating rather than opt in), a greater proportion of the population are organ donors.  Consider, for example, Austria and Germany–two countries that are geographically and culturally similar.  At the time of their publication, Germany had an organ donation rate of 12% compared to Austria’s rate of  99.98%.  The difference?  In Germany, you had to opt in, whereas in Austria, the default was organ donation.  In other words, similar to my Netflix promotion, you had to “contact them to cancel”.

The power of the default has been used to nudge humanity towards more environmentally-conscious choices.  For example, in 2008, Rutgers University changed their printer settings to have “print on both sides” as the default, and guess what?  44% paper reduction! (https://oit-nb.rutgers.edu/service/printing)

In one lab study, subjects were told to pretend they were undergoing a remodeling of their home and the contractor had outfitted their house with lighting. All subjects were given information about the costs and benefits of this choice, and then asked if they wanted to switch, at no cost to the alternative.  When the default outfitting was energy-efficient lightbulbs, the energy-efficient bulbs were chosen twice as often (Dinner, Johnson, Goldstein, & Liu, 2011).

Keeping with the theme of saving the environment, conference participants were given the option to add an additional cost to their plane travel with a donation to offset the carbon cost (e.g., planting trees or investing in alternative energies).  When acceptance of the policy was presented as an opt-out alternative (the extra fee was already included and participants had to tick off that they would like to opt out), more people accepted the extra cost as opposed to when you had to explicitly tick a box.  After all, it’s somehow psychologically more palatable to accept a fee that’s already on your bill than to charge yourself extra, even though mathematically it is the same cost (Araña & León, 2013)

Defaults have been shown to influence nutritional choices.  A recent study (van Kleef et al, 2018) highlighted how when whole wheat bread (as opposed to white bread) was set as a default at a sandwich stand, 94% stuck with this default option (as opposed to 80% sticking to the default when the white bread was the default).  Likewise, people usually will stick with whatever milk (e.g., 2%, whole, skim) is the default milk at their coffee shop (Colby, Li, & Chapman, 2014).

Let’s look at an example to figure out why this might be.

Imagine you are checking out of a pet store, and the cashier says “We automatically include a $1 donation to help homeless animals.  You can opt out if you’d like”. You find yourself saying “fine”.  The morality of helping animals aside, would you have been as likely to add $1 to your bill if she had framed it as “do you want to add a $1 donation to help homeless animals?”

Why does the former frame work better?

First, we are a bit lazy.  “Should I give the $1? Should I not? I don’t want to think about this!”  If I don’t want to think, I just go along with what someone else decided for me.  Less effort.  And if I have to make a call to cancel?  Or fill out a form, or even worse, stand in line and then fill out a form to opt out?  Definitely easier to just stick with the default.

Second, a default seems like the recommended choice.  If I order from a McDonalds screen and the default is the small french fries, I assume that they are suggesting that is the appropriate portion for one person.  Here that $1 donation is the default—we assume it’s the norm.

Third, defaults often represent the status quo, and change usually involves a trade-off.  Consider a new job.  There will be the cost of moving, the emotional toll of leaving coworkers and the discomfort of a new place.  We worry about what losses we may incur if we change the status quo.  This loss aversion prevents us from wanting to take the “risk” of change.  No, there isn’t much trade-off to giving $1 in this case to homeless animals, but the point is our brain might be wired to be conservative.

There are real consequences to humanity not learning how to overcome our tendency to stick with the status quo.  This is exemplified in the following case (Montpetit, & Lachapelle, 2017):

Soil contamination is a serious global environmental problem.  There are estimated to be over a million sites in the U.S. and Europe alone where soil is contaminated, leading to chemicals which can get into the food supply and threaten the health of populations.  A low-cost alternative to conventional (and more expensive) removal methods has emerged in the last 20 years.  Pytoremediation is the use of living green plans for in situ removal, degredation, and containment of contaminants in soil and water.  It is a low-cost way to improve the economies of developing countries, and the plants can even be beautiful flowers that can be sold.  Why would anyone be against this?  Montepit and Lachapelle conclude that the Status Quo Bias is the barrier to accepting this new innovation.

So, what’s the solution?  All change starts with recognizing our own limitations first.  If we know that we (and others) have a tendency to stick with the status quo, we may know it’s going to take a little more effort from ourselves and others to get change.  Indeed, Montepit and Lachapelle highlight the importance of disseminating knowledge to those on the ground. Explicit exposure to the scientific knowledge increased acceptability of the innovation.