decision making processEconomist Richard Thaler and psychology professor Daniel Kahneman have something very unique in common. Besides both being authors of best-selling books, they are both recipients of the Nobel Prize in Economic Sciences, reflecting their contributions to the advancement of the field. It makes sense that an esteemed economist could be recognised in this way, but how on earth does a psychology professor win a Nobel prize in economics and what could either of these people have to do with security?

Their contribution to the field of economics comes down, most simply, to consumer decision making. Traditionally, economists went about their field of study working to the theoretical assumption that consumers make decisions by carefully weighing the pros and cons, the costs and benefits of each decision, and choosing to maximise their chances of success. Decision making was thought to be a rational, considered process.

And then along came Daniel Kahneman who, together with his late research partner Amos Tversky, flipped this whole idea on its head. Through a series of research studies, they showed that consumers are mere humans, who do not always behave in ways that reflect a rational, considered decision-making process. Rather, they often make decisions that are specifically irrational with regard to their longer term economic, financial or security interests. They are susceptible to making errors based on cognitive or emotional responses. The good news, is that we are not just stupid when it comes to decision making; we are predictably stupid.

It was Kahneman and Tversky who introduced us to the term behavioural economics, setting the stage for a huge paradigm shift and changing the way economists thought about economics. Thaler took the concept a step further and coined the term choice architecture to describe the process by which designers can influence decision makers by manipulating the way that choices are presented to them.

In his best-selling book Nudge, Thaler, together with co-author Cass Sunstein, explained how simple changes, called ‘nudges’, can make huge differences when it comes to purchasing behaviour. They cited an example from a school canteen seeking to address the unhealthy food choices typically made by students.

Rather than trying to convince their customers to make better meal choices for the sake of their longer term health, a hard sell to hungry teenagers in any case, canteen staff took a different approach. They made subtle changes to the way the food options were presented, like varying the order of the menu, or rearranging the display so that healthier options were positioned at eye level, to nudge customers toward making healthier choices.

In another example, Thaler and Sunstein applied Kahneman and Tversky’s concept of loss aversion (described later) to propose a better solution for weight loss than positive reinforcement. Rather than rewarding exercisers for adherence to their routine and kilograms lost per week, they devised an alternate intervention based on the understanding that the motivation to avoid loss is often twice as strong as the motivation to gain a reward. In the new method, exercisers commit to hand over a sum of money (for example, $100) if they do not adhere to their exercise regime and lose a specified amount of weight. Since the motivation to avoid losing the money is so strong, people are more likely to comply with their exercise programs under these conditions.

From the Nudge Unit that was initially set up as part of the Cabinet Office in the UK government, to creative marketing agencies applying the principles of behavioural economics to sell more products, nudge tactics are being used across the globe to affect the way that people make decisions.

If you think the term choice architecture is fancy and you are looking for extra titles to add to your CV, you too can quite easily become a choice architect. According to Thaler, “If anything you do influences the way people choose, then you are a choice architect.”

Why does our Brain do Dumb things?

The simple answer to why our brain makes mistakes is because it is trying to help us. It is trying to make our lives easier by taking the quickest, most efficient path to finding a solution. Unfortunately, this comes at a cost of our more composed and calculated cognitive processes, so we make errors. It is the equivalent of when we say something offensive or unintentional in the heat of the moment, and then when we are calm we say, “I can’t believe I said that”. We make mistakes because we are operating on a different system, and when our slower rational system kicks in, we cannot believe how dumb we have been.

In Kahneman’s 2011 book Thinking Fast and Slow, he described the two systems of decision making. System 1 is the fast, impulsive system and System 2 is the slower, more deliberative system. From a neuroscientific perspective, these systems reflect different operations of the brain. When information first enters the brain, it does so via the limbic system. The thalamus receives sensory information from the nervous system and its job is to classify it as either safe, or threatening, with the help of another brain structure called the hippocampus, which has the task of applying memory and context to identify the incoming information.

When the sensory information is perceived as threatening, System 1 is activated. We go into survival mode and blood flows to the parts of our brain responsible for rational thinking and problem solving; decision making is compromised. We are more susceptible to making errors because we are simply not working to our full cognitive capacity. Our brain is focused on doing what it was designed to do – protect us from harm. In its search for any information that will help to reach a solution as quickly as possible, we resort to heuristics. These are shortcuts that our brains take to solving a problem, but they are not particularly trustworthy. The more uncertain the situation, the more likely we are to engage in this style of thinking.

On the other hand, once our hippocampus is able to apply context and mitigate any threat, we are free to engage in System 2 thinking. Blood flows to the prefrontal cortex and we can think clearly, logically and carefully to make smarter decisions.

With this knowledge, designers can apply the principles and theories of choice architecture to not only change the way they present their product to the consumer, but also to understand why people make decisions that are, in the term given to us by Duke University Professor of Behavioural Economics Dan Ariely, predictably irrational.

How Irrational are We?

In our everyday lives, we succumb to a range of heuristics, or cognitive biases, that inform our decision-making process. The list of heuristics is constantly growing, but here are a few of the most common ones we hear talked about in this space.

Availability bias

This refers to the tendency to believe that the more easily we can think of something happening, the more frequent it must be. After the Malaysian Airlines flight 370 went missing, airline concerns and flight phobias were at an all-time high because people could easily recall an instance of flying gone wrong. We quickly believe that the chances of something happening to us in the air are greater than they are, because it just happened it is more readily available in our mind. If a close friend or colleague’s computer contracts a virus, we are more likely to take action to mitigate the risk ourselves, because we can easily recall an instance of a security breach close to home.

Social proof

Social proof draws on our innate human desire to connect to others. We are more likely to make a choice if we have seen all our friends do it – it is the herd mentality. In the absence of adequate information about a product or problem, we will likely default to whatever our friends, or people like us, are doing, because we figure they either must know something we do not, or if it turns out we are wrong, we can share the blame.

Temporal discounting

We tend to overvalue things that present an immediate solution and discount the value of things that present a reward further into the future. Superannuation is a good example that counteracts the problem of temporal discounting – it is very hard to convince people to put money away for their retirement when they are 30 years old and they would really like to spend their money on a relaxing vacation. We figure everything will be alright when we are older, so we would prefer to reap the benefits now even at the sacrifice of our future well-being. Anyone who has seen the famous marshmallow experiment, has seen the dilemma we face when presented with one marshmallow now, or the potential to earn two marshmallows if we can wait 15 minutes. The struggle that we go through to choose rationally over emotionally is evidently displayed through this example. Taking shortcuts to ensure our immediate security at the expense of our future security is common.

Choice paradox

This refers to the fact that even though we like to feel that we have the right to express our freedom of choice when it comes to making a decision, by being exposed to every option available, it is actually really distressing to have too much choice and we end up less likely to make a decision and less satisfied with our choice if we do end up making one. Consumers need to feel that they have the opportunity to see everything on offer, but narrowing the selection for them makes it a whole lot more likely that a decision will be made.

Loss aversion

Even though we really like to win things, not losing is actually better. This is best illustrated in a study by Kahneman and Tversky where they showed that if a person loses $10, giving them back $10 does not equalise their emotional state, even if their bank balance is neutralised. Experiments showed that after losing $10, it took a gain of $20 to recover the emotional loss and restore a person’s mood to where it was before the loss.

Application to Security

In what may seem like a major contradiction, a signal of a healthy functioning brain is its ability to deceive us, or to maintain what are referred to as cognitive illusions. These illusions serve to motivate us by encouraging us to believe that we are in control of things we are really not in control of, and that we are all generally happier than most other people.

When it comes to security, we are disproportionately affected by one major cognitive error: unrealistic optimism. This is the belief that more good things are likely to happen to us in the future than bad, and that bad things are more likely to happen to other people than to us.

So, when it comes to being robbed, having our passwords hacked, or having our identity stolen, we are all driven to do little by way of prevention by insisting that these things will only happen to everyone else. Security, as essentially a fear-based industry, drives us toward making more emotional, less rational decisions; to System 1 thinking.

Everyone thinks the likelihood of a terrorist attack occurring is stronger immediately following another terrorist attack – it is an emotionally charged event that gets plenty of media attention, so it is easily recallable and prominent in our minds in the days that follow. That is availability bias.

Letting an unauthorised person into your building via the secret backdoor so they can help you fix an IT issue, without considering the impact of a possible future security breach – that is temporal discounting.

Using the same password for all your accounts because you figure the chances of it being uncovered and used against you are very slim – that is unrealistic optimism.

We are not rational actors, we are humans. And we will fall victim to cognitive biases and heuristics, often at the times when we are under most pressure to make the ‘correct’ decision. So, what do you do about it? Armed with this knowledge, how can you become an architect of choice to improve the security and wellbeing of those around you?

Melissa Weinberg
Dr Melissa Weinberg completed PhD in psychology in 2011, and managed the Australian Unity Wellbeing Index, a project which has tracked the subjective wellbeing of Australians since 2001. Also a registered psychologist, and currently Senior Research Fellow at the Young and Well CRC. Appeared on TV programs including ABC News Breakfast and Weekend Sunrise, and on various national and regional radio programs. She's the Team Psychologist for the Sandringham Dragons in the TAC Cup, and for the Australian delegation to the 2017 Maccabiah Games in Israel.