Insights that won Richard Thaler a Nobel Prize in economics

Richard Thaler © Getty images
Richard Thaler factors in our human foibles

Our editor in chief, Merryn Somerset Webb, interviewed Richard Thaler back in September 2015. Watch the video of that here.

This week, Richard Thaler won the Nobel prize for economics. Thaler’s main contribution is “behavioural economics”, which aims to incorporate into economic models how real human beings (rather than idealised, rational homo economicus) think and act. As Thaler noted in 2016, there is nothing new here – it merely “returns economics to its earliest roots”. Adam Smith was hugely interested in psychology – it’s only due to the modern desire to mathematicise economics that the human element was tidied out of the science.

Of course, most non-economists already know that we don’t make perfect decisions that take account of all available data, not even with money at stake. Instead we use heuristics – rules of thumb. These have evolved over time in order to help us to make faster, more efficient decisions under pressure. But they are not so useful when it comes to making considered choices about where to invest. Indeed, the value of behavioural economics for investors lies in identifying the many “cognitive biases” (see box below) that result from these heuristics, and lead us to make unnecessary investment errors.

So how can we mitigate the impact of these biases? Being aware of them is a useful step (we list a few in the box below), but it’s far from sufficient. Knowing what ‘‘anchoring’’ is won’t in itself stop you from doing it. This is where the views of another Nobel-winning behavioural economist, Daniel Kahneman, come in. Together with his colleague Amos Tversky, Kahneman pioneered much of the research into behavioural finance. In his best-selling Thinking, Fast and Slow, Kahneman draws a distinction between “System 1” decision-making– gut driven, instinctive and reliant on heuristics – and slower, more analytical “System 2” thinking, which is slower, more analytical and effectively involves engaging your brain. As an investor, you’ll find “System 1” kicking in regularly when what you really need is to use “System 2” instead.

How can you do that? Slow down. Avoid over-exposure to financial news – it makes you panicky. You could also consider using stop-losses to commit to a course of action should a share price fall significantly below your purchase point. But most importantly, have a plan. Keep a journal of the details of each investment and your rationale for making it before you act. By forcing yourself to write down your process before you invest, you will automatically slow down and hopefully make better decisions. That doesn’t guarantee a good outcome – but in the long run, it should lead to better returns than relying on your fickle gut.

Cognitive bias explained

We use mental shortcuts (heuristics) to make decisions rapidly. These work in many circumstances, but when it comes to investing, they can be a major handicap, giving rise to “cognitive biases”. Key examples include the following:

Anchoring: the tendency to “anchor” to a piece of data we are exposed to while making a decision, regardless of pertinence. For example, in a 1974 study, Kahneman and Tversky asked subjects to write down the last three digits of their phone numbers, multiplied by 1,000. They were then asked to make estimates of house prices. The higher the phone number, the higher the estimate. In investing, anchoring can, for example, tempt you to buy a stock that has fallen after a profit warning, because you have ‘‘anchored’’ to its previous price and see it as a bargain, disregarding the deterioration in its fundamentals.

Framing: this refers to the fact that people often draw different conclusions from the same underlying information, depending on how it is presented, or “framed” (which is entirely at odds with the “rational actor” theory of efficient markets). Again, this was formalised by Kahneman and Tversky in a 1979 research paper on “prospect theory”, which tries to model how people make decisions involving probability in the “real world” as opposed to the idealised economic world.

Loss aversion: the pair also found that individuals suffer from “loss aversion” – we feel the pain of losses roughly twice as acutely as we enjoy the pleasure from gains, which goes some way to explaining why “framing” information on decisions in terms of risks rather than rewards can alter our reported preferences, even if the underlying data is the same.