How Daniel Kahneman shaped our understanding of human decision making

By

Classical economists have for a long time worked on the basic assumption of the rational individual: that people make decisions in their self-interest, based on chewing over all of the information available to them. Ask most psychologists and they would tell you that this model of decision making does not play out in reality.

In fact, most of our decisions are made through speedy shortcuts, saving our time for the more important stuff that we might need a little longer to ponder over. The man who tried to meld this psychological account of human decision-making with economics, Daniel Kahneman, passed away three weeks after his 90th birthday, on the 27th March 2024. His contemporaries have labelled him as ‘the Grandfather of Behavioral Economics’, the discipline that he largely founded.

If you have ever been as lucky to have been perusing the psychology or business section of an airport or train station bookshop you are likely to have come across one of his books, Thinking, Fast and Slow — a rare mainstream success for what is a rather academically dense book. It tries to account for why we observe biases in human decision making.

He breaks down the way we make decisions into two loose categories, with speedy ‘system 1’ decisions, and slow ‘system 2’ decisions. The latter take longer to crunch, Prof. Kahnenman gives examples of activities that take up a lot of brainpower and effort, such as parking a car or solving a tricky equation. Economists have classically assumed that we make all of our decisions in this way, but this is not generally the case.

Instead, we rely on a more automatic, instinctive ‘system 1’ pathway in which we make decisions faster, but with the tradeoff that our biases and emotions can sometimes lead us to make the wrong choice. A common example used to illustrate this is the ‘bat and ball’ thought experiment: a bat and a ball together costs £1.10. If the bat is £1 more expensive than the ball, how much is the ball? Have a think — you can find the answer at the end of this article. 

Our biases and emotions can sometimes lead us to make the wrong choice

These shortcuts lead to some interesting predictable biases in decision making. Such as the sunken cost fallacy, where people prefer to keep investing in a project, rather than backing down, even when continuing is no longer the rational option. Another is the framing effect, where the way you provide information heavily shapes its understanding — someone is more likely to elect to have surgery with a 90% survival rate than a 10% death rate. And, the anchoring effect, where providing initial and irrelevant information, such as a high or low number, heavily influences people’s estimates in terms of, for instance, ages or prices.

Although there is no unified behavioural theory of economics quite yet, Prof. Kahnenman’s attempt to begin incorporating psychology into economics has been recognised. In 2002, he won the Nobel Prize in Economics for applying “psychological research into economic science, especially concerning human judgement and decision-making under uncertainty”. 

However, it is worth mentioning that the field has been acutely affected by the ‘replication crisis’ that has impacted much of the ‘soft’ sciences in the past decade. Many of the studies cited in Thinking have since been debunked, as other scientists have not been able to replicate some of their findings. This prompted Prof. Kahnenman to write on his blog “I placed too much faith in underpowered studies […] authors who review a field should be wary of using memorable results of underpowered studies as evidence for their claims.”

It may appear slightly ironic that he himself made a mistake similar to the one that he studied in other people — and to a certain extent this may further convince the sceptic of his thesis surrounding human irrationality. Even a brilliant academic can fall victim to these aspects of human nature. Oh, and the answer is that the ball costs 5p — if you guessed 10p, Prof. Kahnenman would posit that you’re not that rational after all.

Image: Ohadinbar via Wikimedia Commons

Leave a Reply

Your email address will not be published. Required fields are marked *

 

This site uses Akismet to reduce spam. Learn how your comment data is processed.