All his working life, Richard Thaler, who was awarded the Nobel Prize in economics last month, said he has been pleading with his fellow economists that they should be studying real people, not the idealised, so-called “rational” beings around which much of mainstream economics revolves.
Indeed, in their work, many economists don’t even refer to people as people; they call them “economic agents”. The typical “agent” is relentlessly selfish, perfectly informed, clairvoyant, maximises gains as a consumer and profits as a producer and operates independently of social context, as if living in a hermetically sealed container.
Professor Thaler calls this species “econs” and declares them to be “complete jerks”: They are know-it-alls who can make perfect forecasts, will steal your money if they can get away with it, will give nothing to charity and have no self-control problems. In other words, the econ – aka “homo economicus” – bears little resemblance to real people.
Prof Thaler and his fellow behavioural economists – including Daniel Kahneman, George Akerlof and Robert Schiller, among others – have produced rich insights into human economic behaviour, some of which have found their way into government policies.
One of the most far-reaching applications has been in the area of pensions. It is well established that left to themselves, people will not save enough for retirement. The idea of cutting their spending or their take-home pay for the sake of saving for the future is unpalatable. So Prof Thaler and his collaborator Shlomo Benartzi came up with a scheme called “save more tomorrow”, under which people commit to increase their savings whenever they get a pay rise. That way, their savings go up painlessly, without their take-home pay being cut.
Another application relates to the paying of taxes or bills. Recognising the fact that people comply with social norms if they see other people complying, psychologist Robert Cialdini came up with a novel idea to encourage people to pay taxes on time: Instead of sending them the usual stern letter threatening penalties, write to them to say that nine out of 10 people in their city pay their taxes on time. An experiment on 140,000 taxpayers in the United Kingdom found that this approach significantly improved tax compliance.
A key concept in behavioural economics pioneered by Prof Thaler is that of the “nudge”, which was also the title of one of his best-selling books. The idea here is to “nudge” people to do the right thing without coercion – for example, by putting healthier foods at eye level in supermarkets, or making the default option wiser than the alternatives, or – a more offbeat example – by etching the image of a fly near the drain on men’s urinals (because, as Prof Thaler explained, “men like to aim at targets”), which was reported to reduce spillage on the floor, and corresponding cleaning costs, by up to 80 per cent.
However, insights from behavioural economics have yet to make deep inroads into corporate life.
One example relates to what Prof Thaler calls “mental accounting”. We often practise mental accounting, creating separate buckets for different expenditures, ignoring the fact that money is fungible. So we have budgets for groceries, holidays, medical expenses, children’s schooling and so on. If we happen to get a big discount when we buy a car, we would quite likely spend the savings on something car-related, like upgrading the car’s audio system (it comes out of our “car budget”, after all), rather than adding to our medical expense budget-even though it might be stretched.
Companies also practise “mental accounting”. There are separate budgets for IT equipment, office supplies, payroll, travel, entertainment, etc. These are often fixed, and the rule in many companies is that departments must spend their budgets by a certain date. As a result, as the expiration deadline approaches, departments often indulge in last-minute splurges, sometimes on non-essentials, to use up their remaining budgets. A smarter approach would be for the budget allocators to review unused budgets at an earlier date and redeploy funds flexibly, according to where they might be most needed.
Pricing strategy is another area where companies sometimes do themselves in. Economics 101 teaches us that if demand exceeds supply, prices should go up. In principle we accept that idea, but it depends how it’s done. If a Rolex retailer initially charges $1,000 for a new model but then raises it to $1,500 because of high demand, customers complain – even though it’s “rational” economics. On the other hand, if the seller says the normal price of the model is $1,500 but offers a $500 discount – and then withdraws the discount when demand is high – that is more acceptable to consumers.
This is because of what Prof Thaler calls the “endowment effect”: People don’t like to lose what they feel they “have” – which, in the first case, is the right to pay $1,000 for the watch. In the second case they might grumble but they won’t feel cheated.
Surge pricing has also proved problematic. We have witnessed Uber and JustGrab fares go up by as much as five times after, for example, MRT disruptions. This might make textbook economic sense but, in the real world, surge pricing has met with furious responses from commuters – and not only in Singapore. People have a sense of what is right and wrong, fair and unfair, which overrides economic logic. Companies that ignore this do so at their peril.
Behavioural economists have long pointed out that individuals are not just – or sometimes not at all – “homo economicus”. They are also “homo sociologicus”; they take decisions in a social context, sometimes guided by social norms.
Government institutions have recognised this, as we saw with the examples of utilities usage and tax payments.
Companies can do so as well. Experiments have shown encouraging results. In one illustration of the power of social interaction, a study published in 2013 showed that in India, clients of a microfinance company who were randomly assigned to meet weekly, rather than monthly, were more willing to pool risks, and were three times less likely to default on their second loan.
Companies do target individuals in clever ways, taking advantage of various behavioural failings – for example, teaser rates on mortgages which entice home buyers to buy what they actually can’t afford; auto renewals on subscriptions; and extended warranties on small appliances with low risks of repeated breakdowns, which Prof Thaler describes in his book, Nudge, as being akin to paying $20 for $2 worth of insurance. But companies have been less adept at figuring out how to benefit from the fact that people act as members of a group.
But probably the most important area in which companies ignore the insights of behavioural economics lies in how they take important decisions. Two common biases in particular are worth highlighting: One is groupthink and overconfidence bias. Organisations often get caught up in ideas to such an extent that they look only for affirmation or approval. The CEO and board agree and the staff are swept along. There is pressure to go ahead and do it quickly. It could be a major project or a big investment or a corporate restructuring. If it all goes horribly wrong, there is typically a post-mortem in which executives try to analyse the failure with the benefit of hindsight and offer ex-post justifications that might be right or wrong. But it’s too late.
To reduce the odds of the disaster happening, psychologist Gary Klein came up with the idea of the “pre-mortem”, which he describes as “a sneaky way to get people to do contrarian, devil’s advocate thinking without encountering resistance”.
It’s like doing a post-mortem, but upfront. This is how it works: assuming you have a project almost ready to be launched. You call the decision-makers into a room and say: “We have launched the project. One year has passed, and it’s a total disaster. Write down why that happened. What went wrong?” The pre-mortem forces people to suspend their enthusiasm and think of problems that haven’t been considered – which can lead to improvements before the project is actually launched.
According to Prof Thaler, “many companies that were once household names and now no longer exist might still be thriving if they had conducted a pre-mortem with the question being: It is three years from now and we are on the verge of bankruptcy. How did this happen?”
The other common bias in corporate decision-making is “hindsight bias” – aka the “I-knew-it-all-along” effect: the tendency to believe, after the fact, that whatever happened was always obvious – when frequently it was not. Hindsight bias “has huge managerial implications”, according to Prof Thaler, because when managers evaluate the decisions of their employees, they often do so with hindsight. For example, after a project fails, the reason for the failure seems “obvious” – and managers think that it’s obvious that their staff should have thought of it. To deal with hindsight bias, one suggestion by psychologists is that before big decisions, the people involved, including the boss, should agree on all the possible outcomes in advance. That way, if something unexpected happens, they can verify whether they really “knew it all along”.
Behavioural economics teaches us to accept that we are humans, not “economic agents”, that we are sometimes cognitively challenged, driven by biases we don’t know about, resort to rules of thumb that often don’t make sense, and occasionally make terrible decisions. It doesn’t have all the cures or answers, but offers some ways for us to correct some of our misbehaviour. Richard Thaler and his ilk need all the recognition and encouragement they can get. A Nobel Prize certainly helps.
The Straits Times/ANN.