By Rory Sutherland, Vice Chairman at Ogilvy & Mather Group UK
Real life decisions are like darts: aiming for the high score brings a higher chance of disaster.
Whenever I wish to scandalise people, I have a sentence which works every time: "I would prefer my daughters took up smoking than started cycling in London."
My argument is as follows. If my daughters take up smoking and find it impossible to quit, there is a fairly high chance of a fairly bad outcome. They may die early and very unpleasantly. Perhaps at 58 rather than 85 years old. But if they take up smoking and resist the seductive lure of the bicycle, well at least they won't die at 22 beneath the wheels of a truck. The first outcome is a disaster, the second is a catastrophe.
Now I don't claim that I am right here. There are other upsides to cycling - though I suppose if we are being intellectually honest, we should accept that there may be some upsides to smoking too. But I still maintain that I might be right. There is a case to be made that, in the game of life, avoiding elimination in the early rounds is a good approach.
It always interests me that we are now more sanctimonious about tobacco than we are about drink, cycling, motorcycling and mountaineering. Almost every single person I know who has died before the age of 50 was killed by one of these four.
Something economists don't understand, with their narrow focus on utility, which is an artificial additive function accumulated in a series of independent transactions, is that life is multiplicative, not additive. And it is path dependent.
In his excellent blog Farnam Street, Shane Parrish explains the distinction between additive and multiplicative systems as follows. Let's run through a little elementary algebra: what's 1,506,789 x 9,809 x 5.56 x 0? Hopefully you didn't have to whip out the old TI-84 to solve that one. It's a zero.
This leads us to a mental model called Multiplicative Systems, and understanding it can get to the heart of a lot of issues.
Suppose you were trying to become the best basketball player in the world. You've got the following things going for you:
1. God-given talent
You're 206cm tall, quick, skilful, can leap out of the building and have long been the best player in a competitive city.
You live in a city that reveres basketball and you're raised by parents who care about your goals.
3. A proven track record
You were player of the year in a very competitive Division 1 college conference.
4. A clear path forward
You're selected as the second overall pick in the NBA Draft by the Boston Celtics. Sounds like you have a shot. What would you put the odds at of this person becoming one of the better players in the world? Still high?
Let's add one more piece of information:
5. You've developed a cocaine habit
What are your odds now?
This little exercise isn't an academic one; it's the sad case of Leonard "Len" Bias, a young basketball prodigy who died of a cocaine overdose after being selected to play in the NBA for the Boston Celtics in 1986. Many call Bias the best basketball player who never played professionally.
What the story of Len Bias illustrates is the truth that anything times zero must still be zero, no matter how large the string of numbers preceding it. In some facets of life, all of your hard work, dedication to improvement and good fortune may still be worth nothing if there is a weak link in the chain.
Actually it's a bit more complicated than this. Had Bias decided to become a cocaine addict late in life, his life might have been fine. But early losses - or an early zero - have a disproportionate effect on outcomes.
Just as a great meal can be ruined by a single prong on your fork being out of alignment, a great life can be ruined by a single early mistake. In a multiplicative, path-dependent world - one in which we are in competition with others - the rules are very different to the additive rules that economists frequently impose on us with the idea that they are "rational".
Perhaps where economists go wrong is that they think decisions are like archery - where, by aiming for the bullseye you are also minimising your chance of a zero. But real-life decisions are more like darts, where aiming for the highest score brings a higher chance of disaster.
In archery the scoring is concentric. You simply aim for the bullseye, which scores ten, and if you miss, you get nine. Miss the nine and you get eight. The only strategy is to aim for ten and hope. It is a perfectly logical scoring system, but it doesn't make for great telly. The dartboard, by contrast, is not remotely logical, but is somehow brilliant. The 20 sector sits between the dismal scores of five and one.
Most players aim for the triple-20, because that's what professionals do. However, for all but the best darts players, this is a mistake. If you are not very good at darts, your best opening approach is not to aim at triple-20 at all. Instead, aim at the south-west quadrant of the board, towards 19 and 16. You won't get 180 that way, but nor will you score three. It's a common mistake in darts to assume you should simply aim for the highest possible score. You should also consider the consequences if you miss.
So it is in life. Many decisions have a scoring rubric more like darts than archery. In deciding, say, whom to marry, aiming for the best may be less important than avoiding the worst. "Satisficing", as polymath Herbert Simon named it, is the strategy whereby rather than trying to maximise an outcome, you seek a good solution with a low chance of disaster.
Perhaps we have evolved - quite sensibly - not so much trying to maximise anything as to minimise catastrophe. Certainly the fact that loss aversion is found not only in humans, but in a variety of animals, suggests it may not be safe to call loss aversion a "bias". Perhaps what we have evolved to do is not so much to aim for the triple-20 on the dartboard of life, but to make sure we avoid missing the board altogether. Perhaps this is the right thing to do.
The reason this matters to me is that, once you realise that people are programmed to avoid disaster rather than to achieve perfection, certain things about consumer behaviour make sense. A preference for famous brands is a poor way of buying a perfect product, but it is an exceedingly reliable way of avoiding buying something which is awful.
Most of us, at some stage of our lives, have bought a car from a friend or neighbour. This is a ridiculous if we are trying to buy the perfect car for our money - but it is very sensible if we are keen to avoid buying a clunker: no one with a bad car to sell is going to sell it to anyone they know.
What we are doing when we buy a car from a friend is replacing a complex problem ("How good is this car?") with a simpler proxy question ("Do I trust the person who is selling it?"). Since the person selling the car knows more about it than we do, this is not an irrational solution to the problem - it is a clever one. It is only irrational if you make the assumption that we are aiming for the triple-20.
A great deal of marketing activity involves the creation of costly signals which are guarantees of the sender's long time horizons. Anything costly or difficult, which involves spending money now in order to reap the gains later, whether it is an advertising campaign or a café reupholstering its chairs, is perceived as the seller expressing faith in his own futurity.
This is probably the most reliable way to signal trustworthiness. It is hence a reliable signal of seller confidence, not seller desperation. The fact that we have an instinctive sensitivity to nuances like this is not evidence of irrationality. It is evidence of evolved intelligence.
Amos Tversky, one of the greats of behavioural economics, joked that there once had been humans who did not suffer from loss aversion - but they all died out.
This was first published in Wired.