Home page > Writing and publishing in Africa > Daniel Kahneman: the Nobel Prize-winner who says we’re all fools
Daniel Kahneman: the Nobel Prize-winner who says we’re all foolsThursday 28 June 2012 Psychologist and Nobel Prize winner Daniel Kahneman says that, given a choice, we will usually make the wrong one Daniel Kahneman thinks he won the Nobel Prize for being a fool. Over lunch I judge that there is something about him that makes it unwise for me to tell him that this is not very likely. And anyway, if global prestige, the leadership of an entire field of economics and a worldwide bestselling book haven’t persuaded him, it’s unlikely that I will. What Kahneman will accept, I think, is that he is not the only fool. I am a fool too. We’re pretty much all fools. The Princeton professor has changed our understanding of ourselves and rocked economics to its foundations. If social scientists believe that in the past 30 years they have got much nearer to the truth, then Kahneman is one of the reasons why. If being a fool makes me an equal of Kahneman, I accept my status with equanimity. Let’s start at another lunch. Let’s start in 1969 in the Cafe Rimon in Jerusalem. It’s the favourite haunt of junior faculty members from the Hebrew University. It’s Friday noon. The place is filling up as it usually did at that time. And a revolution is about to start. On one side of the table is Kahneman, a psychologist with a statistical bent, with time served in the Israeli military telling the top brass what they didn’t want to hear — that their favoured method of choosing officers was hopeless, because the test results and the achievement of selected candidates weren’t correlated. And finding out that they ignored the evidence and ploughed on anyway. On the other side is a slightly younger man. He’s Amos Tversky, who’s been working away in Michigan on the science of decision-making. The two men had come fresh from an argument. But over lunch, says Kahneman, “we just had a grand time”. The argument, a friendly intellectual affair, was concerned with whether most people were good instinctive statisticians. Tversky was an optimist; he thought we weren’t too bad at numbers. Kahneman disagreed. He told Tversky of his own experience. “One of my lines of research wasn’t working at all. I had adopted a rule that I would never be satisfied with one study and I would have to do the study again and get the same results before I would be sure ... I was fairly inconsistent and never got the same results.” Eventually, he realised why. His sample sizes were too small. “I was teaching statistics. This was material that should have been transparent to me. But it wasn’t.” Was he the lone fool? Or, as he suggested to Tversky, were most people poor as intuitive statisticians? It didn’t take long for Tversky to become convinced. And the two embarked on studies that showed that Kahneman was right. People trust information garnered from ridiculously small samples, they confuse correlation (two facts are related) with causation (one fact causes the other) and they are for ever seeing patterns in events and numbers that are, in fact, random. It was — this paper that Kahneman now calls “a joke, a serious joke” — just the start. The beginning of a revolution against standard economic thinking. In paper after paper, following this first one, Kahneman and Tversky revealed the inadequacy of the most basic assumption made by economists — that man is rational. Ultimately, this work created a new strand of economic thinking — behavioural economics — and earned Kahneman the Nobel Prize for Economics in 2002, even though he is not an economist. Did it take economists too long to see the point? He says that Tversky always joked that economists didn’t really believe in rationality since they thought it was true of people in general, but not of their spouse or their dean. And then he adds that 30 years between the first paper and the Nobel Prize is “very, very fast”. Let me give you an example of the departure the work represents. You are offered a bet; a 50:50 gamble, a coin toss. Heads and you lose $100, tails and you win $150. Classic economic theory is clear about what you will do. You’ll take the bet, because the expected value is positive. But in reality? People don’t. They are so averse to losing something they already have that even a much bigger potential gain doesn’t compensate them for the risk. Here’s another example. We react differently to the same question framed in a different way. Let’s say a doctor is asked to make a decision about two treatments for lung cancer: surgery or radiation. The five-year survival rates favour surgery, but there are short-term risks. When told that the one-month survival rate after surgery is 90 per cent, as many as 84 per cent of the doctors chose the surgical option. When the same point was put in another way — there is 10 per cent mortality in the first month — only half the doctors chose surgery. And this is just one of dozens of ways we behave irrationally. We are, for instance, prone to something called the halo effect. “If you like the President’s politics,” Kahneman has written, “you probably like his voice and appearance as well.” And we package our opinions up to make neat narratives and help us form an identity even when the logical link isn’t there. “There is,” Kahneman told me, “a very high correlation in the US between attitudes to marriage and beliefs about global warming.” We tend to use information that comes quickly to mind in order to form judgments, producing a predictable bias. This explains how Robbie Williams came sixth in a poll to identify the most influential musicians of the past millennium, just ahead of Mozart. The list of such biases is a long one. We have, Kahneman argues, two types of thought processes. System one: quick, intuitive, automatic, but prone to being fooled by its own mental shortcuts. And system two: more contemplative, deeper and harder to deploy. This can correct for error, but more often acts as a lawyer and lobbyist for our emotions. And things get worse. Kahneman doesn’t really think we can do much about them. Even knowing that they are there doesn’t help you overcome them. Strangely enough, if he had done he wouldn’t have written his new book in the first place. One of our biases is that we can ignore the lessons of experience. A group of people compiling a report will estimate they can do it in a year, even though every other similar report has taken comparable groups five years. Kahneman knew this, yet still wrote Thinking, Fast and Slow. “When I started the book I told Richard Thaler [the author of Nudge] that I had 18 months to finish it. He laughed hysterically and said, ‘You have written about that, haven’t you? It’s not going to work the way you expect.’ ” How long did it take you, I ask. “Four years, and it was very painful. It’s not yet clear to me that it was a good idea to write the book in spite of its being quite successful.” I assure him, having read it, that it was indeed a good idea. “For you it’s easy,” he replies. The book is dedicated to Tversky and many of the ideas in it are his, but tragically he isn’t here to enjoy its reception. He died of cancer in 1996. Over our lunch his friend talks of him often. Indeed, Kahneman finds it hard to accept the praise and recognition he gets because Tversky isn’t around to share it. “For me,” he says, “winning the Nobel Prize has been a much smaller psychological event than for most other people because I always felt that I was part of a winning team and by myself I would never have won it.” I point out that, despite this, winning the Nobel Prize is quite cool. “Well, yeah, it’s quite good,” he eventually accepts. “For reasons that people don’t appreciate, by the way, and which took me completely by surprise. What makes it very good is the pleasure that it gives other people. Everybody who knows you is thrilled.” Yes, I say. I told my mum I was having lunch with a Nobel Prize winner. “You know, people who wouldn’t come to your funeral nevertheless are absolutely thrilled.” I promise to come if my schedule allows. His downbeat attitude extends to academic life. “I discouraged my daughter and son-in-law from entering academic life.” Why did you do that? “Two things. You shouldn’t be in academic life if you have a thin skin, and the other one is that you absolutely have to have the ability to exaggerate the importance of what you are doing. If you can’t do that, you can’t be an academic, because a very small problem has to look big to you, otherwise you can’t mobilise yourself to spend so much time and effort on it.” But when I put it to him that the financial crisis vindicated his own work by showing up the irrational behaviour of bankers, he replies: “Oddly enough, not very much. Standard economics explains that very well, what happened.” The bankers were acting rationally in their own interests rather than that of their banks. “So my sense is that it is undoubtedly true that behavioural economics have gained greatly in credibility from the crisis, but I am not sure that this is for the right reason.” All this modesty, all of it becoming, and none of it false. Yet it shouldn’t be mistaken for self-doubt. Kahneman knows what he has done and stands by his work. When I present some academic criticisms of behavioural economics — for instance that the effects are not very large — he is quick to call the point “not very serious”. He knows, too, the impact he has had. It’s just I think that it’s hard to spend your life studying human foibles, to conclude that they are ineradicable and then take yourself too seriously. Daniel Kahneman certainly doesn’t. © Times Newspapers Limited 2012 See online: Daniel Kahneman: the Nobel Prize-winner who says we’re all fools |