Copyright ©2011 NPR. For personal, noncommercial use only. See Terms of Use. For other uses, prior permission required.


This is ALL THINGS CONSIDERED from NPR News. I'm Melissa Block. Now, an interview with Daniel Kahneman. He shared the Nobel Prize for Economics in 2002, although he's not an economist. Kahneman's field is the psychology of decision-making, and my co-host, Robert Siegel, spoke with him recently about his new book called "Thinking, Fast and Slow."


Daniel Kahneman, welcome to the program.

DANIEL KAHNEMAN: My pleasure to be here.

SIEGEL: Let's start with the concept that's summed up in the title of your book, "Thinking, Fast and Slow." We have two ways of thinking, which you call system one and system two, and I wonder if you can give us a good example of some fast system one thinking.

KAHNEMAN: Well, one of my favorite examples is there is an upper class British voice that says I have large tattoos all down my back.

SIEGEL: Large tattoos. Yes.

KAHNEMAN: Yes. And now, you know, people who speak with an upper class British accent don't have large tattoos down their back, so the brain brings to bear all the world knowledge that's involved and register that there is an incongruity here within, you know, between three and four-tenths of a second. Or if a male voice says I believe I'm pregnant, you will get an instant response, an incongruity.

So those are instances of very fast thinking and it's the same process of recognizing things and recognizing, distinguishing the familiar from the unfamiliar, coming up with solutions that have worked in the past. That's what's called, you know, expert intuition. All these are examples of fast thinking.

SIEGEL: Fast system one thinking. Now, enter system two, slower thinking.

KAHNEMAN: System two - in the first place, I should make it clear that the systems don't really exist. It's a very convenient way of talking about them because it's easy for people to think of agents and system one and system two do things with a kind of language that we fall into very naturally.

System two is involved, we say, when people engage in orderly computations. So if I ask you to multiply 17 by 24, your system one is very likely to be quite silent and system two that has to compute first according to rules and so system two deals with any kind of orderly reasoning.

And also, we attribute to system two the function of monitoring your thoughts and monitoring your actions and monitoring what you say. Obviously, we don't say everything that comes to mind, so there must be a function that inhibits things and that is system two.

SIEGEL: I think you write at some point in the book that system two is who we like to think we are, actually.

KAHNEMAN: Well, we mostly have access to system two. That is, we have access to our own reasoning self, to our own conscious self. Much of the interesting activity that we call system one activity - much of that is completely unconscious, automatic and very quick. That is, you know, you're surprised by something, but you don't really know what surprised you. You recognize someone, but you don't really know what cues caused you to recognize that person.

So system two, on the other hand, is typically conscious, deliberate, intentional, and that's who we think we are.

SIEGEL: The great takeaway from your research in how we make decisions and how - well, our misplaced faith in expertise, for example, our overconfidence - it reminds me of the exchange between Robert Redford and Paul Newman in "The Sting" when they say he's not as smart as he thinks he is, nor are we. We're not as much in control of our decision-making. We're much more imperfect than we might think.

KAHNEMAN: Well, certainly. We're not in control because our preferences come from a lot of places that we don't know about and, second, there are really some characteristics of the way a mind works that are incompatible with perfect decision-making. In particular, we have a very narrow view of what is going on and that narrow view - we take decisions as if that were the only decision that we're facing. We don't see very far in the future. We are very focused on one idea at a time, one problem at a time and all these are incompatible with full rationality as economic theory assumes it.

SIEGEL: There's one piece of research that you write about in the book that I'd like you to just mention and that is - it's one that you cite to show that the phrase, mental energy, is not really some metaphor. We actually expend energy when doing a lot of thinking and it's the story of the Israel Parole Court, I guess, which turns down most applications for parole, but is remarkably more positive about cases that it takes up right after a meal break.

KAHNEMAN: Yes. This was published in the proceedings of the National Academy of Sciences this year and it's a very, very careful study of eight parole judges in Israel and many, many cases. And as it happens, they keep an exact record of when they make each decision and also when breaks are taken.

Now it turns out that there is a very large variability in the leniency, actually. I mean, the number of cases in which they award parole. And they're much more likely to do that immediately after a meal and much less likely to do that after a couple of hours and when, presumably, they're hungry. But certainly, they are tired. They're depleted. And when you are depleted, you tend to fall back on default actions and the default action, apparently in that case, is to deny parole. So, yes, these are people strongly influenced, well, very influenced by the level of glucose in their brain. I mean, that's among other things.

SIEGEL: So if you could get the clerk to say, I want you to look at this case right after lunch, your chances of parole might increase...

KAHNEMAN: They do.

SIEGEL: ...significantly.

KAHNEMAN: Quite significantly.

SIEGEL: I mean, the implication of a study like that is here, democratic society is based on people at all different levels making decisions and if we assume that they're as driven by the glucose content of their bloodstream at that moment or other odd biases that they bring to factors, it undermines all the underpinning of a democratic society.

KAHNEMAN: Well, you know, it doesn't undermine it in the sense that what are you going to do about it?


KAHNEMAN: I mean, you still need parole judges. You know, one might think that, with feedback and when they're informed of those things, they might be more careful. That is, they might be more careful when they're hungry and less prone to make quick default decisions.

Clearly, the decision-making that we rely on in society is fallible, it's highly fallible, and we should know that.

SIEGEL: Well, Daniel Kahneman, thank you very much for talking with us today.

KAHNEMAN: It was my pleasure.

SIEGEL: Daniel Kahneman is the author the new book, "Thinking, Fast and Slow."


ARETHA FRANKLIN: (Singing) You better think. Think about what you're trying to do to me. Think. Let your mind go. Let yourself be free.

BLOCK: This is NPR.

Copyright © 2011 NPR. All rights reserved. No quotes from the materials contained herein may be used in any media without attribution to NPR. This transcript is provided for personal, noncommercial use only, pursuant to our Terms of Use. Any other use requires NPR's prior permission. Visit our permissions page for further information.

NPR transcripts are created on a rush deadline by a contractor for NPR, and accuracy and availability may vary. This text may not be in its final form and may be updated or revised in the future. Please be aware that the authoritative record of NPR's programming is the audio.



Please keep your community civil. All comments must follow the Community rules and Terms of Use. NPR reserves the right to use the comments we receive, in whole or in part, and to use the commenter's name and location, in any medium. See also the Terms of Use, Privacy Policy and Community FAQ.

Support comes from: