Why It's Hard to Admit to Being Wrong We all have a hard time admitting that we're wrong, but according to a new book about human psychology, it's not entirely our fault. Social psychologist Elliot Aronson says our brains work hard to make us think we are doing the right thing, even in the face of sometimes overwhelming evidence to the contrary.

Why It's Hard to Admit to Being Wrong

  • Download
  • <iframe src="https://www.npr.org/player/embed/12125926/12125927" width="100%" height="290" frameborder="0" scrolling="no" title="NPR embedded audio player">
  • Transcript

JOE PALCA, host:

This is TALK OF THE NATION: SCIENCE FRIDAY from NPR News. I'm Joe Palca, sitting in for Ira Flatow.

This year marks the 50th anniversary of something called cognitive dissonance. If you don't know what that means, don't feel bad about that. In fact, that's what we'll be talking about here this year - this hour - why you probably won't feel bad about that. Cognitive dissonance is a term in psychology that describes the feeling of tension in your brain when you have two conflicting beliefs.

So, you're a good person, you're a moral person, but you fudged on your income tax - maybe you forgot to record a few cash transactions. So what does your brain do? According to the cognitive dissonance theory, your brain tries to relieve that tension. It immediately starts making excuses for you. Well, everyone cheats a little bit. I probably paid too much last year. I don't like the way the government spends my money. See? You feel better already, right?

My next guest has studied cognitive dissonance for many decades and he has a new book on the subject. Elliot Aronson is the co-author, with Carol Tavris, of "Mistakes Were Made (But Not by Me): Why We Justify Foolish Beliefs, Bad Decisions, and Hurtful Acts." Dr. Aronson is a social psychologist and he taught for many years at the University of California in Santa Cruz. His other books include "The Social Animal" and "The Jigsaw Classroom." He joins us from the campus of UC Santa Cruz. Welcome to SCIENCE FRIDAY, Elliot.

Dr. ELLIOT ARONSON (Co-author, Mistakes Were Made (But Not by Me); Social Psychologist; Professor Emeritus, Psychology, University of California Santa Cruz): Thank you, Joe. Good to be here.

PALCA: And just in the interest of full disclosure, I first bumped into Elliot Aronson probably about 30 years ago to the day, Elliot, because that's when I started graduate school at UC Santa Cruz in the Psychology Department. So I had to be nice to you then. Now, I don't have to.

Dr. ARONSON: I remember you well, Joe. You were a very nice young man in those days.

PALCA: Yes. I'm still very nice sometimes. Okay. If you want more information about what we'll be talking about this hour, go to our Web site at www.sciencefriday.com, where you'll find links to our topic. And we, of course, would like to hear your comments and questions because this is going to resonate with a lot of people. And maybe, you'll understand better why if you didn't understand cognitive dissonance at the start of the show, you certainly will by the end. Our number is 800-989-8255. That's 800-989-TALK.

And okay, Elliot, take it away. What is cognitive dissonance?

Dr. ARONSON: Well, you've already defined it. It's a drive, like hunger or thirst, and it feels uncomfortable whenever we hold two ideas or beliefs that conflict with each other and especially if the major idea is about who we are. If I think that I'm a smart, competent, moral person and I do something stupid, it creates dissonance and I try to convince myself that it was actually the smartest thing I could have done. And as a matter of fact, it was not a bad decision at all. And nobody could have done it better. And it really isn't so bad. And besides, nobody noticed anyway. And that reduces the dissonance and helps us sleep well at night.

PALCA: And so where's the problem?

Dr. ARONSON: The problem comes when we make a serious mistake. You know, a simple mistake is easy. If we spill wine all over ourselves at the dinner party, it would be nice to believe that hardly anyone noticed. And it was white wine, anyway, so that it doesn't stain. And then, we go home and we can sleep well at night. But if we make a serious blunder, then the problem is by reducing the dissonance, we don't learn from our mistakes. And therefore, we're likely to commit the same mistake over and over again.

Or like some politicians, if we dig ourselves into a hole, if we commit ourselves to a war that isn't going well, we dig ourselves deeper and deeper by justifying the initial decision to go to war, and convincing ourselves that it was the right decision and convincing - trying to convince everybody else that it's important to stay the course. That's just to take a random example.

PALCA: But then, are you saying that cognitive dissonance is so powerful that a global leader would persist in a policy simply to make, as you described it a minute ago, himself able to sleep better at night?

Dr. ARONSON: That's precisely what I'm saying. It's - cognitive dissonance is a powerful motive. And it's an unconscious motive. It percolates just below our level of awareness. We don't say, well, I think I'll reduce a little dissonance right now. We do it automatically. And we're not aware that we're doing it. We're convincing ourselves that we're right even when - as God sees it or as other people around might see it - we're wrong.

PALCA: Right. But this is the - I mean, okay, so we have two examples now that we're going to have to try to reconcile. One is taking a country into war and the other is, you know, fudging on your income tax. One may lend itself to a absolute correct or incorrect answer, you know. If you - if somebody shows that in your bank account, you wrote down one number and on your income tax form, you wrote down another, well then, clearly, there's an error. But how do you make that conclusion when obviously, it's not - in this case, we're using the war as an example - it's not just the president who thinks this is the proper course. I mean, there's millions of other people who think this is the right course.

Dr. ARONSON: Right. And what I want to do is before we get to that, I want to give you an example from a piece of experimental research that will clarify it. And it's one of the first experiments ever done to test dissonance theory, and it's an experiment I conducted 50 years ago. The hypothesis was that if people go through a severe initiation - through hell and high water - in order to become a member of the group, they will like that group better than if they got into the group easily.

And so we set up an experiment where we randomly assigned people to a high initiation condition or a low initiation condition. And then, they listened to a tape recording, the exact same tape recording of the group that they had gone trough the initiation in order to join, in action. And then, we had them rate the group afterwards. And what happened was amazing. The people who went through a very small initiation, it was a boring group by any objective standards. The people who went through a very small initiation said this is a boring group. I don't think I want to be a member of it. While the people who went through a severe initiation said, gee, this group is very interesting.

Some of those people, well, you know, they didn't have all the answers, but they were really smart, and I think they're really nice people, et cetera. In other words, in order to reduce the dissonance between I'm a competent, smart person, and I did a stupid thing, I went through all that hell and high water in order to get into a boring group, they downplayed the boring aspects of the group and they upplayed in their own minds the attractive aspects of the group. Now, this is real. It isn't that they were making it up. They really believed it because they took action to stay in the group.

And I would say if you're reading a CIA report that's a little bit ambiguous, that's telling you some things indicating, for example, that Saddam Hussein might have weapons of mass destruction and other things saying that maybe he doesn't - if you've already decided that you might want to go to war, when you read that report, you don't pay a lot of attention to the negative stuff. You only pay attention to the stuff that goes along with the decision you've already implicitly made. And therefore, you see it very differently than someone who's looking at it completely objectively.

PALCA: So how does a president, in this example, avoid the trap of a cognitive dissonance since, clearly, it might lead him into policies that he would not make if he were being, you know, if he were seeing the picture accurately. And by the way...

Dr. ARONSON: Good question.

PALCA: ...we're not judging this. I mean, I'm not sure...

DR. ARONSON: Absolutely not.

PALCA: I don't want to endorse Elliot as saying he's definitely right about this point. We're just discussing this as a possible explanation.

Dr. ARONSON: It's a hypothetical...

PALCA: It's a hypothetical.

Dr. ARONSON: You know, pretend it's some other president.

PALCA: Right. Okay.

Dr. ARONSON: The best way to do it is to become aware of the process. For example, anybody who reads our book will be aware. All of the reviewers who have read the book said, gee, I wasn't aware that I was doing this. But now that I'm aware of how this happens, I'm going to scrutinize some of my important decisions to make sure that I'm really on the right track, that I'm not simply reducing dissonance. And with increased awareness, with increased scrutiny, people can make wiser decisions and avoid making the same mistake twice. The second way - if I'm a CEO or the president of a major company or of the nation, I will try to surround myself with people who don't always agree with me, and I will listen very, very carefully to them.

We have some great examples of presidents of this country who have done this. The most notable, perhaps, is Abraham Lincoln. In Doris Kearns Goodwin's book "Team of Rivals," she documents this beautifully. Abraham Lincoln, when he was elected, appointed to his cabinet some of his staunchest political opponents -including one guy, Stanton, who had called him a stupid ape to his face - and listened to them very carefully. These guys who were his - who had been his political enemies, within a very short time, came to respect him enormously because of his considered judgment.

PALCA: Right.

Dr. ARONSON: Now, all presidents, you know, some - most presidents want to surround themselves with people who agree with them; some do it to a greater extent than others.

PALCA: Right. It just strikes - it presents so many fascinating problems because now, I have to ask myself the question, well, did I make a mistake by inviting Elliot Aronson to come on TALK OF THE NATION: SCIENCE FRIDAY?

(Soundbite of laughter)

PALCA: And how are we going to find out the answer to that? And I think, in this case, we have to turn to the listeners to provide that answer. But unfortunately, before we can do that, we have to take a short break. So stick with us. We're talking with Elliot Aronson who, with Carol Tavris, has just written a provocative new book about cognitive dissonance called "Mistakes Were Made (But Not by Me)." So stick around. We'll be back after a short break.

(Soundbite of music)


We're talking this hour about how we justify our mistakes and my guest is Elliot Aronson, a social psychologist and the co-author of "Mistakes Were Made (But Not By Me)." And we just asked the dangerous question - well, we've made, but we don't want to dwell on that - of whether we've made a big mistake even talking about this. But we're not going to do that. Instead, we're going to go to our listeners, and our number, if you want to join this conversation, is 800-989-8255. And let's start with Adam(ph) in Kansas City. Welcome to the program.

ADAM (Caller): Yeah, great listening to this program, Dr. Aronson. That kind of got me thinking about the political situation that we actually experience right now in this country. There has been recently a push by the candidates on both sides to expose their God relationship - a relationship, you know, religion. And I was curious to know, how do you guys view the, let's say overwhelming evidence of, let's say, absence of religion or, let's - factual support for religion in case, let's say, of Mitt Romney, who is a Mormon. And it could be clearly shown historically, archeologically that Mormonism has absolutely no any factual support. Yet, somebody like Mitt Romney is gaining popularity among the Republican Party. And it's truly an overwhelming evidence that his system of belief has absolutely no support.

PALCA: Well, that's an interesting assertion. But let's start with the other part of the question. Elliot, what do you make of it?

Dr. ARONSON: Well, I just think that in this country, we believe in religion. We believe in belief. And I think if an individual indicates that he believes in something sacred, something beyond himself, that's good enough. I think it would be very hard for an atheist to be elected president of this country, but I think we've gotten to the point where people of faith don't demand proof about the veracity of another person's faith so that religion is a major blanket that covers a lot of people.

PALCA: Okay.

Dr. ARONSON: You know, I got myself into this by talking about politics so quickly. Joe, I just wanted to make sure that you understand or your listeners understand that this theory is not simply about politics.

PALCA: No. Well, maybe...

Dr. ARONSON: And, maybe in order to do that, I should give you one other clear example or several. Our book is about things like how is it that there are hundreds of people languishing in prison for major crimes like rape or murder. DNA evidence turns up that shows - for example, that a person who was convicted of rape and has been spending the past 20 years in prison, DNA evidence shows up that he couldn't have committed the crime. And yet, more often than not, prosecuting attorneys will not want to reopen the case. And it's easy to conclude that these people are simply evil.

What I think has happened is they've convinced themselves that they couldn't possibly have made a mistake. If I'm the prosecutor and I convicted this guy and I sent him to prison, I think I'm a smart and moral person. Therefore, it would be horrendous for me to believe that somebody has been languishing in prison for 20 years because I made a blunder. Therefore, I convince myself that regardless of what the DNA evidence shows, that's the guy that did it and I'll keep him in prison for another 20 years. That's the ironic tragedy of this kind of situation.

PALCA: Let's take another call now and go to Jennifer(ph) in Cleveland, Ohio. Jennifer, welcome to SCIENCE FRIDAY.

JENNIFER (Caller): Hi. I have a question for Dr. Aronson about gender differences. I wondered if he had noticed any difference between the way that women deal with cognitive dissonance and the way that men do, given the fact that women often get some reputation for being so analytical and always thinking about conversations after they've taken place and wondering over and over if we've done something that, perhaps, was contrary to our general behavior or personality.

PALCA: Jennifer, interesting.

Dr. ARONSON: Jennifer, that's a very good question. It turns out that there aren't gender differences, that women do it at about the same rate and the same level that men do. There aren't any important cultural differences. That is one of the reasons that we know that this cognitive dissonance that the brain is hard wired for self-justification is that cognitive dissonance exists and dissonance reduction exists in one form or another in every single culture in the world where it's been tried. So there are no real gender differences, there are no significant, important cultural differences and it shows up in brain scans. Drew Westen, who's become quite famous recently, did a wonderful fMRI study where it showed that when a person...

PALCA: Wait a minute, fMRI, that's a brain scan kind of a...

Dr. ARONSON: A brain scan, yeah. It showed that when a person is experiencing dissonance, his thought processes shut down. And when he starts reducing dissonance, the brain centers that show pleasure light up like a Christmas tree.

JENNIFER: That's great.

Dr. ARONSON: So it produces pleasure when you reduce the dissonance, and it seems to be universal.

PALCA: Jennifer, thanks very much for the call. Appreciate it. Let's take another call now and go to John(ph) in Davenport. John, welcome to the program.

JOHN (Caller): Hello. Are you talking to me?

PALCA: Yes, I think so.

JOHN: Does cognitive dissonance have an effect on mental illness?

PALCA: Okay. Elliot?

Dr. ARONSON: Can you elaborate on that? What kind of effect did you have in mind?

JOHN: If you wish. I'm a paranoid schizophrenic. I came up with the term cognitive dissonance about 13 years ago, then Googled it, and found out that it already existed. And I think that what happens - at least in my case with mental illness - is what you've been describing about cognitive dissonance. That it starts out with a person whether - if they're predisposed genetically to mental illness, if they have some difficulty in squaring something they've done with their conscience, that it puts them off kilter.

Dr. ARONSON: I think that's right. I think that is indeed a very good statement because I think that one of the benefits of being able to reduce dissonance is to smooth out those rough edges. And you see it - so you see that for a lot of minor things where important stakes are not involved, it's - reducing dissonance is a really good idea. But again, as I pointed out, like if you're a prosecuting attorney, you don't want to be reducing dissonance in order to protect your own ego at the expense of letting an innocent man rot in prison.

PALCA: John, thanks very much for that call. Elliot, in the story - in the book, I mean, there are descriptions of how - also you mentioned prosecutors, but also sometimes psychologists and psychiatrists will experience and behave in response to cognitive dissonance. Explain that.

Dr. ARONSON: Well, there has been - as you may recall in the '80s and '90s, there was an epidemic of women in psychotherapy - or who had read a vivid book about it - suddenly recalled that they had been sexually abused as children or adult and adolescents, sometimes for up to 15 or 20 years by an adult in the family, usually their father. They didn't remember it. But, you know, someone would say that - would learn in psychotherapy that they'd been sexually abused from the time they were five until they were time - the time they were 15 and then repressed it.

Well, this was an epidemic that was caused by psychotherapists who had a theory that if a person had particular presenting symptoms, like, was having difficulty in her relationships with members of the opposite sex, had low self-esteem and a few other things, that that meant that she may have been abused sexually as a child. And if she couldn't remember it, then, oh, that's because she repressed it. And well-meaning therapists did harm in that situation by, in order to make - in order to follow what their theory dictated, ended up convincing their patients that they had been abused when they really hadn't been abused.

As far as we know, there's no such thing as repression, that is, people who have been through terrible experiences, especially over a great length of time - repeated experiences - their problem isn't that they can't remember them, their problem is that they can't forget them.

PALCA: And so, how do you get out of this? How does a therapist - I mean, did therapists suddenly get conk on the head and said, you know, wow, we've been going down the wrong path here?

Dr. ARONSON: A great many did. But again, as you can see, the theory predicts that it would be very difficult for them to do that. So many of them began to question the data. And when their own patients began to recant and say, wait a minute, this couldn't have happened. My father didn't do it. I was sleeping in the same bedroom with three siblings. They would have noticed it and they didn't. Often, the therapist would remain convinced that abuse had occurred because they had done so much damage in breaking up families that it would be difficult for them to own up to the fact that they might have made a mistake.

PALCA: Let's take another call now and go to John(ph) in Duluth, Minnesota. John, welcome to the program.

JOHN (Caller): Hello.

PALCA: Hello.

JOHN: Yes. I want to ask the doctor about the psychological benefits of contrition, which should be sort of an opposite tack to take - when you make a big mistake - to say whoops, I'm sorry. I'm a little ashamed of myself, but own up to it and move beyond it. And it seems that - I guess, I would be an advocate for contrition versus dissonance, which seems to be a bit of a dodge.

Dr. ARONSON: Me too. I agree. If you make a mistake, the first thing you have to do is become aware of it. You see - and I hope you understand that what I'm saying here is that people are not aware, often, that they've made a mistake, that they've done something wrong. But to be able to become aware that you have made the mistake - and then if the mistake involves another person - for example, if you're a physician and you commit a blunder that causes pain or suffering to one of your patients, I think contrition and owning up to it is a wonderful thing to do. It's cleansing and, indeed, it turns out that when a physician owns up to having made a mistake and validates the patient's own belief, there are fewer malpractice suits. There is less likely to be a malpractice suit than if the physician doggedly maintains that he didn't make a mistake.

PALCA: Is there not also a benefit from this kind of owning up, as you call it, that would be, perhaps, even more long-term beneficial? Or, is the short-term reward for telling yourself you didn't actually make a mistake in the first place too alluring for most people?

Dr. ARONSON: Well, the short-term benefits can be alluring. But again, if it's a big mistake and a person is really attempting to be honest and is aware of the possibility of covering up and how that takes place, it can be enormously beneficial to own up. My poster boy for this, who is a great hero of mine, is a man named Wayne Hale, who was the operations officer at NASA, who was in charge of saying yes to the launching of the Columbia space shuttle that ended up in disaster with all the astronauts onboard being killed.

Initially, when he first heard about it - these people were his friends, he knew these people - imagine being in that position where you said it's okay to go and the thing blows up, killing everybody onboard. Imagine how powerful that is. His first impulse was to say, well, you know, look. No launch is perfect. There were some problem with it, but it seemed good enough and so I gave the go-ahead. And then when he thought about it for a while, he said, no. As I look at it now, it was right there in front of me. The negatives outweighed the positives. I made a terrible mistake. I should have aborted that mission before it left the ground. Now, that took an enormous amount of courage. The end result is - was that Wayne Hale was promoted.

PALCA: Elliot, I have to...

Dr. ARONSON: And I think that was a wise decision.

PALCA: I have to interrupt to say you're listening to TALK OF THE NATION from NPR News. Sorry. We'll just come back on that for one second.

So in the end, that was a much more beneficial thing for him to do.

Dr. ARONSON: For him and for everybody else involved. I mean, I think it was cleansing and it didn't cost him anything in terms of his job. Now, that of course depends on how open-minded the people are around him. But I would want somebody working for me who owned up to mistakes like that, even horrendous ones, because the important thing is when you own up to a mistake, when you examine yourself and stop reducing dissonance, then you're less likely to make the same mistake again. But if you don't own up, if you keep digging yourself deeper and deeper in the hole, you'll never get out of it.

PALCA: We have time for one more quick question. Let's go to LeVonne(ph) in Madison, Wisconsin. LeVonne, you're on the air.

LEVONNE (Caller): Hi. I wanted to know about the connection between cognitive dissonance and treating depression. I have a teenage daughter who suffers. And I think that this is huge. This is like the answer to my struggles with her.

PALCA: Okay. Elliot, can you help at all?

Dr. ARONSON: In treating depression - can you tell me exactly how you think that would have - might apply to your daughter?

PALCA: No, I don't think - we had to go on to another caller. So is there any kind of approach that would work with cognitive dissonance and depression?

Dr. ARONSON: I think that, in a case of depression - the reason I asked the question is - in the case of depression, it may be caused by the person being unable to reduce dissonance over small things. And in that case, you would want to help them understand that the small things don't matter, that it's okay, that we all make small mistakes, that they shouldn't be beating themselves up over it.

PALCA: All right.

Dr. ARONSON: And if that - you know, I hate to give psychiatric advice in a vacuum...

PALCA: Right.

Dr. ARONSON: ...and that's why I asked the question.

PALCA: Right. Well, no, it's fair enough, but unfortunately, we had to drop that call because we've run out of time for this segment. So Elliot, thanks very much for coming on and sharing your thoughts with us.

Dr. ARONSON: Joe, it was a great pleasure being with you.

PALCA: Elliot Aronson is a social psychologist and co-author, with Carol Tavris, of "Mistakes Were Made (But Not by Me)." And there's an excerpt from the book on our Web site, npr.org.

We have to take a short break now. But when we come back, we'll be switching from the painful talk of psychology and cognitive dissonance - which is painful to some - to the more happy pleasure, let's say, of picking good fruit. So stick with us.

(Soundbite of music)


Copyright © 2007 NPR. All rights reserved. Visit our website terms of use and permissions pages at www.npr.org for further information.

NPR transcripts are created on a rush deadline by an NPR contractor. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of NPR’s programming is the audio record.