Copyright ©2009 NPR. For personal, noncommercial use only. See Terms of Use. For other uses, prior permission required.

JOE PALCA, host:

From NPR News, this is SCIENCE FRIDAY. I'm Joe Palca. If you visit your doctor and go home with a prescription for blood pressure medication, you can be pretty sure that the drug has been tested and proven to work for your condition. The Food and Drug Administration demands it. But can you say the same for mental health treatments, that they've been scientifically proven to work? I'm not talking about antidepressants and other medications. I'm talking about talk therapy, cognitive behavioral therapy. Should these therapies be evaluated to make sure they are, you know, in the same sense, safe and effective?

Some psychologists are saying yes, that there's just not enough science in psychotherapy today. And they're saying some therapists are actually ignoring treatments that have been proven effective in randomized trials, relying on their own experience instead. That's one topic of discussion in a recent report in the journal Psychological Science in the Public Interest.

For the rest of this hour, we're going to talk about these issues, why some psychologists think science can improve the practice and how much weight to give to the treatment versus the therapist giving it. So give us a call. Our number is 1-800-989-8255. That's 800-989-TALK.

And now, let me introduce my guests. Richard McFall is the executive director of the Psychological Clinical Science Accreditation System. He's also professor emeritus in the department of psychological and brain sciences at Indiana University in Bloomington. He's one of the authors of the recent report on this topic, and he joins us from the studios of WFIU. Welcome to SCIENCE FRIDAY, Dr. McFall.

Dr. RICHARD McFALL (Executive Director, Psychological Clinical Science Accreditation System): Thank you.

PALCA: We also are joined by Bruce Wampold. He's a professor and chair of the department of counseling psychology at the University of Madison, Wisconsin, and a clinical professor in the department of psychiatry there. He joins us from the studios of Wisconsin Public Radio. Welcome to SCIENCE FRIDAY, Dr. Wampold.

Dr. BRUCE WAMPOLD (University of Wisconsin, Madison): Good afternoon, Joe.

PALCA: And finally, Dianne Chambless is the Merriam Term Professor of Psychology at the University of Pennsylvania in Philadelphia. She's also director of clinical training in the department of psychology there. Welcome to SCIENCE FRIDAY, Dr. Chambless.

Dr. DIANNE CHAMBLESS (University of Pennsylvania, Philadelphia): Thank you.

PALCA: All right. So, Dr. McFall, let's start with you. Tell us briefly, you wrote this paper, what were you and your colleagues arguing for?

Dr. McFALL: We were arguing to do the best that we could to improve the scientific basis for psychology, for clinical psychology. We felt that in the 60-some years since clinical psychology became organized as a sub-specialty within psychology that not enough attention had been paid to the advances that had been made in the science and that we need to do whatever we can to close the gap between science and practice.

PALCA: Well, give me an example. Sorry, go ahead.

Dr. McFALL: I was just going to say that the way to do that that we proposed in the monograph was to organize a new accrediting system that would promote science-centered doctoral training in clinical psychology.

PALCA: So - but I'm unclear. You're not suggesting that a clinician do the research. You are suggested that clinicians evaluate the research that's being done by others.

Dr. McFALL: That's right. We're saying that what the public should be receiving in the way of treatments when they go to see a psychotherapist should be based on the best science available.

PALCA: Okay. Well, Dr. Bruce Wampold, then, you've read this article, and that seems to make sense, getting your treatment based on the best science available. Is there something wrong with this argument?

Dr. WAMPOLD: Well, there is. It seems imminently logical. When we go to a doctor, we want the best possible treatment, and we expect that doctor to deliver that treatment to us or at least give us options among several effective treatments.

The difference is, in psychotherapy, what we find is that all treatments that are really intended to be therapeutic and are given by competent therapists are all about equally effective. It's different than medicine in the fact that as long as there's a coherent treatment and a rationale given to the patient and a reasonable set of therapeutic actions, the outcomes are comparable.

So what seems to make a difference in psychotherapy, the research shows, is the skill of the therapist giving the treatment. So within treatments, like cognitive behavioral treatment, there's some therapists that get consistently better outcomes than other therapists, but that's true of the other treatments, as well. So what's really more important is that you have a skilled therapist. And we can talk a little bit later about what those skills might be.

But it's - in that way, it's different than medicine. And so when we say we should pay attention to the science, the science is really showing that psychotherapy works somewhat differently than medicine and that we shouldn't anoint some treatments as particularly scientifically based when it hasn't been shown that those treatments are more effective than other treatments.

PALCA: Okay, and Diane Chambless, would you agree with that or is a - is an individual more important than the therapy he or she employs?

Dr. CHAMBLESS: Well, I think they're both important. So I wouldn't agree with Dr. Wampold that this therapist is necessarily the only important ingredient. I think - and I think it also depends on the kind of problem that we're talking about. I think many problems that people come into therapy for are problems in living that, if they can talk out with a person who listens well and forms a good relationship with them and helps them sort things out, that's sufficient.

My concern is more for the people who have significant psychological disorder, that in that case, I think we owe the public, we owe the people who pay for treatment, to know whether the treatment that we're going to provide works. And so I think it's important to know whether there is scientific evidence that supports whether a treatment works for a particular problem. And saying that we don't yet know if that treatment works better than every other possible imaginable treatment, that's never going to happen. We're never going to have all that evidence.

So I would put the emphasis on, do we know that a given treatment works? I would also say this is not so different from medicine in that we do know that doctor-patient relationship matters in medicine, as well, and different surgeons get different outcomes. So I think, you know, that these things really cut across health care.

PALCA: Well, I want to ask you if you can define for me what works means in this context. I mean, is it self-reporting from a patient saying, I feel better, or is it the doctor saying I've made this patient better, or is there some other more, you know, measurable outcome that you can use?

Dr. CHAMBLESS: Well, you know, there are different outcomes that you look at for different disorders. So it would be typical in a study on psychotherapy outcome to have measures that the patient fills out about whether he or she, you know, has changed in treatment but also to have an independent evaluator who's not part of the treatment team and doesn't know what treatment the person has received also do interviews with that person to assess their outcomes.

And then if there are very specific behaviors involved, we can do, you know, other kinds of behavioral assessment. For example, if you know, if we were working with someone who was afraid of being contaminated by public restrooms, we could see whether that person could, in fact, go into a public restroom and, you know, wash his or her hands there and so forth. So it really kind of depends to some degree on the disorder, what the measure is that we use.

PALCA: All right. Well, let's invite our audience, our listeners, to have a say in this conversation. And let's take a call now from Phillip(ph) in Olema, California. Phillip, welcome to SCIENCE FRIDAY. You're on the air.

PHILLIP (Caller): Yeah, great, thank you for having me on the air. I was trying to draw a succinct comment, write one out here. I had some therapy through a very, very effective psychiatrist recently to help me get through a divorce process and the depression that had preceded it and was included in the process. And the thing that was so effective for me was his willingness and his understanding of things beyond what we classify as scientifically provable and talked about spirituality and psychic abilities and various other things that would seem totally discountable by science, including, you know, the idea of past lives, things like that.

But it opened up something in me that allowed me to understand where these feelings of depression were coming from and life events that also contributed. So it was my world view, as well as my life experiences. So - but it couldn't be quantified by scientific standards, I don't believe. So�

PALCA: Well, that's the interesting question. So let me put that to Dr. McFall. Here you have somebody who says, my doctor helped me. And you know, you couldn't being to say start talking spiritually to your patients in a scientific sense, or maybe you could, but what do you do when you hear this kind of a story, Dr. McFall?

Dr. McFALL: We have no boundaries on what we're willing to consider as part of an effective treatment. The question is, if a therapist makes a claim that he or she is able to help somebody, they ought to be able to say what it is that they do, and they ought to be able to point to the evidence that what they're doing is effective. That is, that it does what they claim. And they ought to be able to show that the research evidence shows that it is not harmful.

This is the basic kind of standard that the FDA has for any kind of intervention. The problem is that current practice in psychology has no accountability standard.

And when Bruce Wampold says, for example, that all treatments are equal, that's really an overstatement. The research evidence that he's referring to is a set of comparisons of particular procedures. But that doesn't mean that those are the procedures that are being implemented by practitioners in the real world. There's nobody out there holding the practitioner to any particular standard.

What we're saying then is that practitioners should be held to some sort of standard of evidence that when they claim that they're doing something beneficial, they can back that up.

PALCA: But I think this is - I mean, this seems to me to be where the key question is residing because we just heard from someone who said my practitioner helped me by understanding my problem and giving me spiritual guidance and some past lives and other things like that. How could you look into the literature and find out if that's a particular kind of therapy that would help somebody?

Dr. McFALL: Well, you can't do it on an individual basis.

PALCA: Okay.

Dr. McFALL: What you need to do is to have a sample of people who go in with comparable problems and measure the effect of any particular intervention. It could be having somebody stand on their head in the corner. We're not saying that it's limited to something that only fits some pre-existing concept.

What we're saying is that if you make a claim that you have a treatment that's effective, you ought to be able to back that up. And of course we would learn something if we found that a particular procedure was effective and we'd want to do further research to understand why it was and how it relates to the underlying problem and so forth. That's how science advances.

PALCA: Right. So let me guess, Dr. Wampold then, if it's the practitioner that's really the key, if you're right that these treatments are comparable, how does a particular practitioner validate his or her effectiveness? I mean, presumably the patient wants to go to an effective therapist, not somebody who's not good, and you're saying it depends on the practitioner. How do you find out from a practitioner whether he's any good or not?

Dr. WAMPOLD: Well, that's a great question. And there is a movement I want to talk about just for a minute in which you actually collect evidence in your practice. And so if therapists measure the progress of patients in their care, then they could say and make the claim that what I'm doing with this particular patient and my patients in general is effective.

So when Dr. McFall says we ought to be accountable for the treatments that we give, what I would say instead is that we ought to be accountable for the outcomes that we obtain with our individual patients. So Philip didn't get a evidence-based treatment in the way that would be defined, but he made significant progress. And we have measures that could be given by psychotherapists that would measure that progress and say, you know, I meet the benchmark of what effects are obtained in clinical trials.

We found in a study of 10 or 12,000 patients that clinicians are already when they treat depression producing effects or benefits that are comparable to what's obtained in clinical trials.

So the accountability really ought to be with the individual patients, and we want to be able to say this patient made adequate progress. If they're not making adequate progress, then this is an issue that the patient and the therapist ought to consider within the therapy.

PALCA: We're talking about the scientific basis of psychotherapy, whether it can be - the scientific basis can be raised or if it needs to be.

I'm Joe Palca and this is SCIENCE FRIDAY from NPR News.

Dr. Chambless, what do you make of that? You're interested in evidence-based outcomes. I mean, is this enough to do it based on - does the patient get better or do you need to look more broadly than that?

Dr. CHAMBLESS: Well, I have two reactions to that. One is what Dr. Wampold is proposing, is a great step. I think that everyone should be evaluating the results of his or her own practice. And Professor Michael Lambert at Brigham Young University has done a lot of research on people evaluating clients in ordinary practice. And what he's shown is that therapists often aren't aware that their clients are either deteriorating or not improving in the way that they could be expected to improve, and that if they get feedback on the fact that people are not doing well, that that helps them work better with that person. So I think that that's a great step forward.

But my second concern is I don't think it's enough, for this reason. I think we also care about not only whether the person gets better but whether the person is getting better because of what we are doing.

So one of the problems with something like depression is that depression often gets better over time. The problem is that people relapse. But it would be easy as therapist to think that the person was getting better because of what he or she did when it was actually the passage of time, the relationships the person had outside of therapy and so forth that were accounting for the change.

So that's why we need the controlled studies as well as the kind of study that Dr. Wampold is talking about, to know whether in fact the casual elements, what we can attribute the change to, is what's being done by the therapist with the client, the therapist and the client.

PALCA: But let me ask you just very briefly. I mean, when you talk about evaluating these things, I mean, Dr. McFall was saying, you know, facetiously, we could test people standing on their heads or as a therapy, they're not opposed to anything, but you've got so many different possible approaches to so many different problems, how are you ever going to do, you know, the kind of size of study that's going to tell you what's really working or not?

Dr. CHAMBLESS: Well, you can't possibly test them all. But there are some major approaches. And I think it's important for us to know, you know, in terms of the major approaches, how well they work for particular problems.

So for example, let's take obsessive-compulsive disorder, which is a very severe disorder that destroys people's lives. And you know, we used to think -back in the �60s or something - that we should we should use relaxation approaches with them because they had anxiety problems, and relaxation approaches seemed to help other patients with anxiety.

PALCA: Dr. Chambless, I'm going to have to cut you off there. we can come back to that�

Dr. CHAMBLESS: Okay.

PALCA: �about obsessive compulsive disorder. But we do have to take a short break. We're talking about the scientific basis of clinical psychology, whether it's adequate or whether it needs to be improved. And we're taking your calls at 800-989-8255. Stay with us. We have to take a short break.

(Soundbite of music)

PALCA: From NPR News, this is SCIENCE FRIDAY. I'm Joe Palca.

Should psychology be a little more scientific? And are our psychologists using the treatments that are proven to be effective? That's what we're talking about this hour with my guests, Richard McFall; he's the executive director of the Psychological Clinical Science Accreditation System. He's also professor emeritus at the Department of Psychological and Brain Sciences at Indiana University in Bloomington. Bruce Wampold is professor and chair of the Department of Counseling Psychology at the University of Wisconsin-Madison. And Dianne Chambless is the Merriam Term Professor of Psychology at the University of Pennsylvania in Philadelphia.

And Dr. Chambless, when we went to the break, you were talking about - I was raising the question about, you know, can you in fact study all the possible therapies out there. And you were narrowing that large field down to perhaps a smaller one where you would take a particular treatment for a particularly well documented problem and see if it worked.

Dr. CHAMBLESS: That's right, and to look at some of the major approaches. So I was - wanted to give an example of how the science matters. And I was saying that back in the �60s, early �70s, we were treating obsessive compulsive disorder with relaxation approaches because that seemed to help other anxiety disorders. And we weren't really getting anywhere.

And then in England people developed a new treatment called exposure and ritual prevention. And we read about this. I was in training. And we started with great excitement to try this new treatment, and it worked. So we weren't different as therapists. We were still the same caring people that we used to be. But we tried a new treatment that was based on the science of fear and learning. It worked. It meant some people who were crippled by their problems became functional people, and around 70 percent of them got very much better.

And yet, 40 years later, you know, when people have done surveys of practicing psychologists they've found, what are they doing with people with obsessive compulsive disorder? Well, more of them are using relaxation techniques than are using exposure and ritual prevention.

This is the sort of thing that we're trying to change, that we think that people in practice need to know when there are particular treatments that are helpful for particular kinds of patients. And they either ought to know how to do those treatments or how to refer to somebody who knows how to do them.

PALCA: Okay. Let's take a call now and go to Stephen(ph) in Arlington, Virginia.

Stephen, welcome to SCIENCE FRIDAY. You're on the air.

STEPHEN (Caller): Thank you for having me on.

PALCA: Sure.

STEPHEN: I'm calling as a psychologist, but also as someone who had chaired an APA accredited doctoral program.

PALCA: Uh-huh.

STEPHEN: And part of - your speakers have talked about a kind of need for a new accreditation system coming out of this sort of feeling that there's a - that the science is not being sort of taught. And I guess I sort of take issue with it because I believe that the current accreditation system for psychology is adequate. It requires for students to be trained in and get exposure to evidence-based practice and empirically supported treatments.

And so I sort of object to what I think is sort of kind of a little bit of a self-serving point on their part in terms of arguing for a different accreditation system. All of the programs require that kind of exposure in terms of the empirically supported treatments that your speakers have been talking about.

PALCA: Well, let me ask Dr. McFall. What's wrong with the current accreditation process?

Dr. McFALL: Well, I'm glad that we're actually back on that point because we're not really as concerned at this point with comparisons between treatment A and B as we are with comparisons between different types of training programs, and that's what the new accreditation system is about.

Over 50 percent of all the graduates from doctoral programs in the United States today are from Psy.D. programs.

PALCA: What's that?

Dr. McFALL: These are Doctor of Psychology programs, many of which, or the majority of which are free standing programs for profit with large student bodies, small faculties, often without university facilities of their own. And the training that is received in these programs is not the kind of scientific training that we think should be promoted. We're not saying that all Psy.D. programs are not good. We're just saying that this is sort of a proxy for what's happening in the field.

We're also not saying that all PhD programs in clinical psychology are excellent. We're just saying that there is evidence to show differences in the kind of training that's being provided across the different models of training. And some of the Psy.D. programs are actually hostile to the concept of science as we know it. That is, they don't believe that empirical science is relevant to their practice of psychology. So what we're trying to do�

PALCA: But - I just want to get - I want to get to Stephen's point, which is�

STEPHEN: Can I say something as a person who actually has run one of those programs? And so I think that you're wrong on that, I beg to differ, in the sense that any APA-accredited program, whether it be Psy.D. or Ph.D., has the same standards in terms of the coverage of research methodology, statistics, in terms of empirically-supported treatments. You cannot get APA - that's American Psychology Association - accreditation without having that adequate scientific training. And so�

PALCA: Okay. Steven, let me get Dr. McFall to respond to that. Thanks for bringing that up. Go ahead, Dr. McFall.

Dr. McFALL: Well�

PALCA: I mean, he's saying that the training is there and you're saying it's not. I mean�

Dr. McFALL: No. It isn't there. I don't know the program that this person is from and the programs that he's reviewed. But if you look at the content of these programs, the doctoral programs that are called Psy.D. programs weren't designed to train research scientists. They were - or people who are even knowledgeable about science. They are trained - they're designed to train practitioners with a very limited amount of science. And what we're interested in doing is promoting the development of very strong science training programs for the purpose of turning out graduates, who not only are able to contribute to the advancement of knowledge through their own research, but who are doing research that has implications for improving treatment and improving public welfare and public health.

PALCA: So, Dr. Wampold�

Dr. McFALL: And I don't - and I find it difficult to understand what the opposition to that kind of a goal is.

PALCA: Well, I mean, it's stated that way, no, of course not. But go ahead, Dr. Wampold. I'm sure you�

Dr. WAMPOLD: Well, yeah. I'd like to jump in because the caller mentioned that the American Psychological Association accredits programs, and their scientific training is very central to that accreditation system. The alternative accreditation system doesn't guarantee in any way that the quality of services of the providers trained in those programs are going to be any better. So there is this presumption that if someone is trained in one of these new clinical science programs that they're actually going to be better therapists.

So becoming a therapist involves clearly being knowledgeable about what the science is, about the disorders and about treatment, but it also takes a great amount of effort and supervision to learn how to deliver these treatments. And so it may well be that the graduates of these programs actually have less therapy training and maybe wouldn't be as qualified or as effective as therapists in other programs. So there is no scientific evidence that changing the accreditation system is going to produce more effective therapists.

And I come back to the point that the treatments we're giving now are very effective. And so, there is this implication that somehow, because psychologists are ignoring science, that they are giving less than adequate treatment. And it just - isn't any evidence to support that.

PALCA: But�

Dr. WAMPOLD: There have been studies�

PALCA: Yeah, go ahead.

Dr. WAMPOLD: Yeah, I'll go ahead. There have been studies where we've transported evidence-based treatments to clinics. And often, there's few, if any, differences in the outcomes. And often, it involves an additional training and supervision of the therapists, and even then you get small outcomes.

So we're in some ways addressing a problem that doesn't exist in the fact that psychotherapists are remarkably effective. When we do studies of the size of the effects in the clinical practice or in clinical trials, we find that psychotherapy is as effective as medications for most mental disorders and longer lasting. So I don't quite see what the - what is to be accomplished here except to kind of promulgate one view of what the scientific evidence is.

PALCA: Right. Well, let me turn to you, Dr. Chambless, and say would a better scientifically trained psychologist be better for patients? Would that help?

Dr. CHAMBLESS: I think that the new accreditation system will be training people who will themselves be the faculty members at other universities and will be teaching students. And what I would hope is going to come out of that is that students would learn to infuse their practice with science, and that is that they would be attuned to what are the best treatments for people with particular problems, they would know how to find that information in the databases that are out there about what treatment works, and that they would care about documenting, you know, the effects of their practice, and in addition be prepared to innovative and evaluate new treatments.

So I think the new accreditation system is not predominantly about training people who would themselves be on the frontline, you know, doing therapy as their primary job, but as opposed to training the people who are going to be the leaders in the field. And do I think that it matters what other people know about the science? Yes, I do. I think, you know, I was giving the example about what happened when we knew about the science with obsessive-compulsive disorder. That was also true with other anxiety disorders.

And where - you know, I personally experienced treating people when I was just a measly graduate student who had learned these treatments, you know, relatively recently, and taking people who had been housebound with agoraphobia and had had years of therapy and being able to help them because, you know, I read this literature and talked to the people in England when we're doing this. And so, I think, you know, when you see the difference that learning about science-based treatments can make in people's lives, that it's very hard to say that this is not important.

PALCA: Doctor�

Dr. WAMPOLD: Well, one of the implications that I think is unfair here is that psychologists are now not paying attention to the evidence. And again, as Steven said, the caller, that APA accreditation has, at its core, education in the science, psychopathology as well as in treatment. So I don't think that it's the case that psychologists ignoring the evidence.

Dr. McFALL: Well, I've got some evidence.

Dr. WAMPOLD: What the issue is�

Dr. McFALL: I actually would like to take�

Dr. WAMPOLD: Well, let me just�

PALCA: Let me hold on - let me ask both of you to hold on for a second, because I need to remind people that we're talking about the scientific basis of clinical psychology. And I'm Joe Palca, and this is SCIENCE FRIDAY from NPR News.

All right, now let's see if I can referee here. Dr. Wampold, let me let Dr. McFall have a word here.

Dr. McFALL: Well, there are two points I want to make. First of all, there have been surveys of psychologists in private practice - one in 2008, for example -that found that psychologists continue to rely on their own or their colleagues' clinical experience more than on the scientific literature when selecting treatments or strategies for working with their patients. And a majority of these people have a very vague notion of what the scientific evidence suggests about treatment outcomes.

But even if we accept for the minute that two treatments are equal in outcome - and that's disputable - is there a difference in how efficient these treatments are? This is something that hasn't even been brought up. But if you can achieve the same effect in eight sessions of ERP for OCD, for example - which Dr. Chambless was taking about - but it takes 100 sessions of talk therapy with a different orientation, then efficiency and cost-effectiveness have to be criteria for determining treatments. What we're saying is that the science and the evidence should guide what we do in practice.

PALCA: Well, all right, Doctor�

Dr. McFALL: And to do that, you need scientific training�

PALCA: Right. Well, Doctor McFall�

Dr. McFALL: �to understand how to choose.

PALCA: �I'm not sure - I mean, I think you're being a little optimistic to think that people who are scientifically trained, if you're going to argue that physicians are scientifically trained, they don't make their decisions only exclusively based on the best medical evidence, and sometimes they also say this is what my colleagues and my fellow doctors do. But never mind, I think it's a point worth discussing, but I want to get one of our callers into the conversation.

Let's go to Rebecca in San Francisco. Rebecca, you're on the air. Thanks for talking to us.

REBECCA (Caller): Sure. Well, I wanted to bring up one aspect that hasn't been discussed yet, and that is the issue of the complexity within counseling and psychotherapy that is different from the medical or biochemistry, that we're trying to equate research in medicine to research in counseling with psychology and psychotherapy. And that is the sociocultural variables.

And those aren't easy to measure, but they're definitely really - again, in terms of the treatment outcomes. And one of the difficulties is that in most studies, such as what's been suggested there, it's very difficult to take into account all of the different variables that may be relevant here, so people's cultural histories, ethnic identity, their sociopolitical experiences, sexual orientation, disabilities, their client's value orientation, their understanding of the problem and how that fits (unintelligible) cultural context. And unfortunately, historically, research hasn't really addressed that very well, and not included that aspect very well.

PALCA: Okay.

REBECCA: And because it is such a complex variable, it's hard to stay (unintelligible) controlled studies and figure out what works for everyone.

PALCA: Let me get the - let me get our panel to respond to that. Dr. Chambless, what about that, the cultural variability?

Dr. CHAMBLESS: Well, I think that is important, and I think there's a lot more focus on that now than there used to be. So there's a major push to determine whether treatments are variably effective, according to people's, you know, ethnicity, culture, race and socioeconomic status, and I think that's very important. And so the data, you know, will be coming, but, you know, it's hard. You have to do very large-term studies to get information of that sort. And those are very expensive.

But the other thing is, I think, that it's always curious to me if someone says, well, you haven't done a study to show whether this treatment would be effective with somebody from this particular group. Therefore, I'm just going to fly by the seat of my pants. To me that doesn't make sense. To me, you generalize, you draw from the best science you have, and then you try to determine whether that's going to work for this person, you know, of this particular kind, rather than saying, well, I just, you know, start as if we knew nothing in that point of view.

So, again, I think we need to know very much more, but I think it's also important that we draw from what we do know. And to the extent that there have been, you know, been studies done, it's not like we're seeing that treatments are that variably effective for people of one ethnicity or another, for example.

PALCA: Dr. Chambless, we have to leave it there. I thank you all very much. I know we're not going to resolve this today, but I'm glad we got a chance to discuss it.

My guests were Richard McFall. He's the executive director of the Psychological Clinical Science Association System. Bruce Wampold, he's a professor and chair the Department of Counseling Psychology at the University of Madison -Wisconsin-Madison. And you just heard Dianne Chambless. She's a professor of psychology at the University of Pennsylvania in Philadelphia. Thanks to all of you.

Copyright © 2009 NPR. All rights reserved. No quotes from the materials contained herein may be used in any media without attribution to NPR. This transcript is provided for personal, noncommercial use only, pursuant to our Terms of Use. Any other use requires NPR's prior permission. Visit our permissions page for further information.

NPR transcripts are created on a rush deadline by a contractor for NPR, and accuracy and availability may vary. This text may not be in its final form and may be updated or revised in the future. Please be aware that the authoritative record of NPR's programming is the audio.

Comments

 

Please keep your community civil. All comments must follow the NPR.org Community rules and terms of use, and will be moderated prior to posting. NPR reserves the right to use the comments we receive, in whole or in part, and to use the commenter's name and location, in any medium. See also the Terms of Use, Privacy Policy and Community FAQ.

Support comes from: