How Scientists Can Police Themselves How do scientists deal with sloppy or shoddy science? A survey found that researchers were often able to deal with minor misconduct informally. Gerald Koocher, one of the scientists behind the survey and co-author of a handbook for dealing with research misconduct, explains.
NPR logo

How Scientists Can Police Themselves

  • Download
  • <iframe src="" width="100%" height="290" frameborder="0" scrolling="no" title="NPR embedded audio player">
  • Transcript
How Scientists Can Police Themselves

How Scientists Can Police Themselves

How Scientists Can Police Themselves

  • Download
  • <iframe src="" width="100%" height="290" frameborder="0" scrolling="no" title="NPR embedded audio player">
  • Transcript

How do scientists deal with sloppy or shoddy science? A survey found that researchers were often able to deal with minor misconduct informally. Gerald Koocher, one of the scientists behind the survey and co-author of a handbook for dealing with research misconduct, explains.

JOE PALCA, host:

This is SCIENCE FRIDAY from NPR. I'm Joe Palca. Ira Flatow's away.

Later in the hour, we'll be talking about silk and viruses in your genome, but first, research ethics. Like any field of human endeavor, there are people in science who break the rules. Maybe they leave out some contradictory data or touch up an ambiguous picture to make it seem clearer or plagiarize someone else's work.

But for science, this is a special problem. There is no police squad to keep scientists honest, and sometimes it's hard to tell the difference between intentional wrongdoing and honest error.

Well, some social scientists have taken a look at this problem, and they think they have some solutions to offer, and that's what we're going to talk about.

So give us a call. Our number is 800-989-8255, that's 800-989-TALK. If you're on Twitter, you can tweet us your questions by writing the @ sign, followed by scifri. And if you want more information about what we're talking about this hour, go to our website at, where you'll find links to our topic.

Now let me introduce my guest. Gerald Koocher is a professor of psychology and associate provost at Simmons College in Boston. He's also the author of an opinion paper on this topic that appeared in last week's edition of the journal Nature. Welcome to the program, Dr. Koocher.

Dr. GERALD KOOCHER (Professor of Psychology, Associate Provost, Simmons College): Hi, Joe.

PALCA: Hi. So I guess the first question is, you know, is this is a problem in - is this a solution in search of a problem, or is there a problem here with research ethics?

Dr. KOOCHER: Well, there's a very real problem. I don't want to suggest that all of science, by any means, is contaminated, but we surveyed 2,600 biological, biomedical and behavioral science researchers, and they told us about more than 3,000 incidents that they had run across in their career where someone was cheating or doing something that was likely to contaminate the research record.

PALCA: So what does - so does that mean that these - I mean, these are the people that responded to your survey. So maybe there's some, you know, bias in who decided to respond, but are you saying that this is something that at least at a low level is occurring on a rather regular basis?

Dr. KOOCHER: It occurs with some degree of regularity, and of course, there are different levels of intensity or severity of the problem.

PALCA: So what's an example of the kind of thing - I mean, obviously, if someone makes data up from whole cloth, that's terrible, that's just, that's lying. But if - what are the kinds of things that we're talking about that sort of are in between that and something that might be considered questionable?

Dr. KOOCHER: The kinds of things that the federal government focuses on for federally funded research is mostly what's called F, F and P: fabrication, which is making up data out of whole cloth; falsification, which is modifying your data to fit your needs; or plagiarism, passing off someone else's work as your own.

But we are also concerned, for example, about questionable authorship practices, where you take credit for something that someone else really did most of the work on or where you list an honorific author in the hopes that their prestige will get you published, or when you are careless, such as sloppy recordkeeping, or when you intentionally rig your samples so that - or your methods so that you bias the results; when you don't adequately supervise your research assistant so that some mistakes are made and never detected, and inappropriate data gets incorporated in the analyses.

PALCA: So what's - okay. So this is bad, and nobody likes it, but is there really a problem? I mean, what happens if this stuff gets into the literature? It's wrong, but so what?

Dr. KOOCHER: Well, people will continue to rely on it and cite it, and as I'm sure you and others know, there have been some rather sensational cases in the biomedical area where people's medical treatment was modified based on research which was inaccurate.

And sometimes because the studies that are required are expensive to conduct or lengthy to carry out, they're unlikely to be replicated, and so it may be a while or never before some of these errors get picked up.

PALCA: So is there a particular study that you're thinking about, that comes to mind, where somebody's medical treatment or some pattern of medical treatment was actually based on misunderstood or misrepresented data?

Dr. KOOCHER: Well, there are two types of studies in particular. One was a cardiology study where people were enrolled who really didn't fit the criteria of what was supposed to be the patient. And so there were some people in the sample who shouldn't have been there and biased the treatment outcome results.

There was also at least one cancer chemotherapy study which suggested that women who had breast cancer needed higher levels or longer durations of tamoxifen treatment than was actually necessary.

PALCA: Yeah, and I'm recalling there was also the one in - about bone marrow transplants for women with breast cancer that was based on one study that was found to have errors in it.

Dr. KOOCHER: Correct.

PALCA: So yes, I guess it happens. Well, we have people interested in talking about this. So let's take a call now and go to Paul(ph) in Wichita, Kansas. Paul, welcome to SCIENCE FRIDAY. You're on the air.

PAUL (Caller): Thank you. I'm currently studying a doctorial degree in finance. It's not science-related, I suppose, but we have - we also come across the same set of problems.

And I'm wondering if the speaker can talk about, do they see the problem as being more of a moral issue, that they just want to try to get published, or is it more something that they're just sloppy? Because we get the same thing.

PALCA: Yeah, thanks for that call. Dr. Koocher, what do you think?

Dr. KOOCHER: All varieties. Certainly, there are some people who are clueless about the fact that they're doing the wrong statistical analysis or that someone's been careless. But there are also people who deliberately cheat.

They rationalize away their scientific principles, or because of some potential gain or pressure on them, they compromise them, and also they think that they're going to get away with it.

PALCA: Yeah, yeah. Well, what are your - what do you see as - you're proposing a solution here, or at least part of a solution. What do you see as - what is that?

Dr. KOOCHER: Well, my three colleagues who worked on this project together named the project Colleagues - Collegial - Colleagues as a Defense Against Bad Science. And our goal was to mobilize our colleagues to speak out when they see something happening because your lab buddy or your partner down the hall or someone at your institution may notice something that they could call out to you.

And we actually published a guide called "Responding to Research Wrongdoing," which is distributed free over our website,

PALCA: Right, but I looked at that, and here's the question. I mean, I worked in a lab myself once upon a time, and it would take a lot of nerve for me as a lab technician to go up to the head of the lab and say I think you're doing something wrong. Does that happen? Are people willing to do that?

Dr. KOOCHER: It does happen. You hit the nail on the head, though. It is far easier to speak out about someone who is lower on you than the - on the totem pole. If you, working as a student or a post-doc in the lab, were to notice something that your boss was doing, it would be much harder to speak out.

So one of the things we suggest is some gentle alternatives, ways to approach and raise the issue without being intimidating.

PALCA: Right.

Dr. KOOCHER: And yes, there are some people who will be nasty and who will be obnoxious about it, but if you use some of these gentle alternatives, you may at least chasten the person and get them to think twice.

PALCA: Right. So "you lying cheat" is probably not your opening line.

Dr. KOOCHER: Exactly correct.

PALCA: Okay. Let's take another call now and go to Flora(ph) in Glen Burnie. I guess that's Glen Burnie, Maryland.

LAURA (Caller): Yes, it's Laura from Glen Burnie, Maryland.

PALCA: Go ahead, you're on the air.

LAURA: Very interesting show. My question, I was told to keep it brief, but peer review, you know, we have allowed, you know, the medical and scientific community to police itself. And if there's a problem with this, and of course there have been famous cases of, you know, people claiming that they did things, and it turned out it was wrong, but eventually, it got revealed. But...

PALCA: Okay, well, let me ask Dr. Koocher. What about peer review? Do you think that peer review is going to catch these things or not? What is peer review? Maybe you can review that.

Dr. KOOCHER: Well, peer review refers to the idea of when you want to publish your research, it is sent by the journal editor or the publication editor out to experts who take a look at it, most of the time not knowing who wrote it, and comment and review it for scientific rigor. And that will pick up many problems, in particular carelessness or sometimes forced statistical analysis.

However, the peer review will not be able to detect something that went wrong before the manuscript was written and sent in.

PALCA: I see. I see. And so these things - as I said, they're not always, it's not always obvious that something has been tinkered with. It could just be hidden, and it could take a long time before somebody discovers it.

Dr. KOOCHER: That's right. And in some fields, particularly biomedical research, you may have tissue samples or other hard things that you can go look at. In behavioral science, mostly we might have interview data or questionnaire data, and the person who filled it out is long gone by the time the publication results.

PALCA: Right. So how are you going to know if this approach, this gentle persuasion on the part of lab members, is working?

Dr. KOOCHER: Well, there are a number of ways. One is if you see some direct effect of it. We - based on the survey we did, however, a substantial number of people were able to correct the problem with informal means.

So based on these anecdotes, we think that there is a high likelihood of being effective. At least 30 to 40 percent of the interventions seem to be effective.

PALCA: But these are - again, you're looking at what people have been telling you. You haven't actually been able to go and observe this, or you have to rely on people's reports.

Dr. KOOCHER: No, that's right. We are relying on people's responses to anonymous survey. The people we surveyed were very senior. They were all people who were principal investigators on federally funded grants. So they weren't research novices.

PALCA: Right, and the government has this Office of Research Integrity, I think it's called now. How have they responded to this?

Dr. KOOCHER: Well, ORI were actually one of the agencies that gave us the grant to do the study, and we're grateful to them and the National Institute of Neurological Diseases and Stroke, who funded us, but ORI necessarily only focuses on the recipients of federal funds, and they are really only concerned with the F, F and P - that's fabrication, falsification and plagiarism.

So they don't concern themselves with what, according to our data, are some other very significant potential problems.

PALCA: Just one quick example of something that's not F, F, P but that's a big problem still.

Dr. KOOCHER: Well, for example carelessness. You have sloppy recordkeeping, or one of your research assistants isn't running the rats at night when they told you they were doing it.

PALCA: Okay.

Dr. KOOCHER: And you're not monitoring them.

PALCA: All right. I get the picture. Well, I'm afraid we're going to have to leave it there. Dr. Koocher, thank you very much, an interesting topic.

Dr. KOOCHER: Thank you, Joe.

PALCA: That was Dr. Gerry Koocher. He's a professor of psychology and associate provost of Simmons College in Boston. You can find out more about his work at their website, [POST-BROADCAST CORRECTION: Their website is]

Copyright © 2010 NPR. All rights reserved. Visit our website terms of use and permissions pages at for further information.

NPR transcripts are created on a rush deadline by Verb8tm, Inc., an NPR contractor, and produced using a proprietary transcription process developed with NPR. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of NPR’s programming is the audio record.