Easily Accessible Info Blurs The Line Of Expertise The Internet has made it possible for everyday people to have endless information at their fingertips. From celebrities weighing in on child vaccinations to science fiction authors disputing climate change studies, what happens to science when the line between expert and non-expert becomes fuzzy?
NPR logo

Easily Accessible Info Blurs The Line Of Expertise

  • Download
  • <iframe src="https://www.npr.org/player/embed/95783155/95807331" width="100%" height="290" frameborder="0" scrolling="no" title="NPR embedded audio player">
  • Transcript
Easily Accessible Info Blurs The Line Of Expertise

Easily Accessible Info Blurs The Line Of Expertise

  • Download
  • <iframe src="https://www.npr.org/player/embed/95783155/95807331" width="100%" height="290" frameborder="0" scrolling="no" title="NPR embedded audio player">
  • Transcript

IRA FLATOW, host:

This is Talk of the Nation: Science Friday. I'm Ira Flatow. Would you take financial advice from your hairdresser? Maybe you do. Or for that matter, would you let your financial adviser cut your hair? I hope not, probably not.

So, why are you willing to take health advice from celebrities, even if their advice goes against that of professional medical experts? And how is it that, rather than listen to the opinion of the vast majority of scientists on global warming, some people listen to talk-show hosts or science-fiction writers when it comes to climate change?

It seems that when it comes down to science, we often don't trust our experts.

My next guest has spent a large part of his career studying scientists, those that study gravitational waves especially, and he's spent a lot of time considering what it means to be an expert. Harry Collins is distinguished research professor at the Cardiff School of Social Sciences in Cardiff, Wales.

He's also the director of the Centre for the Study of Knowledge Expertise and Science there. His most recent book, co-authored with Robert Evans, is called "Rethinking Expertise." He's on the line with us from Cardiff. Thank you for talking with us, Dr. Collins.

Dr. HARRY COLLINS (Centre for the Study of Knowledge Expertise Science, Cardiff School of Social Sciences): My pleasure.

FLATOW: How did it come to - our distrust of scientists, when did this begin?

Dr. COLLINS: Well, from my neck of the woods, from where I work, it all began in about the 1960s, and I'm sorry to say that I was quite a big part of it. In the 1950s, the scientists had tremendous success after the Second World War.

It was said that, just as the chemists had won the First World War, the physicists won the Second World War. We had the promise of nuclear power and even some rash promises of power too cheap to meter. Well, that may be a myth, but that more or less indicated the sort of thing that people were thinking about science.

People just took it as natural that science was getting it right. And then, everything got turned upside down in the '60s. People stopped wearing suits and ties, and people - sex was invented, according to one of our most famous poets.

(Soundbite of laughter)

Dr. COLLINS: And at the same time, people like me started saying, well, maybe this story of science isn't quite right. And when you look much more closely at science, you see it isn't quite as exact a procedure, and it can't produce quite the certainty that people had thought it could.

FLATOW: And so the ambiguity, then, that scientists are used to dealing with was not comforting to the public.

Dr. COLLINS: Well, it's - what we did, what - we people in my academic neck of the woods did things like looking very closely at the way scientists proved things, let's say, the normal way that we think the scientists prove things is by repeating an experiment.

Let's suppose somebody does an experiment and they get a certain result, and then some other scientists say, well, we're not sure about that result, so we're going to check the experiment. And they check the experiment. And say they don't get that result. Then the textbooks will tell you, well, that result is disproved.

But I looked very closely at one of these incidents, in fact, a matter of somebody making claims about gravitational waves in the early days, in the late '60s and early '70s, a man called Joe Weber, who said he'd found gravitational waves. And most other scientists didn't believe him.

And a few others then started building similar apparatuses, similar experimental apparatuses to that built by Weber. And they said, look, we can't find the waves, so you must be wrong. But Weber turned to them and said, no, no, the reason you can't find the waves is that you haven't done the experiment well enough. You're just not as good an experimenter as I am.

And the problem is you can't really show who's the best experimenter. You see, showing who's the best experimenter is a matter of human judgment, not scientific judgment. So, Joe Weber could go on more or less indefinitely, telling these people, the reason you can't see the waves is you're no good.

And these other people could carry on telling Joe Weber, no, the reasons that you can see the waves is that you haven't done the experiment right and you are no good. And in the end, it all seems to just settle out into some sort of consensus for reasons which are pretty mundane and pretty ordinary; people get fed up with Joe Weber or they don't get fed up with Joe Weber, and it just settles down into a consensus.

FLATOW: But now, we have everybody and their grandmother on talk shows, on television, pushing books, whatever, all touting themselves as experts on things.

Dr. COLLINS: Exactly, and the stuff that we did kind of set the scene. Not only us, but the whole movement known as postmodernism, which has been going on for about 30 years, set the scene for everybody and their grandmother to say, we're just as good as scientists, because all these people have shown us that science isn't really anything like as accurate or certain as it's said to be. And it really seems to have gone too far. And now, we have disastrous things happening.

So, for example, if I could give an example from my country, from the U.K., not so many years ago, a man called Andrew Wakefield said in a press conference that he thought the vaccine against mumps, measles and rubella, which was regularly used here, caused autism. Now, he had no evidence for this whatsoever.

But somehow, it got picked up in the newspapers and picked up in the rest of the media, and there was a panic. And people stopped injecting their children with the mumps, measles and rubella vaccine. And as a result, we've now got a bit of a measles epidemic in this country, a bit of a measles epidemic, which is actually killing kids, especially kids who are too ill to have a measles injection.

And so, these sorts of things have real consequences which seem to be getting out of hand. And this was - this panic was based on no evidence whatsoever, but on the feeling that parents could sense whether their children had been badly affected by the vaccine or not, I mean - and it was bound to be the case that certain children would be diagnosed with autism a little while after they've been vaccinated. That's statistically inevitable. And of course, when that happens, you're bound to think - if you're a parent - it's the vaccine that must have caused it. But it's not difficult to understand that that causal inference doesn't follow at all from the fact that every now and again, autism is going to follow a vaccination.

FLATOW: Yeah, in this country, we've had a similar debate about vaccinations and autism, and as much emotional bloodshed, so to speak, on both sides, as I'm sure you've had over there. I think people, now with the Internet, would you not say, they get access to, they think, as much data as everybody else has and they can become as big an expert?

Dr. COLLINS: Exactly. The Internet is - can be - I mean, it's very useful. It's a great thing to read the Internet, and it's not a bad thing to go - when you go to your doctor, to just talk to him or her about what you've seen on the Internet. But to take it too seriously is to make a big mistake.

And again, taking the Internet too seriously has had really very, very tragic consequences, indeed. Thabo Mbeki, the premier of South Africa, came to believe that the antiretroviral drugs were possibly poisons and no use against mother-to-child transmission of AIDS virus as a result of reading the Internet.

And you can see him - you can find his speeches in parliament advising his ministers to read the Internet in order to understand that, in fact, these vaccines weren't dangerous, and as a result were - could be dangerous and weren't effective. As a result, best estimates are that maybe some tens of thousands of South African babies have acquired this disease when they didn't need to, really tragic.

FLATOW: What do you advise people to do if they see an expert and would like to know if that's, you know, that's someone we should listen to?

Dr. COLLINS: Well, it's a complicated business. I mean, we're all used to making lay judgments about whether experts are any good or not. We do that whenever we vote for a politician. We take a look at them on the television or wherever we see them, and we see if we like their demeanor or we don't like their demeanor.

And some of that should still go on, but you see, once you get to that point, you've already made the decision that you're going to try and make an assessment about whether a person is a good expert or a bad expert. And once you done that, you're halfway there, because at least you've decided to look at experts, rather than your own guesses or something you've read on the Internet.

Now, ordinary people, I think, can do - can go some way to using their political mouse, as it were, to judge between experts. But if you really want to get into it, it is difficult. You've got to look at people's qualifications, and most important of all, you have to look at experts' experience. Where have they been? How much time have they spent doing research on this kind of thing? Basically, you're looking to say, the people I want to give more weight - I want to give more weight to the opinions of people who know what they're talking about, because they have some experience of it, than those who don't, because they don't have any experience of it. And then it's your job to start summing up who's got the experience and who hasn't.

FLATOW: Sometimes you get into trouble on going in the other direction because the scientists blur the line themselves. They think that - some scientists think, for example, that winning a Nobel Prize gives them the right to talk about anything they want, and be - and be considered an expert on it.

Dr. COLLINS: Absolutely correct. I mean, our analysis of expertise, in the book, we do a - quite a complicated analysis of expertise. And one of the things we point out is that scientific expertise is a narrow crevasse.

You have expertise in the narrow area of your specialism. And as soon as people start talking with too much authority outside, beyond that narrow crevasse, you need to start distrusting them, unless they have some other kind of expertise that leads you to think that they can say something about this specialist area.

FLATOW: Let me see if I can get one call in before the break.

Dr. COLLINS: Sure.

FLATOW: Jim in San Rafael, California. Hi, Jim.

JIM (Caller): Oh, hi. Oh, thanks for talking my call. I teach a course, an adult education course, called Scientific Method: How to Understand What Scientists Are Telling Us. And it's basically a light survey course that talks about what the scientific method is.

And I'm finding that that the people really don't have to - I mean, to understand the basic principles anyway, the average person with a decent degree of intelligence can understand it. And I'm just wondering, what effort you're - you've made to try and educate people as to what the scientific method is? The more they understand, the more they are likely to listen. And I also was curious as to whether or not your show would do a show on that particular topic, because if people do understand it better, they're going to make a more informed judgment.

FLATOW: OK. Thanks for the call.

Dr. COLLINS: Is that question addressed to me or to you?

FLATOW: Well, it's - I think it's addressed to both of us. Let me take the second part, because we have to go to the break, and I'll...

Dr. COLLINS: Sure.

FLATOW: And I'll just say something that from our experience in the program is that lots of times we find out that no matter - and we've had this experience recently talking about autism, asking someone who was not in favor of vaccinating her children - I said, is there any amount of research that I can give you, anything that the scientists that can show you, that would convince you that you're wrong? And she said, no.

Dr. COLLINS: Yeah.

FLATOW: I just don't trust the scientists and I don't trust the government that funds them. And there's dual distrust there of all kinds of authorities that she was talking about. We're going to - stay with us, Harry. We have to take a break. We'll come back and talk lots more with Harry Collins, who co-authored a book with Robert Evans called "Rethinking Expertise." We'll take your calls. We'll be right back after this break.

(Soundbite of music)

FLATOW: I'm Ira Flatow. This is Talk of the Nation: Science Friday from NPR News.

(Soundbite of music)

FLATOW: You're listening to Talk of the Nation: Science Friday. I'm Ira Flatow. We're talking about expertise and how to know it and how to recognize it, and whom should you believe? Harry Collins, the distinguished research professor at Cardiff School of Social Sciences. And he's our guest this hour, author of the book, "Rethinking Expertise." 1-800-989-8255.

When I last crudely interrupted Harry, we had a caller who was wondering about - I think he said he teaches a course in something like critical thinking - could we teach people to be better critical thinkers?

Dr. COLLINS: I think he said he had a course in scientific method and...

FLATOW: Right.

Dr. COLLINS: And of course, what one does as an academic is you do try and teach people to understand scientific method better. But the old version of explaining scientific method better was to teach people about how experiments are done, how scientific inference is made and maybe little bits of mathematics and statistics.

But I think still more important, and much more readily graspable by the majority of people, is a better understanding of the substance and nature of scientific proof. And we try and demonstrate this by examining some case studies of historical bits of the production of scientific evidence.

For instance, there's a famous experiment in science called the Michelson-Morley experiment, which is said to show that the speed of light was a constant. And if you read most physics books, those books will tell you this was very important in proving the theory of relativity, that the experiment was done in 1887, the result was that the speed of light was a constant, nobody understood it until Einstein first relativity paper come along, and then suddenly everything fell into place.

But if you examine the history very, very carefully, people were arguing about the results of that experiment for about 40 years. And what hardly anybody remembers is that another experimenter, called Miller, was awarded the American Association for the Advancement of Science physics prize in 1929 for showing that the speed of light was not a constant. In fact, the speed of light is a constant, but people - there was no firm consensus of that for years.

Now, if there's no firm consensus about it for 40 years, it's hardly surprising that more complicated things like global warming create dissensus and consensuses. It's hard to make completely uniform.

And so, if you understand that science - science is always inexact, but the mistake to make is, because science is always inexact, it's no use to us. On the contrary, we all just have to take responsibility and live with some inexactitude in our lives and get on with it. But you're still much better off talking notice of the experts than the non-experts, because the non-experts' views are virtually random, whereas the experts' views might have a better chance of being right.

FLATOW: Mm-hm. Let's - 1-800-989-8255. Let's go to Rich in Portsmouth, New Hampshire. Hi, Rich.

RICH (Caller): Hi. How are you?

FLATOW: Hi there.

RICH: Say, I'm a retired public health epidemiologist, and I've had a career where I've had to work with dealing with scientific information on the front line from - everything from AIDS to immunizations and, most notably, fluoridation of water supplies. And what I have found in my experience is that unfortunately, particularly at the local level, the media feels a need to provide what they call balance in covering the subjects.

And they tend to give junk science and people who are nonscientists and their views regarding issues, particularly like issues of fluoridation, they give them the same weight and credibility as the scientific literature about the issue that maybe have 50 years worth of scientific data to support a position. But they give this equal coverage and equal balance to nonscientific views, which tends to confuse the public even more, I think.

FLATOW: Mm-hm. Harry, good point.

Dr. COLLINS: I think that's absolutely right. I mean, one of the reasons that the MMR controversy got out of control was exactly for this reason, that journalists felt they need to - needed to balance the story of one mum or dad in some little town whose child showed the symptoms of autism shortly after being vaccinated, they had to balance that with the whole huge raft of epidemiological evidence which showed that, you know, there was no measurable correlation between the MMR vaccine and autism. And it is this kind of spurious balance that can make some of these panics accelerate like forest fires.

FLATOW: Mm-hm. Harry, I want to thank you for taking time, staying up this evening, to be with us today.

Dr. COLLINS: Thanks very much.

FLATOW: Have a good evening. Harry Collins is distinguished research professor at the Cardiff School of Social Sciences and Director of the Centre for the Study of Knowledge Expertise and Science at the Cardiff University in Wales. His book is called "Rethinking Expertise."

Copyright © 2008 NPR. All rights reserved. Visit our website terms of use and permissions pages at www.npr.org for further information.

NPR transcripts are created on a rush deadline by Verb8tm, Inc., an NPR contractor, and produced using a proprietary transcription process developed with NPR. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of NPR’s programming is the audio record.