Public Health

Vaccine Scare Shows How Emotions Trump Facts

Yesterday, the medical journal the Lancet retracted a 12-year-old paper by Dr. Andrew Wakefield, which helped fan a scare about vaccines and autism.

Child receiving the MMR vaccine. i i

The idea that vaccines could make kids sick elicits a very emotional response in many. Jeff J Mitchell/Getty Images hide caption

itoggle caption Jeff J Mitchell/Getty Images
Child receiving the MMR vaccine.

The idea that vaccines could make kids sick elicits a very emotional response in many.

Jeff J Mitchell/Getty Images

Now discredited, the report looked at just a dozen children who developed behavioral and intestinal problems. Eight of them had been recently vaccinated against measles, mumps and rubella.

So we turned to David Ropeik, a risk consultant and author of How Risky Is It Really? Why Our Fears Don't Always Match The Facts, for some insight.

Ropeik says that the Wakefield episode is a prime example of our potential to misjudge a situation based on false perceptions of risk. In emotional circumstances, where we lack an understanding and control of the situation, our emotional instincts to react may overwhelm our more logical side, he says.

Here are edited highlights from our conversation:

How do you view Wakefield's paper as an example of risk perception gone awry?
The Wakefield paper is interesting. Its key phrase specifically says "we did not prove an association between the MMR vaccine and the syndrome described." So, there's unequivocally no link. But Wakefield hinted at one in the news conference. And the parents of the kids who had autism jumped on that. Because what parent of a sick kid wouldn't want an answer that might lead to a cure? But their reaction, as understandable as it was, was emotional, not factual.

So, everything that followed, to me, is a huge lesson to society that our risk perception system can make choices that feel right but get us into trouble. In this case, the trouble is the recurrence of measles in many places in the world — in some cases killing children, a worry about other vaccines, and a shaken faith in science overall.

How much do you think the Wakefield case really shook up people's faith in science?
Wakefield's study came in the context of many other issues — things that had shaken trust in science and public health institutions in the UK already. Mad cow disease, for instance, was poorly handled by the British government in the first place. There was controversy over genetically modified food too. All of this had contributed — perhaps rightly— to the mistrust of public health organizations.

With vaccination, generally, you have a small number of people who don't like being told they have to do it — it's a government imposition. Well, Wakefield hinted that this government imposition might be causing kids sickness. In my opinion, that latched on to the existing undercurrent of resistance to government imposed vaccination.

How is a lay person supposed to make sense of issues like this — when there's so much complicated or conflicting information?
That's the question. The problem for lay people, like myself, is that we don't have scientific expertise but we want our children and ourselves to survive. So we have a host of largely subconscious and instinctive ways of sensing danger. For example, if there's something we can't understand or control, evidence that it might be risky will make us more afraid. This is very natural, and we all do it. But we also have to realize that we get things wrong in ways that make things worse.

We have to recognize the potential dangers in our own perception and factor that into our thinking as well. 'Am I getting this wrong in a way that could make things worse?' is a pretty good way to check your thinking about any risk.

But if you're already lacking understanding, how can you really check for yourself?
Examples like the Wakefield case are good teaching moments. This won't be the last time a complicated issue, fraught with many emotions comes along and we have to decide [how to react]. We're smart enough to learn from the past. And this episode is a grand teaching moment for all of us to not leap to conclusions to recognize how powerful our feelings are in our interpretations of the facts. That's all I think is realistic. It's up to people to make up their own minds.

Doesn't the general public trust experts to explain though?
We only look to those we trust - in part because they're saying what we want to here. I mean, Wakefield gave parents of autistic kids hope. The other par of it is that those experts have demonstrated their competence. What matters a lot besides the science that's factual is the experts' reputation and point of view. And that's subjective — it depends on our own points of view. Expertise is in the eye of the beholder. It's not just how many degrees you have.

What impact do you see coming from the Lancet's official retraction?
What's happening here is a continued movement away from Wakefield and the whole controversy for the Lancet. It's about Wakefield and saving-face. Despite all of the quibbles with his methodology — which is why they say they are retracting it — his paper says "we did not prove an association." They're distancing themselves away from the guy and the controversy, in my opinion.

The problem is that [being in the news] may make him more of a hero to those who want to believe what he says. See, its not just expertise, its emotions too. Which, again, is entirely understandable and valid, even when it causes people to not pay full and fair attention to the facts.

Where's the action now? What's the burning risk question that's causing controversy?
Oh my gracious, where to begin? Cell phones and radiation. Plastics. Mercury in seafood. Climate change. Obesity. They all are bigger or smaller risks where the emotional nature of them is leading to responses that don't seem commensurate to the risk. We're not responding to climate change nearly as dramatically as we need to — or obesity. And we're probably reacting more fearfully about mercury in seafood than the actual threat warrants. At our peril. That's what matters — it's the critical phrase here. I'm not just observing that people are being irrational. But if we're too afraid or not afraid enough, it can lead to choices that raise our risks.

Comments

 

Please keep your community civil. All comments must follow the NPR.org Community rules and terms of use, and will be moderated prior to posting. NPR reserves the right to use the comments we receive, in whole or in part, and to use the commenter's name and location, in any medium. See also the Terms of Use, Privacy Policy and Community FAQ.