Risk Assessment: Balancing Safety With Cost

Guests

Susan Hough, seismologist, U.S. Geological Survey
John Sorensen, researcher, Oak Ridge National Laboratory
James Bagian, director, Center for Health Engineering and Patient Safety at University of Michigan

In the wake of the natural and nuclear disasters in Japan, many are wondering: When is safe, safe enough? Can any business, city or country be fully prepared for the unthinkable? Risk assessors talk about the choices that must be made to balance safety with practicality and cost.

Copyright © 2011 NPR. For personal, noncommercial use only. See Terms of Use. For other uses, prior permission required.

LYNN NEARY, host:

This is TALK OF THE NATION. I'm Lynn Neary in Washington, in for Neal Conan.

The nuclear reactors in Japan that are now threatening meltdown were, until recently, considered well-prepared and safe. Engineers built in safety features and backup systems, but after a massive earthquake and tsunami, none of it was enough. In the U.S., a new report out this week suggests that this country is not nearly as earthquake-ready as it should be.

But preparing for any major disaster involves tradeoffs between what is safe and what is practical and affordable. Earthquakes and accidents are difficult to predict. So how do companies and governments calculate and prepare for the worst?

We'd like to hear from those of you who plan for accidents and disasters in your factory, your restaurant, your city. How do you decide what is safe enough? Our number here in Washington is 800-989-8255. And the email address is talk@npr.org. You can also join the conversation at our website. Go to npr.org and click on TALK OF THE NATION.

Later in the hour, outraged readers blast a New York Times report for blaming the 11-year-old victim of a sex assault. But first, joining me now from member station KPCC in Pasadena, California is U.S. Geological Survey seismologist Susan Hough. She's also the author of the book "Predicting the Unpredictable: The Tumultuous Science of Earthquake Prediction." Susan, welcome to the show.

Ms. SUSAN HOUGH (Seismologist, U.S. Geological Survey): Thank you, glad to be here.

NEARY: Good to have you with us. Let's talk first about this report from the National Research Council. Is the U.S. earthquake-ready?

Ms. HOUGH: Yes and no. There's been a lot of steps taken, in some places going back to before the 19th century, before the 20th century. But preparedness has lagged, definitely, in some parts of the U.S. And there is still, as the report says, a lot more that can be done pretty much everywhere.

NEARY: Yeah, now, a lot of people feel that in order to be ready for an earthquake, you need to know when it's going to happen. But you've written a lot about the fact that it's very hard to predict exactly when an earthquake might occur.

Ms. HOUGH: Well, we can't predict right now, within a short term, within a short window, when the next big one is going to hit. And we really can't make strong statements about earthquake likelihood within even 10 years.

And so that scientific problem has received a lot of attention. It just hasn't made any traction in a useful way. So what we focus on instead is what we call forecasting, which is looking at the expected average rates of earthquakes, and so, you know, what are we up against on average in the next 30 years? And that's the basis for our preparedness.

NEARY: Why is it so hard to predict, first of all? Why is it so hard?

Ms. HOUGH: If you were going to predict an earthquake, you would have to basically see something from the Earth, some sort of signal that's heads-up, a big earthquake is coming.

And a lot of signals have been looked at, investigated, proposed, and none of them have held up to closer scrutiny. So people might think they see radon fluctuations before an earthquake. An earthquake happens, and there was a radon blip.

Well, you need to test that hypothesis and look at radon systematically and look at earthquakes. They're difficult studies to do because we can't run the experiment. We just have to wait for the earthquakes to happen.

But when we do run the careful experiments, the precursors that we think we see just don't hold up. So what that basically means is that we can't - we haven't seen anything that tells us that an earthquake is on the way.

NEARY: Yeah. So then if we don't really - if you can't predict when an earthquake's going to happen, how do you plan for things that you can't predict, that we don't know when it's going to happen?

Ms. HOUGH: OK, well, we can - there is predictability. I mean, that sounds like it's playing with words, but...

NEARY: It does.

Ms. HOUGH: But you know - yeah, no, we call it forecasting. We're pretty good with GPS measurements and recordings of earthquakes in geology. We know where the active earthquake zones are. We know how many big earthquakes are going to happen on average over a long timeframe in different areas. You know, the San Andreas is the major fault in California, but there's other faults.

And we can sort of assess the expected rates, and we cast the hazard in terms of probabilities, and so the attention tends to be focused where, obviously, the probabilities are the highest.

NEARY: And in those areas, how does one prepare? I mean, is it a matter of getting the buildings ready, getting people's exit plans ready? I mean, how do you go about it?

Ms. HOUGH: Yeah, maybe I should say first that there is one possible miscommunication that's been going on, is that when we put together a hazard map that shows lower hazard areas, that's not saying that big earthquakes are impossible in those areas.

And I think sometimes that catches people by surprise. It's just saying that earthquakes are less likely in the regions that have lower hazard. But Earth scientists know that big earthquakes are possible, and that earthquakes like Japan, which are relatively infrequent, in fact are going to happen.

So - but to answer your question, yeah, the preparedness efforts have focused on construction and design of safe buildings and safe infrastructure, because even if we could predict earthquakes, the - our buildings, our infrastructure would still have to withstand them. So that's really the key to safety.

NEARY: And to address your point, which is that there are low-hazard areas, that that's not saying that an earthquake won't happen there, it does seem that certain parts of the country may be more ready to accept the fact that an earthquake might occur and to prepare for that, whereas other parts of the country, I would say where I am right now on the East Coast, we tend not to think that an earthquake's going to come here. So perhaps that part of the country would be less willing to put in the time, effort and cost, I guess, involved in truly getting ready for an earthquake. Is that right?

Ms. HOUGH: Yeah, absolutely. And it is - it's fair to acknowledge that is a balance. It's an investment in resilience, and that has to be weighed against other needs and interests.

But the awareness of earthquake hazard and the preparedness has really been driven by the earthquakes that we've seen in this country's short history, and it's short to a geologist.

So in Northern California, where they had a quite large earthquake in 1868 on the Hayward fault, the earthquake preparedness started quite early for the U.S. The hazard in Southern California was debated through the '20s quite heatedly, until the 1933 Long Beach earthquake put the debate to rest, and the preparedness efforts got started here.

If you go to the Pacific Northwest, when I was in graduate school, which wasn't quite the Dark Ages, there was debate about whether that was an active earthquake zone, whether the subduction that you have in that region could generate large earthquakes or not. And we - since then, it's clear that it can, it does - every few hundred years you get an earthquake about as big as the recent earthquake in Japan.

If you go to other parts of the country, the earthquakes are even less frequent. People have been reminded much less of the potential. So the preparedness definitely lags.

NEARY: I'd like to bring in John Sorensen now to the conversation. He's a researcher with Oak Ridge National Laboratory, and he's an emergency management expert, and he's at member station WUOT in Knoxville, Tennessee. Good to have you with us, John.

Mr. JOHN SORENSEN (Oak Ridge National Laboratory): Hello.

NEARY: Now, I'm wondering: In terms of risk assessment, it's a risky job in the sense that it's hard to get people to listen - or is it hard to get people to listen or even to follow the advice of risk assessors like yourself?

Mr. SORENSEN: I would say that despite fairly large efforts by organizations, including FEMA and the Red Cross, to get the American public more prepared for a range of events, that we still as a country have not achieved the level of preparedness that we should be at to be more resilient to particularly large events.

There's probably several different reasons for this. One of the major ones is that people do not perceive the efficacy of spending money on doing things like preparing a disaster supply kit, particularly when they're marginally economic already and this represents a large expenditure of money compared to their daily needs.

NEARY: All right. Let's take a call now. We're going to go to Giovanni(ph), who's in Denver, Colorado. Hi, Giovanni.

GIOVANNI (Caller): Hello there.

NEARY: Go ahead.

GIOVANNI: Well, I had a comment more than anything in - just particularly in light of the last expert you had on there. I just took a training called CERT, Community Emergency Response Team, here in Denver, sponsored by the Denver Emergency Management Agency.

And what it does, it's a two-days-plus simulated exercise, and it tells you how to prepare for disasters, teaches you fire safety, minimal medical operations, light search and rescue, and sort of how to organize a group team.

And I found it amazingly useful. And I would recommend it to anybody. I know there are CERT groups in California and North Carolina. There are probably a lot more. But it was absolutely great. And again, it's Community Emergency Response Team training.

NEARY: So do you feel much better prepared now to deal with the possibility of an emergency happening in your community, that you could help out, that you could do something about it?

GIOVANNI: Yeah, absolutely, and I've always thought I was good in emergencies, but boy, the things you learn in this that you don't think about, techniques for light search and rescue, and if you come across a building that's had a fire or something like that and you're the first one on the scene - just all kinds of detailed information that you wouldn't think of otherwise.

NEARY: All right, thanks so much, Giovanni. Susan Hough, I wanted to ask you before you have to go: You know, in getting people to sort of take recommendations about getting - preparing for emergencies, I think people make a distinction between mandatory and recommended.

And you know, you can recommend that somebody retrofit their house, for example, for an earthquake. But unless it's mandatory, they may not do it, right?

Ms. HOUGH: Right, and that is - it's a tough economic problem. It's a complicated world. People have financial pressures. And so mandatory retrofitting is a difficult concept to think about. And it isn't generally done.

So there is some legislation within limited cities. For the most part, it is a matter of making recommendations that property owners can act on or not.

NEARY: All right, well, thanks so much for joining us, Susan.

Ms. HOUGH: Thank you.

NEARY: Susan Hough is a seismologist at the U.S. Geological Survey, and she joined us from member station KPCC.

We're going to continue our discussion in a moment. I'm Lynn Neary. This is TALK OF THE NATION from NPR News.

(Soundbite of music)

NEARY: This is TALK OF THE NATION from NPR News. I'm Lynn Neary.

In the wake of the natural and the nuclear disasters in Japan, many people are wondering: When is safe safe enough? And can any business, city or country be fully prepared for the unthinkable?

We're talking with two people who face these decisions all the time. John Sorensen is an emergency preparedness expert and a researcher at Oak Ridge National Laboratory. And James Bagian advises on issues of risk management. He directs the Center for Health Engineering and Patient Safety at the University of Michigan. And he is a former NASA astronaut.

And we'd like to hear from those of you plan for accidents and disasters in your factory, your restaurant, your city. How do you decide what is safe enough? Our number here in Washington, 800-989-8255. Our email address is talk@npr.org. And you can join the conversation on our website. Go to npr.org, and click on TALK OF THE NATION.

Welcome to the program, James Bagian, good to have you with us.

Ms. JAMES BAGIAN (Director, Center for Health Engineering and Patient Safety, University of Michigan): Thanks. So glad to be here, Lynn.

NEARY: So, now, when you're assessing a situation, when you're doing a risk assessment, do you come up with sort of worst possible case scenarios? Do you develop those?

Mr. BAGIAN: Yes, I think you do several things. One, you want to bound the problem. You want to look at: OK, what's the worst that can happen? But then you have to weigh into that how likely is that to occur because as you heard Susan say a little bit earlier, you know, you have to make an economic decision, a decision for preference, the opportunity cost.

If you do one thing, you may not be able to do another, and you need to weigh those things in order to finally make a decision how to allocate resources, and that's not just money. That's people's time, their attention, things like that. So that has to be weighed in coming to the decision.

NEARY: Now, I know you work in the health care area now, but you've worked with other kinds of businesses and organizations, as well. Is that right?

Mr. BAGIAN: That's correct, yes. In fact, I current sit on the NASA Aerospace Safety Advisory Panel.

NEARY: Well, I read something that indicated that you were saying there's a big difference between the way, for instance, health professionals might approach this, as opposed to some of the other people you've dealt with. I wonder if you can explain how there might be different mindsets in different workplaces or different kinds of agencies.

Mr. BAGIAN: Sure, I'd be glad to take a crack at that. Well, as you heard at the top of the hour, you heard Susan say at point, when you asked her could more things be done, and she said: A lot more than can be done.

The question is: Should it be done? And you heard what Mr. Sorenson said about preparedness, that we're not, you know, where we should be at. Well, compared to what?

I mean, the thing is that unless you look at what likelihood the outcome is and your tolerance for that, it changes. So in many engineering settings, for instance in aerospace, they will look at what the various range of things that could go wrong, you know, the failure of a hydraulic pump or the failure of an engine or a bird strike, you know, hitting the cockpit.

And then they look at what would it cost to make that risk - it never goes totally away but get it down, you know, substantially. In some cases, you can actually remove it, versus what else can't you do. And they'll make a decision.

Sometimes, it's fairly straightforward to calculate those probabilities. In other cases, it's not. In health care, traditionally when decisions have been made on an organizational level, often there's not a formal what would be called a risk matrix, where they look at the severity, grading the severity of what could happen, that is a patient dies or they're injured and permanently disabled or temporarily disabled or inconvenienced, that could be the spectrum, versus how likely is that to happen, how many times would this particular hospital see that happen within a year and weigh it that way. Seldom is that done.

It's usually done more on people's gut feel, their so-called expert opinion, which in many cases would comport with that kind of risk matrix. But when they don't actually use a formal, explicit prioritization technique, sometimes emotions tend to get into it.

Like, we're watching, as you mentioned, the current issue in Japan. People have become concerned. And I think we don't know enough to make an intelligent decision about what was the risk management, you know, how was the risk management done in that case.

But the fact is, emotions start weighing in. Instead of saying what's the likelihood we'll see this, what do we give up, is the juice worth the squeeze. If we made this so it was more impervious to a problem such as this tsunami, what would that cost us as opposed to other things we could do?

And I think until you weigh those things in the cold light of day and in ways that people can understand, if you just talk about percentages, most people don't understand what that means.

NEARY: All right, let's take a call. We're going to go to Jackie, who's calling from California. Hi, Jackie.

JACKIE (Caller): Hi, Lynn, thanks for taking my call.

NEARY: Go ahead.

JACKIE: I'd like to come from the perspective of a pharmaceutical company that often knows what the worst is that could happen to folks that take their approved medicines. For example, a rare side effect is defined as a side effect that will occur to approximately one out of 1,000 people. An extremely rare side effect is one that is experienced by one out of 10,000 people.

But when a drug is finally approved by the FDA and then marketed to a U.S. population of, say, 300 million people, those extremely rare side effects, all of a sudden, are being experienced by potentially thousands of individuals.

And part of the equation that is crunched by both the Food and Drug Administration and pharmaceutical manufacturers, is: Does the efficacy of the drug outweigh the risk of the drug? And I think a lot, at times, before a drug is approved, not enough testing is done, or the manufacturers are often guilty of being able to downplay the extremely rare events that they saw emerge during clinical trials.

NEARY: All right, Jackie. Let me see. I'm going to ask James Bagian to address your question. Thanks so much for calling in. Is this the kind of thing you would do a risk assessment with, James Bagian?

Mr. BAGIAN: Yes, and, well, I think what the caller had asked or related to us was a good point, and that is the fact that a decision has to be made.

Now, I think it's not very productive to say: Was there a case where a manufacturer downplayed or falsified information? That would be a crime. People commit crimes. That's not the real thing you should be discussing.

The real point of the discussion is: If you have criteria that, say, one in, you know, a thousand patients could be harmed, if you weigh that against -suppose that means for one in thousand you hurt, you saved 500 lives. Let's just suppose that was it theoretically.

I think any rational person, if given the calculus of here is a drug, and we could cure 500 people, but we'll kill one, and if we do nothing, those 500 would die, I think any rational person would say: Yes, we're going to kill one with the drug to save 500 lives. I mean, that's the decision that needs to be made, whereas if you do nothing, you can be passive, but to not decide is to decide.

If you decide you want to do nothing because of the risk of some bad thing happening, and then you condemn 500 people to die, is that really a prudent choice? And I think those are the kinds of things that need to be talked about. So even if it was one in 1,000 die, if you save 500 lives, overall, depending on how you do the numbers, I'm giving an outlandish example, that would be clear. That would be a fine decision, in my mind and I think in most rational people's minds. Does that make sense?

NEARY: Yeah, that makes sense. And I want to bring John Sorensen back into the conversation because I think what might be something that is - applies to both, say a medical - the potential for a medical disaster and also the potential that you might have in a disaster that would involve people needing to flee from a natural disaster, for instance.

Is the psychology involved, that's involved with the people, and that is: How do you get people to really think in these terms when it just doesn't seem likely to ever happen to them, John Sorensen? I mean, the psychology of expecting the worst, it seems like that's part of the challenge of your job is making people understand these things really can happen, and you really do need to be ready for it.

Mr. SORENSEN: Yeah, I think it's basic human nature to say, first of all, this will never happen to me and then second of all, if it does, I can't believe it's happening to me.

NEARY: Yeah.

Mr. SORENSEN: And so we think that public education plays a much secondary role to getting effective warning messages out when something is suspected like approaching tornado, or something is suspected that the science behind the drug testing is incorrect.

And then, you know, people become very information hungry, and they want to know details about the facts and what they should do.

NEARY: Yeah. You know, something that was interesting in Japan and that part of Japan, it was a part of the world that was familiar, is familiar with earthquakes and tsunamis, even, and to some extent people knew they needed to go to high ground. But even still, they weren't ready for what really happened.

So it's that psychology of truly being able to be prepared for the worst is -it's a tough one to get at, James Bagian, I think.

Mr. BAGIAN: Well, I guess, the question is - you know, I don't think we can say necessarily that what was done in Japan in preparation was wrong. If you look at the probabilities and you thought the forecasts were valid, it might have been a decision to say if we hit the one in 10,000 year tsunami, that we're going to take that risk because the investment we would have made otherwise in the intervening 10,000 years, it wouldn't be worth it.

If we, you know, diverted funds to making a huge seawall, let's say, or moving the population elsewhere, and because we use our funds in that way rather than build hospitals, feed the hungry et cetera, many more would die over a 10,000-year period or whatever that period happened to be, it might not be a rational point.

I guess, the thing I would come back to is about really being prepared. That's where the decision comes into risk. How do you communicate it to your population, to the people to things that they can understand?

I mean, people fly on airliners every day. Most people don't worry about crashing. People drive their cars every day and don't think they're going to be killed as they drive home. Those are risks they take.

And if, you know, tell people, for instance - and I don't know this is exactly the way it would come out - but if you said, you know, you could drive a car for a thousand years, and your likelihood of being hurt by a tsunami is even less than that, would people spend much time being prepared, and is it even appropriate to do that? And I think that's the kind of discussion that needs to be had in the cold lighted day before the disaster occurs, so people can make that rational decision, not when the emotions are running high when people are actually hurt.

And I think Mr. Sorensen made an excellent point about education. Education, while necessary, is far from sufficient. Many studies have shown that when you just try to educate people without then incorporating that into the design about how you design your cities or whatever the issue happens to be - a nuclear reactor, how it's built - education is not enough. You have to make actual physical, tangible changes. And if you don't, the preparation will not be that worthwhile.

NEARY: We're talking about risk assessment and preparing for the worst. We're going to go to the phones now, to James, and he's calling from San Francisco.

Hi, James.

JAMES (Caller): Hi. Thanks for having my call.

NEARY: Go ahead.

JAMES: Yes. I work in the information technology department for a large construction company here in California. My construction company is always taking into account risk assessment, earthquakes, flood damage, so on and so forth.

One thing that I'm responsible for is their information, and the company does not seem to spend any energy or money or attention to information data retention.

And I guess my question or comment here is: why do people start to take into account their data - the information aspect. I'm asked to restore information when, for example, if a meteor were to hit one of our sites and all our data is just gone. But I rarely ever hear anything about that when it comes to earthquakes or the tsunami in Japan. So, I know a lot of data was lost, and I'm just curious to know if anyone is taking that into consideration, and if there's been a dollar and cents applied to it.

NEARY: Well, let me ask John Sorensen to respond to that. Thanks so much for your call. John Sorensen?

Dr. SORENSEN: Yeah. I think what you're talking about is the - overall is the redundancy of systems, and be it a redundant transportation system or a redundant data system, what do you do when one system fails, and do you have another system that can serve its same function? And I don't - I think, from a cost benefit perspective, a lot of times, companies or governments decide redundancy is just far too more expensive and are willing to take the risk that something will take out one of the systems.

NEARY: All right. And you're - I want to remind you that you are listening to TALK OF THE NATION from NPR News.

All right. We're going to go Mark in Columbus, Georgia. Hi, Mark.

MARK (Caller): Yes. Hi.

NEARY: Go ahead.

MARK: Right. I work with businesses that are small, medium sized businesses in Georgia, helping them prepare for emergencies. And one of the challenges we've seen, especially in the smaller business side, is that through the past few years, most of their focus has been on simply just surviving so they really haven't internalized the need to do this on a voluntary basis.

We do see, with the new Public Law 110-53 that has come out with a recommendation from the 9/11 Commission to develop a voluntary private sector preparedness accreditation certification that potentially, then, the message is going to move from more of a fear-based need to do this, to potentially more of an incentive base. And we're already seeing some regulatory requirements that day cares(ph) have to have an emergency plan before they are licensed. So somewhat moving more from the voluntary to some regulatory, but, again, just kind of building that awareness that the need - to have that plan in place regardless of the size.

NEARY: All right. Thanks so much for your call, Mark.

James Bagian, what was your reaction to what Mark was talking about there?

Mr. BAGIAN: Well, I think, you know, any of those kinds of things, especially when they're voluntary, they're just that. It's certainly worth while to make people aware that there are hazards out there, they might want to defend against. But, once again, it comes back to one of the things, I believe, you said at the top of the hour, and that is how safe is safe, how safe do you want to be. And until you try to put some numbers to that, it makes the decision that people make very arbitrary and inconsistent.

So as much as you can, and it's not always easy, you need to try to relate it to, you know, what percent of the time and fractional percent of the time are you willing to have a bad thing happen and what is it cost to mitigate that risk versus other things?

So an example, I think John Sorensen said, or one of the callers said about, you know, an asteroid hitting his data center. If you want to go against that risk, that's something to look at. You would weigh and say is it - if I'm only to do one, would I protect against an asteroid or a tsunami? So then, you'll make a decision, which one was it. But I think often those decisions don't come up.

I would go in slightly other areas about when you do analysis of risk, they do things called failure mode and effect analysis - where you look at the various modes that could cause a failure and decide what the effect would be and is it worth mitigating?

So a good example is during Katrina. When Katrina hit New Orleans, many of the hospitals lost power, and they had diesel generators, diesel electric generators to supply power.

NEARY: Sadly, we're going to have - I'm afraid we're going to have to stop right there. We're out of time. But thank you so much for joining us today, Mr. Bagian.

Mr. BAGIAN: Thank you.

NEARY: James Bagian joined us from WUOM in Michigan. He's a professor of engineering at the University of Michigan. We were also joined by emergency preparedness expert, John Sorensen. I'm Lynn Neary. This is TALK OF THE NATION from NPR News.

Copyright © 2011 NPR. All rights reserved. No quotes from the materials contained herein may be used in any media without attribution to NPR. This transcript is provided for personal, noncommercial use only, pursuant to our Terms of Use. Any other use requires NPR's prior permission. Visit our permissions page for further information.

NPR transcripts are created on a rush deadline by a contractor for NPR, and accuracy and availability may vary. This text may not be in its final form and may be updated or revised in the future. Please be aware that the authoritative record of NPR's programming is the audio.

Related NPR Stories

Comments

 

Please keep your community civil. All comments must follow the NPR.org Community rules and terms of use, and will be moderated prior to posting. NPR reserves the right to use the comments we receive, in whole or in part, and to use the commenter's name and location, in any medium. See also the Terms of Use, Privacy Policy and Community FAQ.