What Can Be Done To Avoid Man-Made Disasters

Guests

William Reilly, co-chairman, BP Deepwater Horizon Oil Spill and Offshore Drilling Commission
James Bagian, chief patient safety officer, Veterans Health Administration
Beverly Sauer, consultant in strategic risk communication

A major plane crash, a bridge collapse or an oil spill — man-made disasters often share a common element: complacency. Managers who don't think things can go wrong, faulty technology and really bad luck can all lead to costly mishaps. But steps can be taken to try to prevent calamities before they happen.

Copyright © 2010 NPR. For personal, noncommercial use only. See Terms of Use. For other uses, prior permission required.

TONY COX, host:

This is TALK OF THE NATION. Im Tony Cox in Washington. Neal Conan is away.

Man-made disasters are a fact of life, from the BP Horizon oil spill to the West Virginia mine explosion earlier this year to the levy failures during Hurricane Katrina five summers ago.

What these and other incidents have in common, experts say, is a dangerous mix of arrogance, bad luck and shortsightedness by the people in charge. Managers often operate on the edge, mistakenly thinking that technology will keep things right, but too often, they're wrong.

Today, we explore the risk-taking culture that can and does lead to disaster. We will also examine what can be done about it. How do you and your colleagues try to keep disaster from happening where you work? Tell us your story. Our number here in Washington is 800-989-8255. The email address, talk@npr.org. And to join the conversation, just go to our website. Go to npr.org, and click on TALK OF THE NATION.

Joining us now by phone from San Francisco is William Reilly. He is the co-chairman of President Obama's national oil spill commission and co-chairman of the National Commission on Energy Policy. Mr. Reilly, welcome to the show.

Mr. WILLIAM REILLY (Co-chairman, BP Deepwater Horizon Oil Spill and Offshore Drilling Commission): Pleasure to be here, thank you.

COX: Let's begin with this. Typically, commissions are developed after disasters occur, such as the one that you are working on for the BP Horizon disaster, yet we continue to have more disasters happen. How do you make sure, if it's even possible to do that, that the findings will help in preventing future problems?

Mr. REILLY: One very unusual characteristic of this commission is that we were appointed while the disaster was continuing to unfold. This was a slow-motion catastrophe that just continued and continues today in terms of the impacts that are continuing to be noticed down in the Gulf.

We are focused mostly, however, on the future. I mean, we're instructed to find out the cause and the root cause of this calamity and recommend policies to absolutely reduce the chance of it ever happening again and then to recommend the policy for future of offshore oil and gas.

And our emphasis will be, while we'll answer those questions about how it happened and why it happened, very much on the future.

COX: Would you expect the and I know it's early, but would you expect that part of your commission recommendations would be to either lower or in some way change what I will call the culture of risk-taking that so often leads to these kinds of disasters?

Mr. REILLY: The culture both of the regulatory enterprise of MMS that was supposed to have realistic response plans approved and realistic well designs, protective well designs, as well as the behavior of the company, which has a history of having been challenged by process safety, those very much on our agenda, and we're getting into those as we speak.

COX: Do you think that we're too reliant on technology?

Mr. REILLY: I think we probably have a tendency to, and I guess it's a very human one, to assume that if things haven't gone wrong for an extended period of time, they won't go wrong, even though we are constantly intervening to improve and make more sophisticated the interrelationships among technology and human choices and decision.

We noticed that in the financial crisis, where the subprime problem was thought not to be possible. And we've also noticed it here, where the consensus really was that you could not have a spill like this, given the technology of today's oil and gas exploration, which is very, very sophisticated.

The response capability, however, and the containment technology were never developed in tandem with the tremendous advances made in the capacity to go deeper in the ocean. That is one reason this spill went on much longer than it should have, even after it happened.

COX: One final thing I'd like to ask you: Do you think that there is a fear of punishment on the part of those who work at places like BP and elsewhere, where these disasters occur, that prevents them or in some way affects their behavior on the job because they're so concerned that if something goes wrong that - well, we know what will happen if something goes wrong?

Mr. REILLY: I cannot really say that's true based upon what I know now with respect to BP. I can say that the best rig practices, the best oil and gas exploration practices, which I have looked into somewhat, give everybody, the lowest level worker a stop-work capacity. And there's no repercussions if he should exercise it.

That kind of thing, very alert sensitivity to safety, a problem that anybody sees that can cause a difficulty like a rig failure, I think needs to be built into the system. And whether it was with respect to the Macondo well, we will determine.

COX: William Reilly is the co-chairman of President Obama's national oil spill commission investing the BP Horizon disaster and co-chairman of the National Commission on Energy Policy. He was also the administrator of the EPA under President George H.W. Bush. He joined us by phone from San Francisco. Mr. Reilly, thank you very much.

Mr. REILLY: Good to be with you, thank you.

COX: Thank you. Joining us now from member station WUOM in Ann Arbor, Michigan, is James Bagian. He is a former NASA astronaut who has participated in investigations of Challenger and Columbia shuttle disasters. He is also the current director of the Veteran Administration's National Center for Patient Safety. Jim, nice to have you.

Mr. JAMES BAGIAN (Chief Patient Safety Officer, Veterans Health Administration): Same here.

COX: Let's get right into your career as an astronaut. NASA has had its problems with takeoffs and space travel. What procedures are used to prevent disasters in such a high-tech field as that?

Mr. BAGIAN: Well, I think it's not unlike many other industries. They try to initially analyze the things that are to be done and look for those steps or procedures that have potential risk or hazard associated with them, try to assess the risk.

And this is really the probability that a bad thing will happen, and then decide which risks you want to reduce and how to go about doing that as a cost effect to do so, and then go through that whole litany of things that could happen and then try to anticipate and practice for them.

COX: Well, Jim, your history has shown that you have worked in a number of areas - aerospace, you are now working in the health care industry. Life and death is a matter of life, so to speak, in both of those careers. Are you finding that there are more similarities than differences in terms of how the processes are gone through in order to try to prevent disaster?

Mr. BAGIAN: Well, I think my observation, at least, is in the fields that are more heavily engineering, rooted in engineering so, space flight, aviation - there tends to be a more methodical approach to what can go wrong, anticipation of that, and then to try to put robust systems in place to account for that and reduce the probability of a bad event.

In health care over the last 10 years or so, there's starting to be an awakening of how to do the same things, rather than concentrate on asking people to be more careful, try harder, don't make mistakes and be perfect. I think they're starting to understand that you can't just exhort people to be more careful, that you have to set up a whole system of things that works together to ensure that the right thing happens, not just diligence.

COX: We talked you write about, we have been talking about with Mr. Reilly before you, and you have written about the culture of safety and how some approach it and some dont. And you just gave us a brief description of the difference between aerospace, for example, and other industries.

How big a factor, how important a factor, Jim, is this culture that leads to the kinds of disasters that we saw in the Gulf and in other places?

Mr. BAGIAN: Well, I think culture certainly is important. I think as you hear Mr. Reilly speak just a few moments ago, part of it is the ability and the tendency, desire for people who work in a given area to be able to speak up. To stop the line, is sort of what he was just talking about, that people, anybody can say this doesn't look right, and it has a mechanism by which they can make that communication available up the chain of command.

Now, I'd hasten to add that no matter how good you make that, when you look retrospectively, when there's been a bad event, you can always say we could have communicated better, and that's always true.

I think things are relative and you have to see, do people generally speak up when they have a concern that something's wrong, not that they're sure, but they are concerned and then if they turn out to raise that concern, and it turns out nothing was, there wasn't a problem, do they feel ridiculed, embarrassed, humiliated, or do people say boy, I'm glad you brought that up? It wasn't anything this time, but I'd rather be safe than sorry.

And I think that's the thing that you always try to strike that happy balance, and I think you never do. You have to constantly be diligent and vigilant as far as management goes, to invite people to speak and to be able to do so without feeling not just punishment but being embarrassed and humiliated.

COX: Whenever there is a disaster, and there have been too many to enumerate, what tends to happen, and it happened again here with BP, was that management gets demonized because it appears as if they were they fell asleep at the wheel or even worse, were complacent or not caring about the, you know, the potential disastrous effects of their activity.

My question to you is: Do you find in your work that that is a legitimate reason and an explanation for why things go wrong, that management just doesn't give a damn, so to speak?

Mr. BAGIAN: I think that's seldom ever the case. As you can just witness - and, you know, I'm not an expert on what's happening with the Gulf oil spill, but you can be sure management wishes they weren't in the position they are today. It's not good from a business perspective, it doesn't help their profitability, it doesn't help their shareholders, and certainly it doesn't help management themselves. As we've seen, there's been a change in management at BP.

So if they could have the Aladdin's magic lamp and make a wish, I'm sure they would wish it didn't happen at all. So I think that's it's an easy thing for people to say, but just whether it's management at an oil company, whether it's a physician or a management of a hospital, they don't want to be a bad thing to happen.

I mean, that would be very, very rare. And it's easy to draw that conclusion, I think, erroneously so, when you're look at after a bad event occurs, people say, oh, something bad happened, somebody must be to blame. And it's seldom ever that simple.

And in fact, a comment that Mr. Reilly made, where he said we're looking for the root cause, I would hasten to add there isn't a root cause. It's a bad term. There are many causes and contributing factors, and to say that there's just one, I would doubt you could ever show an event that there was just one cause. There might be one principal cause, but there are many that, you know, that contribute to in sum total end up with a bad event. And you have to look at the myriad of things that contribute to a bad event.

COX: We're going to be joined shortly by another person to add another perspective to the discussion. And before we go to break, I want to ask you very quickly about the role that punishment plays in risk assessment and in leading up to these kinds of disasters. Is that a big factor, punishment?

Mr. BAGIAN: Well, I think the fear of punishment is. I think in general, what you're trying to do is prevent a bad event from happening. What has happened can't be undone, but you want to say, how do we get people to feel free to speak up when there's been a problem that they don't think they did anything heinous that they'll be treated fairly?

So you have to make criteria that are clear to people that if they cross this line, they could be punished. And we've done that in health care to make that clear to people, but that way, people can feel safe to speak up about other things.

COX: Speaking up is really a very important point, and we are going to talk about that more with Beverly Sauer, who is a consultant in strategic risk communication.

We're talking about risk management and preventing disasters. What measures does your company have in place to try and ward off accidents? Give us a call, 800-989-8255. I'm Tony Cox. It's TALK OF THE NATION from NPR News.

(Soundbite of music)

COX: This is TALK OF THE NATION. I'm Tony Cox, in Washington.

A major plane crash, a bridge collapse or an oil spill, man-made disasters often have several things in common: arrogance, bad luck and complacency among managers.

Today, we talk about the steps that can be taken to curb the risk-taking environment that leads to these kinds of tragedies. Our guests are James Bagian, a former NASA astronaut who participated in the investigations of Challenger and Columbia shuttle disasters. And in just a few moments, we'll be joined by Beverly Sauer, a consultant in strategic risk communication, who served as a member of West Virginia's Special Commission on the Sago Mine disaster back in 2006.

If you'd like to join the conversation, tell us: what systems does your company have in place to try to prevent disasters in the workplace? Tell us your story. Our number here in Washington, 800-989-8255; the email address, talk@npr.org. And to join the conversation, just go to the website npr.org, and click on TALK OF THE NATION.

Let's take a phone call first. This is David(ph) from Salt Lake City, Utah. David, welcome to TALK OF THE NATION.

DAVID (Caller): Yeah, hello, thank you for taking my call.

COX: You're welcome.

DAVID: I have I've been in the field of disaster management and relief for 12, 13 years now, and one thing that I would say is that whenever there's a major setback, such as the, you know, what we're seeing in the Gulf, there are very quickly developments - such as, I believe it was after Three Mile Island - there was the creation of what's called the joint information center, which allows the stakeholders involved in the response to streamline their messaging to the public and also coordinate their messages internally.

We've also seen, you know, the adoption of what's called the incident command system, which now goes from the federal level down to your township level, so that responders can coordinate better.

And what I would think is that one of the outcomes we might see from what's happening here, is better coordination with the private-sector stakeholders. Because even 10 years ago - I won't say it was an afterthought, but we didn't see that as much. And now I think that may bring that special coordination to the fore.

And I'll go ahead and take my response off the air.

COX: Thank you very much, David, for that, which is a good question for me to put to our next guest, who is joining us here in Studio 3A, Beverly Saur. Beverly Sauer, as we said, is a consultant and expert in strategic risk communication; served on West Virginia's Special Commission on the Sago Mine disaster back in 2006, also the author of "The Rhetoric of Risk: Technical Documentation in Hazardous Environments." Nice to have you here.

Ms. BEVERLY SAUER (Consultant in Strategic Risk Communication; Author, "The Rhetoric of Risk: Technical Documentation in Hazardous Environments"): Thank you, Tony.

COX: So communication is really big. Jim made reference to it, so did the caller make reference to intercommunication between private and the government sector. How big a factor is, Beverly, communication in terms of preventing these disasters?

Ms. SAUER: Well, Tony, without making communication the sort of answer to every question, which is dangerous in itself, because technology is really important; you have to have communication within an organization so that the people who are on the ground - and that could be inside a coal mine, it could be on the rig - have the ability to communicate what they know to management because they have a very special knowledge of it's like driving a car. You know your car. You know how it performs. And you have to trust those people to make really good decisions.

At the same time, you need a big picture overview, so that people who are focused in the BP disaster, for example, on the slurry and the mud and making changes, are not too focused on what they are doing, that they're communicating with other people who know the effects on the big system.

So it's a communication top-down, communicating that safety is important; and it's communication from the bottom up, what's happening on the ground, and how do we deal with it.

COX: Before we get to our next caller, who is going to be Ken(ph), by the way, so if Ken, if you can hear me, just hold on, I'm coming to you in a second, Jim, I want to follow up on this idea of communicating between levels within a company and whether or not the people at the bottom rung are heard, or cared about to be heard, by the people who are the decision-makers at the top rung.

Mr. BAGIAN: Well, I think generally, they want to hear, but you can look at many studies that are done we've done them, others have done them when you ask people how good is communication up against the hierarchy.

The higher you are in an organization, the managers tend to think communication's great, but when you go to the bowels of the ship, so to speak, they don't feel it's quite so good.

So it's generally the people at the top think everything is great, I wish people would talk to me. But the people at the bottom have disincentives, obstacles, may feel intimidated, you know, a whole bunch of things that can cause that, where they don't communicate. And that's where management, leadership has to constantly offer avenues, techniques to break down those barriers, because you might not hear the things you need from the coal(ph) face.

COX: Let's hold on, Beverly. I'm going to let you respond to that, but I want to get Ken in here, because we want to hear what the listeners have to say, as well. Ken - joining us joining us from Alford, Florida. Is that right, Alford or Alfred?

KEN (Caller): Alford.

COX: Alford. Welcome to TALK OF THE NATION.

KEN: Yes, and thank you. I've been in health care, I'm an RN, and a lot of problems that face nurses every day are medication errors. And they can be catastrophic. The way we deal with them is not to lambaste the person who made the error, but there's a critical point here, and that is that 99.9 percent of the time, the nurse really tried not to make that error. And I think that's one of the things that's missing in corporate America today, is that lack of regard for the humans or the resources that are involved.

COX: Thank you very much, Ken, for that call. Beverly, you wanted to make a comment?

Ms. SAUER: Well, I want to say that a lot of the communication that occurs is simply different, and we have to understand that we're not always dealing with the same thing.

Engineers, who are sometimes at the top levels of management, very frequently talk in very complex ways, and they don't they feel that if they're criticized for their communication, that they're not trusted for their engineering.

So we need to create a culture where engineers are actually trained to communicate to others outside of their areas of expertise. And this was particularly critical in the blowout preventer, because there are very few engineers that I've known I know or I've talked to - who actually know how it works. And then if you look at the New York Times, articles on the financial issues, people will say, well, the blowout preventer didn't prevent the fire, the blowout preventer didn't prevent this - and very few people actually know what it was intended to do.

So you take that to the Minerals Management Service, they assume that if it's a blowout preventer, it prevents blowouts. And there are not enough questions asked and not enough communication about what's really going on in some of these complex technologies.

COX: Which brings us back to the beginning, communication, once again.

Ms. SAUER: Communication.

COX: Scott(ph) is joining us from Eagle River, Arkansas. Scott, welcome or no, Alaska, Eagle River, Alaska. Welcome.

SCOTT (Caller): Hi, my name is Scott, and I've got two incidents to relate where there were problems in one spill and one potential spill, because management didn't want to listen to engineering.

In the mid-'90s, there was a (technical difficulties)...

COX: Scott, unfortunately, you are breaking up. Let me ask you to get to another phone, call us back, and we'll try to get you back on the air. Jim, you were an astronaut. Did you feel that people weren't listening to you? Have you heard something when you were, you know, about to go up into space - or you saw something, or you smelled something - that had not been in the book, so to speak?

Mr. BAGIAN: No, I think generally, you felt that when you fed back concerns, that they were looked at. I won't say everybody always agreed with answers sometimes, but as Beverly said, it's how do you communicate what level of risk you're going to take.

And I think that's important to point out. There's always risk. There's never zero risk. There's nothing that is safe. That is, it's safe as compared to what? And I think sometimes when people are sloppy and talk about these things, that gets lost, because if you don't define what does safe mean, one person has the idea - say with the shuttle, oh, it's like flying a commercial airliner to Disney World. Well, there isn't like that at all. It's far more dangerous than that.

So one person's view of safe is an airliner, and an astronaut's is it's like getting shot out of a cannon. So it's a quite different view of the world.

COX: Let's talk about something else that's related to safety in the workplace, and that's cost. Cost is always an issue, and here is an email we got. As a recent civil engineering graduate, it seems to me that there is an unwillingness to pay for the cost of designing and maintaining the structures that we rely on.

As we begin more complex designs, the cost will increase if they are to be done safety. True or not true, Beverly?

Ms. SAUER: It's absolutely true. And one of the ironies of all of this is that if you look at the Columbia disaster, NASA estimated that it cost about $6.4 billion to do the cleanup and investigation. There were figures that came out this morning that BP is about $6.1 billion. So it seems to be the cost of a disaster.

The cost of good communication and preparation is much smaller than that and needs to be taken up front. And when you look at some of these companies, Upper Big Branch mining disaster, there were over 1,000 violations which were seen to be the cost of doing business in the engineer or in mining engineering.

Any three of those, the first three were combustible gases, ventilation and arcing electrical wires. And I've often said, if you got on an airplane, and you knew that there were arcing electrical wires, combustible gases and ventilation, you wouldn't get on. But you expect miners to go to work in conditions that are disastrous to begin with -and fundamentally disastrous.

COX: Do you find that to be the issue for you in the places where you have studied and worked, Jim, that cost is a major factor with regard to the potential for safety violations, and ultimately, for disaster?

Mr. BAGIAN: I think you'll often hear that said. I'm not sure how true it is. Cost certainly is a factor. It must be considered, because it's not safety at any cost. It's safety at the appropriate cost. And you have to look at the - what is the cost to the organization.

If you spend money to make one thing safer, but then you don't have the ability, say, to buy new diagnostic or therapeutic equipment, there's a real human cost to that, as well. And you have to weigh those and decide which is the best for the population you serve, in general. And I think that's true whether you run an airline, whether you're trying to fly the Space Shuttle or drill a well.

And I think, how do you communicate that? How do you talk about that there's - what risk is acceptable. And you'll see NASA's had that problem, and so have others, that before the disaster occurs, how well do you communicate to your stakeholders what the true risk of a disaster is and how much you're willing to spend to buy that risk down? And I think that often doesn't occur until - in many organizations until after the disaster occurs. And that's far too late.

COX: You know, we've been talking about space travel. We've been talking about BP. We've talked about the mine disasters. But there's another area where cost and safety are, you know, really close to all of us, and that's the food that we eat. Let me ask Phil from Sparta, New Jersey to join us. Phil, welcome. You're on TALK OF THE NATION.

PHIL (Caller): Thank you very much. I appreciate it. Yeah, you know, the food industry has had a program in place for 40 or 50 years. It was developed by Pillsbury for their food for the space flight program. And, basically, what it does is identifies the critical hazards and then identifies control points that control those hazards. So that way you don't waste your resources on the unimportant stuff. You focus on the important stuff.

COX: Does it work?

PHIL: It works, but people are going to say, why is all the - why do we have all these problems in the food industry?

COX: Well, I was going to ask you that. Yes. Well, what's the answer?

PHIL: It doesn't work. Oh, I'm sorry.

COX: I was going to ask you if it's working, then why do we have all of these problems in the food industry, then?

PHIL: In my 35 years experience in the food industry, when it didn't work, it was because the resources were not given upfront to properly develop the program, or the program wasn't followed and the resources weren't given on a long-term basis to continue the program. The program itself is very rigorous. It's human error in choosing not to follow the program.

COX: All right. Phil, thank you very much for that. Once again, Beverly and Jim, back to the idea of the cost. Here's another caller. This is Mark from Grangeville, Idaho. Mark, welcome to TALK OF THE NATION.

MARK (Caller): Hi, yeah. I work in the telecommunications industry. I mean, we have to spend a significant amount of money to be prepared to respond to disasters: major power outages, you know, earthquakes, all kinds of stuff we have to be prepared to respond to. And it comes down to resources. You have to have the money available from the start, from design, from engineering, from construction, and then in maintenance.

I mean - and in my opinion, in most of these cases, it truly comes down to greed. People want to put the money in their pockets instead of investing it on safety and anti-disaster and disaster preparedness. And to say that safety at any cost isn't worth pursuing, I would vehemently disagree. Safety at any cost is what is necessary, and if you can't justify the safety expense, then we don't need to be doing that in the first place.

COX: Thank you very much for the call, Mark.

You're listening to TALK OF THE NATION, from NPR News.

We've got callers lined up. They want to talk. Let's get them in, then I'll come back to both you, Jim and Beverly, for your responses to what we are hearing. Andy from Blountville, Tennessee - welcome to TALK OF THE NATION.

ANDY (Caller): Yes, sir. Thank you. I think the point I'm going to make is probably the one the man from Alaska that you lost was going to say. Management in modern American business does not want bad news. And technical experts who are brave enough to raise alarms are perceived as not team players, and economics trumps everything.

COX: Randy, thank you for that. You know, let's go back to this communication thing, Beverly. Randy was suggesting, as others have, that the folks at the top don't want to hear what the people at the bottom are saying, and certainly if it has a price tag attached to it.

Ms. SAUER: That's very frequently the case - not always the case, but very frequently. And if you look at the BP emails, when - you see a moment when one of the team members actually says: I don't want to be a bad team member, but I want to make a point. So there is this discussion going on.

But one of the things that happens in emails is that they're not terribly well organized. And so we don't know what the final version of the events is. We can't - we see discussions happening, people making decisions on the fly, sometimes. And these aren't saved in ways that people, when there's a disaster like the Columbia disaster, when people are in the air and we need to have information and things are happening very quickly, we have emails spread all over the world and inaccessible, sometimes, on laptops. So...

COX: Jim, so that we don't end this conversation on a sour note - we've been, you know, bitching about the boss for the last 20 minutes or so, and probably rightfully so in some cases. And yet you have been researching this, working in this area, studying in this area. Are there no positive steps we can implement now, something that we can look forward to to try to prevent these kinds of disasters?

Mr. BAGIAN: Well, I think there's always risk, but I think there's a number of things. And one of your callers that talked about the food industry, for instance. I believe he was talking about HACCPs, which was developed for food. It's very much like the failure mode and effect analysis that the engineering world uses, and as we developed in health care. And there are tools that are out there that help people understand how to methodically look for the flaws or vulnerabilities in their system and then prioritize them for action - which one that gives you the best utility that deal with first, et cetera. Those can be used. Many industries use them. Some don't.

I think the issue about communication, how do you communicate, and what is the system for communicating concerns so that they don't get lost in emails that are not well communicated or generally available? How well do you have a separate pair of fresh eyes that is not emotionally attached to what's going on? They can act as an ombudsman, in a way. And there are systems like that that can occur, and many good organizations have just those. Are any of them foolproof? Absolutely not.

And I would add the one comment about, you know, if you can't do it safely, you shouldn't do it. The question is: What is safely? Nothing that we do is without risk. There are airliners that do crash. There are trains that do crash. And if we say there can never, ever be an incident, that means we will do nothing. We're not omniscient.

So I think we have to realize what those are, and it's always the one we seize on - here's a case, as you mentioned, the food problems. What percentage of food that is served or delivered is defective or hurts a consumer in infinitesimally small amount, and you can't lose sight of that. And you have to look at the denominators who say what percent really cause problems to understand that risk.

COX: You know, it's been an interesting conversation, has it not, Beverly?

Ms. SAUER: Yes, it has.

COX: And our time is up, unfortunately. And I - the only question that I can ask you that I can get a really quick answer is: I'm assuming that we must continue to communicate.

Ms. SAUER: We must continue to communicate. And management, there are good stories of managers who make a point that they are the leaders. They're leaders in safety, and they're leaders in a safety culture.

COX: Beverly Sauer is the consultant in strategic risk communication. She joined me here in Studio 3A today. James Bagian is a former NASA astronaut who has helped investigate the Challenger and Columbia shuttle disasters. He joined us from member station WUOM in Ann Arbor, Michigan. Thank you both.

This is TALK OF THE NATION, from NPR News.

Copyright © 2010 NPR. All rights reserved. No quotes from the materials contained herein may be used in any media without attribution to NPR. This transcript is provided for personal, noncommercial use only, pursuant to our Terms of Use. Any other use requires NPR's prior permission. Visit our permissions page for further information.

NPR transcripts are created on a rush deadline by a contractor for NPR, and accuracy and availability may vary. This text may not be in its final form and may be updated or revised in the future. Please be aware that the authoritative record of NPR's programming is the audio.

Comments

 

Please keep your community civil. All comments must follow the NPR.org Community rules and terms of use, and will be moderated prior to posting. NPR reserves the right to use the comments we receive, in whole or in part, and to use the commenter's name and location, in any medium. See also the Terms of Use, Privacy Policy and Community FAQ.