GUY RAZ, HOST:
On the show today, ideas about the dark side of innovation - the fallout, the downstream effects and the Unintended Consequences of all of this technology we're creating today.
EDWARD TENNER: I'm not sure that's exactly the way I would put it.
RAZ: And then there are some of us who aren't freaking out, who actually love unintended consequences.
TENNER: What I love about them is the way that life is so unpredictable. And you really wouldn't have positive surprises unless there were also negative ones.
RAZ: This is Edward Tenner. He's a historian of technology.
TENNER: To me, the philosophy of unintended consequences really means keeping open. It means constantly observing. The people who see endless despair and suffering on one side from technology and the people who see a wonderful new world are really both ideological, and I don't think that either is wrong. I think it's important to have perspective on decisions and on history that will let us look at change with more equanimity.
RAZ: And as Edward points out, if you look at innovation throughout history, it's always better to take the long view. Here's more from Edward Tenner on the TED stage.
(SOUNDBITE OF TED TALK)
TENNER: Let's go to 10,000 years before the present to the time of the domestication of grains. What would our ancestors 10,000 years ago have said if they really had technology assessment? And I can just imagine the committees reporting back to them on where agriculture was going to take humanity, at least in the next few hundred years.
It was really bad news. First of all, worse nutrition, shorter life spans. It was simply awful for women. The skeletal remains from that period have shown that they were grinding grain morning, noon and night. And politically, it was the beginning of a much higher degree of inequality. If there had been rational technology assessment then, I think they very well might have said, let's call the whole thing off.
RAZ: Of course, this was going to be better in the long term for humans. But there were going to be some bad things that were going to happen as well.
TENNER: That's right. And my point there was we have to recognize the limits of technological assessment. We have to take a longer view, which means that sometimes, as I like to say, things can go right only after they've gone wrong. And if we try to prevent any new development with potentially bad consequences, we may be freezing in the bad consequences that we already have.
RAZ: Which is interesting because it reminds me of something you bring up in your TED talk, which is - which I didn't know, which is the example of the Titanic, which is after it sank, there were all these laws that were passed that required ships to carry more lifeboats.
TENNER: Yes. Well, I think when we think about the Titanic, we have to disregard, for the moment, the films we've seen about the Titanic and put ourselves in the position of captains of ships on the North Atlantic at that time. Sea ice was known as a problem, but it was not known as a problem that caused massive loss of life. It could damage a ship, but there was always time for the rescue of the passengers and crew. So the Titanic was really a case where everything worked out wrong, and the rescue didn't come.
(SOUNDBITE OF TED TALK)
TENNER: The lesson of the Titanic, for a lot of the contemporaries, was that you must have enough lifeboats for everyone on the ship. And this was the result of the tragic loss of lives of people who could not get into them. However, there was another case, the Eastland, a ship that capsized in Chicago Harbor in 1915, and it killed 841 people. That was 14 more than the passenger toll of the Titanic. The reason for it, in part, was the extra lifeboats that were added that made this already unstable ship even more unstable.
And that, again, proves that when you're talking about unintended consequences, it's not that easy to know the right lessons to draw. It's really a question of the system, how the ship was loaded, the ballast and many other things.
(SOUNDBITE OF MUSIC)
RAZ: I mean, it is an amazing example of how we are kind of wired to react, right? Like, something bad happens, and then to solve it or to prevent it, the reaction or the solution that we put forward has worse consequences than doing nothing.
TENNER: Yes. Well, I think we can say that, very often, the means that you put in place after some kind of disaster will, in the long run, lead to the next disaster. For example, new bridge designs have a life span of about 30 years. There is some disaster that leads to a new type of bridge. And then engineers get more and more confident in the design. They get bolder and bolder. And then there is some kind of new catastrophe that leads to reconsidering that technology, and the cycle starts all over again.
RAZ: I mean, it sounds like what you're saying is that, look; there's no point in worrying about this stuff or bothering with this stuff because, you know, the course of history is the course of history - that we can't necessarily shape it. And I wonder whether that's true.
TENNER: I'm not saying that we should do nothing, that we shouldn't take any action. But we should also realize that, really, two things happen. That, first of all, the positive outcomes that we expect are usually not nearly as positive as we imagine them. But also, the negative things don't turn out in the same way. For example, we tend to think that what is going on is just going to go on and on and get worse and worse, or it's going to go on and on and get better and better. And reality usually has surprises for us.
RAZ: But, Edward, take something, you know, as scary as climate change, right? I mean, isn't there some value in anticipating the worst-case scenarios and then trying to prevent them?
TENNER: I think it's very important that the fear of worst-case scenarios is leading to all kinds of proposals for geoengineering, for 100 percent renewable power. I'm all for this, and I think it's very good that our fear of apocalypse is motivating that. So I don't dispute that at all.
But I don't think it's really terribly helpful, unless you're actually working on something concretely to deal with a problem, to worry too much about the problem if there isn't something that you can do about it.
RAZ: I don't know. I mean, I think you're right, and I feel very reassured by this. But, you know, in the middle of the night when I wake up in a cold sweat, I'm thinking we're, like, at the very edge of destroying ourselves. Like, this can be the end of our species.
TENNER: Yes, it could. Or probably, more likely, it would mean a worldwide degradation of the living conditions of humanity. But remember, there was always a positive side of these epidemics. So for example, after the Black Death in the 14th century, if you survived, it was a very good time to be a peasant. You had lower rents. There were more opportunities for people to become artisans and masters of their own workshops. So there was really a lot of opportunity if you didn't get killed by the epidemic.
RAZ: Great if you made it through, right?
TENNER: That's it. So maybe that's the one bright spot. So you know, hope that you - hope that you'll be one of the survivors.
(SOUNDBITE OF MUSIC)
RAZ: That's Edward Tenner. His latest book is called "Our Own Devices: The Past And Future Of Body Technology." You can watch his entire talk at ted.com.
(SOUNDBITE OF SONG, "APOCALYPSE DREAMS")
TAME IMPALA: (Singing) This could be the day that we push through. It could be the day that all our dreams come true for me and you.
RAZ: Hey, thanks for listening to our episode on Unintended Consequences this week. If you want to find out more about who was on it, go to ted.npr.org. To see hundreds more TED Talks, check out ted.com or the TED app.
Our production staff at NPR includes Jeff Rogers, Sanaz Meshkinpour, Jinae West, Neva Grant, Casey Herman, Rachel Faulkner and Diba Mohtasham, with help from Daniel Shukin (ph) and Megan Schellong. Our intern is Daryth Gayles. Our partners at TED are Chris Anderson, Colin Helms, Anna Phelan and Janet Lee. I'm Guy Raz, and you've been listening to ideas worth spreading, right here on the TED Radio Hour from NPR.
NPR transcripts are created on a rush deadline by Verb8tm, Inc., an NPR contractor, and produced using a proprietary transcription process developed with NPR. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of NPR’s programming is the audio record.