RENEE MONTAGNE, Host:
In a complex world, the process of trial and error is essential - so writes Tim Harford in a new book called "Adapt." And while his words might seem like common sense, it's remarkably hard for humans to accept, because with error comes failure. We invite Harford to talk about this. He writes the column "Undercover Economist" for The Financial Times, and he joined us from our studios in New York.
TIM HARFORD: Good morning.
MONTAGNE: Now, the subtitle of your book is "Why Success Always Starts with Failure," and for people familiar with Internet startups, that sounds pretty right, actually. It seems like every successful internet CEO has a list of failures under his or her belt. But why is it so essential to Internet entrepreneurs, but not, perhaps, to the wider economy?
HARFORD: Yes. I wish everybody else learned a little bit more from Silicon Valley. The point I'm making is not by going through failure, it makes you a better person, and you learn essential things. Although, that might be true, my point's quite simple. Failure's inevitable. It happens all the time in a complex economy. And how did the economy produce all these amazing things that we have around us, computers and cell phones and so on? Well, the process was trial and error. There were a bunch of ideas, and the good ones grew and prospered, and the bad ones were pretty ruthlessly weeded out.
MONTAGNE: Give us an example of success that emerged after and actually even because of failure.
HARFORD: One of my favorite examples is Johannes Gutenberg, who invented the moveable type printing press we celebrate today. He, in some ways, created the modern world. A lot of people don't mention Gutenberg went bankrupt producing his most famous object, the Gutenberg Bible. It was - he developed the technology, and then he was wiped out.
MONTAGNE: In a sense, one does wonder why learning from mistakes is so hard. Are people overwhelmed by failure?
HARFORD: Well, if you talk to a professional poker player, he or she will tell you that the most vulnerable time in a poker game is just after you have made a mistake or you've been unlucky and you've lost a lot of money. And then the risk is you do something called going on tilt. When you go on tilt, you just start chasing to try and get your money back. You start being very reckless.
It's been discovered, for instance, in the behavior of stock market investors. They will also chase losses and double-down in the hope of being able to get out of the game having won everything back and they didn't have to admit that they were losers. And that's a real problem, because if the whole process of learning from failure means discarding stuff that's not working, but, in fact, our natural reaction is to keep going, to throw more money behind it, to throw more emotional energy behind it, because if we keep pushing, maybe somehow this bad idea will turn into a good idea. I mean, that's the challenge that we face.
MONTAGNE: Let's talk about a huge failure that's touched virtually everyone: the financial crisis. Bankers bought complex financial derivatives aimed at reducing the risk of losing money. In fact, all these fancy risk-reducing products created an even bigger disaster. Have we learned from that financial crisis?
HARFORD: I think we're still learning. I mean, it was a tremendously traumatic experience for everybody. I think one way of looking at this is when a market is working well, it's trying out lots and lots of different ideas. So you have pluralism.
And then there's a discipline. You get rid of the ideas that are not working, and you stick with the ideas that are. And at the heart of the financial crisis, we had a big plunge into complex, subprime-related products, and there was neither pluralism nor discipline. All the banks were doing the same thing, and nobody actually knew whether these products were any good or not. So there was no discipline to the process. So viewed in that way, it's not surprising that when the system blew up, the affects were catastrophic.
What I've learned and drawn from the financial crisis - and I feel the regulators need to learn this lesson - there were certain systems where you need to have a very low tolerance for mistakes. A nuclear power station is an example. An oil rig is another example. We can learn a lot from the people who understand how industrial safety works, and it is all about catching those failures when they're small and isolating the failures. And I don't think regulators have really got that point yet.
MONTAGNE: What is an example of a company that has embraced failure, the concept of trial and error, and then been successful?
HARFORD: So, some successful firms are now experimenting with devolving responsibility to the staff who have direct contact with the customers on the principle that they can adapt and adjust to whatever the customers want, offer them the right deals. They can experiment with marketing and advertising and product placement and immediately see the results of their efforts.
The supermarket chain Whole Foods has quite a radical employee empowerment program, where employees get to decide whether another employee can work in their team or not. If they think this person's a slacker, doesn't have good ideas, they can vote and say, no, we don't want this person to be working with us on the vegetable aisle. And so that decentralization is a fascinating process. I think it's getting more and more widespread.
MONTAGNE: Thank you very much for talking with us.
HARFORD: Thank you. It's been great.
MONTAGNE: Tim Harford writes the "Undercover Economist" column for The Financial Times. His new book is called "Adapt: Why Success Always Starts with Failure."
NPR transcripts are created on a rush deadline by a contractor for NPR, and accuracy and availability may vary. This text may not be in its final form and may be updated or revised in the future. Please be aware that the authoritative record of NPR's programming is the audio.