ALEX GOLDMARK, HOST:
Look around the world right now, and you can see the same kind of drama playing out over and over again - two sides in a standoff on the brink of some kind of conflict.
ROBERT SMITH, HOST:
So we have the nuclear situation with North Korea. There is this on-again, off-again possible trade war with China.
GOLDMARK: And it might not seem like it on its face, but these global conflicts follow a pattern that we see everywhere down to the tiniest little ways in our own daily lives, like two roommates who are watching the dirty dishes pile up in the sink waiting for the other one to give up and clean them first.
SMITH: These are all problems where both sides are better off if they cooperate, but no one wants to be the sucker and cave first, so they don't. And they end up not cooperating.
GOLDMARK: This kind of standoff is so common. It comes up so often, and it is so tricky to solve. It has its own name - the prisoner's dilemma.
(SOUNDBITE OF MUSIC)
GOLDMARK: Hello, and welcome to PLANET MONEY. I'm Alex Goldmark.
SMITH: And I'm Robert Smith. The prisoner's dilemma is one of the most famous thought experiments in economics.
GOLDMARK: Also math, political science, psychology.
SMITH: It is one of the most famous thought experiments period.
GOLDMARK: More than ever, this little thought experiment is explaining so much of our world right now that we just have to tell this particular story.
SMITH: Today on the show, the story of the idealistic, glorious nerd who set out to solve one of the greatest problems of human existence.
GOLDMARK: And we'll say it right here. What he found out was that sometimes the best way to win is to lose.
(SOUNDBITE OF MUSIC)
GOLDMARK: Let's meet the man who changed the way we think about cooperation and conflict, Bob Axelrod, lifelong peacemaker.
BOB AXELROD: Yes. I'm a mediator or try to help each side understand the other.
SMITH: Bob isn't quite sure where his obsession with cooperation came from. Maybe it was growing up during the Cold War, or maybe was the fact that he was the youngest in his family.
AXELROD: I think my motivation was more from my parents, who bickered a lot. And I thought that's a shame. And I wasn't comfortable with it. And I thought that they should be able to get along better.
GOLDMARK: Bob went on to become a professor at the University of Michigan in political science and public policy. And now, instead of worrying about his parents bickering, he's worrying about something much bigger.
SMITH: Yeah, two countries bickering - the United States and the Soviet Union. In the 1970s and 1980s, we were all worried about global thermonuclear war.
GOLDMARK: And Bob is trying to figure out how to solve this problem. And he looks around, thinking, what could I, a professor at a Midwestern university, possibly do? And he sees something that inspires him - the budding technology at the time, specifically computers that play chess.
SMITH: Going to plug this in right now.
COMPUTER-GENERATED VOICE: I am Fidelity Chess Challenger...
SMITH: This is my most beloved toy from my childhood, 1978 Fidelity Electronics Chess Challenger - computer chess.
GOLDMARK: I love that you brought this in. It's in a whole special briefcase. And it kind of looks like a calculator attached to a maybe metal chess board.
SMITH: Isn't that amazing?
COMPUTER-GENERATED VOICE: Illegal move.
SMITH: Oh, sorry, illegal move.
AXELROD: And so he sees these computers, these chess sets and thinks maybe he can use computers to try and figure out the smartest strategy for global conflict.
AXELROD: I was inspired by computer chess tournaments, where obviously if you're going to write a program to play computer chess, it could be very sophisticated and handle many contingencies.
SMITH: The first step, though, is he needs to frame these political problems in the way a computer can handle. And he chooses that famous thought experiment, the prisoner's dilemma.
AXELROD: Well, the name comes from the story about police arresting two people who might have committed a crime. And they...
SMITH: You know what? The prisoner's dilemma is so important to the story that we are going to play this out as if it were real life. So I just pulled my lunch out of the fridge, and there's a mystery. My turkey sandwich is missing. Let's round up the usual suspects.
GOLDMARK: To the newsroom.
SMITH: Sarah. Sarah, Sarah, Sarah. I bet you're wondering why we brought you in here.
SARAH GONZALEZ, BYLINE: (Laughter).
SMITH: Oh, yeah, that laugh. That laugh tells me you know exactly why we brought you in here.
GONZALEZ: I don't know what you're talking about.
SMITH: You and Cardiff Garcia took the turkey sandwich out of the fridge. Oh, I'm sorry. Is this light too bright?
Meanwhile, Alex is in the other studio with Cardiff.
GOLDMARK: Come on in here, Cardiff. Listen. We know that Robert's lunch has been stolen.
CARDIFF GARCIA, BYLINE: You don't know nothing.
GOLDMARK: Here's the deal. You've got to make a choice. You've got to decide. Are you going to keep your mouth shut, or are you going to rat Sarah out? Because if you rat Sarah out and you let her take the fall by herself, we'll let you go free.
SMITH: Now here's the twist. If he rats you out and you rat him out, then you're both going to get punished pretty harshly. If neither of you rats on the other one, we're still going to punish you.
GOLDMARK: If neither of you talk, both of you will get the lesser charge.
SMITH: Just going to suspend you for one week. So the only way you can walk free is by telling us Cardiff did it.
(SOUNDBITE OF MUSIC)
SMITH: And - scene. All right, you can leave now.
GONZALEZ: Thank you.
SMITH: I'm going to need the sandwich back.
GOLDMARK: Ask Cardiff.
SMITH: Alex, come back in. Come back in to the studio.
GOLDMARK: All right, Robert.
SMITH: So looking at this ideally - ideally, what should Cardiff and Sarah do?
GOLDMARK: If they trust each other, they should keep their mouth shut. That would overall be the best thing for both of them.
SMITH: But if they don't trust each other - and remember, they are criminals...
GOLDMARK: Why would they?
SMITH: ...Then it might be in their individual best interest to sing like a canary.
GOLDMARK: So in the real world, these kinds of situations don't just get played out once in a radio studio. These kinds of conflicts happen over and over again.
SMITH: And say you keep bringing Sarah and Cardiff in, and Sarah is thinking not just about what to do but what did Cardiff do last time? And he's thinking, well, wait, what did Sarah do last time? And so on and so on.
GOLDMARK: And this is essentially what a trade negotiation is between, say, the U.S. and China. They aren't just picking tariffs once and setting those levels. They're thinking, what did the other side do in all of the rounds up to now? And what does that tell me about what they're going to do next?
SMITH: There's literally a billion options.
GOLDMARK: And you know who loves sifting through a billion options?
COMPUTER-GENERATED VOICE: From 8-7 to 8-5...
SMITH: Computers - and Bob Axelrod.
GOLDMARK: So Bob thinks there might be a best way to play the prisoner's dilemma of all the billions of options and that he can find it.
SMITH: So he goes to the library and looks up everyone he can find who had written about the prisoner's dilemma. And he picked out the best and the brightest and the biggest brains he could get in touch with. And he got their addresses.
AXELROD: What I did is I wrote to other faculty members who had written about the prisoner's dilemma.
SMITH: Write - you mean a letter, like in the mail?
AXELROD: Yeah, snail mail. And I said, you know, I'm interested in how - what's a good way to play the game? And you've written about this. Could you pick a strategy and define it in terms of a computer program?
SMITH: And he had one of those computer terminals. And back in the 1970s, they were usually these beige boxes with bright green glowing cursors. And he takes his letters, and he somehow has to get them inside the box.
AXELROD: The first round I did all by myself.
SMITH: Oh, you were the nerd.
AXELROD: I was the nerd, right.
SMITH: Sorry. Basically, inside the computer, he designs a virtual prison with two virtual prisoners kept separate.
GONZALEZ: Player One.
GARCIA: Player Two.
GOLDMARK: Each prisoner is given a particular strategy, one of the strategies sent in by the experts.
SMITH: And they play hundreds of rounds against each other with those strategies. And then they get two new strategies.
GOLDMARK: And play again - inside the computer, of course. So, for instance, one strategy was really simple - be random, unpredictable. Sometimes you keep your mouth shut and cooperate with the other prisoner, and sometimes you rat him out. You defect.
SMITH: Another strategy was way more complicated. It was like an early version of artificial intelligence. This strategy was called lookahead.
AXELROD: It had a model of what the other player was doing. How often would it cooperate after cooperation? How often would it cooperate after defection? And then it would project into the future. If my model - the other person - is correct, what's the best I can do with that kind of player?
SMITH: There was one called tranquilizer that lulled its opponent into thinking it was very cooperative, then later - bam, lights out, lots of defections.
GOLDMARK: One economist sent in an approach called grim trigger. It was just a giant complete and total threat. If you ratted against it once, it would just defect on you afterwards forever and ever. It meant to scare the opponents into cooperating.
SMITH: And there was another super-simple strategy - tit for tat. Just do what the other side does back to them. So 14 different strategies lined up ready for a tournament inside this virtual prison, inside this old-school computer.
GOLDMARK: Which took a while to process back in those days.
SMITH: So wait. What did you during this two hours while it's churning away?
AXELROD: At lunch (laughter).
GOLDMARK: So you just set it to go on your computer in your room and you walked away?
GOLDMARK: He's a pretty chill tournament referee. So who wins, the grim trigger, the tranquilizer, that super fancy artificial intelligence one?
SMITH: You go and you look at this list, and what do you see?
AXELROD: Well, I see that the simplest of all the strategies was the one that did best.
SMITH: Which was...
AXELROD: Tit for tat.
GOLDMARK: Tit for tat - worst name, best results - this round, anyway.
SMITH: Like any good researcher, Bob was doubtful of the results. How can the simple strategy really be the best? So Bob decided to do the whole tournament again.
AXELROD: And so what I did is I sent an announcement to several computer hobby magazines saying, here's what the prisoner's dilemma is. I'm going to run a tournament, and any hobbyist, anybody who wants can send some strategy in.
GOLDMARK: So smart.
SMITH: That's amazing. And, you know, it could be a young Bill Gates, Steve Jobs there reading this in the back of the magazine.
AXELROD: And the thing about computer hobbyists is they have time to think about this, play with it.
GOLDMARK: He got 63 competitors, all of them gunning to take down simple tit for tat. And this time, it took the entire night for his computer to have all of those prisoners play each other.
SMITH: You go home overnight, get up in the morning. Did you drink coffee?
AXELROD: Look. This is 30 years ago.
SMITH: Right. OK. Fair enough.
GOLDMARK: But he does remember printing off the results.
AXELROD: And so it was a whole bunch of pages. And I had to tape them together to make this one huge spreadsheet. And lo and behold, it was the tit for tat strategy again. I thought, wow, that is really cool.
GOLDMARK: Bob's tournament was revealing something fundamental about how people can get along with each other, and it's all wrapped up in simple tit for tat.
SMITH: Here's how the tit-for-tat strategy works. The strategy says that the first time you walk in, you assume the best from the other prisoner. And you do not rat them out.
AXELROD: I use the word nice. I said a nice rule is one that won't be the first to defect. And I've found that the nice rules actually did quite well.
GOLDMARK: Right. So start nice, but you can't be nice all the time. If you're playing against the grim trigger, you have to respond.
AXELROD: So if the other side's taking advantage of you and you don't do anything about it, they're just going to try again harder. To set things straight, you have to demonstrate that they'll have consequences for their bad acts.
SMITH: If your opponent is nice, you stay nice. If your opponent hits you, you hit back. The key is still to look at the other side and think ahead, think about how to know what they're going to do, how to send a message that cooperation is the best in the long run and that you will play nice.
GOLDMARK: Now there's something very special about tit for tat special to my heart for real. It never tries to beat its opponent. Remember, it won't defect unless it's provoked. So that in the short term, in any one round, the best that it can do is tie. But in the long run, even if it loses some one-on-one matchups by a little bit, it ends up better off overall.
AXELROD: You do not as well as the other player, but you do terrific. And the reason you do terrific is you elicit a lot of cooperation for the other player. The other player to do well with tit for tat needs to cooperate a lot, otherwise, it gets punished.
GOLDMARK: I think there's something beautiful and poetic and hopeful about the idea that this strategy that does the best in the long run is the one that never actually tries to beat its opponent.
GOLDMARK: Bob Axelrod publishes his results in 1980, and he sits back and waits for peace to break out around the world.
SMITH: Did anyone from the White House call?
AXELROD: No (laughter).
AXELROD: Neither then, nor later.
GOLDMARK: But after he runs these tournaments and crowns tit for tat the two-time champion, Bob realizes there is a serious problem here. Tit for tat has a dark side in the real world. Say two countries are negotiating something important, and one of them says or does something that the other side interprets as an insult or maybe even a threat. Well, then if both sides are doing tit for tat, all it takes is one misunderstanding, then this can set off a chain reaction.
SMITH: And in a world of miscommunication, if you have two sides doing tit for tat, then you can start defecting against each other. And you defect all the way down to annihilation.
GOLDMARK: So Bob figured this out a bit late for his tournament, but he comes up with a solution, and that's to just do a tiny little tweak to tit for tat to something he called generous tit for tat. This version says, yes, retaliate when you are provoked but also sometimes turn the other cheek - that simple. So when you see two countries in a standoff over a nuclear deal or trade barriers, it's easy to think that the back and forth is a petty, irrational tit for tat that's going to bring the whole thing crashing down. But look for the generous moments where a tit doesn't earn a tat back. It might just be the optimal strategy for cooperation.
(SOUNDBITE OF MUSIC)
GOLDMARK: Game theory heads out there, we know we had to leave out a bunch of little tidbits. So if you want to talk about it, we post a link to every episode on Facebook. Leave a comment there. We're also on Twitter and Instagram. We are @planetmoney.
SMITH: You can also email us at firstname.lastname@example.org.
GOLDMARK: We love getting your emails, especially the ones that ask us how you can help out PLANET MONEY. Here is how. Think of the one friend in your life who would most like listening to this podcast and tell them about your favorite episode - that simple.
SMITH: Today's episode was produced by Megan Tan. Our editor is Bryant Urstadt. I'm Robert Smith.
GOLDMARK: I'm Alex Goldmark. Thanks for listening.
(SOUNDBITE OF MUSIC)
GARCIA: That is quite a dilemma for me.
SMITH: What voice are you doing?
GARCIA: It's like the voice where it's like - it's like the voice in those movies where you're like, you don't got nothing on me, copper, from like the 1950s or something. So here we are.
SMITH: That is actually all we need from you. I don't know who's lunch this is. And it's disgusting.
GONZALEZ: Oh, my God, you actually stole someone's lunch?
SMITH: No, I'm going to put it back.
NPR transcripts are created on a rush deadline by Verb8tm, Inc., an NPR contractor, and produced using a proprietary transcription process developed with NPR. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of NPR’s programming is the audio record.