GUY RAZ, HOST:
It's the TED Radio Hour from NPR. I'm Guy Raz. And on the show today, ideas around Digital Manipulation.
NIR EYAL: I mean, all design is the art of manipulation. And it doesn't matter what type of design we're talking about; we're trying to engineer people's behavior - to do a particular behavior, to think a certain thought, to feel a certain feeling. All design manipulates us. And to be honest, we pay for the privilege.
RAZ: This is Nir Eyal.
EYAL: And I basically study the intersection of psychology, technology and business.
RAZ: Nir actually consults for tech companies, to help them create more engaging products - products that are really good at holding our attention.
EYAL: I am what you might call a behavioral designer.
RAZ: And to understand just how we get hooked, Nir points to a series of studies conducted in the 1940s by the psychologist B.F. Skinner.
EYAL: Skinner did some fascinating experiments where he took these food pellets, and he would give them to a pigeon in a box.
(SOUNDBITE OF ARCHIVED RECORDING)
B F SKINNER: I will try to pick out some particular pattern or behavior, and...
EYAL: And every time the pigeon pecked at the disc, they would receive a reward - a little food pellet.
(SOUNDBITE OF ARCHIVED RECORDING)
SKINNER: The bird is already conditioned to eat from the magazine (ph). Sounds...
RAZ: OK, simple enough - the pigeon gets a reward.
EYAL: And Skinner very quickly realized that, as long as the pigeon was hungry, they would receive this food pellet, and very quickly, he could train them to peck the disc every time.
RAZ: But then one day, Skinner ran out of those food pellets.
EYAL: He didn't have enough, and so he could only afford to give out the food pellets every once in a while. So one time, he'd give out the food pellet, and the pigeon would peck at the disc, and they would receive one. The next time the pigeon would peck at the disc, they wouldn't receive a reward.
(SOUNDBITE OF ARCHIVED RECORDING)
SKINNER: Perhaps every 10th time, or perhaps only once every minute, or something like that.
EYAL: And when the schedule of reinforcement was variable, when there was some mystery around when the pigeon would receive the reward, the pigeon would peck at the disc more frequently; the rate of response increased.
RAZ: The more random the reward, the more the pigeon pecked at the disc.
EYAL: This is called a variable reward. And so we see this mechanic in all sorts of things that we find most engaging, most habit-forming, the things that capture our attention. So, of course, online, when you think about the Facebook news feed or the Instagram feed or the LinkedIn feed, I mean, the feed is this masterful manifestation of a variable reward, where, to get more of these rewards, you just have to keep scrolling and scrolling and scrolling, searching for more information.
RAZ: And so all those pings and dings and notifications, those are all triggers that tell us to go back for more, until we're so hooked that we don't need those reminders anymore.
EYAL: Eventually, they start attaching their product's use to an association inside our own heads - that's when the habit takes hold.
RAZ: Nir Eyal picks up this idea from the TED stage.
(SOUNDBITE OF TED TALK)
EYAL: Where do we go when we're feeling lonely? What app or website do we check? Facebook, of course. What about when we're unsure about something? Before we scan our brains to see if we know the answer we're googling it. And what about when we're feeling bored? Well, that's when people go onto YouTube or Reddit, check stock prices, sports scores, the front-page news. And what we see happening today is that companies building habit-forming technologies are pushing that the easier behavior is to do, the more likely we are to do it.
RAZ: So Nir, I mean, hearing this and, you know, knowing that you help companies design some of these products, I mean, it almost feels like these things are deliberately designed to addict us.
EYAL: So I think we do have to keep this in perspective. You know, very few people who drink alcohol get addicted to alcohol, even though alcohol is a highly addictive substance. And so what we do when we pathologize this, the problem when we call everything addictive, is we're missing the point here; you know, we're painting this tech boogey man, and we're giving these companies more control, believe it or not, than they deserve.
RAZ: But is there an argument that some social media companies are trying to capitalize on our negative emotions, to manipulate us into using their products more? I mean, would you agree that is potentially the case, the reality?
EYAL: Guy, every human behavior is prompted from a negative emotion. Everything we do is to escape discomfort. You know, the reason we sit down to watch television or read the newspaper is also because of discomfort. It's worry. It's fear. It's fatigue. That's the way products are designed, is to solve our problems. It's such an easy story to say it's all the company's fault, and they're doing it to us, and they're hijacking our brains.
But guess what - we have power over this. The fact is, sitting here and wringing our fist at these companies doesn't change anything, right? If we wait for the politicians to do something and hold our breath, we're going to suffocate.
RAZ: So just to be clear - I mean, your view on all this is that we need to take personal responsibility for the decisions we make and that we can't blame technology companies for attracting us, drawing us in, keeping us on their products. Ultimately, it's our own responsibility; we have to make that decision.
EYAL: So I think we're going through this adjustment period. Sophocles, the Greek philosopher, said that nothing vast enters the life of mortals without a curse. And so many new technologies have a downside, and we're dealing with that downside today, and we're learning about how to use them appropriately.
RAZ: Sure, sure.
EYAL: You know, when I was growing up in the 1980s, we had ashtrays all over our house. Now, my parents didn't smoke, and yet we had all these ashtrays because, back then, if you walked into somebody's house, you felt free to light up in their living room, back when smoking rates were at 60% of the adult population. Today, they're at 16%. And if somebody came to my house and lit up a cigarette in my living room, I'd kick them out, and we wouldn't be friends anymore. So the social norms changed, and we are currently learning those norms.
RAZ: The analogy with tobacco is an interesting one because other people in the program have made it, which has been - yes, that's right. There was a transition period where we didn't really understand, you know, the effects of tobacco. And similarly, with social media, we don't fully understand it. But we regulated tobacco, and therefore, we could do the same with some of these social media companies and companies that collect vast amounts of data. Do you think that's a fair comparison and argument?
EYAL: Well, remember - we're not freebasing Facebook; we're not injecting Instagram here. Nothing is entering into the bloodstream. These are behaviors. These are habits, and habits can change. And so I think that, you know, again, there's so much we can do right now, why would we wait? And I think there is a role for legislation for some things. I think that these companies' monopoly status needs to be looked at. I think that their use of data needs to be looked at. But for this specific problem of tech overuse, this is our problem. This is something we can do something about.
RAZ: But, I mean, if, as you say, like, we all have agency - right? - to just stop using these products, which I think a lot of people would disagree with (laughter), I mean, do you even think that it's possible for us to be manipulated?
EYAL: Absolutely. So there's two types of manipulation, right? There's persuasion and coercion. Persuasion is helping people do something they want to do; coercion is when we get people to do things they don't want to do, and coercion is always unethical. Now, what's the difference between persuasion and coercion? A simple test is regret. Would the user regret using our product and service? Not only is it ethical, as I mentioned, but, again, if you build a product that people regret using, they stop using your product. It's bad for business.
And so it's not us to judge people - you know, many parents today, they're, oh, the video games - that's a terrible waste of time, as they sit down on their couch and watch a football game. Is there anything that's morally superior to watching a football game than playing Candy Crush on your phone? It's difficult for me to say, and who am I to say, to judge people? If that's how you want to spend your pastime, there's nothing wrong with it.
RAZ: I'm not a neuroscientist or a psychologist - right? - but what I can say is that I fundamentally believe that people can make choices, but there are certainly choices that are made for people. And I think, look - I'm not denying that you have a legitimate argument - you do - I just don't think it's that clear-cut.
EYAL: Yeah. Well, I don't think it's easy, necessarily. I'm not pro-tech all the time; I'm for having a conversation so that we can use them with intent, use them the way we want to, as opposed to maybe the way these companies want us to. If that's playing a video game, if that's spending time on Facebook, if that's listening to a radio program - wonderful. Spend that time with intent, as opposed to letting other people control your behavior.
(SOUNDBITE OF MUSIC)
RAZ: That's Nir Eyal. He is a behavioral designer. You can see his full talk at ted.com.
NPR transcripts are created on a rush deadline by Verb8tm, Inc., an NPR contractor, and produced using a proprietary transcription process developed with NPR. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of NPR’s programming is the audio record.