Finn Myrstad: What Happens When We Sign Away Our Online Privacy? Do you read the terms and conditions on your apps? Finn Myrstad explains that not only would it take you dozens of hours, but you would probably not agree with all the ways your data is being used.

Finn Myrstad: What Happens When We Sign Away Our Online Privacy?

  • Download
  • <iframe src="" width="100%" height="290" frameborder="0" scrolling="no" title="NPR embedded audio player">
  • Transcript


It's the TED Radio Hour from NPR. I'm Guy Raz. And on the show today, ideas about Digital Manipulation.


RAZ: All right, let me ask you about Cayla. Tell me, what is Cayla? Tell me about Cayla.

FINN MYRSTAD: (Laughter) So Cayla, I would say, was an Internet-connected doll.


UNIDENTIFIED SINGER: (Singing) Cayla knows millions of things.

MYRSTAD: The children can talk to the doll.


UNIDENTIFIED CHILD #1: How do you make a cake?

COMPUTER-GENERATED VOICE: Mix eggs, flower, milk, butter...

MYRSTAD: And the doll would use speech-recognition technology and answer the child's questions.


UNIDENTIFIED CHILD #2: My Friend Cayla talks, listens to you, plays games with you and knows millions of things.

RAZ: And because Cayla talks to you and listens to you and knows millions of things, Finn Myrstad, who is the director of digital policy at the Norwegian Consumer Council, wanted to see how easily Cayla could be hacked.

MYRSTAD: We decided to investigate or look at this doll more closely, and what we discovered was that it had very little security. So anyone within Bluetooth range or within a certain distance could basically connect to the doll through their phones and initiate the two-way conversation.


UNIDENTIFIED PERSON #2: (As Cayla) Hi, my name is Cayla. What is yours?


RAZ: On the TED stage, you do this. You bring out a colleague, who from, like, backstage is talking to you through Cayla and presumably could talk to a child in their bedroom.


UNIDENTIFIED PERSON #2: (As Cayla) Is your mom close by?

MYRSTAD: No, she's in the store.

UNIDENTIFIED PERSON #2: (As Cayla) You want to come out and play with me?

MYRSTAD: That's a great idea.

UNIDENTIFIED PERSON #2: (As Cayla) Oh, great.


RAZ: It's totally weird.

MYRSTAD: Yeah, yeah. And it's completely, of course, unacceptable. And what made this story worse was that there was a label on the packaging saying that this was an Internet-safe doll while it really had no safety or security precautions in it whatsoever.


RAZ: Now, did Cayla, like, also listen in on the things people were telling her and then using that information?

MYRSTAD: Well, we read the terms and conditions that no one reads, and in there, the company reserved the rights to use the voice recordings of the child and use it for targeted advertisement, for example.


UNIDENTIFIED CHILD #2: Cayla knows millions of things.

UNIDENTIFIED CHILD #1: How do you make a cake?

MYRSTAD: But because we don't have any protections when it comes to data or when it comes to security, the market is flooded with insecure products and products which has a business model of selling our data. And that really decreases trust, and it leaves people completely apathetic to this.


RAZ: OK, so Cayla might sound like an extreme example of a tech company acting irresponsibly, but is it really? Is it an anomaly? Because when you unwrap Cayla and install the companion app, there is a user agreement.


MYRSTAD: So we read also the terms and conditions that no one reads.

RAZ: The terms and conditions that no one reads. Finn Myrstad continues the story from the TED stage.


MYRSTAD: Like most of you, I have dozens of apps on my phone. If used properly, they can make our lives easier, more convenient and maybe even healthier. But have we been lulled into a false sense of security? It starts simply by ticking a box. Yes, we say, I've read the terms. But have you really read the terms? Are you sure they didn't look too long, and the last time you tried, they were impossible to understand, and you needed to use the service now? And now the power imbalance is established because we have agreed to our personal information being gathered and used on a scale we could never imagine.

This is why my colleagues and I decided to take a deeper look at this. We set out to read the terms of popular apps on the average phone. And to show the world how unrealistic it is to expect consumers to actually read the terms, we printed them - more than 900 pages - and sat down in our office and read them out loud ourselves.


UNIDENTIFIED PERSON #3: Angry Birds, terms of service - the following terms of service and end-user license agreement...


MYRSTAD: Streaming the experiment live on our website, it took quite a long time. It took us 31 hours, 49 minutes and 11 seconds to read the terms on an average phone - that is longer than a movie marathon of the "Harry Potter" movies and "The Godfather" movies combined. And reading is one thing; understanding is another story - that would have taken us much, much longer.


RAZ: But this isn't just about, like, a creepy doll or, like, how incredibly frustrating and unrealistic it is to have to read hundreds of pages of user agreements. I mean, all of this data collected on us - I mean, companies can and do use it in some really, you know, sleazy ways, right?

MYRSTAD: Yeah. And what we're seeing increasingly is that this data can also be used to discriminate people. You won't see an ad because you're, for some reason, put in a high-risk category. We know that housing ads have not been shown to people of a certain ethnic minority, for example. So you can discriminate based on ethnicity. We know that job ads have not been shown to people living within a certain zip code or with a certain profile. We know that this data can be used to determine whether you're a risky consumer or not, so access to your - to credit can be a problem.

And you won't even know because it will just be a computer giving you a price or denying you access to a service, and you won't know why.


RAZ: Every single one of us - right? - we download apps, and we go to certain pages. And oftentimes, we give consent without thinking about it for a variety of reasons - 'cause we're out of time, we're in a hurry, whatever it might be. But then these technology companies - their response is, well, listen; we warned you. We gave you all of the information. You made a decision to opt in. But that's actually such an infuriatingly disingenuous response.

MYRSTAD: Yeah, I would agree to that because, for example - let's use Facebook as an example. It's the world's most popular social network. And if you want to stay in touch with your friends, for most people, that's how you do it - either through Facebook or Instagram, which is also owned by Facebook.

RAZ: Sure.

MYRSTAD: So we have to also keep in mind that most people have lots of other things to worry about. They have to go to work. They have to buy food that is safe and good, and they have to take their kids to football practice. So taking these very complicated decisions that could have detrimental effects in the long-term - in the short-term, we are not really equipped psychologically to take that into consideration.

RAZ: I can't help but think that a capitalist model makes this an impossible problem to solve because data is so valuable. It is increasingly important to these companies' bottom lines and to their market capitalization. What - why would they possibly give this up?

MYRSTAD: No. And probably they won't, unless there is actually external pressure. And I think that's probably where, you know, we the citizens can make a difference, to tell our decision-makers, our politicians that we care about this, that we don't want this to be an individual choice. And I think this should be regulated the same way we regulated the environment, the water, big oil companies, tobacco and all of these things.

RAZ: I mean, Google and Facebook - this is their bread and butter. They control a huge percentage of the digital head market. How could they change their business model to, you know - to continue to make money if that's obviously their priority and their responsibility to their shareholders?

MYRSTAD: Well, I think there is a discussion now whether you could, for example, serve contextual ads where you actually don't need to collect any data about the user, where, based on which website you're on, you will see ads that would be relevant to users of that website.

I would happily use Facebook. And if they asked me every - beginning of every month or beginning of every year, what things are you interested in? What would you like to see of ads? And it was my choice to say, I'm interested in sports. I'm interested in news. And you don't need to track me to - in order to know that you - I could actually willingly be telling them that.

RAZ: I mean, the companies argue, look; we're targeting people because it's better for their lives. It improves their lives. It's more efficient. It gives them opportunities to buy the things they need. We need this data to make - you know, to offer a better experience for consumers. They genuinely believe that what they're doing is for the greater good.

MYRSTAD: Yeah, I've also heard that, and I also see that when I meet with these companies. It's a lot of really good and smart people working at these big tech companies. But what I feel is that they are not at all - or the corporate culture - the companies' structures are not open for scrutiny. They are not open for transparency. And they say, trust us; we will have your best interests in mind.

And I think with all these privacy scandals, people are losing trust, and these companies are becoming increasingly more unpopular. But because they have such a strong grip on the markets - I would call them monopolies - it's really hard for consumers to vote with their feet because they have nowhere else to go.

RAZ: That's Finn Myrstad. He's director of digital policy at the Norwegian Consumer Council. You can see his full talk at

Copyright © 2019 NPR. All rights reserved. Visit our website terms of use and permissions pages at for further information.

NPR transcripts are created on a rush deadline by Verb8tm, Inc., an NPR contractor, and produced using a proprietary transcription process developed with NPR. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of NPR’s programming is the audio record.