Survivors Laud Apple's New Tool To Spot Child Sex Abuse But The Backlash Is Growing Apple's new feature to fight child sexual abuse is encouraging to families of survivors. But privacy advocates are trying to convince Apple to drop its plans, fearing they could lead to surveillance.

Survivors Laud Apple's New Tool To Spot Child Sex Abuse But The Backlash Is Growing

  • Download
  • <iframe src="https://www.npr.org/player/embed/1027314728/1027537491" width="100%" height="290" frameborder="0" scrolling="no" title="NPR embedded audio player">
  • Transcript

AUDIE CORNISH, HOST:

This next story may not be suitable for all listeners. Apple says its next iPhone and iPad update will help catch child predators through a sophisticated photo matching system. Privacy advocates worry the system could create a backdoor on all Apple devices. NPR's Bobby Allyn takes a closer look. And we also want to note that Apple is among NPR's financial supporters. Here's Bobby.

BOBBY ALLYN, BYLINE: I recently had a conversation with Ann. We're only using Ann's middle name to preserve her family's privacy. About a decade ago, a family member was arrested for taking sexually abusive photos of her child and sharing them online.

ANN: Imagine the very worst thing that has ever happened to you was recorded and then it's shared repeatedly for other people's pleasure.

ALLYN: Their nightmare didn't end with the arrest. Her child's real name was used with the photos that were circulating.

ANN: Ten years later, we still have people trying to find my child, looking for images, wanting new images. It's, you know, a constant, constant battle.

ALLYN: Child safety groups have for years pressured Apple to help people like Ann. Finally, the company has done something. Soon, all iPhones and other Apple devices will be scanned for child pornography in an upcoming update to its iOS operating system.

ANN: I can't think of a family that I know that is not a fan of companies like Apple stepping up and saying, let's help stop kids from being abused.

ALLYN: How it will work is pretty complicated, but it boils down to this. A database of known child abuse images has been distilled down into encrypted bits of code. Apple created an automated process to compare that code to everyone's photos backed up on the cloud. When there's a match, Apple will notify the National Center for Missing and Exploited Children, which works with law enforcement. Longtime tech critic John Gruber has studied Apple's plan and says as long as Apple implements the system as it says it will, it appears secure.

JOHN GRUBER: I truly believe that Apple has carved out a very carefully planned position that I think maintains the privacy that people expect from their Apple devices.

ALLYN: Gruber admits Apple, which has staked its reputation on privacy, has a lot on the line. If the scanning technology is misused, Gruber says it could be disastrous for Apple. Already, resistance is building.

INDIA MCKINNEY: What Apple is doing is they are putting a bunch of scanners in a black box onto your phone.

ALLYN: India McKinney is with the Electronic Frontier Foundation. She is among those pushing back. Some Apple employees are, too, and a group of more than 7,000 developers and security and privacy experts have signed a petition asking Apple to drop the plan. They call it a backdoor that threatens the privacy of all users of Apple products. McKinney worries about where this can lead down the road.

MCKINNEY: How could this idea be misused by abusive governments or abusive spouses or abusive parents?

ALLYN: Critics like McKinney point out that Apple sells iPhones in Saudi Arabia without FaceTime, since local laws prohibit encrypted calls. So the fear is that Apple will make similar concessions with its photo scanning technology. Apple says such a demand would be refused. The company says the tool is built solely to detect images of child sex abuse. People have the ability to opt out by not using iCloud as a backup. Ann says she doesn't think this will keep images of her child off the internet altogether, but it will go a long way in stopping people from seeing them.

ANN: I know that my child's images have been identified hundreds of thousands of times, so there's quite a widespread number of them out there.

ALLYN: And each time one is found by child safety groups, she's notified. She says it's overwhelming.

Bobby Allyn, NPR News, San Francisco.

Copyright © 2021 NPR. All rights reserved. Visit our website terms of use and permissions pages at www.npr.org for further information.

NPR transcripts are created on a rush deadline by Verb8tm, Inc., an NPR contractor, and produced using a proprietary transcription process developed with NPR. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of NPR’s programming is the audio record.