Mass Shootings Renew Schools' Concerns With Protecting Students Schools are investing in high-tech solutions to flag potential student perpetrators of violence such as mass shootings. Privacy experts and student advocates are concerned.
NPR logo

Mass Shootings Renew Schools' Concerns With Protecting Students

  • Download
  • <iframe src="https://www.npr.org/player/embed/750395613/750395614" width="100%" height="290" frameborder="0" scrolling="no" title="NPR embedded audio player">
  • Transcript
Mass Shootings Renew Schools' Concerns With Protecting Students

Mass Shootings Renew Schools' Concerns With Protecting Students

Mass Shootings Renew Schools' Concerns With Protecting Students

  • Download
  • <iframe src="https://www.npr.org/player/embed/750395613/750395614" width="100%" height="290" frameborder="0" scrolling="no" title="NPR embedded audio player">
  • Transcript

Schools are investing in high-tech solutions to flag potential student perpetrators of violence such as mass shootings. Privacy experts and student advocates are concerned.

RACHEL MARTIN, HOST:

It is the first day of school for many students around the country. After the recent shootings, many school officials, now more than ever, are worried about protecting their students and also identifying troubled students who could be perpetrators. Many schools are investing in technologies to help them do that. NPR's Anya Kamenetz has been reporting on this and joins us now to tell us what she found. Hi, Anya.

ANYA KAMENETZ, BYLINE: Hi Rachel.

MARTIN: So what do these technologies look like? What are they investing in?

KAMENETZ: Districts nationwide have been investing in more security technology, and this ranges. One company called Gaggle uses machine learning and human monitors to review student emails and posts made within the school software like Google Docs. Another company called Social Sentinel does a similar thing, except they monitor social media posts made by people in the school community. And so we're talking about millions of students here, billions of posts mostly reviewed with artificial intelligence.

And then coming online - newer, less common - are audio alert systems which listen essentially for yelling inside schools. And finally, one district in New York state is rolling out a facial recognition video system.

MARTIN: I mean, it's really hard to prove a negative, right? So it's hard...

KAMENETZ: Yeah.

MARTIN: ...To know if they can measure what these technologies have been able to prevent, what didn't happen. But is there any evidence that they're able to flag some troubled behavior?

KAMENETZ: Well, that really is a question, especially as schools are diverting resources sometimes to these systems. Across the board, what you see most often flagged are profanity, nudity, sexual content. When it is violence, most often it's threats of self-harm because as we know, unfortunately, suicide is a leading cause of death among teenagers.

I talked to Sarah Trimble-Oliver. She's the chief information officer of Cincinnati Public Schools, which is a customer of Gaggle, which, again, scans these internal posts. She says they have 36,000 students, and last year, they had about 90 serious incidents that came into them through Gaggle. And in one case...

SARAH TRIMBLE-OLIVER: It actually came through as an alert for self-harm. But through investigation, you know, we did find that there was some actual planning for self-harm and harm to others.

KAMENETZ: Yeah, so they were able to prevent - potentially prevent - some violence.

MARTIN: Inevitably, though, this raises privacy concerns, I imagine.

KAMENETZ: Absolutely. Groups like the ACLU, the Future of Privacy Forum are concerned that these technologies might be used to unfairly target certain students. I talked to a student privacy activist. He's a 16-year-old rising senior at the Bronx High School of Science, Babou Gay. And he pointed out that security technologies, no matter what kind they are, they tend to be used much more aggressively against students of color like himself.

BABOU GAY: It treats students like criminals that must be invasively monitored instead of treating us like the future of this nation.

KAMENETZ: He also said that there's a big difference between these kinds of technologies and old-fashioned security technologies, you might call them, like metal detectors.

GAY: Considering it's permanent data, you know? With a password, you can't - you can change that if someone hacks into it, for example. But your face will always be your face. Your fingerprint will always be your fingerprint. Your personal data, once exposed, can never be returned.

MARTIN: I mean, this is such an age-old debate - right? - trying to strike the right balance between protecting people and violating their civil liberties.

KAMENETZ: That's absolutely true, and it certainly holds true for people in a variety of settings. When you're talking about schools, though, the student safety experts I've talked to say there's one issue that connects both of these, both security and civil liberties, and that is that what makes a safer school is trust between students and adults in the school. And so privacy and safety experts worry that surveilling students doesn't tend to build up that kind of trust that you need.

MARTIN: NPR's Anya Kamenetz, we appreciate it. Thanks, Anya.

KAMENETZ: Thank you, Rachel.

(SOUNDBITE OF LIAM THOMAS' "GO ABOVE")

Copyright © 2019 NPR. All rights reserved. Visit our website terms of use and permissions pages at www.npr.org for further information.

NPR transcripts are created on a rush deadline by Verb8tm, Inc., an NPR contractor, and produced using a proprietary transcription process developed with NPR. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of NPR’s programming is the audio record.