Rana El Kaliouby: Will Our Screens Soon Be Able To Read Our Emotions? Despite their powerful computing capability, our screens have no way of knowing how we feel. Computer scientist Rana el Kaliouby says that's about to change.
NPR logo

Will Our Screens Soon Be Able To Read Our Emotions?

  • Download
  • <iframe src="https://www.npr.org/player/embed/439190272/439453206" width="100%" height="290" frameborder="0" scrolling="no" title="NPR embedded audio player">
  • Transcript
Will Our Screens Soon Be Able To Read Our Emotions?

Will Our Screens Soon Be Able To Read Our Emotions?

  • Download
  • <iframe src="https://www.npr.org/player/embed/439190272/439453206" width="100%" height="290" frameborder="0" scrolling="no" title="NPR embedded audio player">
  • Transcript

GUY RAZ, HOST:

So if your screen or your device is like an extension of your mind, it's also a way to convey your emotions. You can write a Facebook post about a bad day or you can put an Instagram photo of a great day, smiling. But the thing is your screen can't really understand how you feel - at least not yet.

RANA EL KALIOUBY: I think building machines and devices that can sense your emotions, I think, with that type of technology instead of people, like, emoting these social signals and then they disappear into cyberspace, we have an opportunity to capture them.

RAZ: This is Rana el Kaliouby. She's a computer scientist who spent the past few years at MIT's Media Lab. And the idea that our devices could be built to detect our emotions - it came to Rana when she moved from Cairo to Cambridge in the U.K. back in the late 1990s to work on her Ph.D.

KALIOUBY: That was the first time I was basically away from home.

RAZ: Rana was in England by herself, thousands of miles away from anyone she knew.

KALIOUBY: And I was pretty lonely. There were a lot of these days.

RAZ: And because she was in a new place where she didn't know anyone, Rana was actually spending most of her time with her laptop, which wasn't much of a friend at all.

KALIOUBY: It had absolutely no idea what my emotional state was. And I could be happy or I could be stressed or frustrated, and it would be completely oblivious.

RAZ: So Rana's sitting there in England, and she's wondering, how could I fix this?

(SOUNDBITE OF TED TALK)

KALIOUBY: What if our devices could sense how we felt and reacted accordingly, just the way an emotionally intelligent friend would?

RAZ: Today, more than 15 years after she first asked this question, Rana explained how she's getting closer to the answer. Here she is on the TED stage.

(SOUNDBITE OF TED TALK)

KALIOUBY: Those questions led me and my team to create technologies that can read and respond to our emotions, and our starting point was the human face. So our human face happens to be one of the most powerful channels that we all use to communicate social and emotional state, everything from enjoyment, surprise, empathy and curiosity. In emotion science, we call each facial muscle movement an action unit.

So for example, action unit 12, it is a lip corner pull, which is the main component of a smile. Another example is action unit 4. It's the brow furrow. It's when you draw your eyebrows together and you create all these textures and wrinkles. We don't like them, but it's a strong indicator of a negative emotion. So we have about 45 of these action units, and they combine to express hundreds of emotions. Teaching a computer to read these facial emotions is hard because these action units - they can be fast, they're subtle, and they combine in many different ways.

RAZ: So initially, Rana and her team fed tens of thousands of photos of people smiling or frowning into a computer program, so it could learn all these tiny little micro expressions. Today, they have 12 billion examples of those expressions, which makes the technology even better at detecting all the subtleties in our faces. And so what they've done is combine the program with a tiny camera, kind of like the one already built into your smartphone. And during her TED talk, Rana brought up a volunteer. Her name is Chloe (ph). And Ranaa gave Chloe an iPad to hold up in front of her face.

(SOUNDBITE OF TED TALK)

KALIOUBY: As you can see, the algorithm has essentially found Chloe's face - so it's this white bounding box.

And then it identifies the main feature points on your face.

(SOUNDBITE OF TED TALK)

KALIOUBY: Her eyebrows, her eyes, her mouth and her nose.

And it starts tracking how these facial muscles move over time. So, for example, when you smile...

(SOUNDBITE OF TED TALK)

KALIOUBY: And then, as she smiles - this is a genuine smile, that's great. So you can see the green bar go up as she smiles.

Your lip corners move upwards and outwards, and they create these wrinkles around your eyes and your cheeks.

(SOUNDBITE OF TED TALK)

KALIOUBY: Can you try, like, a subtle smile to see if the computer can recognize - it does recognize subtle smiles, as well. We've worked really hard to make that happen.

And it identifies these things and it says, oh, OK. You're smiling here. And it maps it to an emotional state.

RAZ: Once Rana and her team perfect this technology, she imagines all kinds of ways other scientists and engineers could build it into our lives. For example, you could send an email, one day, embedded with data about how you are feeling as you wrote it or what if your phone or your laptop or maybe even the mirror in your bathroom read your face every day?

KALIOUBY: And imagine if all these devices talk to each other and they can capture various aspects of your day and your emotions of the day, and we have a baseline for you. So we know what's your norm, and then we also know when you deviate from that norm. So your phone can say, hey, you know, I've noticed that you haven't laughed for the past four days. What's going on? Is there something I can do to help? You know, if it knows your humor profile, for example...

(SOUNDBITE OF ARCHIVED RECORDING)

AZIZ ANSARI: And then I realize, oh, my God, 50 Cent has no idea what a grapefruit is.

KALIOUBY: ...Can find content that is really funny...

(SOUNDBITE OF ARCHIVED RECORDING)

STEVE JOBS: Stay hungry. Stay foolish.

KALIOUBY: ...Or inspiring.

(SOUNDBITE OF ARCHIVED RECORDING)

JOBS: As you graduate to begin anew, I wish that for you.

KALIOUBY: And it can suggest five minutes of meditation.

RAZ: Or dance music.

(SOUNDBITE OF MUSIC)

KALIOUBY: Or dance music (laughter).

RAZ: Yeah.

KALIOUBY: Or that, too.

(SOUNDBITE OF MUSIC)

KALIOUBY: So it can do a lot of things once it understands what your patterns are and what your emotions are over time.

(SOUNDBITE OF TED TALK)

KALIOUBY: I think in five years down the line, all our devices are going to have an emotion chip. And we won't remember what it was like when we couldn't just frown at our device and our device would say, you didn't like that, did you? Imagine if your learning app sensed that you're confused and slowed down or that you're bored so it sped up, just like a great teacher would in a classroom. Emotion-enabled wearable glasses can help individuals who are visually impaired read the faces of others, and it can help individuals on the autism spectrum interpret emotion, something that they really struggle with. What if your wristwatch tracked your mood or your car sensed that you're tired, or perhaps your fridge knows that you're stressed, so it auto-locks to prevent you from binge eating (laughter). I would like that, yeah.

(LAUGHTER)

KALIOUBY: What if when I was in Cambridge I had access to my real-time emotion stream and I could share that with my family back home in a very natural way, just like I would have if we were all in the same room together?

RAZ: How do you imagine all this could, like, change us - right? - because, I mean, I don't know, part of this makes me a little uncomfortable.

KALIOUBY: So we're very social animals and social beings. And the only thing we're changing is this paradigm in which we interact with technology, right? We are becoming more and more digital. We're surrounded by a lot more highly-connected, intelligent devices. And I don't think that that's going to change anytime soon. Like, I don't see my daughter suddenly stopping texting. Like, that's going to be very hard (laughter). So I think the solution is not to curb our use of technology but instead embrace it and build empathy into that technology. Like, I think it's important that we learn, as humans, that our emotions continue to matter, even in a digital world.

RAZ: That's Rana el Kaliouby. Her company is called Affectiva. You can check out her entire talk at ted.com.

Copyright © 2015 NPR. All rights reserved. Visit our website terms of use and permissions pages at www.npr.org for further information.

NPR transcripts are created on a rush deadline by Verb8tm, Inc., an NPR contractor, and produced using a proprietary transcription process developed with NPR. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of NPR’s programming is the audio record.