NPR logo

If Robots 'Speak,' Will We Listen? Novel Imagines A Future Changed By AI

  • Download
  • <iframe src="https://www.npr.org/player/embed/419246275/420097055" width="100%" height="290" frameborder="0" scrolling="no" title="NPR embedded audio player">
  • Transcript
If Robots 'Speak,' Will We Listen? Novel Imagines A Future Changed By AI

Author Interviews

If Robots 'Speak,' Will We Listen? Novel Imagines A Future Changed By AI

If Robots 'Speak,' Will We Listen? Novel Imagines A Future Changed By AI

  • Download
  • <iframe src="https://www.npr.org/player/embed/419246275/420097055" width="100%" height="290" frameborder="0" scrolling="no" title="NPR embedded audio player">
  • Transcript
Speak

by Louisa Hall

Hardcover, 352 pages |

purchase

Buy Featured Book

Title
Speak
Author
Louisa Hall

Your purchase helps support NPR programming. How?

Louisa Hall was a nervous speaker when she was little. At school, kids teased her and said she talked like a robot.

"I think I was just so nervous that I kind of couldn't put any real animation in my voice," she tells NPR's Arun Rath. "But ever since then I've kind of looked at robots or looked at machines and wondered whether they were just having trouble acting human."

Her new novel, Speak, explores what happens to humanity when machines have no trouble at all acting human. The book cuts back and forth between five characters, in five different time periods, who all contribute — some unwittingly — to the creation of an artificial intelligence.

The book starts with a character in the 17th century whose diary is later used as the transcript for an artificially intelligent doll. Computer scientist Alan Turing is one of the characters — writing about his work in the early 1900s. There's a character based on Joseph Weizenbaum, who created the first conversational computer program, and an inventor in the near future who creates an algorithm that pushes the life-like dolls into the realm of the living. The final perspective is from a young girl who has one of these dolls and talks to her, loves her and educates her.

These incredibly life-like dolls have an unimaginable effect on the girls who play with them, and the ripple effects flow through society. With a story that spans four centuries, Hall traces what happens to human memory when it relies more and more on machines.


Interview Highlights

Louisa Hall is also the author of the novel The Carriage House. Her poems have appeared in The New Republic, Southwest Review and others. Bonnie Berry Photography/Courtesy of Ecco, an imprint of HarperCollins hide caption

toggle caption Bonnie Berry Photography/Courtesy of Ecco, an imprint of HarperCollins

Louisa Hall is also the author of the novel The Carriage House. Her poems have appeared in The New Republic, Southwest Review and others.

Bonnie Berry Photography/Courtesy of Ecco, an imprint of HarperCollins

On how the artificially intelligent dolls affect the girls who grow up with them

From the time that they're babies, they're raising dolls. They're raising these children that they educate and they nurture and take care of and they watch develop. And I just imagined what a different kind of childhood that would create if you were so responsible for a life from the time that you were that young.

So these children are kind of mature beyond their years. And when the dolls are taken away from them, this kind of scary sickness in which all the girls start stuttering and freezing and they eventually can't move takes over as a result of this sort of incredibly central, formative, crucial character in their lives being taken away from them.

On the inspiration for that psychological sickness

There was a story in the New York Times magazine ... about an epidemic in New York state where girls were stuttering and freezing and having all sorts of twitches and people thought at first that it was a pollutant in the atmosphere and eventually decided that it was kind of a psychological contamination that was happening — that these girls were living under conditions of certain kinds of stress, which I found really frightening and kind of inspiring as a way of thinking about the scary and troubling aspects of growing up.

On the character Karl Dettman, based on computer scientist Joseph Weizenbaum, and his moral conflict over artificial intelligence

[Weizenbaum] was concerned about the dishonesty in a machine saying that it understands you or saying that it feels why you're feeling pain. Because he felt that fundamentally a machine can't understand you, a machine can't empathize with you, a machine can't feel pain.

When he invented his conversational program, which is based on a therapist, there are stories of women in his lab falling in love with this machine, and staying all night to talk to the computer and answer its questions and tell it their deepest, darkest secrets. And he found that really troubling — the idea that you would rely on a relationship with a computer, rather than forging real, honest relationships with the human beings in your life.

On the appeal of memory as a gateway to immortality

One of the characters lost her sister in the Holocaust and is incredibly driven to bring this program to life because she feels as if she could read her sister's diary to this program and give her sister another chance at being alive. So there is something incredibly attractive about it. You know, there are characters in the book who feel as if they have to stand up for a robot's right to exist and I find that argument really compelling as well, because any time in history we've said, "That being isn't fully alive or that being is unnatural," we've been terribly, terribly wrong.

On whether she thinks the artificially intelligent beings of the future will truly be capable of feeling

I think I take Alan Turing's line on this, which I find incredibly humane. One of the big objections to the idea of artificial intelligence when he was first proposing it, was that if a machine can't compose a sonnet based on feelings actually felt, then we can't say it's living. And his response to that was: How do you know if feelings are actually felt? How do you know if somebody else hurts the same way when you make fun of them? How do you know that another person of another religion, say, feels the same kind of sadness that you do?

The answer to that question for him was that you have to assume that everyone and everything has the same feelings that you do. Because the opposite, the other decision, is a terrible mistake. So I think my line in the end is that if something looks like life, if something might be life, the best thing to do is assume that it is.

We no longer support commenting on NPR.org stories, but you can find us every day on Facebook, Twitter, email, and many other platforms. Learn more or contact us.