A child can speak eloquently using just a few words. But computers tend to make clumsy communicators — even if they know the definition of every word in the dictionary.
New research suggests that's because words and the rules for using them represent just the tip of a linguistic iceberg.
"Language is inherently a social activity," says Deb Roy, a researcher at the Massachusetts Institute of Technology. "It's far more than just a vocabulary and a grammar. That's just the surface stuff."
Roy is part of a team of scientists trying to develop robots that can communicate more effectively with people. In the process, they're learning a lot about what lies beneath the surface of human language.
Ripley's Tabletop World
One of their experiments is a robot named Ripley.
Ripley doesn't know many words, because it's only learned about things in a very limited world — a small tabletop covered with a white cloth.
Ripley can see objects on the tabletop with an electronic eye and pick them up with a mechanical mouth. The robot keeps a mental image of the table in its computer brain.
Kai-Yuh Hsiao, a graduate student in Roy's MIT lab, demonstrates how Ripley interacts with his world. When he turns on the robot, it begins to look around the tabletop.
"Right now it's looking at this blue beanbag and this red ball," Hsaio says.
Hsiao speaks into a microphone connected to Ripley's computer brain: "Where is the blue one?" He asks.
Ripley answers: "At the center."
Ripley learns about the objects on his tabletop the same way a child does — by looking at them, touching them, and moving them around. The robot's brain remembers what color each object is, and whether it is hard or soft, light or heavy.
Roy says that makes Ripley very different from most talking robots.
"This is not a language-processing machine that's just pushing around symbols," Roy says. "I put this thing down in front of Ripley, suddenly we have something to talk about. Ripley and I are talking about that thing out there in the world that we both can see and touch and feel."
Roy says most other computers and robots only know dictionary definitions. But making sure that words are grounded in the physical world is critical to even the most basic communication, says Roy, and that's only a baby step toward what humans do with language.
"Ground level is the physical level," he says. More sophisticated thoughts, though, require that a person knows what another person is thinking.
"Why are you saying the things you are?" Roy asks. "What are things you are likely to believe at this moment? These are all inferences, these are all leaps of faith, guesses I'm making about what's going on in your head."
Scientists call that theory of mind, or theory of other minds.
Another Point of View
Roy explains that Ripley the robot does not have a true theory of mind. But its programming includes what may be a vital first step in that direction: Ripley has the ability to see the world from another point of view. Literally.
Hsiao gives a demonstration. He's on one side of Ripley's tabletop. The robot is on the other, and there are two objects on the table. The computer screen shows an image of the tabletop from Ripley's point of view.
Hsiao tells Ripley, "Hand me the one on my left," and the robot does. Hsiao points to Ripley's screen. It has changed so that it is now showing the world as it appears to Hsiao, making it possible for the robot to correctly interpret the command "my left."
Roy says that ability amounts to a rudimentary theory of other minds. But it's a small step toward more complex abilities, like empathy.
There's growing evidence that the human brain has evolved to be extraordinarily good at understanding the thoughts and feelings in other people's minds, says Roy. And that plays a huge role in language.
"The words are the tip, and the iceberg is this very rich mental structure that is being evoked both in your mind and in mine," he says. "To the degree to which they somehow overlap, to the degree to which what's in your head and my head is aligned, we're successfully communicating."