To humans, a penny simply feels like a penny. But how is information about touch processed robotically? Click on the enlargement to see.
On the left is the optical image of a Lincoln penny and on the right is the corresponding pressure image from the tactile sensor. Image courtesy of Science.
Bruce Springsteen used to sing about wanting just "a little of that human touch." It turns out that researchers who build robots feel the same way. That's because, of all the five senses, touch is the most difficult to replicate in mechanical form. Just think about it: Your fingertips are constantly registering everything from temperature to vibrations to texture. Now, scientists are getting closer to matching the power of the fingertip with a new kind of sensor for pressure.
Germany is hosting a major soccer competition in June. No, not just the World Cup. RoboCup. Some day, robots may just challenge humans to a soccer match. In the meantime, they continue to practice. A 2050 soccer match is one aim of the RoboCup, an international research and education effort dedicated to promoting artificial-intelligence and robotics research.
The 10th Annual RoboCup competition is set for June 14-18 in Bremen, Germany.
The games began in 1997 in Nagoya, Japan. Last year's competition in Osaka, Japan, brought together 330 teams and 2000 participants from 31 countries.
The soccer competitions are divided into five leagues:
Simulation league: Software-created players play on a virtual field on a computer.
Small-size robot league: Robots of no more than 18 cm. The ball is an orange golf ball. The field is bigger than a ping-pong table.
Middle-size robot league: Robots are no more than 50 cm square. The field measures 12 x 8 meters.
Four-legged robot league: Four-legged robots (see Sony's AIBO in section above). The field is 4 x 6 meters.
Ravi Saraf, at the University of Nebraska, says a very sensitive pressure sensor could help a robot learn a lot about texture.
"When you touch somebody, or when you try to feel the texture of an object, you have a pressure distribution on your finger," Saraf says. "And what the human body does, is it measures this pressure distribution. And from there it deciphers the texture."
Touch a smooth surface like glass and your fingertip feels an even distribution of pressure; touch a rough surface, like sandpaper, and you feel a lot of little pressure points.
Various kinds of mechanical pressure-sensors have been around for years, but none are as good as a fingertip. Saraf says they're hard to manufacture, so they usually have to be pretty small. That’s a problem when you consider that skin is the body's biggest organ.
"I was thinking about it and I came up with a different way of doing these things," Saraf says. He realized that the tools of nanotechnology could be used to create a sensor that's more like skin.
His device is a thin film that's easy to make in sheets. The film is no thicker than a human hair. But inside, it looks like a layer cake. Each layer is made of a different kind of nanoparticle, either gold or a semiconductor. Between the layers is a thin plastic.
When something touches the film, the nanoparticles are pushed closer together. That changes a small electric current that runs through the film, and causes some of the nanoparticles to emit light.
"The area that is pressed more, more light comes out. And the area that is pressed less, less light comes out," Saraf explains.
The light is captured by a camera. When you put something like a penny on this mechanical fingertip, the pressure created by the penny produces an amazingly detailed image. Saraf says you can even see the wrinkles in President Lincoln's clothing, and the letters TY in LIBERTY.
"We were very surprised by this device," he says. "We knew in principle it would work. We just didn't think that we'd get so much light."
Saraf's team describes their work in the journal Science. They say there's a lot to do before a robot could wear this skin. For example, they need to replace the camera with some other kind of detector.
Allison Okamura, a robotics researcher at Johns Hopkins University in Baltimore, says the device's resolution and sensitivity is impressive. But she points out that it deals with texture in a way that's not at all like a fingertip.
"For example, if you took your finger and pressed it down on a penny, you would not feel the word liberty on that penny. In fact, humans could only feel that word if they slide their finger back and forth over the surface. So it's not operating at all like a person does."
Still, robots don't necessarily have to copy humans in every respect. Saraf's team hopes their mechanical fingertip could one day help a robot sense the world more like a human, even if it does it in a totally alien way.