Sensor packages are becoming ever-more-dynamic in robotics development. Robots are increasingly savvy at touching, sensing minute forces, and seeing in a variety of spectrums. I’ve written about the sense of touch in particular as a paradigm-expanding extension of the sensing toolkit.
The ability to hear has been lurking more quietly behind the scenes, in large part because the sensors required to hear are so rudimentary that they’ve been taken for granted. But most humans rely heavily on sound to orient themselves in the world, so it stands to reason that automation controls system should as well.
Now researchers at CMU’s Robotics Institute have found that sounds may actually be used to enable a robot to better tell one objects from another. After all, the objects have different physical properties that produce different sounds when they’re handled, clanged against something, or used in the field.
“A lot of preliminary work in other fields indicated that sound could be useful, but it wasn’t clear how useful it would be in robotics,” explains Lerrel Pinto, who recently earned his Ph.D. in robotics at CMU and will join the faculty of New York University this fall.
Researchers at the Robotics Institute created a large dataset of video and audio recordings of everday objects as they slid or rolled around a tray and crashed into its sides. The way this database was populated is a mini story within the story given its savvy deployment of automation. The researchers used a robot named Sawyer from defunct developer Rethink Robotics. Sawyer held a tray and each object — a tennis ball, for example, or a toy block — was placed on the tray. Sawyer then spent hours moving the tray in random directions while cameras and microphones recorded everything. In all the researchers captured a dataset of about 15,000 interactions, which they’ve since released.
Drawing on parallel research into the use of sound to help robots estimate the trajectories and movement of object, the researchers reinforced the usefulness of sound for robots, finding that robots equipped with the insights gleaned from the dataset managed successful classifications of objects about three-quarters of the time based on sound alone.
“I think what was really exciting was that when it failed, it would fail on things you expect it to fail on,” he said. For instance, a robot couldn’t use sound to tell the difference between a red block or a green block. “But if it was a different object, such as a block versus a cup, it could figure that out.”
Interestingly, the Department of Defense’s research grant arm, DARPA, supported the research, along with the Office of Naval Research, both big investors in automation technologies and research.
Source: Robotics - zdnet.com