Researchers at MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) have had a breakthrough in giving a robot the human-like ability to determine what an object looks like just by touching it, as well as predict what an object will feel like just by looking at it, reports Engadget.
The researchers accomplished this by adding a tactile sensor called GelSight to a KUKA robot arm and then fed the information collected by the tactile sensor to the robotic arm’s AI so it could begin learning the relationship between tactile and visual information on its own. Twelve thousand videos of 200 fabrics and other household objects were then converted to still photographs and fed to the AI so it could further learn the relationship between touch and sight. The breakthrough could ultimately help robots become better at manipulating objects, Yunzhu Li, CSAIL PhD student and lead author of the research paper explained:
By looking at the scene, our model can imagine the feeling of touching a flat surface or a sharp edge. By blindly touching around, our model can predict the interaction with the environment purely from tactile feelings. Bringing these two senses together could empower the robot and reduce the data we might need for tasks involving manipulating and grasping objects.
However, for now, CSAIL’s robot can only identify objects in a controlled environment. Next, the researchers plan to enlarge the tactile/visual data set so the robot can perform tasks in various other settings.