advertisement
advertisement
advertisement

Watch this virtual robot learn to play hockey inside a video game

Chipmaker NVidia became a titan in the computer graphics world, then repurposed the same processors to run the deep-learning neural networks that make today’s artificial intelligence possible. It’s now combined the two technologies—plopping virtual robots into game-like environments and allowing them to bump around and learn the best ways to move about, pick things up, or even play games. “You have to sense the world, figure out what you’ve learned from the word, plan what you’re going to do, take action, and you go around in that loop,” said NVidia CEO Jensen Huang at an event today in San Jose.

The virtual world is named Isaac (after both Newton and Asimov) and runs on the Unreal video-game engine to save the drudge work of teaching physical robots manually. To make the point, Huang showed a (sped up) video of a top Berkeley AI researcher painstakingly placing a hockey puck in front of a robot 200 times so it could learn from trial and error how to hold a hockey stick and hit the puck into a net. The Isaac virtual world replicates the physics of real life, as well as all the robot’s sensors and actuators; but the trial-and-error process can happen much faster. Plus, several robots can all try to learn at the same time. 
The learnings from the virtual robot that did best on the first attempt then get loaded into all the other robots for the next attempt. And so on, and so on. “Inside that computer is a virtual brain, and when we’re done with that, we take that virtual brain and we put it in a real robot,” said Huang. “When a robot wakes up, it has already been pre-trained for this world.”

SC