This Week In Bots: When Buck Rogers Became Real

asimo

Asimo Gets An Upgrade

For robot fans, Honda's Asimo has perhaps always been the biggest treasure: He's been in development for decades, and Honda just this week revealed a refresh that makes Asimo the most advanced yet. There's a host of improvements, from slicker and faster gaits, more sophisticated movements, individually articulated fingers (flexible enough to cope with sign language) to improved voice recognition that can discern commands even when three voices are talking at once. The best way to understand how advanced this child-sized bot is nowadays is to watch this video:

Asimo is, arguably, just one step away from being the science fiction robots we've been seeing for years. Though he's white, not silver, in many ways he resembles the lovable Twiki bot, from the 1970s' Buck Rogers in the 25th Century TV show...minus Twiki's catchphrase sounds. Asimo weighs about the same as Twiki, but actually has more sophisticated movements and gaits than the sci-fi effect did, despite the fact the cheap costume had a human being crammed inside.

The missing bit from Asimo is artificial intelligence. He's now smarter than ever, but is still a reactive machine--responding to commands, rather than intuiting what to do. We can imagine something like IBM's Watson crossed with Apple's Siri being a good solution to this, once the computing power is reduced in size enough. As a result Honda has said Asimo is still years away from being commercialized--which is a shame, because even in his current state he's probably well-suited to many butler-bot and nurse-bot tasks.

A New Robot Walking Trick

Asimo isn't the only sophisticated walking robot, of course--NASA's working on some legs for it torso-only Robonaut, and Boston Dynamics has recently freaked us all out with its very Terminator-esque PETMAN project. Then there's the work of the German Aerospace Center's Robotics and Mechatronics institute to think of: They've been working on a very tricky aspect of robot design called compliance. It's essentially a way for a robot to accept an unexpected push, slip, or impact without losing control of its poise and falling over...with an added benefit that if the colliding object is a human being, the robot will smoothly absorb the energy of the crash, which should cause less injuries.

DLR has eschewed the motion system called Zero Moment Point that makes robots like Asimo work, and has instead applied some new math based on the design of robot gripping hands instead. And: It works. The upshot will be robots that are less prone to damage, and which will hopefully avoid injuries and accidents if they bump into people.

MaskBot

One solution to making robots appealing to humans is to make them deliberately artificial, like Asimo. Another is to try a hyper-real approach, like the Geminoid series. And robotics engineers in Germany have come up with a third way that is freakishly convincing: MaskBot. By building a 3-D model of an artificial face, then projecting it onto the inside of a translucent robot mask, the team has created a dynamic synthetic face that reminds us more of the subtly human-like face of Sonny in the film I, Robot.

It can bat its eyelids, create realistic mouth shapes, and display emotional expressions. For a robot interface, this kind of advance may make humanoid robots much more integrated and accepted when they make their way into situations like health care. And the team's also been clever enough with the recording tech, so the same design could put your face onto a telepresence robot that's rolling the corridors of your workplace while you sit at home with the flu.

Telexistence

Telepresence devices are mainly about transmitting voice and a imagery between a remote user and the location of the robot--the "presence" is limited to a kind of roving robotic videophone, if you like. But work from Keio university has created a whole new approach to the effect, not just telepresence but telexistence. The group's Telesar V machine can transmit 3-D stereocopic imagery so your eyes can see roughly what the remote droid can, it has arms with hands that you can manipulate, and it has sensory feedback so you get a sense of what the droid is touching when it picks something up--down to the roughness detail of the spots on a Lego brick. There's even the facility to sense the heat of an object you've picked up.

The ultimate goal is to refine the device so that it moves with the same degrees of freedom as a human body, and delivers back yet more complex sensory data so that a remote operator actually feels like they're present at the remote location (yes, it's like Avatar). There are benefits to such sophisticated remote presence--particularly if the robot is involved in tricky construction, or perhaps medical work. Or you can just kick back and enjoy the entertainment value of such an invention.

Chat about this news with Kit Eaton on Twitter and Fast Company too.

Add New Comment

0 Comments