Fast Company

This Week In Bots: Think You Better Dance Now

asimo

ASIMO Dances

Honda's child-sized android may be the world's best-known real-world robot, even though his practical white paint job isn't as snazzy as C3PO's gold-plated goodness. Over the years, ASIMO's skills have gotten ever more spohisticated as Honda's research scientists look at improving his software, drive units, and sensors to give him better control over his body, and more artificial intelligence to let him manuever under his own control and navigate around unexpected objects (the kind of task androids will need to master if they're to help us in our homes or hospitals). But now the Automaton blog has seen a demonstration that ASIMO is now smart enough to copy your dance moves.

Actually there's more to this than watching a bot do the Macarena--we've seen even more impressive dancing before. Rather it's all about giving the robot more intuitive ways to react to his environment in real time, as well as move in a more human-like way that could boost trust in the machine (vital in health care situations). ASIMO could actually gain gesture control.

Chameleon Tongue

Chameleons can shoot out their tongues with incredible speed and accuracy to snatch insects for food from a surprising range. This inspired scientists Tomofumi Hatakeyama and Hiromi Mochiyama to try to build a robotic copy. The design is simple--a compressed air gun fires out a tiny magnetic projectile connected to an elastic cord, emulating the sticky end of the real animal's tongue and the elastic way the muscles recoil the tongue back into the mouth.

It works! In a limited way the tongue really can snatch tiny magnetic targets and return them. The device is more of a proof of principle than anything else, but research like this can end up in use in surprising ways--can you imagine a genuine robot fly catcher?

PR2 Can Order A Subway

Research robot PR2 has already shown its skills in dealing with the real world, but scientists at the University of Tokyo and the Technical university of Munich have taken PR2's powers to an incredible level: It can perform semantic search to understand enough about the context and implications of being asked to buy a sandwich that it can actually order one in a restaurant.

In some ways, this kind of linguistic technology is similar to Apple's amazing new Siri personal assistant in the iPhone 4S. It's tricky stuff, because if you asked, say, your future butler bot to get you a cup of tea it would have to look at what you mean, identify the needed objects, and work out where they may be located. For the experiment the scientists taught PR2 that sandwiches are food and food can be found in certain locations. From this start, PR2 was smart enough to understand the question "get me a sandwich," make its way to a separate building, and order one. Did you get the chills yet?

Soldiers Throw Robots, Not Grenades

We've talked about throwable robots before--their utility in military situations like those found in Iraq and Afghanistan is evident, when you imagine small spy bots being tossed into unidentified buildings to recon the situation before personnel enter.

Now we know that the U.S. Army will be testing the idea in the wild, for real. They've selected three devices (iRobot's 110 First Look, MacroUSA's Armadillo V2, and QinetiQ North America's Dragon Runner) and will be deploying them in the field, under fire. The robots are all similar, with a solid, small and resilient body, camera systems, and remote control. It's exciting stuff because unlike many of the far-future science we write about in This Week In Bots, it's really a sample of the future being put to the test right now.

Robot Rodent Brain

If the PR2 antics didn't chill you, this will: As part of research into reconstructive surgery, brain interfaces, and brain reading technology, a Tel Aviv researcher has inserted a "robotic" cerebellum into the brain of a paralyzed rat. The cerebellum, among other things, gets information from the larger brain (the main body) about movements and control, and sends signals down the nervous system to move and monitor limbs. In the experiment the brain-injured rat could only blink, under its own volition, when its robotic cerebellum was turned on.

This is far-flung future stuff, but if you've ever seen a cartoon like Ghost In The Shell, you'll know that this type of cybernetic interface has long been imagined. As well as allowing paralysis victims to gain control of their limbs again, the idea is that in the future brain-damaged regions of the brain could really be replaced by robotic "copies" that emulate the same systems...and thus restore full function.

Impressive? For sure. Scary? A little bit.

Chat about this news with Kit Eaton on Twitter and Fast Company too.

Add New Comment

1 Comments

  • Torry

    yippy a first comment.........
    my best friend's mom makes $77 an hour on the computer. She has been out of job for 9 months but last month her check was $7487 just working on the computer for a few hours. Read about it here MakeCash3. com