Bot Vid: Robo Cheetah
Boston Dynamics is behind many of the most astonishing robot advances at the moment, and its latest tech is splashed all over the web this week beacuse it’s extra impressive: The robotics firm has pushed its cheetah-like robot quadruped up to a mind-boggling 18 miles an hour, a record for legged bots. It’s a DARPA-funded project, like many BD efforts, to improve mobility and speed of ground robots for tasks like mine and IED detection. Check out the slo-mo at the end.
Bot Vid: Robo Artist
Industrial robots are best known for wielding welding arms around cars in factories, or whirling impossibly heavy objects through the air at high speed on production lines. At this week’s CeBit fair in Germany, Fraunhofer Institute researchers showed off the unexpected gentle dexterity of an otherwise industrial-scale robot arm: They’ve taught it to draw portrait sketches. Quite beautiful ones, too.
Bot Vid: Robo Pole Dancing
Another CeBit eye-grabber was the pole dancing robot art installation of U.K. artist Giles Walker. They’ve been around but are still impressive eye candy…and though their dancing finess doesn’t quite match with some robo dancers we’ve shown you before, they do illustrate a quite interesting and ethics-challenging future: Robots as pseudo-erotic entertainment. (This video is of the robots performing at a previous show.)
Bot Vid: Microsoft’s Bot Kit
Microsoft just unveiled the general availability of Robotics Developer Studio 4, the lastest and greatest edition of its homebrew/education software suite that brings robotics programing to the public. The new code includes improvements and tweaks, and importantly includes compatibility with the RTM version of the Kinect For Windows software developer kit. Which means keen coders can now integrate Kinect sensing directly into their robot projects (like Willow Garage has done with its advanced PR2 research robot). The video shows the MS Robotics Groups’ own fun effort.
Firefighter Robots. The U.S. military and police are spending more and more money on robots, but a new addition isn’t the kind of privacy-invading, battlefield-capable devices we’ve seen before, instead being designed to save lives aboard ships. Developed by the Naval Research Lab, the Shipboard Autonomous Firefighting Robot is designed to put out fires aboard ships or submarines. Shaped like a person and thus able to navigate the cramped spaces on a vessel, including stairs, SAFFiR can put itself at risk in situations that otherwise endanger human firefighters–and thus extinguish fires that could kill people or sink watercraft. It’s also gesture-controlled, making it part of a human team, and commanding it is simpler in noisy environments. It’s expected to be field-tested in about 18 months.
Giving PackBot Arms. HDT Global has just released its MK2 robot arm suite, a bolt-on assembly that can be fitted to PackBots or Talons to give them an android-like twin arm capability. This can allow more finesse in fine motions and object manipulation than some simpler manipulator tools, including a commonality with human arms which may give the remote controller a better mental map of the robot’s task. It’s designed to make defusing explosives simpler.
Making Human-Robot Queries Comfortable. New research coming from Georgia Tech is exploring how in the near future when our homes are filled with robots, how the average consumer can program the bots without having to become a programmer. The work by Maya Cakmak and Andrea Thomaz has suggested that one key to making these human-robot interactions most comfortable in the mind of the human is to make the robot ask the right “feature queries,” specific questions about the details of the task you’re trying to command it to do.
Bot Futures: Chatty bots
Humans interacting with robots is tricky technically because of the sophistication involved in making them speak and understand speech. But interaction is vital for commanding the machines, understanding their requests, and for situations that are safety-critical. Which makes research by some Plymouth University scientists in the U.K. really important, but not perhaps in the way you may expect.
Unlike speech-centric interfaces, the team is trying experiments with different sounds–varying pitch, rhythm and other qualities of sounds to deliver a sense of emotion. It’s a project funded by the European ALIZ-E initiative, designed to develop meaningful bonds between humans and robots in a hospital environment. And yes, it’s tapping into the very same strange sense of emotion you get when you hear George Lucas’ adorable little R2-D2 beep its way through Star Wars.
The Plymouth team even used an unusual source for their human data: Kids. Perhaps because when you’re younger you don’t have so many filters between an emotional trigger and your emotional response, the children were able to strongly characterize robot noises into categories like sad, happy, or scared.
Non-verbal communication like this is likely to be really important in situations where you need to trust a robot, such as in a health care environment, as it adds an anthropomorphic edge to the robot’s interface. It’s all in the human’s mind, at this point in our robotic science because we’ve not imbued robots with genuine intelligence and thus emotional expression yet. But it does mean you assume the robot has emotions (in the same way we tend to feel positive about car designs that make them look like they’re smiling), and this could be vital if, say, an elderly patient has to trust their robot nurse who is delivering medication. Now, does this come with a spoonful of sugar?