Precision quadrocopter aerobatics are always eye-pleasing, but the amazing display can distract us from remembering why someone bothered to do that. Now ETH Zurich has a new video of three quadrocopters cooperating to toss a ball in the air and catch it in a net–it’s impressive. But it also demonstrates incredible coordination and advanced processing in each machine to keep its balance in the air as it experiences complex dynamic forces from the net. It’s all about improving the utility of these amazing machines, which will eventually help their use as drones.
PR2 from Willow Garage is already one of the most promising research robots because of his widespread use in universities–enabling scientists and engineers to refine algorithms for making robots work alongside humans. Now the Automaton blog reports a PR2 at Georgia Tech has been programmed to carry trays around without spilling the contents. This is a surprisingly tricky task that you take for granted, thanks to practice and acute awareness of the balance of a tray. But it’s a hard task for robots, and one that they must master if they’re to be useful as robot butlers or in medicine (because as well as tray-carrying it’s useful to be able to balance complex objects).
The Thermite RC robot is in the news this week because it’s being heralded as the world’s first production fire-fighting robot. It’s a lot like a rugged caterpillar robot along the lines of iRobot’s warbots, and it can be operated from a quarter of a mile away and shoot 600 gallons of water a minute onto a blaze. It has to tug a firehose behind it, of course, which is perhaps one weakness. But because it can be used in dangerous situations you wouldn’t want a human fire-fighter going near (like a chemical fire or an uncontrolled forest fire) it’s likely to be incredibly useful. It’s also not going to be the last fire bot you hear about.
An oldie, but goodie. And how could we not show this now that James Bond is back on our screen in Skyfall? Plus it’s another demonstration of how incredibly precise quadrocopter drones can be programmed to fly.
Transporter turtle bots. As well as aerial drones, the students at ETH Zurich are also working on an incredible robot called Naro-Tartaruga. It’s a seagoing robot, designed to loosely bio-mimic the swimming motion of turtles. These amazing creatures swim their bulky bodies through water with relative ease, maneuverability, and high efficiency–all things we guess might make a giant sea-swimming cargo robot a potentially useful alternative to giant transporter ships.
Nissan’s car robot. At the CEATEC show this week the carmaker Nissan showed off its latest robot car development, and it’s an eye-opener. The NSC-2015 is a very modified Nissan Leaf that can park itself, come when you call it via a smartphone, and alert you if someone tries to steal it–even delivering live 360-degree video footage of the car’s exterior so you can see the thief. Some of this tech will be in production cars in 2015 and 2016, but in a limited form, and self-parking may only be enabled in dedicated lots.
Sweden’s robot museum guide. A new robot has joined the fleet of machines helping guide visitors around museums, and in this case she’s called Bib and she’s working at the Technical Museum in Malmo, Sweden. She uses several tricks to make her more approachable, including being voiced by a Swedish comedienne in a local accent, and being able to ask for help if she falls over or gets lost.
Scientists in U.K. universities are taking biomimicry in robots to a totally weird and wonderful new level. Instead of learning how animals and plants move or adapt in nature, they’re actually planning to upload a brain from an animal into a robot to give it a kind of neural-net smartness that may surpass other AI efforts in simple robotics.
Their animal of choice: A bee.
Speaking to PCPro.com, the scientists explained that a bee makes a great example because honeybees are very sophisticated insects capable of doing many tasks that much larger animals can also do. But where bigger animals have billions of neurons in their brains, honeybees have a far more manageable one million neurons. That’s a scale that’s much more easy to model in a powerful computer than a larger animal (humans, for example, have around 100 billion neurons that are connected by around 100 to 500 trillion synapses in adulthood). Bigger animals have bigger brains, in part, to control their bigger bodies–but bees are still capable of complex autonomous tasks as well as community interaction, which makes them a very useful model. The science team thinks any resulting robot drone aircraft may be smart enough to use its bee-like sense to search for survivors in post-disaster scenarios, and other simple tasks.
That’s amazingly interesting, but perhaps the part of this news that will most surprise you is the hardware the team is using. The drone is the kind of quadrocopter you can buy to control with your iPhone for a couple of hundred dollars. The bee-brain simulation happens not in a supercomputer but in a PC that’s equipped with some very fast GPU-accelerator cards that have sprung from the gaming industry’s innovations. These are swift enough and capable of enough parallel computing calculations that they can realistically emulate how a bee’s brain would react–at a speed that’s fast enough to keep the quadrocopter stable while flying.
Essentially this is a biomimic robot drone that’s made with off-the-shelf parts, and is as smart as a bee. Which is pretty smart. What kind of artificial intelligence would you be able to create if you used dedicated high-end computing power?
While you ponder the implications of that, other AI-capable flying robots are in the news thanks to the developments of assistant professor Chengyu Cao at the University of Connecticut. His team is using advanced processing and sensor networks to develop artificially intelligent helicopter drones, quadrocopter drones, and even undersea vehicles. By sensing their environment, and perhaps even working collaboratively as a fleet, Cao’s machines are far from the limited remote-control drones we have now, or more sophisticated ones that can autonomously navigate to a set point–using limited straight-line flight or pre-programmed routes. Cao’s drones can sense their environment and react adaptively, choosing to fly around a tree that they find in their path, for example.
Professor Cao himself foresees a near future where machines cannot only navigate smartly from point to point by themselves, but will be able to react and adapt to their changing conditions without human interruption. That’ll lead to situations where smart robots work autonomously alongside humans in scientific research, rescue situations, or even in environmental monitoring.
What these two pieces of research show is that drone robotics, in particular, is advancing at an incredible rate that may surprise you. Not only will near-future drones be more complex and reliable, but they’ll also be smarter and able to think for themselves to a limited degree. That also implies that advances in AI and autonomous behavior may trickle down to other robots sooner rather than later–and that’s a good thing. After all, if your dog can perform simple fetch tasks, why couldn’t a butler bot with a neural net brain of about the same power help you around the home?