Roombas are basically extremely smart vacuum cleaners, but to some owners they start to feel like pets or friends–a phenomenon investigated by one of my favorite accounts on Twitter, @SelfAwareROOMBA, where you can follow the philosophical, dramatic, and mundane musings of the eponymous device.
As more advanced robots enter more parts of our lives, especially in the workplace, there could be growing emotional and psychological consequences for their human caretakers.
Julie Carpenter, a human-robot interaction researcher who did her doctoral work at the University of Washington’s School of Education, recently found that out in a series of interviews she did with 23 military personnel who operate robots that dismantle explosives and other weapons. In their responses, it was clear they had begun to view the robots as extension of themselves.
“There’s generally one person who is tasked with operating the robot regularly,” Carpenter says. “They described the robot in terms of their sense of self, not just as an avatar of themselves at a destination. They would often describe it as: ‘the robot is really my hands.’”
While the military robot operators knew in their heads their robots were simply tools designed to keep them safe, they gave their robot names and even painted the names on the side. They also experienced a sense of loss–even a few had funerals–when their robot got “hurt” or destroyed in the course of an operation, Carpenter says.
Ryan Calo, a researcher who studies the legal issues around robotic technologies at the University of Washington School of Law, sees broader repercussions of our emotional attachments as technology advances. A robot might not need to be self-aware to have rights to certain legal protections some day.
“Because of our conflation of what appears to be social and what is actual,” says Calo, “we may be uncomfortable with the ill-treatment of robots well short of any claims of consciousness.” Speaking at a conference about drones and aerial robotics at NYU recently, he said there may be a new philosophical, or “ontological,” category that is needed to describe the interaction.
Dismantling bombs is of course an extreme situation, but tele-operated robots are becoming more prevalent in settings ranging from the office to nursing homes. What’s more, they are starting to look and act more human.
For example, while the bomb specialists Carpenter interviewed used robots that looked like machines, the military is starting to look at robots that seem like pets and people and would be more agile, such as climbing stairs or galloping on terrain. And in civilian settings, the same trend is occurring–at least in research labs, for now, as the New York Times recently documented. As the next generation of robots arrives, making “rational” decisions about machines could get harder.
Different kinds of conversations are starting to happen already. Many surgeons, for example, who use the increasingly popular DaVinci robots to perform the surgery or assist in the operating room have patients watch videos about robotics so they can learn about the technology. But that supposes that the robot is something more than a tool used by a skilled professional.
“People feel differently about tasks they perform with robots,” Calo says. “We would never be having this conversation about whether you should ‘meet’ the scalpel or the operating table.”