Fast company logo
|
advertisement

ROBOT REVOLUTION

The Rise Of The Robots: What The Future Holds For The World’s Armies

Beyond the already deployed human-controlled drone fleets, military engineers are already tinkering with lethal AI-driven autonomous battlefield bots.

The Rise Of The Robots: What The Future Holds For The World’s Armies

[Photo: Flickr/National Museum of the U.S. Navy]

BY Steven Melendezlong read

Judging solely by science fiction, military robots seem like a bad idea. From The Terminator to The Matrix, pop culture is full of stories of powerful machines that run amok, turning on their makers and overcoming any human forces that try to stop them. Even RURthe 1920s play by Karel Čapek that introduced the term “robot,” foretold the end of the human race at the hands of the artificial beings.

Despite the cinematic warnings, land, air, and sea robots have moved from science fiction to a standard part of the modern arsenal—and now there’s even serious debate to program killer bots and drones to think on their own, and communicate with each other.

Since Sept. 11, 2001, tens of thousands of remote-controlled robots have been deployed by the U.S. military. Flying drones have become a common weapon in operations in the Middle East. Unmanned land vehicles have helped U.S. forces safely destroy roadside bombs in Iraq and Afghanistan. And underwater drones have been used to collect data for science and reconnaissance, and to disable naval mines.

Other forces, from rival superpowers Russia and China to smaller countries and insurgent groups like ISIS, are developing their own combat robots, raising the possibility of a Cold War-style race to build offensive and defensive bot technology.

Proponents contend that military robots will save more lives than they’ll endanger. Those bomb-defusing robots already have.

Militaries are beginning to grapple with when and how to integrate increasingly autonomous robots into their operations. Some of the questions they face are the same ones of safety and efficacy the civilian world deals with in automating trucks, tractors, or forklifts. But world leaders are also actively mulling when, if ever, they’ll be allowed to fire guns or other weapons without express orders from a human.

So yeah, that mission-driven Terminator-like killing machine may not be so far-fetched.

Endeavor Robotics

The Drone Wars

To say that the U.S. military’s robotics program is robust would be an understatement.

It was October 7, 2001, a month after 9/11, that a robot first proved its value as a lethal weapon. One day into the invasion of Afghanistan, a MQ-1 Predator drone, tail number 3034, launched the first-ever deadly airstrike from an unmanned aircraft.

That drone is now a featured part of the National Air and Space Museum in Washington. Use of the Predator and larger Reaper drones ramped up quickly after that initial success under President George W. Bush and continued to increase in number under his successor. Council on Foreign Relations senior fellow Micah Zenko estimated in a 2016 opinion piece in The New York Times that President Barack Obama authorized approximately 506 drone strikes, compared to 50 under Bush.

Critics say the drones, remotely piloted by operators hundreds or thousands of miles away, are hardly foolproof. Reports of  accidental bombings of wedding parties and other civilian gatherings have dogged the drone program.

Obama argued in a 2013 speech that drone attacks on suspected terrorists were still better than any alternatives.

“Conventional airpower or missiles are far less precise than drones, and are likely to cause more civilian casualties and more local outrage,” he said. “And invasions of these territories lead us to be viewed as occupying armies, unleash a torrent of unintended consequences, are difficult to contain, result in large numbers of civilian casualties, and ultimately empower those who thrive on violent conflict.”

By 2013, Sen. Lindsey Graham, R-S.C., estimated that about 4,700 people had been killed by U.S. drone strikes. “Sometimes you hit innocent people, and I hate that, but we’re at war, and we’ve taken out some very senior members of al-Qaida,” he said in a speech at a Rotary Club in his home state.

Donald Trump has continued the strikes in the first few months of his presidency, quickly granting the CIA increased authority to conduct strikes and advocating use of Predators for border surveillance during last year’s election.

The U.S. is likely to face increasing threats from enemy forces armed with drones of their own: 86 countries now have some sort of drone capability, including 19 that have or are acquiring armed drone technology, the New America Foundation estimated last year.

“And a 2011 study found that there were around 680 active drone development programs run by governments, companies, and research institutes around the world, compared with only 195 six years earlier,” the foundation reported.

Terror groups are also eyeing the aerial tech: ISIS has used commercial and custom-rigged unmanned aircraft for surveillance and, recently, to drop explosives in combat with Iraqi forces.

Reaper drone. [Photo: Official Air Force Photo]

Boots—And Bots—On The Ground

It was in Iraq that robots on the ground have shown how they can help save lives with the deployment of unmanned ground vehicles tasked with defusing the roadside bombs and improvised explosive devices that killed and maimed thousands of U.S. troops in Iraq and Afghanistan. Peter Singer, a strategist at the New America Foundation and the author of Wired for War: The Robotics Revolution and Conflict in the 21st Century, wrote in 2009 that the number of military ground robots deployed in Iraq went from about 150 in late 2004 to about 12,000 just four years later.

“As many as 1,000 UGVs have either been blown up or hit by blasts in Iraq and Afghanistan, saving many lives,” Army spokesman Dov Schwartz wrote in an email to Fast Company.

As early as 2002, portable remote-controlled, tank-like vehicles known as PackBots were deployed in Afghanistan to search caves for booby traps and hidden foes.  At the time, the machines were made by iRobot, the Massachusetts Institute of Technology spin-off company better known for its Roomba and Braava lines of automated floor-cleaning machines. Its military division was spun-off last year to form Chelmsford, Mass.-based Endeavor Robotics.

The PackBot: Explosive ordnance disposal technicians are using remote-controlled machines to help detect and defuse improvised explosive devices. [Photo: U.S. Navy photo by Mass Communication Specialist 2nd Class Jhi L. Scott/Released]
The PackBot proved popular with troops in the field. An Army colonel told the Associated Press in 2003 that it was infinitely preferable to have to replace a robot than to lose a soldier to a bomb or ambush as iRobot rolled out new models with enhanced features for reconnaissance and bomb disposal.

“It’s a continuing learning process in conjunction with our customers and our end users–they’re always going to know more about their needs than we can ever hope to,” says Sean Bielat, Endeavor’s CEO. “There’s always this iterative process with getting equipment out to them, getting robots out to them, and getting to re-engineer them and help them with their missions.”

In 2005, a PackBot became the first robot to ring the Nasdaq exchange’s opening bell, heralding iRobot’s $72 million IPO. By 2012, iRobot announced it had sold more than 5,000 robots to defense and police agencies. Those included PackBots, a heftier Warrior line capable of carrying up to 150 pounds, and devices sold under the name FirstLook that weigh as little as five pounds.

The FirstLook machines fall into a category typically known as “throwable” robots, since they can literally be tossed into hard-to-see areas, rolled under vehicles, or even thrown inside buildings to check for potential threats.

“The military will use them to reconnaissance roadways, down culverts, in tight-to-get-into-places,” says Mack Traynor, president and CEO of Edina, Minn.-based ReconRobotics, which offers a variety of such devices to military and law enforcement. “I had a Marine Corps staff sergeant tell me [recently] that when he gets deployed, he gives it to the biggest guy in the unit and says, ‘throw it as far as you can.'”

The littoral combat ship USS Independence (LCS-2) deploys a remote multi-mission vehicle (RMMV) while testing the ship’s mine countermeasures mission package (MCM). [Photo: US Navy Photo]

Bots Beneath The Sea

Similarly, the U.S. Navy has been using robots to locate ocean mines. In 2003, torpedo-shaped unmanned subs called Remote Environmental Monitoring Units, or REMUS vehicles, were first used to hunt for mines around the Iraqi port city Umm Qasr. The devices helped sailors find dozens of suspicious objects under the surface.

As reconnaissance tools, unmanned bots have proven valuable for decades, including helping to find the wreckage of the Titanic in 1985. And they’ve been used to find shipwrecks and the “black box” transponders from crashed planes. In 1999 a U.S. Navy unmanned vehicle called Deep Drone successfully found the flight data recorder in the EgyptAir Flight 990 crash near Nantucket. Chinese forces deployed an underwater drone of their own last year to investigate sonar anomalies in the unsuccessful search for Malaysia Airlines Flight 370.

But so far, anti-mine drones have mostly failed to live up to expectations. Plans for what was called the Remote Minehunting System were officially canceled last year, not long after Senate Armed Services Committee Chairman John McCain issued a blistering report saying the system failed to deliver after $706 million and 16 years of development.

“Put simply, while estimated overall spending on RMS is only a little higher than originally planned, it is only yielding half the number of systems, at more than double the unit cost, and it is taking twice as long to field it,” the Arizona Republican wrote. “Also, it doesn’t work.”

Controversially, the Navy has long used trained dolphins and sea lions to hunt out mines, and officials have said they plan to phase out the Sea Mammal Program as soon as machines prove fully up to the task.

In November, the Defense Department’s Office of the Inspector General warned that a planned vehicle called the Knifefish, designed to be deployed in shallow waters from the Navy’s newly developed littoral combat ships, might not be ready for a planned 2017 launch. The Knifefish was part of the Navy’s announced plan to phase out the dolphin program.

The Robot Revolution

The revolution in military robotics still has a lot to get right. As the U.S. Pentagon switches gears from quick wartime acquisitions into longer-term robotics programs, it’s facing some challenges that have more in common with ordinary workplace tech projects than science fiction horror stories, like making sure devices from different vendors can play nicely together, and figuring out which new gadgets make operations safer and more efficient.

25-year roadmap released in 2013 emphasized ensuring that future robots can share data and work with each other. The plan called for robots to use standardized interfaces, so robots from one manufacturer could work with add-on hardware like sensors and tools from other vendors, just as computers from different makers can talk to the same USB-enabled devices.

“Upgrading existing proprietary components may be both expensive and logistically unfeasible because whole platforms may need to be taken out of service and/or replaced,” the report warned. “Such a closed development approach has resulted in a number of unfavorable characteristics that impede applications of technical progress and the adoption of new capabilities.”

Last year, the Army launched procurement processes for two standardized modular robot platforms, with an eye toward acquiring an initial total of approximately 4,400 of the devices, says Schwartz. The goal is to ultimately be able to buy compatible hardware—like radiation sensors and robot arms—from a variety of vendors without having to replace vehicle bodies to keep other equipment up to date.

The approach has been at least outwardly praised by some contractors, who say it will allow them to focus their energies on areas of particular expertise.

“As a mobility provider, we can concentrate on really what makes that particular platform able to go and do other things, but we don’t have to keep up with the latest technology with, say, a camera,” says Lincoln Hudson, director of land forces and protective systems at Northrop Grumman.

[Photo: Flickr user Walter]
Beyond the issues of interoperability, the Defense and Homeland Security departmentshave increasingly investedin technologies to track and take down enemy drones, from radio-jamming and hacking tools toshotgun shells that release nets to ensnare the aircraft.

Experts worry drones could be used for terror attacks in the U.S.

“It’s obviously the perfect weapon to deliver something like a biological agent,” says Kent Ho, a cofounder and partner at venture capital firm Spectrum 28 who has studied the issue.

Ultimately, the future of drone technology may bring a new kind of arms race, with unmanned aircraft developers designing drones that can repel or evade attacks from enemy drones, he suggests. So far, that hasn’t been a critical need of the U.S. drone program, which largely operates where there aren’t airborne threats, says Dan Gettinger, co-director of Bard College’s Center for the Study of the Drone.

 “The Predators and Reapers and the [similar] Grey Eagles, they’re not designed for air-to-air combat,” he says. “They’re being flown over Iraq, Afghanistan, Syria, Somalia, and Yemen where there isn’t much threat from air-to-air combat or air defense.”

General Atomics’s Gray Eagles [Photo: courtesy of General Atomics]

Will The Robots Come Alive?

Future generations of military robots will almost certainly operate with more autonomy than comparable machines today—but will they be able to make life-or-death decisions?

“I can’t imagine a case where you’d want a robot to be autonomously making decisions about harming people,” says Endeavor’s Bielat.

But others are imagining that very thing and sounding the alarm, including the Vatican.

The devices could contribute to power disparities between rich and poor nations, said Archbishop Silvano Tomasi, the former Vatican representative to the United Nations, at a 2015 meeting on the subject.

“Another important aspect to which we need to be attentive is the fascination created by armed robots and the feeling of power that they elicit,” he said. “Their use may be linked implicitly to a desire of omnipotence, rather than the desire to make available means which are proportionate for a just defense. ”

The argument for autonomy is perhaps strongest when bots are in support roles.

Ground robots can foreseeably carry supplies and navigate difficult terrain when a soldier indicates a point on a map. Ideally, they would also be reliable enough to travel with troops without requiring full-time attention from a member of the unit, says Gettinger.

advertisement

“If you’re going to have a robot carrying all your gear, you don’t necessarily want to have another person controlling that robot at all times,” he says. “That just adds on extra layers of complexity.”

An unmanned rigid-hull inflatable boat operates autonomously [Photo: U.S. Navy/John F. Williams]
Autonomous ships and subs will be capable of increasingly complicated maneuvers and coordination with other robots, even when they lose contact with their human commanders. A set of autonomous surface boatsrecently demonstrated by The Office of Naval Research were able to collectively swarm a targetand could one day be used for harbor defense.

 “During the demo, unmanned boats were given a large area of open water to patrol,” according to a statement from ONR spokesman David Smalley. “As an unknown vessel entered the area, the group of swarmboats collaboratively determined which patrol boat would quickly approach the unknown vessel, classify it as harmless or suspicious, and communicate with other swarmboats to assist in tracking and trailing the unknown vessel while others continued to patrol the area.”

Similar vessels could patrol hazardous areas or monitor largely dull expanses of sea more cheaply than manned vessels. “If we can send an unmanned boat into a minefield instead of a manned asset, that’s a huge advantage,” says Office of Naval Research program officer Robert Brizzolara.

Naval robots, particularly those that operate underwater, require more autonomy than flying drones or land robots, since they’re more likely to lose connections with human operators, says Lt. Kara Yingling, a spokeswoman for the Navy’s Unmanned Systems Directorate.

“UUV situational awareness requires a degree of autonomy unlike even existing unmanned systems,” she wrote in an emailed statement to Fast Company. “Even Unmanned Aerial Vehicles (UAVs) are somewhat manned if they are remote-controlled by a pilot, whereas a UUV is unlikely to maintain constant human-in-the-loop communications underwater.”

Some autonomous aerial drones will likely fly in larger swarms, making group decisions and changing formations based on their surroundings faster than they could with humans controlling every turn.

“If you ask me to protect a building you might say, well, the building’s too big, I’m going to recruit my friends,” says Vijay Kumar. Kumar is the Nemirovsky Family Dean of Penn Engineering at the University of Pennsylvania and has led research in small, swarming drones, though he emphasizes his research doesn’t focus on weaponizing drones.

Prox Dynamics microdrones

Terminators?

The U.S. military and forces abroad have so far been conservative about how autonomous military robots should be, amid public fears of “killer robots” run amok or tiny, autonomous spy drones patrolling cities.

“The technology for autonomy is ahead of the customer’s demand for it,” says Bielat, of Endeavor Robotics. “We’re working with customers closely to see what it is that would be most practical for their missions.”

Part of that is a natural reluctance to introduce new, complex, and potentially dangerous machinery into operations. “Any autonomy that’s introduced has to reduce the potential for harm,” says Bielat, and commanders will likely be more inclined to bring self-navigating robots to the battlefield once they’re more established in civilian life.

“I think as we as a society become more comfortable with autonomy through Google or Uber’s self-driving cars, or Tesla’s autonomous driver mode, I think as we become more comfortable, the military customer will also become more comfortable,” he says.

But military leaders are also wary of heading down the road toward what have been called fully autonomous weapons: robots that can select targets, aim, and fire without human intervention. Those devices are already possible too, and not just in the United States: a 2015 BBC report described machine gun turrets developed by South Korean arms makers and deployed across the Middle East. The devices can issue spoken warnings to intruders and shoot on explicit human orders. An initial version could decide to fire on its own, but so far customers have insisted those sold only do so with human permission, according to the report.

Researchers in the field often distinguish between systems with humans “in the loop,” where they’ve signed off on each target, those with humans “on the loop,” where a robot selects targets and fires but a human has the clear ability to order it to stop, and those with humans “out of the loop,” where humans lack even that authority, perhaps because the robot is out of communications range.

And so far, top U.S. military officials have sought to keep people in the loop when it comes to robots firing at humans and on the loop in most other cases.

“The Army seeks to maintain human control over all autonomous systems,” officials said in a strategy document published in March. “It will achieve this goal by keeping humans ‘in-the-loop or on-the-loop’ of current and future [robotic and autonomous systems].”

That’s in line with a 2012 set of rules that then-Secretary of Defense Ash Carter issued for autonomous weapons.  Carter ordered that the robots be rigorously tested and resistant to hacking and specifically required that semi-autonomous robots not be allowed to select new targets even if unable to communicate with humans.

Supervised robots with humans on the loop can select their own nonhuman targets–like incoming missiles or, presumably, other robots–to defend humans “for local defense to intercept attempted time-critical or saturation attacks,” the guidelines say. Fully autonomous bots are limited to “non-lethal, non-kinetic force, such as some forms of electronic attack, against material targets.”

In practice, that allows for devices like automated, airborne radar jammers, explains Paul Scharre, a senior fellow and director of the Future of Warfare Initiative at the Center for a New American Security, in a 2014 essay.

At least 19 countries and international organizations, including Human Rights Watch, have called for an international ban on autonomous, lethal robots, potentially similar to existing restrictions on undetectable mines and blinding laser weapons.

A group of nongovernmental organizations in 2012 formed what they call the Campaign to Stop Killer Robots, and the United Nations has held a series of international meetings on the subject.

Russian President Vladimir Putin recently expressed interest in advancing the country’s own military autonomous tech and Russia has been a somewhat hesitant participant in international discussions on limiting robots, though talks are set to continue this year.

The U.S. Defense Department’s current five-year policy on autonomous weapons is also scheduled for re-evaluation in 2017, making it quite possible that Presidents Trump and Putin will effectively decide just how autonomous systems will move from the lab to the battlefield.

China is still believed to be developing its own autonomous weapons systems, as well as remote-controlled robots, according to the U.S.-China Economic and Security Review Commission.

“I think China has recently been really emphasizing autonomy and artificial intelligence, and they really see these technologies as integral to developing the next generation of drones and the next generation of military systems,” says Gettinger. “This type of emphasis on autonomy is certainly not limited to the United States.”

“There’s a moral concern that humans would be delegating life and death decisions to a machine, which just seems fully unacceptable to most people,” says Mary Wareham, advocacy director of Human Rights Watch’s Arms Division and global coordinator of the Campaign to Stop Killer Robots. “It seems like they’d cross a red line about what’s acceptable in warfare.”

Human Rights Watch argued in a 2012 report that fully autonomous, lethally armed machines might not even comply with existing international law. They might not be smart enough to recognize the complex cues needed to tell civilians from combatants, the group warned.

“For example, a frightened mother may run after her two children and yell at them to stop playing with toy guns near a soldier,” according to the report. “A human soldier could identify with the mother’s fear and the children’s game and thus recognize their intentions as harmless, while a fully autonomous weapon might see only a person running toward it and two armed individuals. The former would hold fire, and the latter might launch an attack.”

They might also struggle with other factors required under international law: deciding whether the military benefits of an attack overcome potential risks to civilians, and even when it’s necessary to apply force at all, like when a human target appears to be wounded, the group warns. They might also lack the natural human restraint and compassion that can keep troops from doing inhumane things.

“Emotionless robots could, therefore, serve as tools of repressive dictators seeking to crack down on their own people without fear their troops would turn on them,” the report warned.

But some experts say autonomous robots, rigorously developed and properly tested, could even help keep civilians safe in high-risk situations, where they’d avoid making mistakes a human might that could endanger unarmed bystanders.

“They can act far more conservatively than human beings and assume more risk,” says Ronald Arkin, director of the Mobile Robot Laboratory and associate dean for research and space planning at the Georgia Institute of Technology’s College of Computing. “They don’t exhibit fear, or frustration, or anger, or the emotions that can cloud human judgments in their operations.”

Arkin opposes a complete ban on autonomous robots, though he says the devices need to be carefully regulated. They shouldn’t be deployed into combat until they’re shown to adhere to international law as well, if not better, than human troops, he says.

“If we create an outright ban on the technology, then we lose the potential in my mind to help noncombatants and reduce noncombatant casualties,” he says.

And their developers should steer away from programming techniques, like some forms of machine learning, that can lead to decision-making processes humans can’t easily peer into, he says.

“Regulation is the key if these systems are going to be used,” Arkin says. “It should be introduced in a very graded and controlled manner into the battlefield, and not rushed out there.”

What form that regulation will take may well be decided in the next few years. The Defense Department policy, formally Department of Defense Directive 3000.09, is slated to be revisited this year: If it’s not reissued or “certified current” by its five-year anniversary in November, it automatically expires in another five years’ time. It’s so far unclear how President Trump and Defense Secretary James Mattis will choose to proceed, and a Pentagon spokesman declined to comment on potential changes to the policy.

But in December, Steven Groves–now chief of staff to U.S Ambassador to the United Nations Nikki Haley and then a Trump transition advisor–told Politico the U.S. would be unlikely to support a ban on the weapons, which he deemed necessary to maintain military superiority, though he said at the time he didn’t officially speak for the administration.

“Congress should fund the research and development of autonomous technology,” he argued in a 2015 report for the conservative Heritage Foundation, where he was then a senior research fellow. “The capabilities of [lethal autonomous weapons systems] to increase U.S. national security have yet to be fully explored, and a preemptive ban or moratorium on such research is against U.S. interests.”

Prox Dynamics microdrones

He has, however, advocated for an international agreement to define a manual specifying when, where, and how autonomous robots can legally be used in combat. That could address some of the concerns about the devices endangering civilians or otherwise proving a step back from existing norms of combat, he said.

“The United States is participating in the Convention on Chemical Weapons discussion on ’emerging technologies in lethal autonomous weapons systems (LAWS),'” a State Department official told Fast Company in an email. “It is premature to know where these expert discussions will lead.”

President Vladimir Putin has also spoken of the importance of Russian research into “autonomous robotic systems” that could “radically change the spectrum of weaponry for the general purpose forces,” according to the state-sponsored Sputnik news agency. Russia abstained from last year’s vote to continue international talks on the subject.

And in December, China called for “a legally binding protocol” to govern the use of fully automated weapons and warned that the devices “will lower the threshold and cost of war, thus making the outbreak of wars easier and more frequent,” though the country stopped short of calling for an out-and-out ban on any particular technology.

“Such systems cannot effectively distinguish between soldiers and civilians and can easily cause indiscriminate killing or wounding of the innocent,” a Chinese report warned. “Consequently, pending an appropriate solution, we call on states to exercise caution in their use and specially to prevent their indiscriminate use against civilians.”

Recognize your company's culture of innovation by applying to this year's Best Workplaces for Innovators Awards before the final deadline, April 5.

PluggedIn Newsletter logo
Sign up for our weekly tech digest.
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Privacy Policy

ABOUT THE AUTHOR

Steven Melendez is an independent journalist living in New Orleans. More


Explore Topics