Fast company logo
|
advertisement

TECH

The U.S. is alarmingly close to an autonomous weapons arms race

Weapons systems that think for themselves remain in their infancy, but geopolitical pressures and mistrust may force them into use prematurely.

The U.S. is alarmingly close to an autonomous weapons arms race

[Photo: U.S. Army photo by Specialist Carlos Cuebas Fantauzzi, 22nd Mobile Public Affairs Detachment]

BY Mark Sullivan7 minute read

One of the Pentagon’s primary jobs is anticipating what the wars of the future will look like so that it can allocate the resources necessary to make sure the U.S. has the edge in those battles. When people in the defense industry talk about the tools of future war, they usually mention applications of AI, autonomous weaponry, and a very different role for warm-blooded human beings during battle.

These technologies are in their early stages of maturity; defense forces don’t yet understand the best ways to deploy them in battle. Military leaders in other wealthy countries, including China and Russia, are also talking about such matters, though we don’t know where they’re placing their bets.

For a number of reasons—some old, some new—the U.S. could easily get pulled into a race to develop and use autonomous weapons before it understands how to use them predictably, effectively, and ethically.

Cold War 2.o

We may enter a period of escalation that recalls the nuclear arms race between the U.S. and the former Soviet Union during the Cold War.

“There’s an AI arms race where I’m worried about your development of this technology and you’re worried about my development of this technology, and neither of us communicates that we’re aware of the limitations,” said Chris Meserole, director of research and policy for the Artificial Intelligence and Emerging Technology Initiative at the Brookings Institution. He spoke during a Defense One/Nextgov panel discussion on AI ethics and policy.

“It turns into this self-fulfilling prophecy . . . you enter this spiral where each one assumes the other has the advantage,” Meserol explained. “You can end up in a situation where you’re already fighting, when neither party originally wanted to.”

https://www.youtube.com/watch?v=6Ipkq-BASaM

This mutual paranoia might be even worse with autonomous weapons systems. The development of such systems is moving faster than that of nuclear weapons, and it’s accelerating. Many in U.S. defense circles believe China already has an advantage. The centralized Chinese government, they say, can move faster than the U.S. government to leverage private-sector development of AI and autonomous technology and turn it to use in military applications.

Autonomous systems are different in a more fundamental way. Where nuclear warheads are a singular type of weaponry, AI is an enabling technology that can be used in many types of weapons and support systems. Even a nuclear missile can be outfitted with an AI system that would give it the ability to search out and destroy a specific target.

Experts say AI could radically change warfighting for soldiers. Humans will be asked to quickly digest and act upon large amounts of data while controlling—or defending against—autonomous weapons systems such as drone swarms. They’ll be intaking data from sensors mounted on weapons, satellites, and soldiers’ bodies. The side with the best data and quickest means of processing it may have the edge. The fear of yielding that advantage might force a state actor to accelerate its development of autonomous systems, perhaps without addressing reliability or ethics questions.

Humans: The weakest link

Secretary of the Air Force Charles Brown told me in an April interview that in order for autonomous weapons systems to be used predictably and ethically, a human decision-maker must always be “in the kill chain” to approve its actions. But it’s likely that as autonomous systems become more advanced, the “human in the loop” will become the factor that impedes the speed and effectiveness of the system. Humans won’t be able to keep up with the machines that are fighting on their behalf.

Retired Major General John G. Ferrari, who also participated in the Defense One/Nextgov panel, illustrated the point by drawing a parallel between autonomous weapons and autonomous vehicles. The safest of today’s self-driving cars might require a human to be at the steering wheel, but experts say that as autonomous driving systems improve and mature they’ll almost certainly prove to be safer on the road than human drivers. After all, human drivers can become tired, distracted, or otherwise impaired; neural networks and sensors can’t.

AI systems may have even more advantages in battle. The Defense Advanced Research Projects Agency (DARPA) demonstrated this when it matched an AI system against an experienced human fighter pilot in a series of simulated dogfights. The AI defeated the human fighter pilot 5 to 0 “through aggressive and precise maneuvers the human pilot couldn’t outmatch,” DARPA said. The human pilot reported that the AI pilot was using “suicidal” tactics during the fight.

“Well, no kidding . . . it has no fear, no fatigue, so many advantages,” said retired General Tony Thomas, now a partner at Lux Capital, on a recent Invest Like the Best podcast.

“And yet the fighter pilot community wants to hold on to a guy or gal in a cockpit,” Thomas said. “To me, I don’t want my kids going up against some AI-enabled fighter technology in the future. We need to leap ahead right now; we need to make the sky black with automated capabilities.”

The near-future of war

The autonomous battlefield may be coming faster than most people think. The U.S. military has been using ground-controlled drones since the 1990s, and it’s now serious about funding technology companies developing autonomous drone swarms as well as companies, such as Anduril, that develop technologies that can defend against autonomous drones.

advertisement

But the U.S. is approaching these technologies more cautiously than some of our adversaries. “Russia is a little more risk-accepting in their willingness to use these technologies,” said the Brookings Institute’s Meserole during the Defense One/Nextgov panel. It’s believed that Russia has used autonomous drones to kill targets on the ground in Syria.

In a more extreme example, Azerbaijan’s use of Israeli-supplied IAI Harop drones in the country’s war with Armenia in 2020 proved decisive. The drones, which can operate autonomously, circled over the Armenian defense line until they could detect a radar or heat signal from a missile battery or tank on the ground. Then they dove down and crashed, kamikaze-style, into targets. The Armenians had an advanced air force with expensive fighter jets, but those were of little use against the Azerbaijani drone attack. The drones were small and lightweight, difficult to detect, and very hard to shoot down with conventional weapons.

The drones shattered the morale of the Armenian forces. The soldiers on the ground knew they could be hit by a drone circling overhead at any time. The drones are so quiet they wouldn’t hear the whir of the propellers until it was too late. And even if the Armenians did manage to shoot down one of the drones, what had they really accomplished? They’d merely destroyed a piece of machinery that would be replaced. Most of the Armenian casualties and equipment losses came as a result of drone attacks. Azerbaijani and Armenian soldiers rarely even saw each other. It was a very different kind of war, and likely a preview of wars between state actors in the future.

The conflict raised some basic questions about the morality of autonomous war. Azerbaijan won its war with Armenia not because it demonstrated a greater willingness to fight or sacrifice lives, but because it understood the Armenian defense forces and had enough money (partly from supplying natural gas to Europe) to purchase expensive drones from Israel. Is winning really winning if you have no human skin in the game? Apparently so. Armenia surrendered. It was only the second surrender of any country in any war since World War II.

Haves and have-nots

“The vast majority of countries will not have access to this technology,” said Mary Wareham, advocacy director of the arms division at Human Rights Watch, during the Defense One/Nextgov panel. “They’re very concerned about the three countries that do. They do not want to abdicate the responsibility to regulate this technology to . . . China, Russia, and the United States.”

The developing world is extremely concerned about autonomous warfare.”

Mary Wareham, Human Rights Watch
If the world’s powers get caught up in an escalating contest of developing ever-more-lethal autonomous weapons—and ways to defend against them—it won’t be just soldiers who lose lives. The Azerbaijan-Armenia conflict took place within the borders of the two countries. But it’s easy to imagine two wealthy countries waging autonomous war on the ground and in the air over a poorer country.

“The developing world is extremely concerned about autonomous warfare,” Wareham said. “They are extremely concerned about the need to protect human dignity—being killed by a machine is a very undignified way to go, and it’s a cowardly method of warfare.”

Many Americans agree. According to a 2020 Human Rights Watch/Ipsos survey of 19,000 people in 28 countries, more than half of American respondents (55%) favor a ban on autonomous weapons, as do 58% of Russian respondents and 53% of Chinese respondents. 

The U.S. is currently not involved in high-level talks with China, Russia, and other countries about establishing ethical guidelines for the use of autonomous weapons. However, such conversations are taking place among researchers in academia and in think tanks in countries that are now developing such weapons.

But the window of opportunity for talking and making agreements may rapidly be closing, experts fear, as work on the weapons advances, and pressure mounts on the militaries of developed countries to keep pace.

Recognize your company's culture of innovation by applying to this year's Best Workplaces for Innovators Awards before the final deadline, April 5.

PluggedIn Newsletter logo
Sign up for our weekly tech digest.
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Privacy Policy

ABOUT THE AUTHOR

Mark Sullivan is a senior writer at Fast Company, covering emerging tech, AI, and tech policy. Before coming to Fast Company in January 2016, Sullivan wrote for VentureBeat, Light Reading, CNET, Wired, and PCWorld More