Most of Axon’s ethics board has resigned over the company’s fast-tracked idea for a Taser-equipped drone, swiftly collapsing the most prominent advisory panel in the police technology industry.
Following the school shooting in Uvalde, Texas, Axon announced last week plans to mount its Taser weapons on drones in order to help stop school shootings. The idea sparked concerns about privacy, safety, and potential abuses, and blindsided many of the legal experts, technologists, and former police officials who sit on the Arizona-based policing giant’s 12-member ethics panel.
Nine members of the ethics board announced their resignations Sunday: founding member Barry Friedman, a law professor and director of the NYU Policing Project; Wael Abd-Almageed, Miles Brundage, Ryan Calo, Danielle Citron, Rebekah Delsol, Chris Harris, Jennifer Lynch, and Mecole McBride. (Four board members remain: former Seattle police chief Carmen Best, former LAPD chief Charlie Beck, Warren Stanley, former commissioner of the California Highway Patrol, and Giles Herdale, an associate fellow at the Royal United Services Institute, a London-based think tank.)
“In light of feedback, we are pausing work on this project and refocusing to further engage with key constituencies to fully explore the best path forward,” Axon CEO Rick Smith said in a statement on Sunday.
Yet Smith didn’t write off the possibility of a Taser drone at some point in the future. “A remotely operated non-lethal TASER-enabled drone in schools is an idea, not a product, and it’s a long way off,” he said in the statement. “We have a lot of work and exploring to see if this technology is even viable and to understand if the public concerns can be adequately addressed before moving forward.”
The idea of arming drones with Tasers was first proposed to the ethics board last year, as part of a trial for SWAT teams, and the board voted 8-to-4 against it last month. But last week, the company informed the board that it was pursuing the idea anyway, and expanding it beyond its additional scope, just two days before the company announced it publicly.
“Reasonable minds can differ on the merits of police-controlled Taser-equipped drones—our own board disagreed internally, but we unanimously are concerned with the process Axon has employed regarding this idea of drones in school classrooms,” the board wrote in a statement released hours after Axon’s announcement on Thursday. “Axon’s announcement came before they even began to find workable solutions to address many of the board’s already-stated concerns about the far more limited pilot we considered, and before any opportunity to consider the impact this technology will have on communities.”
Apart from concerns about exacerbating racial injustice and expanding surveillance, the idea also sparked confusion as to how exactly such a system would work. Some reports said the drones would be operated by humans, but Axon told Fast Company that in some cases, the system could operate autonomously. In a graphic novel Smith published in 2019 (which is cited in the company’s promotional materials), he described a Taser drone that attacks an assailant on its own.
In a Reddit AMA on Friday, Smith faced blistering criticism and questions about the idea. He acknowledged legal risks and training challenges (“most cops spend very little time in training after their first year and the vast majority of time on the streets,” he wrote) and said communications could be a problem, too. “Fortunately, most police agencies use our body cameras and cloud software,” he added, “so we have a footprint to introduce direct communications capabilities.”
Axon, which sells most of the nation’s electroshock weapons and body cameras, said in February that its customers include about 17,000 out of the roughly 18,000 police agencies in the United States. Following the Taser-drone announcement, Axon’s stock price jumped 6% on Thursday.
Formed by Axon in 2018, the AI Ethics Board has provided non-binding recommendations to the company about controversial uses of its products, including Tasers, body cameras, and policing software. Its first report in June 2019, on the use of facial recognition technologies, prompted Axon to postpone the development of face-matching software for body cameras, given the privacy, accuracy, and racial justice concerns. The board’s second report, released in October 2019, called for industrywide regulation of automated license plate readers, after Axon announced it would sell the devices. The board released its 2020 end-of-year report in February 2021.
Last week Smith told Motherboard that he was “proud” of the ethics board and its dissenting stance. Outside ethics panels have proven to be complicated endeavors for tech companies as they struggle with the implications of their own artificial intelligence research. In 2019, Google disbanded its AI advisory board just one week after it was formed amid controversy over the tech giant’s work for a Department of Defense drone program.
When the Axon board was founded, Friedman hoped it could eventually provide ethical guidance for the entire police technology industry.
“I have this vague hope that something like that can turn into an industry advisory board because the industry desperately needs one,” he told Fast Company in 2018. “I really worry about these technologies just being put into use without any intentionality or thought behind it, and some of them I think are deeply problematic. I think to take a group of people of various backgrounds and try to get some input from them about that is very important. And I think the industry should be doing that—because customers are all too quick to snap up the technology.”