Fast company logo
|
advertisement

Facial detection has all sorts of technological and ethical problems. What if we tried to prevent crime by using software to spot weapons?

The Capitol riot is spurring new interest in gun-detection AI

Trump supporters clash with police and security forces as they storm the US Capitol in Washington, DC on January 6, 2021. [Photo: Roberto Schmidt/AFP via Getty Images]

BY Mark Sullivan4 minute read

A new round of debate over surveillance technology broke out after the riot at the U.S. Capitol on January 6. As some oberservers wisely pointed out, such events make it tempting to loosen restrictions on surveillance technologies such as facial recognition in the name of safety, but yielding to those temptations could lead to a rapid erosion of privacy and civil liberties. The Patriot Act Congress adopted after 9/11 (and the mass surveillance programs that followed) is a notable example.

Capitol security is already requiring lawmakers to pass through new metal detectors, which has already infuriated some of Congress’s more libertarian members. Congress could feasibly decide to go further and install new cameras that identify every face in the vicinity of the Capitol, perhaps checking any one of them against social media and public records data, as a way of preventing future attacks.

Shifting the focus from detecting human faces to detecting the instruments of harm they carry with them might be a way to satisfy security needs while preserving civil liberties. Companies such as ZeroEyes and Omnilert sell computer vision AI systems that identify guns–not people—seen through the lenses of security cameras. Both companies report a sharp uptick in interest after the assault on the Capitol, video of which clearly shows people carrying firearms.

“Unfortunately our business does well when bad things happen,” says ZeroEyes CEO Mike Lahiff. Lahiff, an ex-Navy SEAL, says he started the business after the mass shooting at Marjory Stoneman Douglas School in Parkland, Florida on February 14, 2018. “I went into my kids’ school and I saw all these cameras around and asked what they were doing with them,” he says. A school official told him there were used mainly to identify students who’d stole something from a locker or been in a fight. “I thought, why not use those cameras to detect guns?” Lahiff explains.

In the beginning, ZeroEyes considered selling facial recognition software. But Lahiff says that when the company proposed the product to schools, many people in the room would register their unease with the technology. “AI is already a concern for people, and with anything having to do with facial recognition, privacy becomes top of mind,” he says. So ZeroEyes decided specialize in object recognition AI.

There is currently no comprehensive federal law governing the use of facial recognition AI. Some states have passed laws limiting its use, and the state of Washington has passed a comprehensive law restricting use of the technology by government agencies. In 2019, the city of San Francisco was the first to pass a ban the use of facial recognition in body cams used by police. Portland, Oregon followed.

Some big tech companies that sell the technology have tapped the breaks. Last summer, Amazon placed a one-year moratorium on selling the tech to law enforcement. Microsoft said it would stop selling facial recognition until governments pass regulatory guidelines.

False positives

The most serious criticism of gun-detection AI concerns false positives. “This technology, which tends to involve object and behavior recognition, is far from accurate,” says Jennifer Lynch of the  Electronic Frontier Foundation. “For example, the technology adopted by the Lockport school district in New York misidentified broom handles as guns.”

Lynch refers to a profile by Vice‘s Todd Feathers about Canadian facial recognition tech provider SN Technologies. Along with mistaking brooms for guns, its AEGIS system also showed a racial bias when detecting wrongdoers at the upstate New York school district.

advertisement

The last year has seen a spike in gun sales, driven by the pandemic, social instability, and fears that gun ownership will be restricted.

For ZeroEyes’ AI to detect a weapon, it must be at least partially exposed. “It’s all based on pixels,” says Lahiff. According to him, with some training the technology can detect a gun on surveillance video about as well as a human being could. And, he adds, there’s a “human in the loop,” meaning that all positive gun detections are sent to a control center where a human being confirms a positive detection before sounding the alarm to the client and to law enforcement.

“Avoiding and managing false positives and [maintaining] privacy are the two things you have to get right in order to sell this,” says Dave Fraser, the CEO of ZeroEyes’ competitor Omnilert. Omnilert’s main business has been selling multi-channel mass alert systems to schools, manufacturers, and retailers. Because any gun-detection system involves mass alerts, it made sense of Omnilert to add on the gun-detection AI, Fraser tells me.

Fraser says his company’s computer-vision algorithm makes a series of determinations before reporting that it’s spotted a weapon. It first tries to establish that a human being is actively wielding the gun, and that it’s not holstered or just laying on a table. Only after that has been established is the sighting communicated to a human being for review. If the operator confirms that it’s a gun, the alert system goes into action and notifies law enforcement. The system can also integrate with building security systems that can lock down the building, and even lock an active shooter in a certain area of the building.

Fraser says that it’s not just events like the Capitol attack that are hiking up demand for gun detection technology. The last year has seen a spike in gun sales, driven by the pandemic, social instability, and fears that gun ownership will be restricted. During the seven months between March and September 2020, Americans bought 15.1 million guns, a 91 percent increase from he same period a year earlier, according to data collected by The Trace. Because there are more guns out there, Fraser says the burden of detecting them in public spaces falls on technology.

Ultimately, gun detection AI probably wouldn’t have meaningfully changed what happened at the Capitol last week. It can only detect and warn; it must be followed by a forceful and proportionate response to neutralize the danger. It’s all too clear that such as response was not ready at 2 p.m. on January 6th.

Recognize your brand’s excellence by applying to this year’s Brands That Matter Awards before the early-rate deadline, May 3.

PluggedIn Newsletter logo
Sign up for our weekly tech digest.
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Privacy Policy

ABOUT THE AUTHOR

Mark Sullivan is a senior writer at Fast Company, covering emerging tech, AI, and tech policy. Before coming to Fast Company in January 2016, Sullivan wrote for VentureBeat, Light Reading, CNET, Wired, and PCWorld More


Explore Topics