Fast company logo
|
advertisement

A new pack of “dogs” loaded with surveillance technology could creep their way into our communities through a federal backdoor.

Who let the robot dogs out?

[Photo: MARK RALSTON/AFP via Getty Images]

BY S.A. Applin8 minute read

Robot dogs are getting smarter.

In early demos, robot dogs debuted as “pack mules,” and were shown carrying soldiers’ payloads during forest treks. As they have become commercialized, new upgrades have given robot dogs sensors, color cameras, increased mobility, and surveillance-enhancing features. These new capabilities have further piqued the interest of  both federal and local law enforcement agencies, who have been strategizing as to how they might be used in more local contexts.

Local law enforcement robot dog trials have failed in the past due to citizen outcry, and for now, the focus has shifted outside of cities as the federal government plans to pilot them at the border through the Department of Homeland Security. 

This doesn’t mean that they won’t be back. The DHS enforces borders, which include the ports in major cities, as well as “immigration” areas (which could be anywhere). Thus, the agency’s robot dogs could be wandering the streets of our cities sooner than we might expect. If that happens, how we are able to protest the tech or object to its presence will become more complicated.

From ‘pack assistants’ to mobile surveillance data platforms

Robot dogs are four-legged metal robots with articulated joints that enable them to move over smooth and uneven terrain. They don’t have heads or tails (or even look much like dogs), but they can cover ground, climb, carry loads, and now, with added sensors and cameras, collect data on the areas (and people) around them. 

With modern upgrades, the Boston Dynamics’ Spot robots and those from their rival, Ghost Robotics, have been rebranded from “pack assistants” to full-featured mobile surveillance data platforms. Each robot dog manufacturer touts different features and use cases, but they share “surveillance” as a common main feature. In May 2022, Hyundai-owned Boston Dynamics released new features for its Spot robot (billed as an “agile mobile robot”) with a marketing video showing Spot’s transformation into a sensor- and (color) camera-laden, mobile, human-controlled remote surveillance “Dynamic Sensing Solution.” 

Robot dogs collect data and have databases that can be accessed, replicated, shared, mined, and acted upon. Spot now climbs as well as walks, too—all while taking photos and using its many sensors to collect data for their companies’ customers. 

Ghost Robotics’ “Ghost” robot dog is marketed toward military usage and objectives, and is shown maneuvering across ice, bricks, puddles and riverbeds, and climbing steep terrain and stairs. Ghost Robotics describes the company’s goals as building “cutting edge solutions addressing defense, homeland, and enterprise customer needs and leveraging the latest in sensors and comms hardware, as well as operational and autonomy applications and AI.​” That covers nearly any application possible, and its description of applying “autonomy and AI,” to its surveillance (and databases) is extremely alarming. 

Releasing the hounds to the fury of the public

As law enforcement continues to conduct pilots with the tech, unpopular experimental trials have pushed robot dogs from cities to the edges of society, which is where federal law enforcement is stepping in.

In 2019, Massachusetts State Police became the country’s first law enforcement agency to use Spot for a short trial as a “mobile remote observation device,” which raised concerns with citizens, the ACLU, and robot law experts.

Two years later, the New York City Police Department  touted their Boston Dynamics “Digidogs” (i.e. Spot robot dogs) as a new tool that could be deployed into dangerous situations as a way to potentially “save lives,” as New York Inspector Frank Digiacomo put it.  The NYPD’s pilot program placed robot dogs on New York City streets to aid officers in so-called dangerous areas—locations which included lower-income housing. This created a strong case for bias.

The New York City public did not warm to the idea of the robotic surveillance dogs roaming its streets, and the NYPD canceled the Digidogs’ contracts early (and abruptly) after their presence raised many issues, including those of public safety, trust, and security, as well as questions of how (or if) police tasks should or could be automated.

Boston Dynamics has claimed that its robot dogs are mostly deployed in remote areas aiding utility and construction workers, and as such, are “not designed to be used as weapons, inflict harm or intimidate people or animals.” But that message was inconsistent with the use cases for the NYPD robot dogs, which certainly intimidated people, amongst other reported concerns. And those complaints came before additional surveillance features were implemented.

advertisement

Testing Robot Dogs at the Border

Local police departments may have temporarily struck out with robot dogs in cities, but federal law enforcement is testing robot dogs as a means of border surveillance. A year after Boston Dynamics’ declaration of more banal uses, and the failed NYPD Digidog pilot, a new robot dog from Ghost Robotics is being trained for deployment to the U.S.-Mexico Border

In February, the US Department of Homeland Security released an article describing its “need” for Ghost Robotics’ Ghost robots dogs at the border, claiming that the terrain is “inhospitable” to humans. The DHS also claimed that the robots will be used to protect officers against natural hazards as well as to “force-multiply” the Customs and Border Patrol (CBP) presence against other “threats.” The article goes on to explain that these threats can “take Border Patrol Tactical Operators into towns, cities, or ports.” That would give DHS cover to deploy these dogs in areas closer to home under federal jurisdiction, creating a path for these mobile data collection machines to be used in our cities under different legal conditions. CBP’s own Agent Brett Becker said as much in that DHS article: “[T]here are plenty of risks closer to home, too,” he said. “For instance, when missions take Border Patrol Tactical Operators into towns, cities, or ports, they can encounter hazardous environmental conditions, volatile individuals, or hostile threats. These situations can all be inherently dangerous.”

And as such, we will have limited ways to complain about DHS-deployed robot dogs. Because federal laws supersede local laws, any DHS expansion and usage of dogs to ports and cities will have more legal authority than local police departments, and changing federal laws require more consensus from more people, who may have a broader range of cultural and political views.

Dog Bites Man

Deploying surveillance robot dogs locally, a bad choice for the overall sociability and functionality of communities, is being promoted as a positive choice for the law enforcement officers who feel their lives are constantly at risk. Increasingly, police departments seem to be seeking a way to enforce laws without having to confront or connect with people, building on a trend that shows them relying more and more on data and firearms than community and human engagement. (Some communities have even been vocal in their requests for police to take more personal risks on the job.)

Robot dogs do not seem to be a win for public—or likely officer—safety in daily operation. The skills required to navigate and control a robot are different from police training. It moves the role of policing further into technology operation. Furthermore, there is an issue of simple human officer attention bandwidth with robot dogs. Already, most police officers manage a complex technological routine on the job, and robot dogs further increase an officer or border agent’s cognitive load. Having robot dogs “accompany” an officer means that they may need to be controlled (by either another officer, which uses an additional human resource), or by the officer themselves, which adds to further divides their attention as their attention oscillates between the robot’s controls and public streets, on top of managing multiple communication channels. Robot dogs require battery monitoring and maintenance, too, which means that at times, they will need to have an officer’s full focus. If an officer is distracted by a robot, they could be vulnerable to physical threat and harm, as well.

One of our recurring problems in society today is a lack of clear and good communication between law enforcement and individuals in communities. Mediating interactions and actions via robot removes human agency and reduces the conversation to a fraction of the true bandwidth it needs for us to understand each other. People will have no way to respond or have any agency with a robot dog acting as a law enforcement agent. Programming could go wrong, an officer in control of the robot could have wavering attention and be distracted, and there could be many other unforeseen circumstances that require human flexibility and creativity that use of a robot would eliminate, to potentially dire consequences.

Additionally, the manufacturers of robot dogs are trying to find the sweet spot for their products. Pivoting to “robot dogs as a mobile surveillance platform” brings them business in a world where data collection is profitable. Boston Dynamics advertises their platform as having customizable data for their customers, making types of surveillance into preferences that can be installed like features on a new car. As citizens, we don’t know where that data—which may include us—is going, or how it will be used, either.

Surveillance data, autonomously collected, does not include context and understanding of the people whose data is collected. If it is put in a database, accessed by others without being properly understood, it will create a platform for rumors and mistakes, for one-sided surveillance, and for little accountability to protect our most vulnerable people and eventually, the rest of us.

The decisions to use robot dogs to protect the lives of law enforcement seem to come over any concerns over how that same tech could put others’ lives at risk. The DHS robot is intended to be used (in some way) against humans trying to enter the U.S. No matter what someone’s position is on the immigration issue, the deployment of robots to engage with humans at the border becomes a humanitarian issue for all of us: Are we comfortable with remote controlled robots interacting with at-risk humans, be they criminals, refugees, low-income citizens, criminals, or others that exist (or are forced to exist)  in the margins of society? And, as robot dogs roam back into cities under a “cloak of federal law,” are we comfortable with remote controlled autonomous AI driven robots interacting with, and recording us?

The public has expressed many concerns about law enforcement robot dogs wandering  in cities, but now that these dogs are also surveillance platforms, the data they are collecting, and the longer term storage and uses of that data, must also be a concern.


S. A. Applin, PhD, is an anthropologist whose research explores the domains of human agency, algorithms, AI, and automation in the context of social systems and sociability. You can find more at @anthropunk and PoSR.org.

Recognize your brand’s excellence by applying to this year’s Brands That Matter Awards before the early-rate deadline, May 3.

PluggedIn Newsletter logo
Sign up for our weekly tech digest.
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Privacy Policy

ABOUT THE AUTHOR

S. A More


Explore Topics