Millions of New Yorkers have been unknowingly surveilled since 9/11. Numerous reports over the last many years have described mass surveillance programs in the name of national security. Though these operations are now more widely known about, how they worked and what they ultimately led to has remained opaque. Today, the Intercept published an exposé describing a partnership between the NYPD and IBM as a result of the agency’s video surveillance program.
Essentially, in the years following the Twin Tower attack, New York City installed thousands of street cameras as part of its large-scale anti-terrorism operation, but it wasn’t exactly known how the city was handling all this footage. It turns out, IBM has been helping.
According to documents surfaced by the Intercept, as well as interviews conducted with people familiar with the operation, IBM was given access to thousands of images taken by these NYPD cameras. This stream of data helped the technology giant build out its video analytics software, which offered fine-tuned search options–including the ability to search for images of people with a certain hair color, facial hair, or skin tone. IBM did not comment to the Intercept about its use of this footage, but the NYPD did admit to sharing the data with the company.
While the revelation that the NYPD mass surveilled the city for years is (depressingly) not surprising, the fact that the agency was sharing its terabytes of data with IBM is an interesting development. Not only that, but this partnership led IBM to develop software that lets users search video footage by skin color. This essentially means that racial profiling technology was created and fine-tuned by IBM with help from the NYPD, and millions of unsuspecting New Yorkers.
The company has subsequently been selling this software, which has been used at places like university campuses. As Fast Company wrote earlier this year, IBM has been telling law enforcement agencies interested in its Intelligent Video Analytics software about the program’s ability to not only count people but also perform “lying and body detection.” The software, it turns out, was able to detect other things, too.
And now that this partnership has come to light, civil liberties advocates are worried about the secrecy of the technology, and how it may ultimately be used to profile thousands of unknowing citizens based on race.
You can read the full Intercept story here.
Update: IBM provided Fast Company with this statement:
IBM is committed to responsibly advancing and using new technologies and is recognized for policies that promote diversity and inclusion. These values are why we have numerous programs underway to better identify and address bias in technology, including making publicly available to other companies a dataset of annotations for more than a million images to help solve one of the biggest issues in facial analysis — the lack of diverse data to train AI systems. IBM would not bid on client work that would enable ethnic bias.