The FTC's New Rules On Facial Recognition Tech Warn Against Abuses

A new set of guidelines from the FTC warns companies against abusing facial recognition technology at the expense of user anonymity.

As companies develop more and more applications for facial recognition technology, the Federal Trade Commission is trying to make sure privacy remains paramount with a new set of suggested guidelines it released yesterday outlining several "best practices."

The FTC's suggestions fell into one of three pillars:

1. Companies should factor privacy into every point of the product development process, taking measures to ensure data security to prevent against third-party data scraping.
2. Companies should determine how long it's appropriate to retain consumers' biometric data and dispose of it where appropriate. For example, if a consumer uploads a photo of herself to Ray-Ban's Virtual Mirror so she can virtually "try on" some sunglasses but then later deletes her account, Ray-Ban should discard her data, per the FTC's suggestions.
3. Companies, particularly those who develop digital signs that are equipped with cameras, should be transparent about letting passersby know they're interacting with facial recognition software. They should not place said digital signs in "sensitive" (and obvious no-no) areas such as "bathrooms, locker rooms, health care facilities, or places where children congregate."

The FTC also advises that social networks should obtain express consent from consumers before collecting or using biometric data in at least two scenarios: When they're about to use that data for a different purpose than originally stated at the time of data collection; and when they use facial recognition to identify someone who would otherwise have remained anonymous.

FTC Commissioner J. Thomas Rosch was the report's lone dissenter, claiming the FTC is going "too far, too soon." He also says facial recognition technology as it stands doesn't pose a threat to the point where it could amount to "tangible injury."

Meanwhile, the report points out that facial recognition technology is constantly improving: Facial recognition systems' false reject rate--in which the system rejects a match between two images of the same face--reduced by half every two years from 1993 to 2010, until the error rate was less than one percent. Higher-quality digital cameras and lenses are being developed constantly.

Among the large-scale projects that implement facial recognition are the FBI's $1 billion Next Generation Identification program, which works to scan photos against the FBI's own database. At the new International Finance Center Mall in Seoul, shoppers who stop at one of the mall's information kiosks are being scanned by nearby cameras for their age and gender, which will potentially be used to create customized ads. And in September, months after first coming under the fire of EU data officials, Facebook switched off its controversial facial recognition feature that identified people in users' Facebook photos and suggested friends to tag in those photos.

The uproar surrounding the Facebook case gets to the heart of why facial recognition technology is a particularly gnarly form of data tracking: Once identified, a person can never be retroactively un-identified. "A consumer’s face is a persistent identifier that cannot be changed in the way that a consumer could get a new credit card number or delete a tracking cookie," per the FTC report. In a future where facial recognition could eventually be used to identify strangers in public places, it becomes more feasible to imagine a world in which the act of protecting one's anonymity becomes more active than passive.

[Image: Flickr user justinpickard]

Add New Comment

0 Comments