In an announcement that raised plenty of eyebrows, Facebook said this week that it wants to help potential revenge porn victims keep intimate images of themselves off the social network by uploading the images via Facebook-owned Messenger. The effort, which is being rolled out in an initial test in Australia, is an extension of existing Facebook tools for identifying and removing revenge porn imagery. The goal of the new tool is to allow Facebook to create digital fingerprints of images that could be used to block photos subsequently posted for revenge porn purposes.
The company promises it isn’t storing the images, and that the goal is simply to keep vengeful people from sharing intimate imagery on Facebook “in the first place,” the company’s head of global safety, Antigone Davis, wrote in a blog post.
Unfortunately, because of issues tied to the Russians’ use of Facebook to interfere in America’s election last year, and the company’s evolving explanations for how that happened, the public’s trust in Facebook is low at the moment, and some doubt whether it is able, or even willing, to truly erase the imagery.
But advocates for domestic violence victims think the new tool is a “bold move” and say that privacy concerns might be exaggerated.
“The timing is unfortunate because of scrutiny on other issues,” says Cindy Southworth, the executive vice president and founder of the Safety Net Technology Project, and someone who’s been advising Facebook on the development of the new tool for over a year. “I think it’s getting overblown because of much bigger frustration over Russia….I don’t want victims of domestic violence to lose out because of this timing.”
For some time, Facebook, as well as other tech giants like Twitter and Google, has had systems in place that allow revenge porn victims to report offending imagery and have it taken down. What Facebook’s new tool is designed to do is to allow people to send in images they think might be used for revenge porn purposes so that Facebook can proactively create a digital fingerprint that would be used to automatically reject matching images should they be posted later.
The pilot program works like this: Australians concerned about potentially becoming victims of revenge porn first complete a form on that country’s eSafety Commissioner’s Web site and then send the image or images they’re worried about to themselves via Messenger. The eSafety Commissioner’s office then notifies Facebook that the images have been sent, after which a member of Facebook’s Community Operations team would review the image or images in order to “hash” it, which “creates a human-unreadable, numerical fingerprint,” Davis wrote.