In response to controversy over its questionable experimentation on its users, Facebook has announced that it’ll be changing the way it goes about its research.
Earlier this year, Facebook researchers published a study in the Proceedings of the National Academy of Sciences revealing that the social network manipulated the News Feeds of its users to measure their emotional responses. People felt betrayed, if not violated–and academics questioned whether Facebook’s broad data-use policy really qualified as the kind of “informed consent” that is usually the standard for involving humans in research. In June, Virginia Senator Mark Warner asked the Federal Trade Commission to formally investigate the company’s research practices. That same month, the lead author of the study, Adam Krause, published a sorry-not-sorry on his Facebook page and mentioned that Facebook would be reviewing its internal practices.
On Thursday afternoon, Facebook chief technology officer Mike Schroepfer published a more comprehensive update on the company’s blog in response to the criticism. He listed the ways in which Facebook planned to revise its research practices, including updated guidelines, an internal review panel, new kinds of training, and a website that will post all of the company’s research work. He writes:
“It is clear now that there are things we should have done differently. For example, we should have considered other non-experimental ways to do this research. The research would also have benefited from more extensive review by a wider and more senior group of people. Last, in releasing the study, we failed to communicate clearly why and how we did it.”
There’s clearly more fessing up here than Facebook was willing to do before. And embracing an ethical review board is a step in the right direction. However, as a Microsoft working paper on Facebook’s ethical dilemma noted earlier this year, internal review boards can still make quite a bit of wiggle room for themselves. They’re no guarantee of fair play.
At the same time, questions linger about what it was that made people so angry about the Facebook study in the first place. The emotional trigger might not have been the issue of consent at all, but the possibility that Facebook could use the research to improve its ad targeting and make more money from its users. If that’s the case, Facebook can adjust its research guidelines all it wants, but there’s no way it’s going to turn off the firehose of ad money that is determining its future.