Facebook upset the Internet this weekend when a joint study conducted by one of its data scientists revealed that the company had socially engineered the news feeds of 689,003 users to make them feel feelings.
For one week in January 2012, those users became unwitting guinea pigs as their feeds were flooded with either positive or depressing status updates from their friends. The goal was to assess whether the mood of their news feeds affected their own mood and behavior, and—surprise!—it totally did. Feelings felt.
Critics cried foul. A common thread of argument from academic researchers suggests that Facebook needed the "informed consent" of users in order to conduct psychological experiments on them, with informed being the operative word. Facebook's most ardent defenders, however, pointed to the social network's Data Use Policy, which states that the company can make changes whenever it darn well pleases in the name of "internal operations," including "testing, research, and service improvement."
It is already a complex case with compelling arguments to be made on both sides of what Facebook can and can't do. Now, though, there is another wrinkle: Kashmir Hill at Forbes (whose reporting on the matter, I should add, has been nothing short of excellent) reports that the little line in Facebook's liberal Data Use Policy that allows it to experiment on users under the auspices of research was actually added to the Terms of Service four months after the emotion-manipulation study happened—in May 2012.
On the surface it would seem to throw a wrench in to the argument that Facebook can experiment on users as it pleases. But as Hill writes, "Let's be candid: No one actually reads these things anyway."