When Facebook performed an experiment to see if it could secretly affect users' emotions, were you really surprised?
Academics were galled, knowing they would need explicit permission from research subjects and strict ethical oversight to perform such an experiment. A privacy group filed a complaint with the
But many Facebook users responded with a shrug, having long accepted that targeted ads and extensive data collection are permanent features of life online.
In the 21st century, user participation has come to equal user consent, a social contract governed by massive terms-of-service agreements that few users fully read or understand.
However, that social contract has come under scrutiny this week as news spread of Facebook's emotion-manipulation experiment in 2012, which was carried out on nearly 700,000 users without their knowledge.
Facebook data scientist Adam D.I. Kramer, who would later say he was concerned that users might leave the network if using Facebook made them sad, carried out the weeklong experiment with input from two outside academic researchers from
The researchers wanted to see whether emotions were contagious on Facebook. If users saw a greater proportion of happy statuses from friends in their newsfeeds, would they feel happier? And if they saw more sad updates, would they be sadder?
According to the results of this experiment, Kramer and the Cornell researchers, Jamie E. Guillory and Jeffrey T. Hancock, found that emotions appeared to be contagious on Facebook.
Since then, Facebook and the researchers have been barraged with criticism. The editor of the Proceedings of the National Academy of Sciences, which published the study in June, noted that researchers' failure to obtain participants' informed consent was "a matter of concern."
Facebook officials and other industry insiders argue that Facebook has a right to conduct testing on its own service to improve its product, and that users implicitly agree to the testing when they accept the company's sweeping terms of service.
"This was part of ongoing research companies do to test different products, and that was what it was; it was poorly communicated," Sheryl Sandberg, Facebook's chief operating officer, told the Wall Street Journal. "And for that communication we apologize. We never meant to upset you."
Similar large-scale user testing goes on all the time, internally, as a normal course of business, officials said. "They're always trying to alter peoples' behavior," one former Facebook Data Science team member told the Journal this week.
In fact, three Facebook researchers wrote in April that "we run over a thousand experiments each day," adding that "many online experiments are implemented by engineers who are not trained statisticians."
A competing school of thought argues that Facebook's experiment is symbolic of a power imbalance between companies and users, who cannot bargain their terms of service, who have little control over their data, and who have little oversight of what companies do with their private information.
“Facebook knows a lot about us, and we don’t know what it’s modeled about us, right?” said Zeynep Tufekci, a sociologist at the
Now, Tufekci said, with Facebook's emotions study, "We know that they can move this needle. ... It's power."
Facebook's data scientist, Kramer, said the study's effects were minimal, even unnoticeable by the common user.
But critics say any harm that may or may not have resulted from the 2012 study -- or any of Facebook's other research -- remains secret.
“We don’t know if anybody in the study group killed themselves,” said James Grimmelmann, a
"It appears to not have been a disaster," Grimmelmann added of the study. "But it exposed very clearly that, one, the immense power that Facebook does have to manipulate our environment, and two, the attitude that Facebook and other companies have is that experiments are normal and routine and just something that they do and they don't even see the issue.
"That's why they give apologies like, 'I'm sorry you were offended,' not, 'I'm sorry it happened," Grimmelmann continued. "And three, it exposes a gap in our system of oversight. If this research happened in the university, they absolutely would have had to have informed consent."
Facebook's internal research could be governed by Federal Trade Commission rules that forbid "a representation, omission, or practice [that] misleads or is likely to mislead the consumer" or any act that "causes or is likely to cause substantial injury to consumers."
Facebook has previously run afoul of federal rules for how it treated users. In 2011, for example, it faced FTC charges that the company had misled consumers by telling them their information would be kept private when, in fact, the company "repeatedly allow[ed] it to be shared and made public."
On Thursday, a privacy group, the Electronic Privacy Information Center, filed a complaint with the FTC over the 2012 experiment, arguing it was a deceptive trade practice that also violated Facebook's promises about user privacy.
Facebook also faces scrutiny from Britain's Information Commissioner's Office and Ireland's Data Protection Commissioner to see if any of those countries' regulations were violated, officials from both offices told The Times.
"When someone signs up for Facebook, we've always asked permission to use their information to provide and enhance the services we offer," the spokesman wrote, on condition that he not be identified. "To suggest we conducted any corporate research without permission is complete fiction."
Of course, users who are upset about being experimented upon could simply quit Facebook, as defenders of the service have said.
But as the University of North Carolina's Tufekci pointed out, given how huge the service's user base is, "you don't have a Facebook alternative."
"You either accept, or you cut off your social networks," Tufekci said. "And social networks are the stuff of life."