Advertisement

Facebook tinkered with users’ emotions in experiment

A man poses for photographs in front of the Facebook sign on the company's campus in Menlo Park.
(Jeff Chiu / Associated Press)
Share

Facebook has been playing with its users’ emotions, and now a lot of people are upset.

For one week in 2012, hundreds of thousands of Facebook users were unknowingly subjected to an experiment in which their news feed was altered to see whether certain kinds of content made users happy or sad.

The research that resulted from that experiment, which was published in an academic journal this month, said emotions appeared to be contagious: If users saw happier posts from friends in their Facebook news feed, they were more likely to post their own happy updates. Sad updates appeared to have a comparable effect.

In other words, the study seems to show you are what you eat, as the saying goes -- except in that metaphor, you usually get to choose what you put in your mouth.

Advertisement

Now, Facebook, which uses a secret algorithm to control what it shows users on its popular news feed, faces another round of allegations that the world’s largest social-media network is being a little too creepy and manipulative.

After the study started to receive widespread scrutiny on the Web, Adam D.I. Kramer, a data scientist at Facebook and one of the study’s authors, wrote in a post Sunday: “In hindsight, the research benefits of the paper may not have justified all of this anxiety.”

Kramer added that he and the paper’s coauthors were “very sorry for the way the paper described the research and any anxiety it caused.”

The other two researchers involved were Jamie E. Guillory of UC San Francisco and Jeffrey T. Hancock of Cornell University.

The research, which was published in the June 17 issue of Proceedings of the National Academy of Sciences, has also drawn complaints that the academics involved strayed from typical standards about participant consent, especially in a study that could have had negative effects.

The research paper’s editor, Susan T. Fiske of Princeton, told the Los Angeles Times that the authors told her the research had been approved by Cornell’s Institutional Review Board on the basis that Facebook had already performed the study. However, she said, she did not confirm that approval herself. A spokesman for Cornell could not immediately confirm or deny its review board’s involvement late Sunday.

Advertisement

Such boards, which are common at many universities, exist to ensure that researchers don’t harm their subjects and that they obtain what’s known as “informed consent” from their subjects.

However, spokesmen for Cornell and Facebook told The Times on Monday that an earlier announcement from the university stating that the research was partially funded by the U.S. Army Research Office was incorrect.

The Facebook study’s researchers apparently relied on Facebook’s sweeping terms of service and data use policy -- which combine for more than 13,000 words and mention the word “research” exactly twice -- to use 689,003 users for its experiment.

In the study, carried out Jan. 11-18, 2012, users’ posts were interpreted to be positive or negative based on whether the posts contained certain key words. Two groups of users were then shown news feeds that had a greater proportion of those positive or negative posts.

One of the study’s authors admitted that the measurable effect was small, to the extent that affected users might post “one fewer emotional word, per thousand words, over the following week.” But the study was still enough to prompt criticism of Facebook.

“We should care that this data is proprietary, with little access to it by the user, little knowledge of who gets to purchase, use and manipulate us with this kind of data,” Zeynep Tufekci, a sociologist at the University of North Carolina, Chapel Hill who specializes in tech issues, wrote in a post titled “Facebook and Engineering the Public.”

Advertisement

The research paper’s editor, Fiske, told The Times in an email that she was “concerned about this ethical issue as well” in regard to consent but said the study’s manipulations were approved on the grounds that Facebook has always been tweaking the service’s news feed.

“Facebook filters user news feeds all the time, per the user agreement,” Fiske wrote. “Thus, it fits everyday experiences for users, even if they do not often consider the nature of Facebook’s systematic interventions.”

If an institutional review board did indeed approve the Facebook research, Nathan Jurgenson, an academic and technology theorist, told The Times that academics using Facebook’s terms of service to justify a study was “weird” because “Facebook’s terms of service exist to protect Facebook, and the IRB [institutional review board] exists to protect participants.”

“Facebook has probably been doing this [kind of research] forever, because Facebook doesn’t have to go to the IRB,” said Jurgenson, noting that the study’s findings weren’t as significant as the way the study was carried out.

In a statement provided to The Times, Facebook said the company does perform research to improve its service, noting that “a big part of this is understanding how people respond to different types of content, whether it’s positive or negative in tone.”

“We carefully consider what research we do and have a strong internal review process,” the statement said. But when a Times reporter asked for more information about what Facebook’s internal review process entailed, a spokesperson did not respond.

Advertisement

The research paper also states that “the authors declare no conflict of interest” on the Facebook project -- even though one of the paper’s authors, Kramer, works for Facebook, and has stated that he was worried about people leaving the service.

Kramer, in his post Sunday, wrote that the study was carried out to “investigate the common worry that seeing friends post positive content leads to people feeling negative or left out.”

“At the same time, we were concerned that exposure to friends’ negativity might lead people to avoid visiting Facebook,” he added.

Which, after the study was published, is exactly what some users threatened to do.

Follow @MattDPearce for national news

Advertisement