|
|
|
Facebook Researcher Defends Controversial Psychology Experiment on Users
Published Jun 30, 2014
|
A Facebook researcher behind a controversial psychology experiment on users has defended it, stating that the research aimed to investigate a common concern that seeing friends post positive content on the social networking website leads people to feel negative or left out.
The researchers were also concerned that exposure to friends' negativity might lead people to avoid visiting Facebook. "We didn't clearly state our motivations in the paper," researcher Adam D.I. Kramer wrote on his Facebook page Sunday.
"The reason we did this research is because we care about the emotional impact of Facebook and the people that use our product," Kramer wrote.
After a storm of protest on the Internet that Facebook was manipulating its users' emotions with the experiment that doctored the content on the news feeds of close to 700,000 users, Kramer appeared to be having second thoughts about the experiment. "In hindsight, the research benefits of the paper may not have justified all of this anxiety."
The research by Kramer and two others, published in the Proceedings of the National Academy of Sciences of the United States of America, described the use of an algorithm to manipulate content in the News Feed of 689,003 users in two parallel experiments. In one the exposure to friends' positive emotional content in the news feed was reduced, while in the other exposure to negative emotional content in the feed was reduced.
The aim of the experiment was to find out whether exposure to emotions led people to change their own posting behavior, and in which direction.
The research concluded that emotional states "can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness," without in-person interaction and nonverbal cues, and that the observation of others' positive experiences constitutes a positive experience for people.
The response to news of the experiment was, however, overwhelmingly negative. Even when Facebook manipulates news feeds to sell things to users, it is supposed, legally and ethically, to meet certain minimal standards, wrote James Grimmelmann, professor of law at the University of Maryland in a blog post. Ads, for example, are labeled as such, even if at times not clearly, he said. "This study failed even that test, and for a particularly unappealing research goal: We wanted to see if we could make you feel bad without you noticing. We succeeded," Grimmelmann added.
Kramer wrote in his post that the experiment, conducted in early 2012, was the result of "minimally deprioritizing" a small percentage of content in News Feed for about 0.04 percent of users, or 1 in 2500 for the short period of one week, which is a big number considering the large number of users Facebook has. Nobody's posts were "hidden," but they just didn't show up on some loads of Feed. "Those posts were always visible on friends' timelines, and could have shown up on subsequent News Feed loads," he added.
Kramer did not respond to criticism that the social networking company did not take permission of its users to do the experiment. However, in the paper, the authors had noted that the experiment procedure "was consistent with Facebook's Data Use Policy, to which all users agree prior to creating an account on Facebook, constituting informed consent for this research."
Facebook will include what it has learned from the reaction to the paper in its internal review practices for such experiments, which it continually works on improving, Kramer said. "The experiment in question was run in early 2012, and we have come a long way since then," he added.
Posted by
VMD - [Virtual Marketing Department]
|
|
|