Monday, June 30, 2014

Facebook Researchers Manipulated News Feeds in 2012

A Facebook Inc. (FB) researcher apologized after conducting an experiment that temporarily influenced what almost 700,000 readers saw on their news feeds, reviving some customers’ concerns about privacy issues.

The number of positive and negative comments that users saw on their feeds of articles and photos was altered in January 2012, according to a study published June 17 in the Proceedings of the National Academy of Sciences. People shown fewer positive words were found to write more negative posts, while the reverse happened with those exposed to fewer negative terms, according to the trial of random Facebook users.

Adam Kramer, a Facebook data scientist who was among study’s authors, wrote on his Facebook page yesterday that the team was “very sorry for the way the paper described the research and any anxiety it caused.”

The data showed that online messages influence readers’ “experience of emotions,” which may affect offline behavior, the researchers said. Some Facebook users turned to Twitter to express outrage over the research as a breach of their privacy.

“Facebook knows it can push its users’ limits, invade their privacy, use their information and get away with it,” said James Grimmelmann, a professor of technology and the law at the University of Maryland. “Facebook has done so many things over the years that scared and freaked out people.”
Photographer: Manjunath Kiran/AFP via Getty Images

The number of positive and negative comments that users saw on their feeds of articles... Read More

Even so, the anger won’t have a long-lasting effect, Grimmelmann said. While some users may threaten to leave Facebook, most people “want to be where there friends are” and there is no alternative to the social networking site that provides more privacy, he said.
Face to Face

In the study, the researchers, from Facebook and Cornell University, wanted to see if emotions could spread among people without face-to-face contact.

The Facebook study is “really important research” that shows the value of receiving positive news and how it improves social connections, said James Pennebaker, a psychology professor at the University of Texas. Facebook might have avoided some of the resulting controversy by allowing users to opt out of taking part in any research, he said.

“It will make people a little bit nervous for a couple of days,” he said in an interview. “The fact is, Google knows everything about us, Amazon knows a huge amount about us. It’s stunning how much all of these big companies know. If one is paranoid, it creeps them out.”

Facebook said none of the data in the study was associated with a specific person’s account. Research is intended to make content relevant and engaging, and part of that is understanding how people respond to various content, the Menlo Park, California-based company said in a statement yesterday.
Internal Review

“We carefully consider what research we do and have a strong internal review process,” Facebook said. “There is no unnecessary collection of people’s data in connection with these research initiatives and all data is stored securely.”

Facebook’s Kramer conducted the study with Jeffrey Hancock, a Cornell professor in the communications and information science departments, and Jamie Guillory, also a researcher at Cornell.

“We were concerned that exposure to friends’ negativity might lead people to avoid visiting Facebook,” Kramer wrote. “We didn’t clearly state our motivations in the paper.”

“Having written and designed this experiment myself, I can tell you that our goal was never to upset anyone,” he continued.

Susan Fiske, a psychology professor at Princeton University, edited the study for PNAS. She contacted the authors and was told it passed Cornell’s human subjects’ ethical review. The data had already been collected when the Cornell researchers became involved.

“From that point of view, this is an issue about Facebook, not about research ethics,” she said in an interview. “My own decision was not to second-guess the Cornell” review board.

“People are relating to Facebook as if it has betrayed their trust,” she said. “The level of reaction is understandable. That doesn’t mean what Facebook or Cornell did is unethical.”

To contact the reporter on this story: Mary Schlangenstein in Dallas at maryc.s@bloomberg.net

To contact the editors responsible for this story: Kevin Miller at kmiller@bloomberg.net Bruce Rule

-bloomberg

No comments:

Post a Comment