Did Facebook deliberately manipulate the news feeds of 700K of its users this past January? Looks like it.

Facebook Emotion TestingThe news that Facebook deliberately manipulated the news feeds of about 700,000 users from January 11-18, 2012 is making headlines around the Web. The New York Times, in a June 29th article, goes so far as to call us “lab rats” for the social media giant. Why would Facebook do this? A study by Facebook and Cornell University published in Proceedings of the National Academy of Sciences looked at “emotional contagion” between users based on posts in response to more positive or more negative news feeds.

The study offers insight into user behavior on the social media site (and invalidates the generally held notion that happy posts depress us). The hue and cry being raised is not about the conclusions, but about the ethics of the study. What constitutes informed consent?

Academic researchers are well aware of the stringent Internal Review Board (IRB) regulations regarding Federal Policy for the Protection of Human Subjects. While Facebook notes there was an “internal review,” the term seems deliberately hazy, and there are conflicting reports about what Cornell’s IRB was asked to review. In a Washington Post article, Princeton University psychology professor Susan Fiske is quoted as saying that Cornell’s review was for research dealing with a “pre-existing dataset.” In PNAS, the Author Contribution indicates that the researchers designed the study, suggesting that the dataset was not a simple chunk of data they simply asked for and received. The conflicting accounts further add to the furor. (The most complete analysis about what we know and what is the most likely version of the truth can be found on Sebastian Deterding’s Coding Conduct page.)

The question of informed consent will have academics debating this study for a long time. The study’s conclusions will get a little more publicity thanFacebook Emotions Study most PNAS studies. Those of us who use social media will be a bit more unsettled about our privacy rights. And that’s the real reason this caught my attention.

The terms of service that we sign (Facebook assumes we read it) for an account on Facebook includes this: “For example, in addition to helping people see and find things that you do and share, we may use the information we receive about you for internal operations, including troubleshooting, data analysis, testing, research and service improvement.” Facebook’s response from Adam Kramer, who co-authored the study, is quoted in an article on TechCrunch.com. Kramer casts the study goal as a means to provide a better service. This disingenuous approach deflects the research question and by extension, the fact that Facebook freely admits to using our personal information to further its business goals.

Facebook is a business. Their end goal is not the same as users. Users are in search of friends and community. Facebook is concerned with their bottom line. Google, Yahoo, and every business with a website are increasingly mining the information users leave in the path of their Internet usage. Most of us accept this as the price of a digital world and all of its benefits. But periodically, we need to stop and assess what’s happening.

Let us know what you think in the comments below. Do the old rules meet the new standards of conduct? If we can, does it mean that we should? How do we define our right to privacy today? 

Kathleen GossmanProject Manager

(Visited 36 times, 1 visits today)