But few users expect that Facebook would change their News Feed in order to manipulate their emotional state.The article goes on to discuss the lack of informed consent and debriefing that are usually considered standard operating procedure for social psychology experiments. Individuals affected in the experiment were never explicitly notified that they were going to be studied and that the data would be potentially published, nor were they given an option to opt out. There was a certain amount of deception - by omission if nothing else - and under such circumstances it is expected that individuals who have been deceived will be fully debriefed as to the nature of the experiment, the expected findings, and the significance of those findings (scientific, personal, etc.).
We now know that’s exactly what happened two years ago. For one week in January 2012, data scientists skewed what almost 700,000 Facebook users saw when they logged into its service. Some people were shown content with a preponderance of happy and positive words; some were shown content analyzed as sadder than average. And when the week was over, these manipulated users were more likely to post either especially positive or negative words themselves.
This tinkering was just revealed as part of a new study, published in the prestigious Proceedings of the National Academy of Sciences. Many previous studies have used Facebook data to examine “emotional contagion,” as this one did. This study is different because, while other studies have observed Facebook user data, this one set out to manipulate it.
An IRB apparently signed off on it, so the authors have that to fall back on, I suppose. Social network activity is a rather tricky gray area. It is not really "public" but it is not really "private" either. I think it is understandable, nonetheless, that many Facebook users feel a bit violated right now, and with good reason. While this experiment may not quite have the "creep factor" of some field experiments from the past (see this one, for example), its publication should give us pause. Once more as a community, we as social scientists need to ask ourselves about the limits of what is considered "fair game" for research in an era of social networking websites, and those limits need to be explicitly clarified by the appropriate umbrella organizations for our discipline (such as the APA).
In the meantime, I suppose many will be wondering how many more Facebook users have been guinea pigs in psychological experiments.