Sunday, June 29, 2014

Facebook's questionable psychological experiment

From The Atlantic:

But few users expect that Facebook would change their News Feed in order to manipulate their emotional state.

We now know that’s exactly what happened two years ago. For one week in January 2012, data scientists skewed what almost 700,000 Facebook users saw when they logged into its service. Some people were shown content with a preponderance of happy and positive words; some were shown content analyzed as sadder than average. And when the week was over, these manipulated users were more likely to post either especially positive or negative words themselves.

This tinkering was just revealed as part of a new study, published in the prestigious Proceedings of the National Academy of Sciences. Many previous studies have used Facebook data to examine “emotional contagion,” as this one did. This study is different because, while other studies have observed Facebook user data, this one set out to manipulate it.
The article goes on to discuss the lack of informed consent and debriefing that are usually considered standard operating procedure for social psychology experiments. Individuals affected in the experiment were never explicitly notified that they were going to be studied and that the data would be potentially published, nor were they given an option to opt out. There was a certain amount of deception - by omission if nothing else - and under such circumstances it is expected that individuals who have been deceived will be fully debriefed as to the nature of the experiment, the expected findings, and the significance of those findings (scientific, personal, etc.).

An IRB apparently signed off on it, so the authors have that to fall back on, I suppose. Social network activity is a rather tricky gray area. It is not really "public" but it is not really "private" either. I think it is understandable, nonetheless, that many Facebook users feel a bit violated right now, and with good reason. While this experiment may not quite have the "creep factor" of some field experiments from the past (see this one, for example), its publication should give us pause. Once more as a community, we as social scientists need to ask ourselves about the limits of what is considered "fair game" for research in an era of social networking websites, and those limits need to be explicitly clarified by the appropriate umbrella organizations for our discipline (such as the APA).

In the meantime, I suppose many will be wondering how many more Facebook users have been guinea pigs in psychological experiments.

Sunday, June 22, 2014

Photos from this year's George Gerbner Conference

For those of you who might be interested, I thought I would share the photos of this year's George Gerbner Conference on Communication, Conflict, and Aggression, hosted at the University of Applied Sciences, Budapest (BKF). Our paper, Framing Effects on Attitudes Toward Torture, was well-received, and overall the conference was productive. Next on tap is submitting the manuscript for publication at a relevant peer-review journal.

There were several interesting presentations. One that caught my attention was one dealing with violent video games. The impression I got from the presenter was that although the science demonstrating a causal influence between violent video games and aggression is quite consistent at this point (although not without some debate), the science itself is of secondary influence as far as policy goes. Instead, it appears that proponents and opponents of regulating video games tend to latch on whatever published research appears to support their particular views while ignoring the rest. That is a rather disheartening revelation, although not entirely unexpected, given the content of student term papers I peruse every semester in my social psychology classes.