The researchers, led by data scientist Adam Kramer, found that emotions were contagious. "When positive expressions were reduced, people produced fewer positive posts and more negative posts; when negative expressions were reduced, the opposite pattern occurred," according to the paper published by the Facebook research team in the PNAS. "These results indicate that emotions expressed by others on Facebook influence our own emotions, constituting experimental evidence for massive-scale contagion via social networks."
The researchers -- who may not have been thinking about the optics of a "Facebook emotionally manipulates users" study -- jauntily note that the study undermines people who claim that looking at our friends' good lives on Facebook makes us feel depressed. "The fact that people were more emotionally positive in response to positive emotion updates from their friends stands in contrast to theories that suggest viewing positive posts by friends on Facebook may somehow affect us negatively," they write.
They also note that when they took all of the emotional posts out of a person's News Feed, that person became "less expressive," i.e. wrote fewer status updates. So prepare to have Facebook curate your feed with the most emotional of your friends' posts if they feel you're not posting often enough.
In its initial response to the controversy around the study -- a statement sent to me late Saturday night -- Facebook doesn't seem to really get what people are upset about, focusing on privacy and data use rather than the ethics of emotional manipulation and whether Facebook's TOS lives up to the definition of "informed consent" usually required for academic studies like this. "This research was conducted for a single week in 2012 and none of the data used was associated with a specific person's Facebook account," says a Facebook spokesperson. "We do research to improve our services and to make the content people see on Facebook as relevant and engaging as possible. A big part of this is understanding how people respond to different types of content, whether it's positive or negative in tone, news from friends, or information from pages they follow. We carefully consider what research we do and have a strong internal review process. There is no unnecessary collection of people's data in connection with these research initiatives and all data is stored securely."
Ideally, Facebook would have a consent process for willing study participants: a box to check somewhere saying you're okay with being subjected to the occasional random psychological experiment that Facebook's data team cooks up in the name of science. As opposed to the commonplace psychological manipulation cooked up by advertisers trying to sell you stuff.
795a8134c1