The One Thing Facebook Got Wrong When Manipulating Users' Emotions
Conspiracy theorists probably think the discovery was a big deal with wide-ranging “big brother” implications. Others, like a poster mentioned by Forbes who said, “Dude, Facebook and advertisers manipulate us all the time. NBD,” not so much.
As for Facebook’s response when asked about the 2012 study, Forbes said it seemed a bit “tone deaf.”
The company said in part, “This research was conducted for a single week in 2012 and none of the data used was associated with a specific person’s Facebook account. We do research to improve our services and to make the content people see on Facebook as relevant and engaging as possible.”
That response, of course, did not address the key issue of whether it was ethical to manipulate users emotionally in the name of “Facebook science.”
The Manipulative Study
The study, titled “Experimental evidence of massive-scale emotional contagion through social networks" was serious science; it was published by the National Academy of Sciences.
In the abstract researchers said, “We show, via a massive (N = 689,003) experiment on Facebook, that emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness. We provide experimental evidence that emotional contagion occurs without direct interaction between people (exposure to a friend expressing an emotion is sufficient), and in the complete absence of nonverbal cues.”
In other words, for the study researchers manipulated subjects’ News Feeds to contain either more good news or more bad news and then tried to determine whether that affected the emotions of those subjects. (Spoiler alert: It did.)
Not Exactly Informed Consent
The primary complaint about the study was what many in the academic and legal communities felt was a “lack of informed consent.” What consent there was came from a single line in Facebook’s “Data Use Policy” in which users agree to the use of their information for “research.”
In a blog post, University of Maryland law professor James Grimmelmann said, "Facebook didn't give users informed consent. The study harmed participants." Grimmelmann added, "This is bad, even for Facebook."
As might be expected, the Facebook data scientist who led the study disagreed with the law professor, posting on Facebook that the research was conducted, "because we care about the emotional impact of Facebook and the people that use our product."
At the time of this writing, Jim Probasco had no position in any mentioned securities.
© 2017 Benzinga.com. Benzinga does not provide investment advice. All rights reserved.