Market Overview

The One Thing Facebook Got Wrong When Manipulating Users' Emotions

The One Thing Facebook Got Wrong When Manipulating Users' Emotions
Related FB
Google Vs. Facebook: Will There Be An Online Advertising War In 2015?
SEC Charges Stock Promoter With Deceptive Ads Who Traded Facebook And Twitter IPOs
2015 Playbook: Small Caps Versus Large Caps (Fox Business)

This past weekend people discovered that a Facebook (NASDAQ: FB) data scientist and a couple of university researchers studied the effect of manipulating the emotional content of Facebook News Feeds.

Conspiracy theorists probably think the discovery was a big deal with wide-ranging “big brother” implications. Others, like a poster mentioned by Forbes who said, “Dude, Facebook and advertisers manipulate us all the time. NBD,” not so much.

As for Facebook’s response when asked about the 2012 study, Forbes said it seemed a bit “tone deaf.”

The company said in part, “This research was conducted for a single week in 2012 and none of the data used was associated with a specific person’s Facebook account. We do research to improve our services and to make the content people see on Facebook as relevant and engaging as possible.”

That response, of course, did not address the key issue of whether it was ethical to manipulate users emotionally in the name of “Facebook science.”

Related: 2014 World Cup Touted As Biggest Global Event For Social Media

The Manipulative Study

The study, titled “Experimental evidence of massive-scale emotional contagion through social networks" was serious science; it was published by the National Academy of Sciences.

In the abstract researchers said, “We show, via a massive (N = 689,003) experiment on Facebook, that emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness. We provide experimental evidence that emotional contagion occurs without direct interaction between people (exposure to a friend expressing an emotion is sufficient), and in the complete absence of nonverbal cues.”

In other words, for the study researchers manipulated subjects’ News Feeds to contain either more good news or more bad news and then tried to determine whether that affected the emotions of those subjects. (Spoiler alert: It did.)

Not Exactly Informed Consent

The primary complaint about the study was what many in the academic and legal communities felt was a “lack of informed consent.” What consent there was came from a single line in Facebook’s “Data Use Policy” in which users agree to the use of their information for “research.”

In a blog post, University of Maryland law professor James Grimmelmann said, "Facebook didn't give users informed consent. The study harmed participants." Grimmelmann added, "This is bad, even for Facebook."

As might be expected, the Facebook data scientist who led the study disagreed with the law professor, posting on Facebook that the research was conducted, "because we care about the emotional impact of Facebook and the people that use our product."

At the time of this writing, Jim Probasco  had no position in any mentioned securities.

Posted-In: Facebook ForbesPolitics Topics Tech General Best of Benzinga

 

Related Articles (FB)

Around the Web, We're Loving...

Get Benzinga's Newsletters