Quantcast

Facebook may have experimented with controlling your emotions without telling you

By David Ferguson
Saturday, June 28, 2014 11:32 EDT
google plus icon
Happy woman at computer via Shutterstock
 
  • Print Friendly and PDF
  • Email this page

Researchers at Facebook ran an experiment on nearly 700,000 randomly chosen users to see if their overall emotional state could be manipulated by the types of posts they see in their feeds.

Think Progress reported on a scientific paper published on Friday in the Proceedings of the National Academy of Science (PNAS) entitled “Experimental evidence of massive-scale emotional contagion through social networks” in which the researchers concluded that it is not just possible but very easy to drive emotional responses in users by controlling the content that they see.

Users of the social media platform consented to be experimented on by developers and researchers when they signed the Terms and Conditions necessary to open a Facebook account.

The experiment took place from January 11th to the 18th of 2012 and involved 689,003 English-speaking users. When researchers fed users posts which an algorithm ranked as emotionally positive, those users began to produce more positive and optimistic posts in their own feeds.

According to the study abstract, “We show, via a massive (N = 689,003) experiment on Facebook, that emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness.”

Furthermore, “We provide experimental evidence that emotional contagion occurs without direct interaction between people (exposure to a friend expressing an emotion is sufficient), and in the complete absence of nonverbal cues,” meaning that you don’t even need to talk to your friend to be affected by their online emotional state.

None of the users who were part of the experiment have been notified. Anyone who uses the platform consents to be part of these types of studies when they check “yes” on the Data Use Policy that is necessary to use the service.

Facebook users consent to have their private information used “for internal operations, including troubleshooting, data analysis, testing, research and service improvement.” The company said no one’s privacy has been violated because researchers were never exposed to the content of the messages, which were rated in terms of positivity and negativity by a software algorithm programmed to read word choices and tone.

“As such,” said researchers, “it was consistent with Facebook’s Data Use Policy, to which all users agree prior to creating an account on Facebook, constituting informed consent for this research.”

Privacy advocates have long warned that Facebook users give up an inordinate amount of their personal data and sensitive information to the service without realizing it because of the shifting parameters of its privacy agreements.

A Consumer Reports survey said that only 37 percent of users have even bothered to check their own privacy settings, let alone customize them to “Friends Only” or other settings.

“There’s a burden on the individual to get educated, but there’s also a burden on the companies,” said Dr. Pamela Rutledge, director of the Media Psychology Research Center, to ThinkProgress. “We’re not all lawyers, we’re not all IT guys.”

[image of happy woman at computer via Shutterstock.jpg]

David Ferguson
David Ferguson
David Ferguson is an editor at Raw Story. He was previously writer and radio producer in Athens, Georgia, hosting two shows for Georgia Public Broadcasting and blogging at Firedoglake.com and elsewhere. He is currently working on a book.
 
 
 
 
By commenting, you agree to our terms of service
and to abide by our commenting policy.
 
Google+