Content
Facebook manipulate 690,000 news feeds as part of a psychology experiment
Jul 7th, 2014Social network, advertising juggernaut, science lab
Facebook has been called many things over the years, some positive, some negative and ironically this is how the world’s largest social network may have targeted your mood with their week long study of nearly 690,000 English language news feeds.
Indeed when it comes to Facebook, the latest trending topic is the news that an ‘emotion-manipulation study’ was conducted by those at the social network to see how the posts shown on your news feed could affect your mood.
In the article titled: ‘Experimental evidence of massive-scale emotional contagion through social networks’ authors Adam D. I. Kramer, Jamie E. Guillory and Jeffrey T. Hancock, highlighted the desire to first find out if the information that we receive through our news feeds can affect our mood and secondly, whether or not these moods can be passed on without physical interaction.
Many disgruntled users have taken to various social media platforms to have their say and it seems the legality of not requesting consent is the biggest talking point
When the study took place back in January 2012, the researchers altered the news feeds of 689,003 users to show a disproportionate number or either positive or negative status updates, they used a keyword study and algorithms to select the statuses of friends and affiliates, which contained positive or negative words and pushed them on to the users’ news feeds to see how their mood was reflected in their next updates.
Their findings showed that there was a positive correlation between the future status updates of the participants and the emotion to which they had been ‘pushed’ towards.
The idea was big, bold and in the eyes of many, unethical.
Many disgruntled users have taken to various social media platforms since the publication of the results to have their say and it seems as though the legality of not requesting consent is the biggest talking point.
The truth however, is that when questioning the legality of such a study, the answer is clear, Facebook did no wrong and the signing and acceptance of the Data Use Policy, which all new account holders are required to agree to upon joining the site are clear. The document states that users agree to have their data accessed: ‘for internal operations, including troubleshooting, data analysis, testing, research and service improvement.’
As highlighted on other social networks such as Twitter, users questioned whether or not it was right to not offer the choice to opt in or out of the study.
One of the writers has however rebuked the idea that they went beyond ethics and the legal context of the Data Use Policy, but offered an apology for a lack of transparency, admitting at times that they got things wrong.
In a public statement (below) about the publication of the experiments findings, Kramer, the co-author, who works as a member of Facebook’s Core Data Science Team, responded to the critics in an apologetic Facebook post saying: “The reason we did this research is because we care about the emotional impact of Facebook and the people that use our product.”
“We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out. At the same time, we were concerned that exposure to friends’ negativity might lead people to avoid visiting Facebook.
“We didn’t clearly state our motivations in the paper.”
Where do you stand on the legal/ethical questions that have been raised by the study? Let us know in the comments box below.