Facebook Manipulated Users’ Feeds for a Psychology Experiment

William Hughes, writing for the AV Club:

Scientists at Facebook have published a paper showing that they manipulated the content seen by more than 600,000 users in an attempt to determine whether this would affect their emotional state. The paper, “Experimental evidence of massive-scale emotional contagion through social networks,” was published in The Proceedings Of The National Academy Of Sciences. It shows how Facebook data scientists tweaked the algorithm that determines which posts appear on users’ news feeds — specifically, researchers skewed the number of positive or negative terms seen by randomly selected users. Facebook then analyzed the future postings of those users over the course of a week to see if people responded with increased positivity or negativity of their own, thus answering the question of whether emotional states can be transmitted across a social network. Result: They can! Which is great news for Facebook data scientists hoping to prove a point about modern psychology. It’s less great for the people having their emotions secretly manipulated.

This is hugely controversial, but I’m only surprised that anyone is surprised. Yes, this is creepy as hell, and indicates a complete and utter lack of respect for their users’ privacy or the integrity of their feed content. Guess what: that’s Facebook.

“Fool me once, shame on you; fool me twice, shame on me,” the saying goes. Fool me two dozen times — there’s no adage for that.

Saturday, 28 June 2014