Social network's emotional lab rats turn against their keepers

Facebook
Facebook is probably glad it hasn't introduced a dislike button yet

This week people across the world were outraged by Facebook's latest timeline test, an experiment that saw it populating people's feeds with overly negative or positive responses and seeing how this affected an individual's mood.

Rather unsurprisingly, it turned out that having an algorithm throw endless status updates about bereavement, sick children and car insurance meltdowns into our faces made us sadder. Meanwhile populating our timelines with only Good News and pretty sunset photos made us more likely post cheery messages ourselves.

This sort of experimenting with our minds without explicit consent was instantly declared disgraceful by observers. Facebook's chief operating officer Sheryl Sandberg explained the company's actions and intentions in a pretty terrible and vague half-apology, saying the tests were "poorly communicated".

That was one of the mildest ways it was put.

The only crime is getting caught

Beneath an article on the Wall Street Journal about the history of the controversial Data Science team, reader Matthew Ferrara highlighted the fact that the majority of Facebook's users are unlikely to read many technology sites on a daily basis, so are unlikely to know - or even really care - they're being experimented on.

Mark Zuckerberg

Mark's been zucking with your Facebook feed

He said: "Even if a few million people were upset enough to close their accounts Facebook is hardly concerned about negative perception in the press because clearly almost nothing they can do causes a revolt amongst their user base. Lots of reasons for this, possibly such as people willing to be experimented on in exchange for something free, a more 'privacy is irrelevant' social belief, or a general malaise on these issues in general."

William Glasheen isn't surprised, though, suggesting everyone's up to this sort of thing all the time, posting: "I am absolutely appalled that people complaining about this are so naive. You think this is unique to Facebook? Really? Any company that wants to understand and serve its customers better is doing experimentation. My biggest concern isn't that Facebook is doing it, but rather that people are shocked that gambling is going on in the casino."

I am so very sort-of sorry

On the Telegraph, beneath a piece about Sheryl Sandberg's sort-of apology, reader BlokefromKent wasn't convinced she really meant it, commenting: "Well now; she hasn't apologised for the experiment at all, but for the poor communications surrounding it."

Mr Kent continued: "A full apology would have consequences and implications for those who authorised and who took part in designing the experiment. Ethically speaking, Facebook and the universities should delete the data and take no academic or commercial advantage of the information it produced, as no subject gave proper informed consent."

Sheryl Sandberg

Sheryl Sandberg says sorrynotsorry

Paul_Basel was equally unimpressed by the amount of wriggle room that exists inside the Facebook terms and conditions, rules that have been amended countless times since half of us signed up without reading them back in 2006, saying: "...you make it all sound like a terrible mistake. But you changed the company's terms and conditions so that users had consented to this by accepting them. You make it sound like a ghastly mistake but your creepy company went to all the trouble to put the building blocks in place so they could point to the terms and conditions and say the users had consented, that shows how premeditated this was."

It's a bit like [thing it's not really like]

News site CNN went for the ethical jugular, asking whether Facebook's emotional toying went too far. This allowed the commenters to go largely berserk, although reader NCGH was sort of on the side of the blue social whale, or at least on the side of those who thought its tests as no big deal, saying: "Marketing companies do tests all the time. Show different internet ads to different people, gage response. Try tweaking the presentation, test again. I'm not sure this is really any different. (And virtually ALL of our political parties do the same thing)."

He or she also made a great point about how we actually do this sort of thing all the time in weird little ways ourselves, continuing: "People on dating sites change their profile and look at what the results are. They're doing an 'experiment' too."

A few mouse wheels down we find reader Bobby Leon, one of the full-on conspiracy theorists, who announces the, ahem, real reason behind it all with: "Sounds like a way the elite can detect if an overthrow is eminent. Gauge the pulse of the masses, and redirect them to a distractionary outrage."

So a funny cat photo going viral might distract the population from military outrages overseas.

Entirely thoughtless

On the Daily Mail, reader Ceeare believes it's the end of everything, or at least the end of this particular social network, saying: "Not the start of the thought police? That's a joke. That is exactly what this is. Facebook started out as a place where like-minded folks could socialise. Now it's a place of experiments, manipulated page feeds and forced to pay if you want to 'boost' your page. It's a joke and it's time a new place starts up."

Last word this week goes to commenter DM Hype Watch, who, as you might suspect, has their own agenda to subtly try to scroll to the top of everything in an effort to control our opinions, posting: "Tweaking the facts to one's own advantage? You'd know all about that now wouldn't you DM?"