Are we using Facebook or is Facebook using us?

Issue: The social network is again guilty of a serious ethical lapse

In the annals of controversial psychological experiments, the most notorious, such as the Milgram experiment and the Stanford prison experiment, manipulated subjects into violent behaviour.

But now we have a new case study in ethically questionable research with the Facebook feed experiment, which caused a massive furore this week.

In a study involving academics from Cornell and the University of California, Facebook data scientist Adam Kramer authored a paper in the March issue of Proceedings of National Academy of Sciences, detailing how, in 2012, Facebook manipulated the information posted on 689,003 users' home pages, filtering some for positive messages and others for negative posts, with a view to how the different feeds affected the mood of the users, as observed through their own behaviour on Facebook.

The conclusion was far-reaching: “Emotions expressed by friends, via online social networks, influence our own moods, constituting, to our knowledge, the first experimental evidence for massive-scale emotional contagion via social networks.”

READ MORE

Deeply unsettling

As deeply unsettling as the blunt use of the phrase “massive-scale emotional contagion” might seem, we must acknowledge that the Facebook experiment didn’t manipulate subjects into acts of violence. However, at least the subjects in those earlier examples knew they were involved in experiments in the first place, even if Milgram in particular misled them as to what was going on.

In Facebook’s case, the nearly 700,000 “subjects” were under the impression they were using the social network as usual, and were none the wiser that their news feeds were being manipulated in such a fashion. That’s an ethical minefield if ever there was one.

Central to Facebook’s defence is the claim that agreeing to the social network’s terms of service was equivalent to offering consent.

Apart at all from the absurd suggestion that anybody actually reads the terms of service on any piece of software or online service, Forbes reported how, at the time of the emotional response experiment, research wasn't even included in the terms of service. The key proviso allowing for such activity, "for internal operations, including troubleshooting, data analysis, testing, research and service improvement", was added four months after the research was conducted.

Mealy-mouthed apology

Facebook also surpassed itself in the mealy-mouthed apology stakes – over the years, the firm has become adept at shrugging off various user-infuriating changes with hollow words of contrition, but this time chief operating officer

Sheryl Sandberg

set a new bar: “This was part of ongoing research companies do to test different products, and that was what it was; it was poorly communicated. And for that communication we apologise.”

Just so we're clear, it was the poor communication that Facebook feels bad about, not the experiment itself. And since that apology, the Wall Street Journal now reports that the study was just one of hundreds of tests run by a highly active group of data scientists at the company.

Inevitably, the firm’s behaviour has attracted the critical focus of regulators, with Ireland’s Data Protection Commissioner Billy Hawkes, who has responsibility for overseeing the firm’s European operations, once again forced to chase after the serial offender.

But what is it about this experiment that has particularly angered users? Partially it’s the discovery that the firm has some rather insidious power to control our mood.

But even more so, the anger from this episode is because it leaves no room for doubt that while we might nominally be Facebook’s users, it is Facebook that is using us.