Facebook tinkered with users’ feeds for a massive psychology experiment

Article

Scientists at Facebook have published a paper showing that they manipulated the content seen by more than 600,000 users in an attempt to determine whether this would affect their emotional state.

Everyone here is likely (I would surely hope) aware of the multiude of privacy issues that come with using Facebook, but this is a new low, actively attempting to manipulate their users’ emotions (and censoring content in the process).

There are so many ethical issues with this study, I’m having trouble deciding where to start. It’s clear this hasn’t gone through a proper ethics review board, nor was there any effort to obtain informed consent from the unknowing participants. There are a lot of academics out there right now calling for this paper to be retracted by PNAS for gross academic misconduct. If it’s not, I expect PNAS will be the subject of a wide ranging academic boycott. This sort of shit just isn’t on.

A couple of things worth noting here, BTW…

  1. Facebook justified this by saying that users agreed to research when they signed up to Facebook, because it’s there in the Data Use Policy.
  2. Research going back a decade has shown that the vast majority of consumers don’t ever read privacy policies, and that they can’t understand them even if they do read them.
  3. While the readability of Facebook’s policies is greater than most social networks, it’s still above the reading level of the average adult.
  4. While Facebook’s policies have better readability than they did in the past, it’s likely that a significant portion of these users would have signed up a long time before they were put in place, so they may have read a version that’s even harder to understand.
  5. With how closely Facebook tracks everyone, it should actually be possible for them to identify specific individuals who have clicked through to the Data Use Policy. They should even make a reasonable guess to which people actually read the whole thing as opposed to skimmed, and who is likely to have fully understood the damn thing they read.
  6. If Facebook had identified specific individuals who they knew had likely read the whole thing and understood this, I would have expected them to have declared this in the paper, if only to cover their own asses when it went through peer review.
  7. If Facebook did identify specific users who they knew had read it and understood it, then it’s junk science anyway. This would bias their sample badly towards some very specific demographics.
  8. Of course, given that they had 689,003 unwitting participants, it’s highly unlikely that they did restrict it to just those who read and understood the policy anyway.

EDIT: And one other thing…
James Grimmelmann, a Privacy Law academic from the US, has been having a lot to say on this Twitter account about how this was handled by Facebook. Well worth following along. His accout is @grimmelm

1 Like