700,000 people were involved in Facebook research during 2012. They knew nothing about it.
There was a Furore a couple weeks ago when Facebook revealed the results of an experiment they had conducted on Facebook users in 2012. They were undertaking research on Emotional Contagion in online and social settings. Paper here. We already know emotions are contagious and when you meet someone who is happy or sad you can’t but help be affected by their mood. Facebook wanted to know if the same thing happened through interaction in online social networks. To test this they altered the number of positive and negative words in posts people saw in their News Feed and tracked how this affected that user’s subsequent posts and updates.
Did it impact their mood?
Facebook found that the more positive posts people saw they were more likely to share more positive updates. Likewise if exposed to negative posts users were more likely to share less positive updates. Facebook contends their findings show that emotions in social networks are contagious too.
So whats the problem?
- 700,000 people’s news feed where involved in this research during 2012 and they knew nothing about it. Facebook say that when you sign up as a user to their data use policy and terms of service, you agree to such experiments. But how many realised they were signing up to this? Can you really say this was informed consent? At the very least, hiding those permissions behind tick boxes in sign up form that Facebook know people do not read could be construed as a use of dark pattern design.
- And what about Privacy issues? Altering peoples posts and tracking their subsequent posts surely breaks all privacy rules. Not, according to Facebook, if you don’t use humans to track or read data. During this experiment Facebook used language analysis and automated tools to track positive and negative words and terms during this research.
- It is because of those automated tools that many think this research was flawed. The contention being that algorithms are incapable of correctly interpreting human emotions and that emotions are more that just words. Interpretation is often more dependent on content or tone E.g how would “I am not having a great day” How would an algorithm interpret that? Negative or Positive because of ‘great and ‘day’? It is unclear.
Right or Wrong?
Facebook did later apologise for any anxiety this experiment caused. Adam Kramer, lead researcher posted through Facebook “I can understand why some people have concerns about it, and my coauthors and I are very sorry for the way the paper described the research and any anxiety it caused,” he said. “In hindsight, the research benefits of the paper may not have justified all of this anxiety.”
Facebook say they are sorry, which is not at all case for OK Cupid…
an American dating site, who just this week revealed they have been running a series of experiments on unsuspecting users of their website. Co-Founder Christian Rudder published the results in his blog entitled ‘We Experiment with Human beings! ’
These experiments involved Ok Cupid altering the data in user’s profiles to see what impact it had on the engagement between users.
“Love is Blind or should be”
The first experiment found that temporarily removing the photo from someone’s profile made for better and according to Rudder ‘more deeper’ conversations between users. However when they restored the photos many of these conversations stopped.
“Your actual words are worth… almost nothing”
OkCupid next undertook an A/B test where in some profiles they hide text (and left photo) and in others left the text and the photo. Results showed the lack of text had little impact on traffic between customers. Photos are what matter.
“when we tell people they are a good match, they act as if they are. Even when they should be wrong for each other.”
In another experiment OKcupid set up matches between pairs of users that were not, according to OkCupids own formulas , compatible. But they told these users they were “exceptionally good” for each other, or 90% matches. Naturally users then sent more messages to their paired match. Interestingly even though the matches should not have got on particularly well, they did. The company did reveal the correct scores to users after.
Are we all just Lab rats?
Of course these experiments have sparked more outrage… but was this the point?
On one hand I applaud this research; the tone is loud and proud and if the motivation is to improve a service through testing and research, good on them. On the other hand Rudder is deliberately inflammatory in his tone and methodologies (especially on back of FB research) and looks to be courting publicity which for me, dilutes the research effort.
Also, the acid test here is, did these experiments actually prove anything? From the love is blind experiment follow up research revealed that “women had a good time more or less regardless of how good-looking their partner was. So what? ~ Was there more hook-ups or potential budding relationships because of this intervention?
“If you use the Internet, you’re the subject of hundreds of experiments at any given time, on every site,” Christian Rudder
And finally we are back to the thorny issue of informed consent. Should online companies be allowed to do this? What impact does this have on the brand and how customers trust them? And is it even important? This is at the very least a moral question and probably very soon a legal one too.