You must have heard about Facebook’s latest confession by now. If you haven’t (somehow) here it is in a nutshell: they conducted some research on unknowing users by altering the posts that appeared in their news feed (either by displaying more positive or negative posts) and then analysed subsequent posts to see whether this had an effect.
The media, and the world in general, have gone crazy about this. Was what they did right? We have three different opinions for you…
Ian: Researcher that won’t admit his age!
I am ever so slightly biased as I think there is far too much intrusion into people’s personal lives and the current crop of Internet global monoliths think they can ride roughshod over people’s liberties and privacy. I quote Google’s Street View where they captured and posted a picture of everyone’s house on-line – and even worse they secretly captured information on people’s Wi-Fi including emails, passwords and photos.
So, Facebook’s little social experiment is just more of the same (i.e. action without thought for the consequences or the fact that they may be contravening any kind or law or simply …..good moral behaviour).
What would Facebook have done if someone had become so depressed they committed suicide or committed a violent action as a result of being depressed? Maybe it happened – we will never know as FB aren’t saying who was included in this bit of social mood-manipulation. Can you imagine if Starbucks were to run a similar exercise but putting different ingredients in a muffin which were known to affect people’s moods (positively or negatively). I think criminal action may well have followed.
The world has changed as a result of the Internet and legislation and laws haven’t kept up – partly because this is a global phenomenon than spans national boundaries and laws. It’s like the wild west and we need some kind of global oversight with the power to hit these companies hard for contravening acceptable behaviour. Google was fined €150k in Germany for their Streetview antics – an absolute pittance.
What would happen if we suddenly decided to research people’s opinions and behaviours without telling them? There are strict guidelines that we have to follow when conducting research – why is this different for Facebook?
Pushback is coming and I accept it will probably go too far initially but, hopefully, we will find the right regulatory balance before too long.
Bethan: 25 year old researcher
I have been to-ing and fro-ing for a while on this one.
On one hand, I completely understand Ian’s point of view – Facebook’s experiment could have had detrimental effects that nobody knows about. I would count myself as a frequent Facebook user, and I am way too familiar with that “why is the world being so miserable” sigh that can sometimes happen after scrolling through your news feed. Yes, it can put a dampener on things. (And lead to a huge friend cull – but that’s a different story altogether).
On the other hand, I do think that Facebook users need to accept some of the responsibility here. I think that we, as a society, are so blinded to the small print, the familiar “terms and conditions apply” spiel, that we just switch off. How many times have you gone to submit a form with your details to a company online, only to be interrupted by the annoying “Please confirm you have read our terms and conditions” text box? How many of these have resulted in you just ticking the stupid box and pressing ‘submit’? I couldn’t even give you an estimate, it happens so often.
Having recently sat the MRS Advanced Certificate exam (I say recently, that might be me clinging to the truth slightly) – I quite often still go back to the study materials and the website to double check things, the MRS Code of Conduct being one of them. This set of rules is really important to me – I hate how little trust the general public have in the market research industry, and I like to think that I do everything I can to build that trust and treat any research participants with respect – without them I wouldn’t have a job!
Whilst the revision materials did make me shudder slightly, a few points in particular stuck out to me this time:
- “Consent to participate should be informed consent”. Yes, Facebook had consent – everyone had ticked the box next to their terms and conditions, which included “internal operations, including troubleshooting, data analysis, testing, research and service improvement.” They did not have informed consent. The word “research” is NOT enough.
- “Research should be conducted honestly, objectively, and without any unwelcome intrusion or harm to respondents.” This seems to have been completely ignored. I don’t see any honesty on Facebook’s behalf (why are we surprised) – and the possibility of emotional harm for respondents, I’d imagine, was quite high.
- “Research participants should not be deceived in any way.” This is also a little cloudy. However, if Facebook users feel like they have been deceived, I guess it’s as simple as that.
In conclusion, I think Facebook may have done enough to legally cover their backs. This doesn’t mean that they are in the right. They have just harmed their image, and caused the trust levels to dip even further. I’ll say it again – why are we surprised.
Lucy, giving the view of a 17 year old
Personally I don’t think that these behavioural tests are that bad. Yes they could be construed as an intrusion into our privacy, but Facebook didn’t hack into anyone’s profiles and change them; they just delayed messages. It seems to me that this experiment just proves society’s over-reliance on social media. Our moods shouldn’t be dictated by what we read on our Facebook news feed, it should be influenced by the outside, the ‘real’ world. If Facebook had influenced people’s moods for their own gain then I could understand the controversy, however this is simply an experiment into behavioural psychology. And, like any experiment, it works best when the subject doesn’t realise there is an experiment, because that knowledge could influence the results.
What do you think? Feel free to let us know @MustardResearch!