Advertisement

Facebook used you like a lab rat and you probably don't care

Companies perform A/B testing -- minor site variants to see what users like or don't like -- all the time. Twitter does it with its experimental features, and sites like ours tweak designs for a sample of users to see which ones they prefer. In January 2012, researchers at Facebook did something like that too. When people heard about it last week, however, they were outraged. Facebook, in the course of the study, messed with users' emotions without explicitly letting them know about it. But as outraged as people are right now, it likely won't make a difference in the long run.

In the span of seven days, researchers rejiggered the News Feeds of 689,000 users to surface either more positively or negatively worded stories to the top. The study found that users who saw the positive stories were more likely to write more positive words in their own posts, and users who saw negative ones were more likely to write negative words. According to the published paper in the Proceedings of the National Academy of Sciences, the study found that "emotional states can be transferred to others via emotional contagion" and that it can happen "without direct interaction between people."

Let's face it: Most people don't read policies and terms of service before agreeing to them, and even if they did, the terms are pretty difficult to understand

It seems like a relatively innocuous study, right? Even Adam Kramer, the study's author, wrote that the impact of the study was fairly minimal, and that its purpose was to "provide a better service." But this experiment goes beyond the pale, for several reasons. For one thing, we didn't know it was happening. The American Psychological Association (APA) states in its Code of Conduct that in the process of doing psychological research with human beings, informed consent is required -- it needs to be offered in a "language that is reasonably understandable to that person or persons." The part of Facebook's Data Use Policy that seems to allude to this states that the company would use your information "for internal operations, including troubleshooting, data analysis, testing, research and service improvement."

SWEDEN-FACEBOOK-DATA-CENTER-SERVERS

According to Forbes, however, this particular language didn't even appear in the agreement until four months after the study took place. And, let's face it: Most people don't read policies and terms of service before agreeing to them, and even if they did, the terms are pretty difficult to understand. Plus, that sentence is vague enough that it doesn't convey the possibility of a psychological study. It's logical to assume that the "research" stated here alludes to something harmless -- like making a button red instead of blue rather than studies that probe into the inner workings of your mind. That's not "informed consent" as the APA defines it, even if Facebook claims that it underwent a strong "internal review" process.

It's bad enough that the study occurred without Facebook users' permission. But it didn't just observe users' actions -- it intentionally meddled with their emotions. When we go on Facebook, we generally expect to catch up on our friends' lives unencumbered by any emotional sleight of hand. Sure, the advertising on Facebook is a form of emotional manipulation too, but many of us understand what we're getting into when we see an ad -- we expect to be pandered to and cajoled. We don't expect that same manipulation in our regular News Feed.

A local review board had approved the methodology "on the grounds that Facebook apparently manipulates people's News Feeds all the time."

But -- and here's the part that many people don't necessarily realize -- Facebook has been messing with your News Feed anyway. Susan Fiske, a Princeton University professor who edited the study for publication, told The Atlantic that a local institutional review board had approved the methodology "on the grounds that Facebook apparently manipulates people's News Feeds all the time." And she's right -- your News Feed is filtered based on a variety of factors so that some stories float to the top, while others don't. It's all part of Facebook's unique News Feed algorithm that intends to surface the "right content to the right people at the right time" so that you don't miss out on stories that matter to you. So, for example, you'll see a best friend's wedding photos over what a distant relative said she was having for lunch if your behavior on Facebook leads it that way.

US-FACEBOOK-MENLO PARK

In a way, the algorithm makes sense. According to Facebook, there are on average 1,500 potential stories every time you visit your News Feed and it's easy for important and relevant posts to get lost in the mix if you have to sift through it all. And from Facebook's perspective, surfacing more pertinent stories will also get you to stick around and engage more, and maybe help the company get more ad impressions in the process. The flip side, of course, is that Facebook is actually deciding what to show to you. Most of us probably don't really care about this because we're usually unaware of it, and as it's actually beneficial at times. But sorting out posts just because they're positive or negative is taking it too far. It turns us from customers into lab rats. Yet, we're all so used to this sort of manipulation that many of us probably never noticed.

In response to the negative reactions that the study caused, Kramer said in his post that the company's internal review practices would incorporate some of the lessons it's learned from the reaction to the study. Facebook also sent us the following statement:

"This research was conducted for a single week in 2012 and none of the data used was associated with a specific person's Facebook account. We do research to improve our services and to make the content people see on Facebook as relevant and engaging as possible. A big part of this is understanding how people respond to different types of content, whether it's positive or negative in tone, news from friends or information from pages they follow. We carefully consider what research we do and have a strong internal review process. There is no unnecessary collection of people's data in connection with these research initiatives and all data is stored securely."

Facebook's mea culpa is certainly appreciated, but it still doesn't quite resolve the biggest pain point: The experiment altered our moods without our consent. Also, let's not forget that Facebook has messed up with privacy issues before -- one of the more famous examples is the company's Beacon program, where it broadcasted your online shopping habits without your knowledge. This isn't exactly a company that can afford any further damages to its reputation. The firm has certainly made strides in recent years to show it's committed to user privacy by defaulting posts to friends only and making privacy options clearer. But it only takes a mistake like this to have everyone question their allegiance to Facebook again.

Facebook's mea culpa is appreciated, but it doesn't quite resolve the biggest pain point: The experiment altered our moods without our consent.

Or will it? The fact is that even with this controversial study revealed, most people will still continue to use Facebook. The company continues to grow -- it went from a million users in 2004 to almost 1.2 billion in 2013 -- despite the multiple privacy faux pas throughout the years. The social network has commanded such a loyal and dedicated following that none of these breaches in public trust have seriously damaged it. Most people just don't seem to care that their feeds are being manipulated, with or without their consent, as long as they still get to play Candy Crush Saga and see photos of their grandkids. After all, if you really cared about controlling your privacy, you'd look into getting off the internet entirely.

[Image credit: Bloomberg via Getty Images, AFP/Getty Images]