Sign up FAST! Login

Furor Erupts Over Facebook Experiment on Users - WSJ


Stashed in: Facebook!, Privacy does not exist., Awesome, @ajs, You are the product., Advertising

To save this post, select a stash from drop-down menu or type in a new one:

To determine whether it could alter the emotional state of its users and prompt them to post either more positive or negative content, the site's data scientists enabled an algorithm, for one week, to automatically omit content that contained words associated with either positive or negative emotions from the central news feeds of 689,003 users.

"What many of us feared is already a reality: Facebook is using us as lab rats, and not just to figure out which ads we'll respond to but actually change our emotions," wrote Animalnewyork.com, a blog post that drew attention to the study Friday morning.The research, published in the March issue of the Proceedings of the National Academy of Sciences, sparked a different emotion—outrage—among some people who say Facebook toyed with its users emotions and uses members as guinea pigs.

Facebook has long run social experiments. Its Data Science Team is tasked with turning the reams of information created by the more than 800 million people who log on every day into usable scientific research.

I'm amazed that there are people who were surprised that Facebook does experiments on its users.

That is EXACTLY how I feel.  It's the advertising, stupid.

We're All Facebook's Lab Rats

How shocking: Facebook had the temerity to conduct an experiment on its users without telling them and now the results have beenpublished in the Proceedings of the U.S. National Academy of Sciences. Actually, no one should be surprised.

For a week in 2012, the social network's staff scientist Adam Kramer and two collaborators used algorithms to doctor the news feeds of 689,003 English-speaking Facebook users. They reduced the number of posts containing "positive" and "negative" words, tracked their lab rat users' own posts, and found that their mood was influenced by that of the news feed. The term, well-known to psychologists studying real-world communications, is "emotional contagion."

............................................................................

....on Facebook, one can opt out of having a machine decide what content you will find engaging. Twitter, by contrast, allows users to opt in by using the so-called Discover feed. I find the opt-in tactic more honest, but, predictably, it's less effective from a marketing point of view.

............................................................................

People who hate this have the option of not using Facebook and switching to a network that is less invasive in its attempts to leverage its user base. It's like unplugging that TV or quitting smoking: so easy, and yet so hard.

The good news for those who want their social networking fix regardless is that Facebook only has a limited ability to influence our emotions. "At the end of the day, the actual impact on people in the experiment was the minimal amount to statistically detect it -- the result was that people produced an average of one fewer emotional word, per thousand words, over the following week," Kramer explained in a Facebook post on Sunday.

http://www.bloombergview.com/articles/2014-06-30/we-re-all-facebook-s-lab-rats

Unfortunately. One day it will cross the useful-annoying line and everyone will leave en masse. 

But for now, we are the product.

Yeah, Science! (Breaking Bad) gif

TechCrunch calls it unethical to experiment on people without opt-in. 

http://techcrunch.com/2014/06/29/facebook-and-the-ethics-of-user-manipulation/

P.S. We get so much mileage out of that Jesse Pinkman gif!

When "you are the product" - these experiments are inevitable. Maybe it's time to go back to products people pay for. LinkedIn doesn't need to do these experiments because it makes tons of money from recruiters. At some point "free but invasive" will get to the point where people would rather pay, than be invaded.

You May Also Like: