Roll Call: Latest News on Capitol Hill, Congress, Politics and Elections
October 24, 2014

Facebook News Feed Filtering: Beyond the Emotion Study

Among the flood of writing that’s emerged after news of Facebook’s controversial changing of users’ News Feeds for an emotion study involving more than 689,000 users, a number of articles point out that even the News Feeds that users see outside of such an experiment don’t fully reflect all of their friends’ activities on the site.

For example, Vox’s Nilay Patel writes that “manipulating the News Feed is Facebook’s entire business” and lays out how it works with advertising.

So, what’s the response to the idea that everything’s filtered, whether it’s for a study or not? David Weinberger, a senior researcher at Harvard’s Berkman Center for Internet and Society in his piece in CNN raises concerns about commercial objectives of entities making decisions about what people see.

Weinberger writes that “much of the outrage is driven by a false assumption: that there is a ‘real’ mix of news about our friends,” when actually Facebook “always uses algorithms to figure out what to show us and what to leave obscure,” and narrows  feeds into a “tinkling stream we can drink from.” He goes on:

Are we sure that filtering social news so that it includes more of the negative is bad? And positive filtering can paint a too-rosy picture of our social network, shielding us from the full force of life as it is actually lived. I don’t know the answer, but it can’t come from a commercial entity whose overriding aim is to keep us coming back so we can buy more from its advertisers

The recent experiment shows this concern, he writes:  “Facebook, an important and even dominant part of our social infrastructure, makes decisions about what we’ll know about our friends based on what works for Facebook, Inc., and only secondarily based on what works for us as individuals and as a society.”

Tal Yarkoni, director of the Psychoinformatics Lab and a research associate at the University of Texas at Austin, similarly points out in New Scientist that Facebook’s News Feed has always been a “completely contrived environment.”

He goes on to write that companies like Facebook run experiments all the time, aimed at increasing revenue, and if “the idea that Facebook would actively try to manipulate your behaviour bothers you, you should probably stop reading this right now and go and close your account.” Facebook and others shouldn’t have a “free pass” to do what they want, he writes, but this particular emotion study shouldn’t be used as a “lightening rod” for criticism.

On a different note, historian Nicole Hemmer, in a piece in U.S. News & World Report raises a broader philosophical issue about the concerns that have arisen from the emotion study :

These worries stem in part from long-standing concerns about Facebook’s business practices, particularly its cavalier attitude toward users’ privacy. But they also arise from a much broader set of fears about Big Data, technology, and our growing sense that we don’t have as much control over our lives – or our minds – as we’d like.

  • pancheetah

    You certainly address what seems to be a major topic for discussion among my friends and one that should certainly be a target for ongoing national consideration as we try to clarify what’s at stake and what national policy should look like.
    I recently changed my approach to FB since it seemed to require too much effort for too little reward. But I didn’t close the account since I engage in some private discussions about a community health database and other topics. However I did put in place a number of available restrictions… such as they are. Meanwhile I’m directing FB traffic to email and elsewhere for the time being (not that there’s much better security elsewhere.)

Sign In

Forgot password?

Or

Subscribe

Receive daily coverage of the people, politics and personality of Capitol Hill.

Subscription | Free Trial

Logging you in. One moment, please...