Facebook’s New News Feature is Bad News
How Facebook News deals in doublespeak when it comes to championing news integrity
Opinions / November 20, 2019
“Journalism plays a critical role in our democracy. When news is deeply-reported and well-sourced it gives people information they can rely on. When it’s not, we lose an essential tool for making good decisions.”
This isn’t a quote from Woodward or Bernstein, Pulitzer, or one of the KPU journalism instructors. This is a quote from a blog post uploaded by Facebook, apparently written by Campbell Brown, Facebook’s VP of Global News Partnerships, and Mona Sarantakos, its Product Manager for News.
The post was made to introduce the world to Facebook News, a service which is still being tested that puts a number of news sources in one place on the platform, emphasizing content which Facebook considers “accurate, authentic, meaningful and informative.”
These qualities are determined by their evaluation of “a range of integrity signals in determining product eligibility,” including the publication’s propensity for spreading misinformation, use of clickbait, and community standards in regards to hate speech — all of which will be judged by third party fact checkers unique to each country.
Facebook has been trying to develop a fact-checking framework for a while now, and back in 2018, it acknowledged how truly difficult that can be.
“Even where fact-checking organizations do exist, there aren’t enough to review all potentially false claims online. It can take hours or even days to review a single claim. And most false claims aren’t limited to one article — they spread to other sites,” reads a Facebook post by Tessa Lyons, Product Manager.
I get really creeped out by how casually Facebook refers to news as a “product,” and it betrays the fact that its idea of reporting truth to the public is inextricably tied to capitalist gain.
“People want and benefit from personalized experiences on Facebook, but we know there is reporting that transcends individual experience. We want to support both,” wrote Brown and Sarantakos.
Facts and truth are values independent of what “people want and benefit from,” and they cannot be repackaged and sold as personalized experiences. Objective truths aren’t subject to change just because people don’t want to believe in them.
When we use phrases like “personalized experiences” in regards to news reporting, we’re talking about inherently distorting the truth. That can come in many forms outside of actual misinformation, like how often we are subjected to certain kinds of stories over others in our social media bubbles, and who is being given space to voice their opinions and reactions in the media. If your “personalized experience” of the news excludes everything except for a few subjects, your perception of the world will be skewed. Being hyper-selective in the stories we follow gives us an incomplete picture of reality, and Facebook News could enable that.
Predictably, Facebook received a lot of criticism over this initiative, particularly for publishing political ads that contained false information and including Breitbart News — an organization strongly criticised for publishing misinformation — in its list of publishers with integrity.
“I strongly believe it should be the role of the press to dissect the truth or lies found in political ads — not engineers at a tech company,” wrote Brown in a post responding to critics.
This has been Facebook’s slippery argument for years. It comes across as essentially using doublespeak and trying to pass the buck while retaining the revenue gained from spreading misinformation. Its insistence that it is just “a tech company” despite its unprecedented reach and popularity and the fact that the vast majority of its business model relies on data collection and advertising revenue is patently deceptive.
Facebook seems to have been using this line of reasoning to try to worm its way out of being held accountable for failing to ethically spread information to the public it claims to support.