select all

Leaked Documents Show Facebook’s Procedures for Trending Topics

Photo: David Ramos/Getty Images

Newly leaked Facebook documents, published today in The Guardian, confirm earlier reports from Gizmodo about the all-too-human process that goes into the platform’s seemingly algorithmic Trending Topics sidebar feature. (Facebook confirmed the documents were legitimate but described them as an “older version.”)

That Facebook had a clearly defined set of practices around its Trending Topics — including, apparently, a procedure to insert stories that were not necessarily “trending” — conflicts with how the company has previously described the section: as a series of topics whose appearance and importance was determined algorithmically and via automation. In fact, these early guidelines show that Facebook, in its infinite wisdom, still relied on less than a dozen publications in determining whether stories were of national importance.

Facebook relies heavily on just 10 news sources to determine whether a trending news story has editorial authority. “You should mark a topic as ‘National Story’ importance if it is among the 1–3 top stories of the day,” reads the trending review guidelines for the US. “We measure this by checking if it is leading at least 5 of the following 10 news websites: BBC News, CNN, Fox News, The Guardian, NBC News, The New York Times, USA Today, The Wall Street Journal, Washington Post, Yahoo News or Yahoo.”

Perhaps the biggest contradiction to Facebook’s earlier claims is that, per The Guardian’s documents, curators were in fact allowed to “inject” (Facebook’s words) stories into the trending section that had not actually reached the computer-determined threshold for what constitutes “trending.” Earlier this week, the company said that they do not “artificially” inject stories, implying that topics still had to pass some sort of newsworthiness smell-test before they entered the system. But they were injected nevertheless.

The company itself disclosed with a list of 1,000 trusted sources — the news outlets whose reporting Facebook relied on to confirm the accuracy of its Trending Topics summaries. News topics needed to be “corroborated by reporting from at least three Media 1K outlets,” making Facebook, somewhat recursively, a gatekeeper for media gatekeepers. It’s not news until Facebook says that three other places said it’s news.

Notably, the list includes conservative sources like Breitbart and the Daily Caller, the types of sites that, earlier this week, Facebook contractors said had at times been excluded from the Trending section. That those conservative sources are in the list is good news for those worried about Facebook leaning left in the information it provides, but maybe less reassuring to people who don’t think highly of the quality of reporting of sites like Breitbart. The presence of frequently inaccurate sites like ViralNova — in the “science” section, no less! — is similarly unreassuring.

None of this is hugely surprising to sophisticated consumers of Facebook’s product — of course the company has guidelines, of course it has a list of reliable news sources, of course its decisions are largely driven by human editorial judgment. But it’s at odds with Facebook’s self-image as an impartial, machine-driven enterprise, in which computers crunch big data to better cater to their users, without the intervention of humans and their messy biases.

In response, Facebook’s VP of Global Operations, Justin Osofsky, broadly described the Trending curation process in a blog post today. It’s more or less similar to what Gizmodo’s reporting had previously described — a current-events sandwich where algorithms are the bread and humans are the meat. An algorithm surfaces topics (“many people are talking about Prince”), curators confirm the news and write the short descriptions (“Prince died”), and then an algorithm determines which users would benefit from seeing it (“show this story users who like Prince’s page”).

Old news though it might be, it’s still nice to hear it directly from Facebook. The service has so far stubbornly stood by its commitment to a vaguely defined “neutrality” and an appeal to the impartiality of the algorithm. Better that it acknowledge that its human intermediaries are, well, human, susceptible to biases and subjectivity, and explain how it mitigates those concerns.

Leaked Documents Show Facebook News Guidelines