For more than a year, Facebook’s trending topics — that list of short news blurbs in the upper-right corner of the Facebook home page — have kept users in the know, delivering stories about the latest objects of viral interest in a bizarre, clinical tone, as if they’d been written by someone minutes after being thawed from a decadelong cryosleep. Maybe you’ve seen the recent news item about Justin Bieber (“Singer Responds to People Who Have Criticized His Hair”) or the one about Air Jordans (“Photo Shows Sneakers Featuring Meme of Michael Jordan Crying During Speech”).
Exactly how these blurbs are created has been shrouded in mystery, but Gizmodo’s Michael Nuñez, in the latest of a string of Facebook scoops, has tracked down the people who are tasked with sorting through and writing them. Things appear … grim. Hired in a freelance capacity through a string of subcontractors, the news-curation team is tasked with reading and summarizing dozens of stories every day. According to the report, “Managers gave curators aggressive quotas for how many summaries and headlines to write, and timed how long it took curators to write a post. The general standard was 20 posts a day.”
Facebook’s news curation, as well as their M messenger-bot and God knows what else, tend to present a kind of bland algorithmic face that hides actual human beings sitting and typing at desks around the world, often under terrible working conditions. In this case, humans are being tasked to ensure that what is presented as a straightforward, algorithmically determined picture of the site’s most popular topics of discussion doesn’t embarrass or otherwise damage Facebook.
This is because some of the things that trend on Facebook might make the company look very bad, or, at the very least, stupid. Facebook doesn’t want — for example — ”Obama Birth Certificate” or “Chemtrails” to show up as trending topics, because to publicly acknowledge the frequency with which those subjects are discussed would be to accept some level of responsibility for them. (Declaring something a “trending topic,” after all, only makes it trend more.)
And, so, maybe unsurprisingly, the main function of the curators is to temper the wild swings of virality. Curators were allowed to blacklist trending topics, and did so on a daily basis. “A topic was often blacklisted if it didn’t have at least three traditional news sources covering it,” Nuñez writes, “but otherwise the protocol was murky[.]” At the same time, certain news sources were to be avoided, “Twitter” was to be replaced with the more generic “social media,” and only Facebook-native video was to be promoted:
The former contractors Gizmodo interviewed said they were asked to write neutral headlines, and encouraged to promote a video only if it had been uploaded to Facebook. They were also told to select articles from a list of preferred media outlets that included sites like the New York Times, Time, Variety, and other traditional outlets. They would regularly avoid sites like World Star Hip Hop, The Blaze, and Breitbart, but were never explicitly told to suppress those outlets.
As the Times’s John Herrman points out, that line about the Blaze and Breitbart — both right-wing news sources — will probably haunt Facebook for a while. Anyone can now cry censorship whenever they see a dip in Facebook referrals or the absence of a trending topic like “Benghazi.”
And that’s going to be a pain for Facebook. The success of platforms often rests on their ability to present themselves as neutral spaces. (Look at the outcry when Twitter introduced an algorithmically sorted timeline in place of its reverse-chronological feed.) Facebook wants us to think of its products as transparent and objective: perfect robotic helpmates. But of course, there’s always a human behind the scenes somewhere — not just because the bots and algorithms aren’t advanced enough, but because objectivity and neutrality often come into conflict with the overall interests of the platform.