screen time

Social Media Drops the ‘Digital Town Square’ Routine

Photo-Illustration: Intelligencer; Photo: Getty Images

I’m guessing you don’t need to be told that Threads, Meta’s uncanny competitor to the site formerly known as Twitter, is not a top-tier source for the latest science news or for advice about your personal health. But just in case! The Washington Post this week discovered some curious omissions from its search results:

When users went to Threads to search for content related to “covid” and “long covid,” they were met with a blank screen that showed no search results and a pop-up linking to the website of the Centers for Disease Control and Prevention.

Other blocked searches include “sex,” “nude,” “gore,” and “porn,” more or less in keeping with Threads’s nascent identity as a brand-friendly content suburb, but also “vaccines” and “vaccination,” which are things about which one might seek information while neither horny nor overcome with bloodlust.

Meta did set some expectations around the launch of Threads. “Politics and hard news are inevitably going to show up on Threads,” wrote platform boss Adam Mosseri, “but we’re not going to do anything to encourage those verticals.” His boss, Mark Zuckerberg, positioned Threads as a refuge from touchy topics. “We are definitely focusing on kindness and making this a friendly place,” he said. Also, despite quick early growth and more recent attempts to integrate Threads into more of its products — you might have noticed little snippets of Threads text showing up in between Instagram posts, hastening the latter’s inevitable transformation into Facebook for under-70s — Meta isn’t exactly cutting off a hugely valuable resource here, though it is setting a precedent. There isn’t really a decision about how to run Threads that could matter that much because, well, it’s Threads.

Every platform that lets users post content has to draw lines somewhere. Small platforms have it easy — people are usually there for somewhat specific reasons and can be asked or made to understand norms and expectations in a way that doesn’t really threaten their ability to share their thoughts with the world. Bigger social platforms, however, often aspire to serve the public in general — in most social-networking companies’ ideal worlds, everyone signs up for an account and maybe even posts. The last generation of big, public social-media platforms — Twitter, Facebook, and YouTube — initially took differing versions of the same approach, positioning themselves as general-purpose, population-scale venues for all sorts of conversation with as few exceptions as possible. They talked about themselves as digital “town squares” and, in some cases, as defenders or at least enablers of “free speech.”

This was good marketing in the sense that it gave the impression that these vast new digital spaces had some sort of civic virtue and that the mass onboarding of discourse onto a few big private platforms would be managed with, at the least, a vague sense of obligation to the public’s ability to speak freely, openly, and under fair and safe conditions. It was bad marketing in the sense that it was fundamentally never quite true. Whatever these platforms said about speech — and, indeed, whatever they did in service of defending it, even in court, as Twitter has — the fact remained that these were companies designed to monetize content consumption in partnership with brands, not, like, to support a healthy commons, whatever that might mean. In the meantime, they made arbitrary new rules as needed, as was their actual right as the operators of commercial spaces. Quietly, and with a modicum of shame, these “digital town squares” were all always Threads.

The commitment of these platforms to masquerading as democratic-ish spaces with rightlike privileges, lawlike rules, and courtlike processes of appeal and adjudication paid off, sort of. These platforms are huge and influential. They’re woven into society and culture in ways that even their biggest early boosters couldn’t have imagined and which are now difficult to unwind. One could argue that even a weak, cynical commitment to free-and-free-flowing speech on platforms like Facebook and Twitter has had some trace of value to users, too. If we’re going to be using these services to consume, disseminate, and fight about important things anyway, it’s probably slightly better for their leadership to feel obligated to explain why certain things are off-limits.

It also turned out to be a huge pain for the platforms. Reframing content moderation as a discussion about speech, rights, and censorship came back to bite these companies in the ass in a few different ways: in the form of reasonable and unreasonable user expectations about what they could say and to whom, turning every ban and episode of mod drama into a conflict with civilizational stakes (a particular issue when one worried user has the ability to buy your entire platform); in the form of expensive, miserable, and fundamentally hard actual moderation, in which companies tried to draw clear lines through fuzzy topics and fuzzy lines through clear topics, always alienating at least some of their users, or advertising clients, or government officials in countries where they wished to continue operating; and in the form of scrutiny from critics and public officials who took them at their word that they supported free speech; or that they were creating open, deliberative spaces; or that they were anything but overgrown advertising networks. These companies got what they said they wanted (big open spaces for mass communication, collaboration, and disputation, where users expected to be treated in certain ways) instead of what they actually wanted (cheap, low-maintenance sources of renewable eyeballs to sell). These platforms really did become important. Everyone started using them.

It’s a situation in which moderation choices can matter a great deal beyond the apps themselves. Building real-world platforms for everyone means building spaces that will one day simultaneously contain — to choose an extreme but real example — genocidaires and their victims as users. Or, for that matter, users who have COVID-19, or who ended up dying of COVID-19, or who ended up with long-lasting effects from COVID-19, and users who maybe don’t think COVID-19 is real, or who have very different thoughts about what vaccines are good for (mitigating disease; controlling politics; installing microchips). This is a classic and hard moderation problem, where ethical and even practical choices are often hard to nail down exactly, change over time, and are alienating to enforce. Your platform is a place where people are figuring things out and maybe preventing one another from figuring things out. Intervention is messy and provokes constant criticism. With Threads, Meta is saying: Why bother?

Threads isn’t the only example of a platform whose owners have decided hosting contested conversations isn’t really worth it. Within Meta, Facebook has steered its recommendation algorithms away from news. TikTok has never really marketed itself nor been perceived as a “free speech” space — like its predecessors, it ultimately censors whatever it wants, only with a bit less open hypocrisy but more sensitivity to the needs of the Chinese government. Reddit spent the last year reminding its power users who the real boss is. Even Twitter, which Elon Musk (claims to have) bought in response to its previous leadership’s moderation policies, has decided that some of the central dilemmas of platform moderation — for example, what to do when a government in a valuable foreign market makes a threatening takedown demand — are, in fact, not dilemmas at all, and not necessarily in the direction of more speech.

(See also: Walter Isaacson’s reporting about the Twitter owner’s admonitions to be “careful” about China, where the service does not even operate.)

Despite its familiar design and features, Threads emerged from a different context than its predecessors, in which a sort of platform realpolitik is now embraced more openly by tech companies, or at least not denied so vigorously — at this point, nobody would believe them anyway. It’s a bit more honest, at least up front. Speech-wise, it could also result in a more generally restrictive state of affairs in which platforms don’t feel quite as much need to service users’ non-commerce-related concerns about content moderation, which, rather than a series of thorny problems to be handled by trust and safety teams and independent oversight boards, will be handed down to teams of thousands of contract moderators following scripts as best and as fast as they can, unless of course a major advertiser is involved. Threads’s search function might not be Google, and people might not rely on it for much of anything now, but it’s not ideal that one of the world’s largest social-media companies is telling people to go elsewhere to talk about vaccines, in general, because a vocal minority of its users post about them in a really annoying way.

For the platform’s leadership, more freedom to simply declare more things off-limits, or at least unimportant, is probably a relief (and speaks to the recurring tech fantasy of eliminating “politics” from spaces in which people are … trying to work or communicate as a group). To be fair, hosting and attempting to responsibly moderate the world’s collective rupture over how to handle a pandemic probably felt like a thankless task. It’s not even clear what sort of nightmare Meta is averting here. Would Threads be an anti-vaxx, long-COVID-denialism sort of place? An Instagram-wellness-adjacent quack zone? A place to rehash school-closure and masking discourse? All of the above? Or, just as likely, none of the above?

It might also reflect a slightly too aspirational sense of what people got, and continue to get, out of posting online in general. The mess is still a mess. We’ve all been online for the last decade. We’ve seen the posts. We’ve probably made some of them. But without the mess, what’s left? We might have to reinstall Threads to find out.

More Screen Time

See All
Social Media Drops the ‘Digital Town Square’ Routine