Letting conspiracy theory go to the dark recesses of the web to die would be for the best, says Alex Webb.
Facebook has long defended its lackadaisical approach to misinformation by saying that if it imposed stricter conditions, such content would proliferate elsewhere anyway. Far better to monitor conversation itself … or something.
But pushing false information and incitement to violence to the darker recesses of the web would be better.
Facebook’s line of reasoning has long seemed disingenuous, not least because of how many people turn to the social media giant. It has 2.5 billion monthly users across its platforms, which include Instagram and WhatsApp. Alongside Alphabet’s YouTube, these properties represent the greatest agglomeration of eyeballs that the world has ever known. Where else can misinformation find such a massive audience?
Facebook’s banning of Donald Trump, alongside a similar decision by Twitter, means we might now find out. The first alternative for many was Parler, a social media app that boasts of being a bastion of free speech and is backed by the billionaire Mercer family. But such free speech came at a cost: Its failure to moderate content organising the violence at the Capitol last week prompted Amazon.com Inc. to suspend Parler’s use of its web-hosting services, while Apple Inc. and Google removed its app from their mobile stores. Trump has since said he may build his own social network as Parler scrambles to get back on its feet.
Whatever happens to Parler, forcing the most outlandish strands of political discourse off the mainstream platforms would be a good thing. To boost user engagement, social media companies tend to reward provocative content with greater exposure while also deploying algorithms that personalize user feeds. That produces engines that incubate and accentuate radicalization, which can have the effect of turning moderates into radicals.
Facebook is now taking down all mentions of “stop the steal,” the slogan used by US election conspiracy theorists, while Twitter has banned more than 70 000 QAnon accounts.
If QAnon, electoral fraud conspiracists and flat-earthers are encouraged to move elsewhere — to Parler, for instance, or Gab, a site reportedly frequented by white supremacists — then it might reduce the number of people sucked out of the mainstream and into their conspiratorial vortexes. Web users would have to actively seek out Covid-19 or anti-vaccination misinformation, rather than happening across it organically on their Facebook or YouTube feeds.
Take extremism out of the mainstream
Such an approach would not, of course, end online radicalization, according to Dipayan Ghosh, author of “Terms of Disservice: How Silicon Valley Is Destructive by Design.” Indeed, it could even push those already inside the bubble into an even greater frenzy. But “it’s the right thing to do because it removes extremist views from the mainstream,” he said.
The main platforms needn’t worry about getting overtaken. For all of Facebook Chief Executive Officer Mark Zuckerberg’s assertions that competition in social media is fierce, the advertising giant continues to grow user numbers, revenue and profit. Its lead is not easily surmountable. Parler’s peak daily active user count was just 3.4 million globally, back in November, and it had only 1.6 million daily users last week, according to app analytics firm Apptopia. That’s 0.09% of the 1.8 billion people who log into a Facebook service every day.
Rivals will only start to threaten the dominance of Facebook and YouTube if they’re able to build sustainable business models. That means securing advertising dollars, which is not straightforward given that brands are less likely to want to blazon ads alongside troublesome content.
It also means strictly managing costs. Google and Apple have already made that more difficult, by stipulating that Parler must adhere to its own content moderation policies if the app is to be available on their stores. In other words, the app needs to hire a stack of content moderators. It’s the right thing to do, given the evidence that last week’s riots and future violence have been planned on the platform. Facebook should also have built those costs into its business early on, but it opted for low overheads initially in order to scale quickly.
Pushing more extreme political discourse out of the mainstream would formalize the content bubbles that essentially already exist. It would be a far cry from the 1990s cyber-utopian vision of the internet as a village green or an agora for the free and open exchange of ideas. But it would be for the best.
Views expressed are the author's own.