When the big internet platforms like YouTube and Facebook began to kick hate speech and harmful misinformation off their sites last year, a process that sped up as the pandemic flared and protests against police violence roiled the country, something comedian Sarah Silverman would often say about audiences came to mind: “People go towards love.”
If you begin to push people away from a community for believing bad things, those people, being human, will look for people who accept them.
The gist of her point is that if you begin to push people away from a community for believing bad things, those people, being human, will look for people who accept them, who will show them love, and who will welcome them inside. It’s a nuanced take, one that comes from a place of almost cringeworthy-levels of empathy, that feels relevant as we grapple with communities that have nurtured right-wing extremism and harmful misinformation. Their members, suddenly cut off from the online public square, have found other platforms where they feel more welcome.
Case in point: What did QAnon believers do after YouTube, Twitter and Facebook banned their content? They did not disappear. They went to Parler, briefly, and then to sites like Twitch, where, according to the New York Times, 20 large communities of QAnon and Q-adjacent subscribers have sprung up since last fall. Twitch, which is owned by Amazon, doesn’t think QAnon is a hate group. And, until the Times asked about it, it didn’t think the Proud Boys counted as a hate group either.
No shade against Twitch specifically: They face exactly the same cycle of content moderation headaches that befell Facebook and Twitter, as groups with aggressively controversial and hateful viewpoints colonized small parts of their server space.
Twitch is currently in what I call the “Whac-a-Mole arms race” phase: They’re dealing with smart ideological entrepreneurs who know how to manipulate hashtags and discourse markers (change a single letter!) to circumvent any attempts to moderate whatever pops up.
It is not illegal to believe in racist things, nor is it possible to shame people who believe these things from attempting to find people who think like them.
Misinformation researchers have worried about the “moving toward love” problem for a while. Censoring views does not, in the networked commons, get rid of them. Far from it, in fact. It is not illegal to believe in racist things, nor is it possible to shame people who believe these things from attempting to find people who think like them.
As much as we might complain about Facebook and Twitter and their echo chambers, the reality was that people with extremist views were exposed to contrary opinions fairly frequently, especially if their algorithm seemed to be tweaked toward engagement.
On the one hand, seeing the other side take umbrage at your content is a great motivator; on the other, it reminds you, on a subtle level, that you’re still part of a larger community and bear some responsibility to it. But if you suddenly find yourself in a smaller space where people believe the same things you do, your responsibility narrows.
Your beliefs become more virulent, even if you’re not able to spread them as quickly. When you do recruit someone new — say, to a Twitch stream, or to a Discord group, or to a private Telegram chat — they’re likely to be more like you, a true believer who has been kicked out of some other community.
And then there are the folks who decide to get violent. Say what you will about Facebook’s inability to take down #StoptheSteal groups before the January insurrection — and there’s a lot to say — at least Facebook had visibility, which meant it could (and did) work with law enforcement to find its users who used the platform to organize the storming of the U.S. Capitol.
Banned by Facebook, many turned to platforms where the anonymity of the content and the security of the interactions were the point. Even Zoom, the go-to app for pandemic meetups, wasn’t immune. PBS found a healthy migration of militia content to the online teleconferencing platform in the wake of Facebook’s post-election attention to election misinformation.







