Skip to content
Menu
menu

Illustration by Security Management

Facebook Bans QAnon Groups and Pages

In its continuing efforts to crack down on the conspiracy movement QAnon, Facebook announced yesterday that it will ban any groups, pages, or Instagram accounts that “represent” the movement. This marks a sharp escalation in Facebook’s battle to curb the movement and its conspiracy theories—which have been identified as a potential domestic terror threat by the FBI.

In a memo from earlier this year, the FBI pointed to QAnon and Pizzagate—a conspiracy theory claiming that Hillary Clinton and top Democrats were running a child sex-trafficking ring underneath a Washington, D.C., pizza shop—as examples of groups whose messages could lead to “violent acts,” The Hill reported. In December 2016, a man fired a gun into the Comet Ping Pong pizza shop in D.C.; he claimed he was investigating the Pizzagate conspiracy. His attorney alleged the man was inspired by QAnon—a set of outlandish beliefs, including that a cabal of satanic elites rules the world.

More recently, U.S. Representative Tom Malinowski (D-NJ) faced death threats after QAnon believers false accused him of protecting sexual predators.

The FBI document, dated 30 May, added: “The FBI assesses these conspiracy theories very likely will emerge, spread, and evolve in the modern information marketplace, occasionally driving both groups and individual extremists to carry out criminal or violent acts.”

The conspiracy theory followers have harassed and doxed (published private or identifying information on the Internet) perceived enemies, and even the group’s purported efforts to thwart child trafficking have had the opposite effect, experts say. Fans of the conspiracy theory have clogged antitrafficking hotlines and hijacked charity fundraising campaigns with misinformation, The New York Times reported.

An internal investigation by Facebook earlier this year found thousands of groups and pages—with millions of members and followers—that support the QAnon conspiracy theory, NBC News reported in August. The social media giant has been studying the movement’s use of the platform since at least June, and a Facebook spokesperson told NBC in July that the company was investigating QAnon as part of a larger look at groups with potential ties to violence.

In the second quarter of 2020, Facebook scrubbed 22.5 million pieces of hate speech—violent or dehumanizing speech, statements of inferiority, slurs, or calls for exclusion or segregation—from its platform. This marked a huge spike from Q1, when only 9.6 million pieces of content were removed for hate speech policy violations, CyberScoop reported.

In August, Facebook had announced a set of measures designed to disrupt the ability of QAnon and militarized social movements to operate and organize on the platform.

“In the first month, we removed over 1,500 Pages and Groups for QAnon containing discussions of potential violence and over 6,500 Pages and Groups tied to more than 300 Militarized Social Movements. But we believe these efforts need to be strengthened when addressing QAnon,” a Facebook statement said.

The update, which goes beyond flagging content promoting violence to target disinformation campaigns, was predicated by a number of recent changes and QAnon tactics. 

“While we’ve removed QAnon content that celebrates and supports violence, we’ve seen other QAnon content tied to different forms of real world harm, including recent claims that the west coast wildfires were started by certain groups, which diverted attention of local officials from fighting the fires and protecting the public," the Facebook statement explained. "Additionally, QAnon messaging changes very quickly, and we see networks of supporters build an audience with one message and then quickly pivot to another. We aim to combat this more effectively with this update that strengthens and expands out enforcement against the conspiracy theory movement.”

The new rules will be enforced by Facebook’s Dangerous Organizations Operations team, which also enforces the platform’s bans on terrorist and hate groups. The team will proactively detect content for removal instead of waiting for user reports.

U.S. Senator Mark Warner (D-VA) released a statement Tuesday after Facebook’s announcement, praising the platform’s decision.

“I’m pleased to see Facebook take action against this harmful and increasingly dangerous conspiracy theory and movement. Just this morning I encouraged the company to take the threat of QAnon more seriously, given increasing evidence that its growth has in large part been propelled by Facebook," Warner said. "Ultimately the real test will be whether Facebook actually takes measures to enforce these new policies—we’ve seen in a myriad of other contexts, including with respect to right-wing militias like the Boogaloos, that Facebook has repeatedly failed to consistently enforce its existing policies,” his statement said.

Disinformation and conspiracy theories have had serious consequences, from sparking civil unrest to misleading people about the origins of the COVID-19 pandemic. Learn more about the infodemic affecting COVID-19 responses in the June issue of Security Management.

arrow_upward