Skip to main content
Snow icon
36º

Facebook says it will ban groups for ‘representing’ QAnon

Baseless conspiracy movement linked to real-world violence

FILE - In this Aug. 2, 2018, file photo, a protesters holds a Q sign waits in line with others to enter a campaign rally with President Donald Trump in Wilkes-Barre, Pa. Candidates engaging with the QAnon conspiracy theory are running for seats in state legislatures this year, breathing more oxygen into a once-obscure conspiracy movement that has grown in prominence since adherents won Republican congressional primaries this year. (AP Photo/Matt Rourke, File) (Matt Rourke, Copyright 2018 The Associated Press. All rights reserved.)

Facebook is tightening its policy against QAnon, the baseless conspiracy theory that paints President Donald Trump as a secret warrior against a supposed child-trafficking ring run by celebrities and "deep state' government officials.

The company announced Tuesday that it will ban pages and groups that represent QAnon across all of its platforms as part of its enforcement against militarized social movements. The social network said it will consider a variety of factors to decide if a group meets its criteria for a ban, including its name, the biography or “about” section of the page, and discussions within the page, group or Instagram account.

Recommended Videos



Mentions of QAnon in a group focused on a different subject won’t necessarily lead to a ban, Facebook said.

Facebook announced its new approach weeks after deciding to limit the spread of content from Facebook pages, groups and Instagram accounts that support violent acts, such as those associated with QAnon.

In mid-August Facebook said it would stop promoting the group and its adherents and remove QAnon groups if they promote violence -- although it faltered with spotty enforcement.

Officials said Tuesday that their efforts revealed that a more robust approach is necessary to limit violent and harmful disinformation spread by QAnon.

“Starting today, we will remove Facebook Pages, Groups and Instagram accounts for representing QAnon, even if they contain no violent content," Facebook wrote in a press release Tuesday. "We’re starting to enforce this updated policy today and are removing content accordingly, but this work will take time and will continue in the coming days and weeks. Our Dangerous Organizations Operations team will continue to enforce this policy and proactively detect content for removal instead of relying on user reports.

“We’ve been vigilant in enforcing our policy and studying its impact on the platform but we’ve seen several issues that led to today’s update,” the press release continues. “For example, while we’ve removed QAnon content that celebrates and supports violence, we’ve seen other QAnon content tied to different forms of real world harm, including recent claims that the west coast wildfires were started by certain groups, which diverted attention of local officials from fighting the fires and protecting the public.”

The QAnon conspiracy theory is centered on a baseless belief that Trump is waging a secret campaign against enemies in the “deep state” and a child sex trafficking ring run by satanic pedophiles and cannibals. For more than two years, followers have pored over tangled clues purportedly posted online by a high-ranking government official known only as “Q.” Some extreme supporters of Trump adhere to the theory, often likened to a cult.

The QAnon phenomenon has sprawled across a patchwork of secret Facebook groups, Twitter accounts and YouTube videos in recent years. QAnon has been linked to real-world violence such as criminal reports of kidnapping and dangerous claims that the coronavirus is a hoax.

The conspiracy theory has also seeped into mainstream politics. Several Republican running for Congress this year are QAnon-friendly.

By the time Facebook and other social media companies began enforcing — however limited — policies against QAnon, critics said it was largely too late. Reddit, which began banning QAnon groups in 2018, was well ahead, and to date it has largely avoided having a notable QAnon presence on its platform.

Twitter did not immediately respond to a message for comment on Tuesday.

Also on Tuesday, Citigroup Inc. reportedly fired a manager in its technology department after an investigation found that he operated a prominent website dedicated to QAnon. According to Bloomberg, Jason Gelinas had been placed on paid leave after he was identified on Sept. 10 by a fact-checking site as the operator of the website QMap.pub and its associated mobile apps.

Citi did not immediately respond to a message for comment on Tuesday.

Facebook officials said Tuesday that since Aug. 19, 1,500 QAnon pages and groups “containing discussions of potential violence” have been removed, in addition to 6,500 pages and groups tied to over 300 militarized social movements.


About the Authors
Cassidy Johncox headshot

Cassidy Johncox is a senior digital news editor covering stories across the spectrum, with a special focus on politics and community issues.

Loading...

Recommended Videos