Facebook Bans QAnon Conspiracy Promoters From Its Platforms

A protester in 2018 holds a Q sign as he waits in line with others to enter a campaign rally with President Donald Trump in Wilkes-Barre, Pa., in 2018. (AP Photo/Matt Rourke, File)

(CN) — In a major escalation of recent efforts to crack down on misinformation, Facebook announced Tuesday it will ban all groups related to QAnon, a baseless conspiracy theory that portrays President Donald Trump as a warrior battling a shadowy cabal of liberal “deep state” pedophiles.

“Starting today, we will remove any Facebook Pages, Groups and Instagram accounts representing QAnon, even if they contain no violent content,” the company said in an updated blog post Tuesday. 

The company announced in August it would remove all Facebook pages, groups and Instagram accounts associated with the baseless conspiracy theory if they discussed violence. The Menlo Park-based tech giant also said it would impose restrictions to limit the reach of QAnon-related pages, groups and accounts.

The new Facebook policy goes further in not only removing content and accounts that promote violence, but also those that spread QAnon-related misinformation.

QAnon is an umbrella term describing a wide-ranging series of internet conspiracy theories that falsely claim the world is run by a group of devil-worshiping pedophiles who operate a global sex-trafficking ring and are actively scheming to undermine Trump.

Once dismissed as a fringe phenomenon, a torrent of QAnon-related groups and posts have inundated social media platforms in recent months, often promoting false claims about the Covid-19 pandemic, Black Lives Matters protests and the election.

The movement has leaked into the mainstream with several Republican candidates for Congress supporting the groundless conspiracy theory, including Marjorie Taylor Greene who is running for Georgia’s 14th congressional district

QAnon has also been linked to real-world acts of violence, including one case in which a QAnon supporter was accused of murdering a mafia boss in New York last year. 

Another QAnon supporter from Illinois livestreamed her trip to New York with more than a dozen illegal knives while threatening to kill former Vice President Joe Biden.

A leaked FBI bulletin from May 2019 warned that QAnon poses a domestic terror threat

Facebook said Tuesday it is changing its strategy to adapt to the way the QAnon groups’ messages change rapidly. The social network has identified cases in which networks of QAnon supporters build an audience with one message then quickly pivot to more misleading and pernicious content.

“While we’ve removed QAnon content that celebrates and supports violence, we’ve seen other QAnon content tied to different forms of real world harm, including recent claims that the west coast wildfires were started by certain groups, which diverted attention of local officials from fighting the fires and protecting the public,” Facebook wrote in its blog post Tuesday.

Facebook said it is also working with external experts to address the way QAnon supporters use the issue of child safety to recruit and organize.

“We are taking steps to address evidence that QAnon adherents are increasingly using the issue of child safety and hashtags like #savethechildren to recruit and organize,” Facebook said in a previous update.

After identifying QAnon supporters’ use of a #savethechildren hashtag last week, Facebook said it would start directing people to “credible child safety resources” when they search for certain child safety-related hashtags.

Last week, the company said it would also ban all ads that praise, support or represent QAnon and “militarized movements.”

“We expect renewed attempts to evade our detection, both in behavior and content shared on our platform, so we will continue to study the impact of our efforts and be ready to update our policy and enforcement as necessary,” Facebook said Tuesday.

QAnon content has also proliferated on YouTube and Twitter.

Like Facebook, Twitter and YouTube have vowed to remove misleading content about elections, but neither YouTube nor Twitter have specifically pledged to ban QAnon-related false claims.Last month, Twitter said it would beef up efforts to limit the reach of “coordinated harmful activity” without mentioning the QAnon movement or other conspiracy theories.

%d bloggers like this: