Facebook is pausing efforts to redirect people to fact-checks about QAnon after a "glitch" accidentally redirected users who weren't looking for the conspiracy theory, the company said Wednesday.
Facebook said its "redirect initiative" would "direct people to resources that can help inform them of the realities of QAnon and its ties to violence and real world harm."
The initiative is part of Facebook's ongoing crackdown on the QAnon conspiracy theory and other militant groups that have used the platform to spread their message and incite real-world violence.
QAnon has become increasingly mainstream among the GOP base, fueled in part by Trump's repeated refusal to denounce the theory or its adherents.
On Wednesday morning, Facebook updated its ban on QAnon content, saying it would redirect people searching for the conspiracy theory "to resources that can help inform them of the realities of QAnon and its ties to violence and real world harm."
But less than two hours after implementing the redirect, the company had to hit pause after a "glitch" caused people to see QAnon information even when they weren't looking for it.
"When we first launched the Redirect Initiative for QAnon today there was a glitch that caused people to see information about this topic when they searched for unrelated terms. We've paused this Redirect while we fix the issue," Facebook said in a tweet Wednesday.
Facebook did not respond to questions about when and how the redirect would be re-implemented.
Facebook's update planned to send people to resources from the Global Network on Extremism and Technology, the academic research arm of the Global Internet Forum to Counter Terrorism, an initiative created and funded by Facebook in partnership with other social media giants such as YouTube, Twitter, and Microsoft, which owns LinkedIn.
"As we continue to study the impact of our enforcement against QAnon, we'll partner with GNET to assess the impact of this Redirect Initiative, and we'll continue to reassess the list of terms that, when searched for on our platform, should direct people to these resources," Facebook said in its blog post on the update earlier Wednesday.
Facebook has taken various steps to crack down on QAnon, the unfounded far-right conspiracy theory — which holds that a cabal of Satan-worshiping, child-trafficking Democrats is plotting to oust President Donald Trump — following pressure from various groups including users, employees, advertisers, misinformation experts, and lawmakers.
The company has been criticized for its slowness in acting against QAnon, only announcing earlier this month that it would remove all pages, groups, and Instagram accounts that promoted QAnon. The ban, which Facebook said would be enacted gradually, comes after the platform announced over the summer that it had removed 790 QAnon Facebook groups.
BuzzFeed News also reported Tuesday that CEO Mark Zuckerberg plans to roll back many of the company's steps aimed at slowing the spread of misinformation following the upcoming elections.
Read more: How the GOP learned to love QAnon
Extremism researchers are tracking how the new ban will play out, as the movement has spread rapidly on Facebook and on Instagram, where many are using "Save the Children" rhetoric in an attempt to further spread the movement's misguided focused on human trafficking conspiracy theories.
QAnon has also grown increasingly mainstream among the GOP base, fueled by Trump repeatedly refusing to denounce the conspiracy theory, questioning in a recent town hall whether it was a conspiracy theory, and claiming its adherents were fighting pedophilia.
Read the original article on Business Insider