The social media giant said it would ban groups and accounts associated with the movement – as well as a variety of US-based militia and anarchist groups that support violence.
However, it has stopped short of a full ban as users will be allowed to post material that supports these groups, as long as they do not violate policies against hate speech, abuse and other provocations.
It comes on the same day that US President Donald Trump courted the support of those who put stock in the theory, saying: “I heard that these are people that love our country.”
QAnon groups have flourished on Facebook in recent months, and experts said social media has aided the rise of the fringe movement.
Twitter announced a similar crackdown recently and TikTok has banned QAnon altogether from its searches.
Google said it has removed tens of thousands of QAnon-related videos from its YouTube service and banned hundreds of channels for violating its policies, but it also does not ban QAnon outright.
The QAnon conspiracy theory is centred on the baseless belief that Donald Trump is waging a secret campaign against enemies in the “deep state” and a child sex trafficking ring run by satanic paedophiles and cannibals.
For more than two years, followers have pored over tangled clues purportedly posted online by a high-ranking government official known only as Q.
The conspiracy theory emerged in a dark corner of the internet but has recently crept into mainstream politics.
Mr Trump has retweeted QAnon-promoting accounts and its followers flock to his rallies wearing clothes and hats with QAnon symbols and slogans.
Facebook said it will only remove groups and accounts outright if they discuss potential violence, including in veiled language.
It will probably make a dent. But will it solve the problem? Not at all
It said it is not banning QAnon outright because the group does not meet criteria necessary for the platform to designate it a “dangerous organisation”.
But it is expanding this policy to address the movement because it has “demonstrated significant risks to public safety”.
But experts say this does not go far enough.
Ethan Porter, a professor of media and public affairs at George Washington University, said: “Facebook’s actions today may ultimately come to be viewed as ‘too little, too late’.
“It will probably make a dent. But will it solve the problem? Not at all. At this point, the most fervent QAnon believers are not only entrenched on the platform, but likely heading to the halls of Congress. Yet this may give them trouble with new recruits.”
The social network said it has removed over 790 groups, 100 pages and 1,500 ads tied to QAnon on Facebook and has blocked over 300 hashtags across Facebook and Instagram.
For militia organisations and those encouraging riots, including some who may identify as antifa, the company said it has removed over 980 groups, 520 pages and 160 ads from Facebook.
“These movements and groups evolve quickly, and our teams will follow them closely and consult with outside experts so we can continue to enforce our policies against them,” Facebook said.