Facebook on Thursday said it is cracking down on private groups where hate or misinformation is shared among members.
The move comes amid a wider crack down on malicious and false content at the social networking giant which has led people to turn to private groups of like-minded members who can share content that is not available to the wider Facebook community.
"People turn to Facebook Groups to connect with others who share their interests, but even if they decide to make a group private, they have to play by the same rules as everyone else," Facebook vice president of engineering Tom Alison said in a blog post.
Alison said Facebook's community standards "apply to public and private groups, and our proactive detection tools work across both."
Facebook uses artificial intelligence to automatically scanning posts, even in private groups, taking down pages that repeatedly break its rules or that are set up in violation of the social network's standards.
More than a million groups have been taken down in the past year for violating hate policies, according to Alison.
In the past year, Facebook has removed about 1.5 million pieces of content in groups for violating its policies on organized hate, with 91 percent of those posts found by automated software systems, according to Alison.
Over that same period, the leading social network has taken down about 12 million pieces of content in groups for violating policies on hate speech, 87 percent of which was found proactively.
Facebook last month said it has removed hundreds of groups tied to the far-right QAnon conspiracy theory and imposed restrictions on nearly 2,000 more as part of a crackdown on stoking violence.
The moves, which were made across both Facebook and Instagram, were against accounts tied to "offline anarchist groups that support violent acts amidst protests, US-based militia organizations and QAnon," the social media platform said in a blog post.
Under rules tightened on Thursday, administrators or moderators of groups taken down for rule-breaking will be temporarily blocked from forming new groups at Facebook.
People tagged for violating social network standards in groups will need to get moderator or administrator permission for any new posts for 30 days, and if what is cleared for sharing continues to break the rules the entire group will be removed, according to Alison.
Facebook will also start "archiving" groups that been without administrators for a long time, meaning they still exist but don't appear in searches and members can't post anything.
And, to promote getting information from authoritative sources, Facebook will no longer show health-themed groups in recommendation results.
Facebook has been struggling with hoaxes and misinformation about the coronavirus pandemic, seeking to give users well-sourced information about the health emergency.