Facebook said Thursday it removed more than 1 million groups over the past year for breaking the platform’s rule sand violating policies.

Facebook said Thursday it removed more than 1 million groups over the past year for breaking the platform’s rules, and took down more than 13 million posts for violating policies on organized hate and hate speech.

This is the first time Facebook has released statistics on its enforcement of community standards inside groups. Facebook’s groups have been targeted by critics in recent years for their roles as hubs for political extremism, conspiracy theories and misinformation. The data came from an internal review and was not audited by outside experts, the company said in a blog post.

It said around 9 in 10 pieces of content removed from groups was found by the company’s automated systems.

Facebook pivoted toward groups in 2017, with a goal of connecting 1 billion people in “meaningful” groups. But over the last year, it has struggled with how and to what extent to moderate its tens of millions of the mostly private groups.

Over the summer, Facebook removed hundreds of groups themed around the “Boogaloo” anti-government extremist movement. It also removed or restricted thousands of groups and pages dedicated to armed militias and the QAnon conspiracy theory, following an internal investigation that uncovered thousands of groups and pages with millions of members and followers.

Facebook also announced a series of small changes around how it will enforce its existing rules and recommend groups, in an effort “to reduce harmful content and misinformation.” Administrators and moderators of groups taken down for breaking rules will no longer be able to create any new groups for 30 days, a move seemingly meant to stop users from evading group bans. The change doesn’t account for users who create backup groups in anticipation of bans, a widely used tactic by anti-vaccination groups, QAnon supporters and other extremist groups.

Facebook will begin archiving groups without admins, a move that will likely affect groups that haven’t been in use for some time. As part of this action, it has been proactively alerting individual users in groups without administrators, suggesting they fill the role.

The company will also stop its algorithm from recommending health groups, stating in the post that while “health groups, can be a positive space for giving and receiving support during difficult life circumstances,” it is “crucial that people get their health information from authoritative sources.”

Facebook’s definition of health groups was vague. “We use machine learning to determine groups that post a significant amount of health-related content including medical advice, and remove those groups from our recommendations surfaces,” company spokesperson Leonard Lam said.

Groups around health have long been criticized as networks for dangerous misinformation surrounding the coronavirus, vaccines, harmful fake cures and scams.

The article is originally published at NBC