Facebook removed more than 1 million groups and 13.5 million pieces of content in groups for hate speech in the past year, the company said today. These figures were released alongside new policies meant to “keep groups safe,” including a new commitment to limit the spread of health advice on the platform.
It’s not clear how many groups were removed for hate speech previously. This is the “very first time” Facebook has broken out “safety stats” for groups, Facebook spokesperson Leonard Lam confirmed to Adweek. Tom Alison, vp of engineering, wrote in a blog post that about 90% of the content Facebook removed was done “proactively.”
Facebook has come under fire for misinformation and hate speech on its platform, but its groups—even private ones—have been a particular hotbed for harmful content. In July, 1,110 advertisers boycotted the platform for its policies and inaction on hate speech, an effort led by civil rights groups including the NAACP and ADL.
On Thursday, Alison outlined new restrictions on Facebook groups in the blog post:
- Administrators and moderators of groups taken down for rule violations won’t be able to create any new groups for a “period of time”
- Members who have broken site rules will require approval for any posts in groups for the next 30 days
- Facebook will try to fill vacant admin roles in groups, taking into consideration users with past violations
- Facebook will archive old groups without admins in the coming weeks
- The platform will no longer recommend “health groups” to users
- For QAnon, U.S.-based militia organizations and anarchist groups, Facebook will remove them from recommendations, restrict them from search and “soon” reduce their content in the news feed
Following Twitter’s lead, Facebook cracked down on QAnon content in August but did not ban the conspiracy network outright from its platform. Under pressure from advertisers, civil rights groups and regulators this summer, Facebook also took action against the boogaloo movement, removed Trump campaign ads featuring Nazi iconography and released a civil rights audit. Its QAnon purge removed 790 groups from the platform.
Facebook has also cracked down on health misinformation, particularly in the dawning of the Covid-19 pandemic, and introduced a fixed panel of reliable coronavirus information, but multiple outlets recently reported on the rise of anti-vaxxer content on the platform, including in groups.
The company said it takes a “remove, reduce, inform” approach to harmful content in its groups, removing groups that violate their rules, reducing the distribution of groups that spread misinformation, and informing people after-the-fact who have encountered misinformation on the site.