News

Facebook says levels of harmful content taken down remain ‘consistent’

Meta has published its latest community standards enforcement report detailing actioned content from across Facebook and Instagram.
Meta has published its latest community standards enforcement report detailing actioned content from across Facebook and Instagram.

The prevalence of harmful content on Facebook and Instagram remained “relatively consistent” during the first three months of this year, the platforms’ parent company Meta said, as it published its latest community standards enforcement report.

According to the figures, there was an increase in the amount of spam and violence and incitement content removed from Facebook, while a rise in the amount of drug content being removed from Instagram was reported.

But Meta said the prevalence of harmful content had decreased slightly in some areas, including bullying and harassment content, because of the improvements and enhancements to the company’s proactive detection technology.

The report said there had also been a slight increase in the prevalence of adult nudity and sexual activity content on Facebook compared with the last three months of 2021, which it says was due to “an increase in spam actors sharing large volumes of videos containing nudity that violated our policy”.

Meta said it removed more than 1.6 billion fake accounts during the first three months of 2022, a slight decrease on the 1.7 billion it removed in the final three months of 2021.

The amount of terrorism and organised hate content actioned on Facebook also increased compared with the previous quarter – with more than 2.5 million pieces of organised hate and 16.1 million pieces of terrorist content taken down in the first three months of this year.

“Over the years we’ve invested in building technology to improve how we can detect violating content,” Meta vice president of integrity, Guy Rosen, said.

“With this progress we’ve known that we’ll make mistakes, so it’s been equally important along the way to also invest in refining our policies, our enforcement and the tools we give to users.”

Mr Rosen also said the company was ready to refine policies as needed when new content regulations for the tech sector are introduced.

The UK’s Online Safety Bill is currently making its way through Parliament and would introduce strict new content rules around online harms for platforms such as Facebook and Instagram, while the EU is also working on its own regulation, with a similar approach expected in the United States in the future too.

“As new regulations continue to roll out around the globe, we are focused on the obligations they create for us,” Mr Rosen said.

“So we are adding and refining processes and oversight across many areas of our work. This will enable us to make continued progress on social issues while also meeting our regulatory obligations more effectively.”