Facebook owner Meta has opened up more about the amount of bullying and harassment on its platforms amid pressure to increase transparency.
The tech giant revealed the prevalence of bullying and harassment in its Community Standards Enforcement Report for the first time.
Between July and September, the firm said prevalence for “one of the most complex issues” was 0.14-0.15 per cent on Facebook and 0.05-0.06 per cent on Instagram.
“This means bullying and harassment content was seen between 14 and 15 times per every 10,000 views of content on Facebook and between five and six times per 10,000 views of content on Instagram,” explained Meta’s global head of safety, Antigone Davis, and product management director Amit Bhattacharyya.
Meta Community Standards Enforcement Report, Third Quarter 2021 https://t.co/jjanAT9SjR
Advertisement— Meta Newsroom (@MetaNewsroom) November 9, 2021
The company – which recently changed its business name to Meta – said prevalence over time will help to show how it is performing at reducing the problem across platforms, with a new benchmark to work against.
In the latest quarterly report, Meta claimed it removed 9.2 million pieces of content on Facebook, though only 59.4 per cent was found and removed before a person had reported it.
On Instagram, it removed 7.8 million pieces of content, with 83.2 per cent taken down proactively.
The firm said detecting bullying and harassment is a “unique challenge” and “one of the most complex issues to address” because of context.
Its latest release comes after whistleblower and former Facebook employee Frances Haugen recently told MPs the firm is “very good at dancing with data”.