Facebook recently shared insights for the first time regarding the prevalence of bullying and harassment on its platform, including Instagram. The data revealed that during the period from July to September, instances of bullying and harassment were observed between 14 and 15 times per 10,000 views on Facebook, and between five and six times per 10,000 views on Instagram.

This disclosure by Facebook, now known as Meta, comes amidst increasing concerns that the company prioritizes profits over user safety. Various stakeholders, including advocacy groups, legislators, and former employees, have raised alarms, with former Facebook product manager turned whistleblower Frances Haugen bringing internal documents to light. These documents, accessed by The Wall Street Journal and other media outlets, shed light on Facebook’s challenges in addressing hate speech, mental health issues, and violence on its platforms.

Moderating bullying, harassment, and hate speech poses difficulties due to the nuanced nature of online interactions. Context plays a crucial role, as certain remarks may not be categorized as bullying if they occur among individuals with a close relationship. During the third quarter, Facebook removed 9.2 million pieces of bullying and harassment content, with an additional 7.8 million pieces taken down from Instagram.

While Facebook has been transparent about the prevalence of various forms of offensive content, concerns have been raised by former employees regarding the efficacy of this metric, especially in combating content that propagates extremism. Meta’s head of safety and integrity, Guy Rosen, acknowledged the ongoing discussions about exploring alternative metrics for evaluating platform safety.

In a bid to enhance transparency and accountability, Facebook has engaged business services and consulting firm EY to conduct an audit of its metrics. The findings of this audit, covering the fourth quarter of the year, are slated for release in the spring of 2022. Additionally, by the end of 2022, the social network intends to disclose information about content retention based on newsworthiness criteria.


FAQs

  1. What data did Facebook recently release for the first time?
    Facebook recently disclosed information on the prevalence of bullying and harassment on its platform, including Instagram.

  2. How frequently was bullying and harassment content observed on Facebook and Instagram during the period from July to September?
    During the specified period, instances of bullying and harassment were noted between 14 and 15 times per 10,000 views on Facebook and between five and six times per 10,000 views on Instagram.

  3. What concerns have been raised regarding Facebook’s approach to user safety?
    There are concerns that Facebook, now Meta, prioritizes profits over user safety, as highlighted by advocacy groups, legislators, and former employees.

  4. Who leaked internal documents exposing Facebook’s challenges with hate speech and violence on its platforms?
    Former Facebook product manager Frances Haugen turned whistleblower and brought internal documents to light, which were subsequently accessed by various media outlets.

  5. How does understanding context play a role in moderating bullying and harassment online?
    The nuanced nature of online interactions means that context is crucial in determining whether certain remarks constitute bullying or harassment.

  6. How many pieces of bullying and harassment content did Facebook remove during the third quarter on its platform?
    Facebook removed 9.2 million pieces of bullying and harassment content during the third quarter, with an additional 7.8 million pieces taken down from Instagram.

  7. What concerns have been raised by former employees regarding Facebook’s focus on content prevalence metrics?
    Former employees have expressed concerns about the adequacy of prevalence metrics, particularly in addressing content that promotes extremism.

  8. Who is overseeing the audit of Facebook’s metrics to enhance transparency and accountability?
    Business services and consulting firm EY is conducting an audit of Facebook’s metrics to provide insights into platform performance and content moderation practices.

  9. When will the findings of the audit covering the fourth quarter of the year be released?
    The audit findings, covering the fourth quarter of the year, are scheduled for release in the spring of 2022.

  10. What additional information does Facebook plan to disclose by the end of 2022 regarding content moderation?
    By the end of 2022, Facebook intends to report on content retention based on newsworthiness criteria, further enhancing transparency and accountability.

Summary

Facebook, now known as Meta, has shed light on the prevalence of bullying and harassment on its platform, revealing concerning statistics during the third quarter. Instances of such content were observed multiple times per 10,000 views on Facebook and Instagram, prompting discussions about platform safety and content moderation strategies. The company faces scrutiny over allegations of prioritizing profits over user safety, with internal documents leaked by a former employee underscoring the challenges in combating hate speech and violence online. To address these concerns, Facebook has initiated an audit of its metrics through a reputable firm, aiming to enhance transparency and accountability. By sharing insights into content retention based on newsworthiness criteria, Meta seeks to strengthen its commitment to user safety and responsible platform management.