The Content Moderation Report: Social platforms are facing a massive content crisis — here’s why we think regulation is coming and what it will look like

Executive Summary

Content moderation has become a top priority for social media networks amid rising public and political scrutiny, and we expect regulators around the world to intensify their pushes to control online content moderation.

Three Key Questions This Report Will Answer:

  • How do social media companies currently moderate content on their respective platforms, and why is it so challenging to do well?
  • How do social media users themselves think that social companies should police harmful or deceptive content on their platforms?
  • What solutions have been proposed by outside stakeholders, including regulators and advertiser groups?

WHAT'S IN THIS REPORT? In this report, Business Insider Intelligence analyzes one of the most pressing issues currently facing social platforms — content moderation — to lay out how we expect the debate to evolve and why we think regulation will come soon. This report uses both third-party data as well as proprietary data collected from our 2019 Digital Trust Survey, an online survey of 1,974 respondents from Business Insider's proprietary panel fielded from April 23 to May 16, 2019. Respondents to our Digital Trust Survey tend to be younger, male, affluent, North American, and early adopters of tech.

Here’s what’s in the full report


Exportable files for easy reading, analysis and sharing.

Read This With Insider Intelligence


Audrey Schomer, Daniel McCarthy

Already have a subscription?Sign In

Access All Charts and Data

Gain access to reliable data presented in clear and intelligible displays for quick understanding and decision making on the most important topics related to your industry

Become a Client