A core issue Facebook
has grappled with has been the question of moderating content that violates its terms and conditions, a thorny question that is intertwined with questions of free speech and how Facebook functions as a public forum. Facebook bans violent, sexually explicit, and generally offensive material. This content gets manually flagged by users and reviewed by employees, who determine whether a post has violated specific terms in company policy. Twitter
and YouTube also employ a similar system to monitor content put on their social networks.
Most of that labor is outsourced to third parties. Content moderators on Facebook are paid abysmally less than regular salaried Facebook employees. Annual pay for content moderators ranges anywhere from just $1,404 per year, for the moderators who work in other countries like India and Bangladesh, to $28,800 per year for those third-party workers who work in the U.S. Facebook employees on average make $240,000 per year.
Facebook has struggled to address
its content moderation problem. In 2009, when the network only had 120 million monthly active users, it employed only 12 people to moderate the content of every flagged post by a user. Now with a worldwide reach of about 2.3 billion users, the company employs around 15,000 workers to view and get rid of violent, sexually explicit, and offensive content on the internet.