Dominating the global social media market with a share ranging from 68 to 75%, Facebook is the undisputed king of online interactivity. Providing nearly 3 billion users—almost 40% of Earth’s population—with a digital platform, the company enables people to connect, communicate, and stay in touch no matter the distances or circumstances that separate them.
Providing such a wide-reaching and versatile platform comes with a trade-off: not every user is content to post family and career updates, photos of pets, or videos of important events. A portion of Facebook users regularly post grisly and toxic content that violates community guidelines. Ultimately, somebody has to sweep the site of this digital detritus—which also means they have to personally experience content that can have a profound impact on their mental health.
For years, professional services firm Accenture has provided thousands of full-time employees from around the world working in eight-hour shifts to sort through the worst of what is shared on Facebook. The arrangement is a lucrative one, with contracts worth at least $500 million a year according to an examination conducted by The New York Times. However, the company’s high profits come with a high cost to those who do the work to earn them, with employees reporting depression, anxiety, and paranoia as a result of their working conditions.
While Facebook has spread the work out among up to 10 different companies and increasingly leveraged artificial intelligence to remove more than 90% of objectionable material posted to the site, humans are still needed to decide upon the appropriateness of the posts that aren’t caught by the software. These individuals have had to grapple with high performance requirements and rapidly changing, sometimes-conflicting content moderation standards as well as trauma induced by having to repeatedly view disturbing content.
Both companies have come under fire for this practice, with Facebook claiming it is not liable for damages because the workers are employed by firms like Accenture, and with Accenture claiming to offer mental health support and disclosure of the mental health hazards to employees—facts disputed in class-action lawsuits filed by former workers.
Ultimately, both companies are too intertwined to discontinue the practice in spite of negative media coverage and harm to their employees. Facebook requires extensive moderation of its content, and the need generates a significant amount of revenue for companies which provide people to do so—and which ultimately cannot afford to walk away from such a well-paying, highly influential client.