Facebook moderators have PTSD-like symptoms from horrifying and violent images, fringe content

She knows that section 13 of the Facebook community standards prohibits videos that depict the murder of one or more people.

When Chloe explains this to the class, she hears her voice shaking," The Verge reports, adding that she later leaves the room and cries so hard she can barely breathe.

Facebook, which has faced criticism from all corners for its content moderation mistakes and for the massive rulebook that guides for moderators, had more than 30,000 employees working on safety and security by the end of last year. In the face of a never-ending firehose of content, moderators are expected to maintain a 95 percent accuracy rate while reviewing more than 1,000 posts per week to see if they violate Facebook's community standards.

The Verge's report, which is based on interviews with a dozen former and current Cognizant employees, depicts a soul-crushing, morbid environment where workers joke about self-harm, do drugs on the job, develop severe anxiety or have panic attacks because of the horrifying content they're forced to view. Most of the moderators interviewed quit after one year.

In addition, moderators told the tech news site that some colleagues have even embraced the fringe, conspiracy-laden views of the memes and posts they're forced to view each day.

Both Cognizant, and Facebook, which is led by CEO Mark Zuckerberg, pushed back on some aspects of The Verge's reporting.

At a later stage of the reporting, Facebook allowed The Verge's reporter to visit the Phoenix site after telling her that the moderators' experiences don't reflect those of most contractors, either in Phoenix or worldwide. New positive-message posters were put up and several content moderators who spoke to The Verge expressed satisfaction with their jobs and how they're treated, claiming that the very awful, violent content is only a small fraction of what they view.

A former contract content moderator sued Facebook in September, claiming that her work for the tech giant left her with PTSD.

Original article
Author: Christopher Carbone

Real News. Real Honest Opinion.

Christopher Carbone has recently written 5 articles on similar topics including :
  1. "Facebook has permanently banned several far-right or hate figures and organizations including Nation of Islam leader Louis Farrakhan, Infowars host Alex Jones, Laura Loomer, and Milo Yiannopoulos, for being "dangerous" -- a signal that the social network is enforcing its policies against hate speech more intensely". (May 2, 2019)
  2. "Facebook, a massive and lucrative platform built on engagement, is now going to start hiding likes. But don't fret, it's only for users in Australia -- so far". (September 28, 2019)
  3. "Facebook has suspended tens of thousands of apps that have in some way mishandled user data as the company faces a range of U.S. investigations and potential regulatory actions". (September 21, 2019)
  4. "All of Facebook's family of apps are down". (April 14, 2019)
  5. "Facebook is pushing back hard against calls from several Democratic lawmakers and its co-founder Chris Hughes to break up the embattled tech giant". (May 13, 2019)
Posted on  ,