Trauma a job hazard for Facebook content moderators

Worst. Job. Ever.

So-called “content moderators” on Facebook — tasked with vetting user posts that have been flagged for pornographic material, graphic violence or hate speech — are so traumatized by their work that they’ve resorted to smoking weed, spouting conspiracy theories and even having sex on the job, according to reports.

In one instance cited by a Monday article in the Verge, a job-training session in Phoenix included a trainee named “Chloe” playing a video for a group of fellow trainees that “depicts a man being murdered.”

“Someone is stabbing him, dozens of times, while he screams and begs for his life,” according to the tech blog. “Chloe’s job is to tell the room whether this post should be removed.”

The woman ended up having a panic attack and — like many of the 15,000 other contract workers used by Facebook to moderate graphic and disturbing content — experienced trauma symptoms for months, according to the 7,500-word exposé.

Even for moderators who are skilled at what they do, the job takes a toll. Workers at the site featured in the report were found having sex in the office lactation room, as well as bathroom stalls, stairwells and the parking garage — a phenomenon one former moderator called “trauma bonding.”

“You get really close to your coworkers really quickly,” she said. “If you’re not allowed to talk to your friends or family about your job, that’s going to create some distance.”

Facebook, which has been harshly criticized for being slow to stop the spread of misinformation on its platform — including during the 2016 presidential election — has tapped contractors such as Phoenix-based Cognizant and Accenture in Austin, Texas, to get it done.

These content moderators earn $28,800 a year — a fraction of the $240,000 average annual haul of a median Facebook employee — and are expected to perform their jobs with 95 percent accuracy.

“It’s a strain. I don’t know what I’m going to see,” one contractor at Accenture told Bloomberg. “I don’t have a problem with nudity — that’s what I signed up for — but then there are random beheadings.”

Employees who are fired for not properly auditing routinely threaten their former coworkers, according to the report, prompting one supervisor to begin bringing a gun to work with him in the event that an interaction becomes violent.

It’s not just videos of graphic violence that have to be vetted by the army of moderators; employees also must review photos and videos of child pornography and bestiality, as well as an endless stream of conspiracy theories.

One auditor, the Verge reports, now believes that the Earth is flat, while another has become convinced that 9/11 was a government conspiracy. A third moderator “has begun to question certain aspects of the Holocaust,” according to the report.

In a blog post responding to the Verge story, Facebook VP of Global Operations Justin Osofsky wrote that the social network takes the well-being of its moderators seriously, outlining the steps Facebook is taking to hold its partners accountable.

“We are committed to working with our partners to demand a high level of support for their employees; that’s our responsibility and we take it seriously,” Osofsky said.

Facebook shares finished the day up 1.7 percent, at $164.62.

https://nypost.com/2019/02/25/trauma-a-job-hazard-for-facebook-content-moderators/

SHARE
Staff Writer
The above article is by a guest contributor, or shared from another news outlet.