Mind Matters Natural and Artificial Intelligence News and Analysis
Photo by Alex Kotliarskyi
Tech workers at computers in large room
Photo by Alex Kotliarskyi

Facebook Moderators Are Not Who We Think

Companies offer terrible working conditions partly because they think AI will just take over soon
Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

Complaints about the bias of Facebook moderators abound. One US Senator offers a proposal to make it easier to sue big-tech firms over political bias. But who really makes the day-to-day decisions anyway?

A recent heart attack death has shed light on the decision-makers’ underlying working conditions. A former Coast Guard officer, Keith Utley, 42, died at work at the Tampa site of Cognizant, a contractor supplying “bodies,” so to speak, for Facebook moderation. The workplace atmosphere was so bad that managers were told not to discuss his death; many site employees did not know until his father came to collect his things: “My son died here.”

A piece in The Verge yesterday, based on the accounts of employees who broke non-disclosure agreements in the wake of Utley’s death, describes unsanitary working conditions and low pay but also illuminates the very nature of the work.

Shawn Speagle, who worked for an online education company, was recruited at a job fair for what he believed was a career in high-tech. Assigned to comment moderation, he had to learn on the job:

For the six months after he was hired, Speagle would moderate 100 to 200 posts a day. He watched people throw puppies into a raging river, and put lit fireworks in dogs’ mouths. He watched people mutilate the genitals of a live mouse, and chop off a cat’s face with a hatchet. He watched videos of people playing with human fetuses, and says he learned that they are allowed on Facebook “as long as the skin is translucent.” He found that he could no longer sleep for more than two or three hours a night. He would frequently wake up in a cold sweat, crying.

Casey Newton, “Bodies in Seats” at The Verge

The job worsened Speagle’s existing anxiety disorder so he later quit and is pursuing a teaching certificate. It especially bothered him that law enforcement authorities never seemed to follow up on video evidence of clearly illegal activities reported to them.

Ironically, as Newton explains, Facebook’s bad reputation in this matter stems partly from good intentions: “In 2017, Facebook began opening content moderation sites in American cities including Phoenix, Austin, and Tampa. The goal was to improve the accuracy of moderation decisions by entrusting them to people more familiar with American culture and slang.”

Thus, people who have no idea how much distress the job really entails sign on to whatever working conditions are locally allowed.

But the lack of investment in better conditions stems partly from the belief that AI will just take over one day:

If you believe moderation is a high-skilled, high-stakes job that presents unique psychological risks to your workforce, you might hire all of those workers as full-time employees. But if you believe that it is a low-skill job that will someday be done primarily by algorithms, you probably would not.

Casey Newton, “Bodies in Seats” at The Verge

He spells it out: “… at the highest levels, human content moderators are viewed as a speed bump on the way to an AI-powered future.”

And if that doesn’t—and perhaps can’t—happen, what’s the backup plan? Lawsuits?

See also: Will Facebook’s new focus on community groups prevent abuses? When you look a little closer at the proposal, you will see that the answer is no (Russ White)

and

Facebook’s old motto was “Move fast and break things” With the current advertising scandal, it might be breaking itself


Mind Matters News

Breaking and noteworthy news from the exciting world of natural and artificial intelligence at MindMatters.ai.

Facebook Moderators Are Not Who We Think