Mark Zuckerberg, in a Facebook post (included below), said that in order to build a “safe community,” the company must make violent videos easier to report. The 3,000 new employees will be added to a current global “community operations team” of 4,500 over the next year.
If “community operations” sounds like a euphemism, that’s because it is. The content moderators who keep the vilest material off the internet are increasingly located in the Philippines, and they’re tasked with almost unbearable work.
“Companies like Facebook and Twitter rely on an army of workers employed to soak up the worst of humanity in order to protect the rest of us,” Adrian Chen reported for Wired in 2014. “And there are legions of them—a vast, invisible pool of human labor.”
Within a year, there will be 3,000 more.
This article appears in May 3-9, 2017.



Why not hire more people to get rid of all of the fake accounts and SPAM?
Zuck-Zuck invented a monster, and he is an agent of the devil. Face Place shoulda done this a decade ago.
Chuckles the Clown