By John Gruber
Jiiiii — Free to download, unlock your anime-watching-superpowers today!
Casey Newton, in a fantastic piece for The Verge, “The Secret Lives of Facebook Moderators in America”:
The video depicts a man being murdered. Someone is stabbing him, dozens of times, while he screams and begs for his life. Chloe’s job is to tell the room whether this post should be removed. She knows that section 13 of the Facebook community standards prohibits videos that depict the murder of one or more people. When Chloe explains this to the class, she hears her voice shaking.
Returning to her seat, Chloe feels an overpowering urge to sob. Another trainee has gone up to review the next post, but Chloe cannot concentrate. She leaves the room, and begins to cry so hard that she has trouble breathing.
No one tries to comfort her. This is the job she was hired to do. And for the 1,000 people like Chloe moderating content for Facebook at the Phoenix site, and for 15,000 content reviewers around the world, today is just another day at the office.
Just a stunning piece of writing and reporting. Kudos to Newton and The Verge for this piece.
I particularly enjoyed this tidbit, where Facebook’s own moderators are bitten by an algorithmic (as opposed to chronological) feed:
The fourth source is perhaps the most problematic: Facebook’s own internal tools for distributing information. While official policy changes typically arrive every other Wednesday, incremental guidance about developing issues is distributed on a near-daily basis. Often, this guidance is posted to Workplace, the enterprise version of Facebook that the company introduced in 2016. Like Facebook itself, Workplace has an algorithmic News Feed that displays posts based on engagement. During a breaking news event, such as a mass shooting, managers will often post conflicting information about how to moderate individual pieces of content, which then appear out of chronological order on Workplace. Six current and former employees told me that they had made moderation mistakes based on seeing an outdated post at the top of their feed. At times, it feels as if Facebook’s own product is working against them. The irony is not lost on the moderators.
The bottom line: If this is what it takes to moderate Facebook, it’s an indictment of the basic concept of Facebook itself. In theory it sounds like a noble idea to let everyone in the world post whatever they want and have it be connected and amplified to like-minded individuals.
In practice, it’s a disaster.
The problem isn’t the “everyone can post whatever they want” — that’s the nature of the internet, and I truly believe it has democratized communication in a good way. The disastrous part is the “be connected and amplified to like-minded individuals”. That’s the difference between Facebook (and to some degree, YouTube and Twitter) and things like plain old web forums. Facebook is full of shit about most of what they actually do, but one part of their self description that is true is that they really do connect people. The problem is that some people shouldn’t be connected, and some messages should not be amplified.
There is something fundamentally wrong with a platform that — while operating exactly as designed — requires thousands of employees to crush their own souls.