By John Gruber
Jiiiii — Free to download, unlock your anime-watching-superpowers today!
David Thiel and Renee DiResta, announcing their own report for Stanford’s Internet Observatory investigating child sexual abuse material on Mastodon servers:
Analysis over a two-day period found 112 matches for known child sexual abuse material (CSAM) in addition to nearly 2,000 posts that used the 20 most common hashtags which indicate the exchange of abuse materials. The researchers reported CSAM matches to the National Center for Missing and Exploited Children.
The report finds that child safety challenges pose an issue across decentralized social media networks and require a collective response. Current tools for addressing child sexual exploitation and abuse online — such as PhotoDNA and mechanisms for detecting abusive accounts or recidivism — were developed for centrally managed services and must be adapted for the unique architecture of the Fediverse and similar decentralized social media projects.
Their report is interesting and nuanced, and points to aspects of the problem you might not have considered. For example, tooling:
Administrative moderation tooling is also fairly limited: for example, while Mastodon allows user reports and has moderator tools to review them, it has no built-in mechanism to report CSAM to the relevant child safety organizations. It also has no tooling to help moderators in the event of being exposed to traumatic content — for example, grayscaling and fine-grained blurring mechanisms.
I cannot agree with the headlines regarding this report:
Every instance of CSAM is a heinous crime. But it’s impractical to think that any large-scale social network could be utterly free of CSAM, or CSAM-adjacent material. Words like rife, massive, and major to me do not fairly describe the report’s findings. My conclusion is that while Mastodon server admins can do a better job — and seem sorely in need of better content moderation tooling for handling CSAM — the overall frequency of such material on the top 25 instances is lower than I expected, especially from the headlines.
(I also suspect, simply through gut feeling, that much if not most CSAM in the fediverse occurs on smaller fly-by-night instances, not the big public ones which the Stanford study examined.)
★ Wednesday, 9 August 2023