Stanford researchers find Mastodon has a massive child abuse material problem
while1malloc0 @ while1malloc0 @beehaw.org Posts 0Comments 14Joined 2 yr. ago
while1malloc0 @ while1malloc0 @beehaw.org
Posts
0
Comments
14
Joined
2 yr. ago
While the study itself is a good read and I agree with the conclusions—Mastodon, and decentralized social media need better moderation tools—it’s hard to not read the Verge headline as misleading. One of the study authors gives more context here https://hachyderm.io/@det/110769470058276368. Basically most of the hits came from a large Japanese instance that no one federates with; the author even calls out that the blunt instrument most Mastodon admins use is to blanket defederate with instances hosted in Japan due to their more lax (than the US) laws around CSAM. But the headline seems to imply that there’s a giant seedy underbelly to places like mastodon.social[1] that are rife with abuse material. I suppose that’s a marketing problem of federated software in general.