The problem is that if it’s hard to tell at a glance, there’s no way to know if actual CSAM gets uploaded there in the future. So what it boils down to is, is it worth the risk? That admin says no, it isn’t, so they defederate.
My Mastodon instance defederates pretty much any instance that allows sexually explicit or suggestive artwork or photos of people who look underage. It’s just not worth it.
I can’t tell if you’re trying to be funny or not but I’ll answer anyway.
There’s a difference between federating with instances that disallow any pornography featuring models/characters/etc who look underage, and federating with instances that allow that type of material. Actual CSAM will be immediately obvious and handled very quickly on the first, but not necessarily on the latter instance.
It’s pretty much standard practice for Mastodon/*key/whatever admins to defederate instances that allow lolicon/shotacon and anything else like that. There are curated block lists out there and everything, we’ve been doing it for years while still federating and we’re doing just fine.
If they offered that as explanation there would have been no drama.
Well yeah I’m not like defending them or anything. I just kind of understand where they’re coming from too.
Yeah, but on the other hand it is verifiably not CSAM
The problem is that if it’s hard to tell at a glance, there’s no way to know if actual CSAM gets uploaded there in the future. So what it boils down to is, is it worth the risk? That admin says no, it isn’t, so they defederate.
My Mastodon instance defederates pretty much any instance that allows sexually explicit or suggestive artwork or photos of people who look underage. It’s just not worth it.
then why even federate at all? someone else could post CSAM at any time
I can’t tell if you’re trying to be funny or not but I’ll answer anyway.
There’s a difference between federating with instances that disallow any pornography featuring models/characters/etc who look underage, and federating with instances that allow that type of material. Actual CSAM will be immediately obvious and handled very quickly on the first, but not necessarily on the latter instance.
It’s pretty much standard practice for Mastodon/*key/whatever admins to defederate instances that allow lolicon/shotacon and anything else like that. There are curated block lists out there and everything, we’ve been doing it for years while still federating and we’re doing just fine.