• Bob@lemmy.world
    link
    fedilink
    English
    arrow-up
    21
    arrow-down
    4
    ·
    edit-2
    1 year ago

    First of all I want to make it clear that I don’t agree with this defederation, if the models are verified adults then there is no problem.

    That said, as a Mastodon instance admin, I wanna explain something to y’all. CSAM is one of those things that you do not want to take your chances with as an admin. Beyond the obvious fact that it’s vile, even having that shit cached on your server can potentially lead to very serious legal trouble. I can see how an admin might choose to defederate because even if right now all models are verified, what if something slips through the cracks (pun not intended, but I’ll roll with it).

    My instance defederates a bunch of Japanese artist instances like pawoo because of this. All it takes is one user crossing the line, one AI generated image that looks too real.

    Aside from all that, there’s also a lot of pressure being put on many instance admins to outright ban users and defederate instances that post or allow loli/shota artwork as well. You’re quickly labeled a pedophile if you don’t do it. A lot of people consider fake CSAM to be just as bad, so it’s possible that the other admin felt that way.

    I’m more lenient on loli/shota as long as it’s not realistic because I understand that it’s a cultural difference and generally speaking Japanese people don’t see it the way we do. I don’t ban stuff just because I think it’s gross, I just don’t look at it.

    Anyway what I’m trying to say I guess is that being an admin is hard and there’s a lot of stuff y’all don’t know about so disagree with that person if you want (I do too) but keep in mind that these decisions don’t come easy and nobody likes to defederate.

    EDIT: here’s a mastodon thread about the CSAM problem in the fediverse if you’d like to learn more.

      • Bob@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        1
        ·
        1 year ago

        Well yeah I’m not like defending them or anything. I just kind of understand where they’re coming from too.

          • Bob@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            1 year ago

            The problem is that if it’s hard to tell at a glance, there’s no way to know if actual CSAM gets uploaded there in the future. So what it boils down to is, is it worth the risk? That admin says no, it isn’t, so they defederate.

            My Mastodon instance defederates pretty much any instance that allows sexually explicit or suggestive artwork or photos of people who look underage. It’s just not worth it.

              • Bob@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                ·
                1 year ago

                I can’t tell if you’re trying to be funny or not but I’ll answer anyway.

                There’s a difference between federating with instances that disallow any pornography featuring models/characters/etc who look underage, and federating with instances that allow that type of material. Actual CSAM will be immediately obvious and handled very quickly on the first, but not necessarily on the latter instance.

                It’s pretty much standard practice for Mastodon/*key/whatever admins to defederate instances that allow lolicon/shotacon and anything else like that. There are curated block lists out there and everything, we’ve been doing it for years while still federating and we’re doing just fine.