• Bob@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      1
      ·
      1 year ago

      Well yeah I’m not like defending them or anything. I just kind of understand where they’re coming from too.

        • Bob@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 year ago

          The problem is that if it’s hard to tell at a glance, there’s no way to know if actual CSAM gets uploaded there in the future. So what it boils down to is, is it worth the risk? That admin says no, it isn’t, so they defederate.

          My Mastodon instance defederates pretty much any instance that allows sexually explicit or suggestive artwork or photos of people who look underage. It’s just not worth it.

            • Bob@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              1 year ago

              I can’t tell if you’re trying to be funny or not but I’ll answer anyway.

              There’s a difference between federating with instances that disallow any pornography featuring models/characters/etc who look underage, and federating with instances that allow that type of material. Actual CSAM will be immediately obvious and handled very quickly on the first, but not necessarily on the latter instance.

              It’s pretty much standard practice for Mastodon/*key/whatever admins to defederate instances that allow lolicon/shotacon and anything else like that. There are curated block lists out there and everything, we’ve been doing it for years while still federating and we’re doing just fine.