A trial program conducted by Pornhub in collaboration with UK-based child protection organizations aimed to deter users from searching for child abuse material (CSAM) on its website. Whenever CSAM-related terms were searched, a warning message and a chatbot appeared, directing users to support services. The trial reported a significant reduction in CSAM searches and an increase in users seeking help. Despite some limitations in data and complexity, the chatbot showed promise in deterring illegal behavior online. While the trial has ended, the chatbot and warnings remain active on Pornhub’s UK site, with hopes for similar measures across other platforms to create a safer internet environment.

  • YarHarSuperstar@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    8 months ago

    This covered a lot of my concerns and thoughts on the topic. I want these people to be able to seek help and possibly even have a legal outlet that is not harming anyone, i.e. not even someone who has to view that shit for a living, so maybe we get AI to do it? IDK. It’s complicated but I believe that it’s similar to having an addiction in some ways and should be treated as a health issue, assuming they haven’t hurt anyone and want help. This is coming from someone with health issues including addiction and also someone who is very empathetic and sympathetic to any and all struggles of folks who are just trying to live better.

    • afraid_of_zombies@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      8 months ago

      I can’t even imagine the amount of money it would cost for someone to pay me to watch and critique child porn for a living. I have literally been paid money in my life to fish a dead squirrel that was making the whole place stink, from underneath a trailer in July and would pick doing that professionally over watching that filth.