A mother and her 14-year-old daughter are advocating for better protections for victims after AI-generated nude images of the teen and other female classmates were circulated at a high school in New Jersey.

Meanwhile, on the other side of the country, officials are investigating an incident involving a teenage boy who allegedly used artificial intelligence to create and distribute similar images of other students – also teen girls - that attend a high school in suburban Seattle, Washington.

The disturbing cases have put a spotlight yet again on explicit AI-generated material that overwhelmingly harms women and children and is booming online at an unprecedented rate. According to an analysis by independent researcher Genevieve Oh that was shared with The Associated Press, more than 143,000 new deepfake videos were posted online this year, which surpasses every other year combined.

  • Snot Flickerman@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    56
    arrow-down
    9
    ·
    edit-2
    11 months ago

    Maybe it is just me, but its why I think this is a bigger issue than just Hollywood.

    The rights to famous people’s “images” are bought and sold all the time.

    I would argue that the entire concept should be made illegal. Others can only use your image with your explicit permission and your image cannot be “owned” by anyone but yourself.

    The fact that making a law like this isn’t a priority means this will get worse because we already have a society and laws that don’t respect our rights to control of our own image.

    A law like this would also remove all the questions about youth and sex and instead make it a case of misuse of someone else’s image. In this case it could even be considered defamation for altering the image to make it seem like it was real. They defamed her by making it seem like she took nude photos of herself to spread around.

    • Dark Arc@social.packetloss.gg
      link
      fedilink
      English
      arrow-up
      51
      arrow-down
      2
      ·
      edit-2
      11 months ago

      There are genuine reasons not to give people sole authority over their image though. “Oh that’s a picture of me genuinely doing something bad, you can’t publish that!”

      Like, we still need to be able to have a public conversation about (especially political) public figures and their actions as photographed

      • Snot Flickerman@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        2
        ·
        edit-2
        11 months ago

        Yeah I’m not stipulating a law where you can’t be held accountable for actions. Any actions you take as an individual are things you do that impact your image, of which you are in control. People using photographic evidence to prove you have done them is not a misuse of your image.

        Making fake images whole cloth is.

        The question of whether this technology will make such evidence untrustworthy is another conversation that sadly I don’t have enough time for right this moment.

      • afraid_of_zombies@lemmy.world
        link
        fedilink
        arrow-up
        4
        arrow-down
        11
        ·
        11 months ago

        If you have a picture of someone doing something bad you really should be talking to law enforcement not Faceboot. If it isnt so bad that it is criminal I wonder why it is your concern?

    • Zetta@mander.xyz
      link
      fedilink
      arrow-up
      10
      arrow-down
      4
      ·
      11 months ago

      That sounds pretty dystopian to me. Wouldn’t that make filming in public basically illegal?

      • ParsnipWitch@feddit.de
        link
        fedilink
        arrow-up
        8
        arrow-down
        1
        ·
        edit-2
        11 months ago

        In Germany it is illegal to make photos or videos of people who are identifieable (faces are seen or closeups) without asking for permission first. With exception of public events, as long as you do not focus on individuals. It doesn’t feel dystopian at all, to be honest. I’d rather have it that way than ending up on someone’s stupid vlog or whatever.

    • CleoTheWizard@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      11 months ago

      The tools used to make these images can largely be ignored, as can the vast majority of what AI creates of people. Fake nudes and photos have been possible for a long time now. The biggest way we deal with them is to go after large distributors of that content.

      When it comes to younger people, the penalty should be pretty heavy for doing this. But it’s the same as distributing real images of people. Photos that you don’t own. I don’t see how this is any different or how we treat it any differently than that.

      I agree with your defamation point. People in general and even young people should be able to go after bullies or these image distributors for damages.

      I think this is a giant mess that is going to upturn a lot of what we think about society but the answer isn’t to ban the tools or to make it illegal to use the tools however you want. The solution is the same as the ones we’ve created, just with more sensitivity.