A mother and her 14-year-old daughter are advocating for better protections for victims after AI-generated nude images of the teen and other female classmates were circulated at a high school in New Jersey.

Meanwhile, on the other side of the country, officials are investigating an incident involving a teenage boy who allegedly used artificial intelligence to create and distribute similar images of other students – also teen girls - that attend a high school in suburban Seattle, Washington.

The disturbing cases have put a spotlight yet again on explicit AI-generated material that overwhelmingly harms women and children and is booming online at an unprecedented rate. According to an analysis by independent researcher Genevieve Oh that was shared with The Associated Press, more than 143,000 new deepfake videos were posted online this year, which surpasses every other year combined.

  • Dark Arc@social.packetloss.gg
    link
    fedilink
    English
    arrow-up
    51
    arrow-down
    2
    ·
    edit-2
    11 months ago

    There are genuine reasons not to give people sole authority over their image though. “Oh that’s a picture of me genuinely doing something bad, you can’t publish that!”

    Like, we still need to be able to have a public conversation about (especially political) public figures and their actions as photographed

    • Snot Flickerman@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      2
      ·
      edit-2
      11 months ago

      Yeah I’m not stipulating a law where you can’t be held accountable for actions. Any actions you take as an individual are things you do that impact your image, of which you are in control. People using photographic evidence to prove you have done them is not a misuse of your image.

      Making fake images whole cloth is.

      The question of whether this technology will make such evidence untrustworthy is another conversation that sadly I don’t have enough time for right this moment.

    • afraid_of_zombies@lemmy.world
      link
      fedilink
      arrow-up
      4
      arrow-down
      11
      ·
      11 months ago

      If you have a picture of someone doing something bad you really should be talking to law enforcement not Faceboot. If it isnt so bad that it is criminal I wonder why it is your concern?