‘Nudify’ Apps That Use AI to ‘Undress’ Women in Photos Are Soaring in Popularity::It’s part of a worrying trend of non-consensual “deepfake” pornography being developed and distributed because of advances in artificial intelligence.

    • TORFdot0@lemmy.world
      link
      fedilink
      English
      arrow-up
      13
      arrow-down
      1
      ·
      10 months ago

      Not all nudity is but there is no non-sexual reason to use AI to undress someone without consent

      • Eezyville@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        3
        ·
        10 months ago

        The question on consent is something I’m trying to figure out. Do you need consent to alter an image that is available in a public space? What if it was you who took the picture of someone in public?

    • Pyr_Pressure@lemmy.ca
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      10 months ago

      Just because something shouldn’t be doesn’t mean It won’t be. This is reality and we can’t just wish something to be true. You saying it doesn’t really help anything.

      • lolcatnip@reddthat.com
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        4
        ·
        edit-2
        10 months ago

        Whoooooosh.

        In societies that have a healthy relationship with the human body, nudity is not considered sexual. I’m not just making up fantasy scenarios.

            • mossy_@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              arrow-down
              2
              ·
              10 months ago

              You caught me, I’m an evil villain who preys on innocent lemmings for no reason at all