‘Nudify’ Apps That Use AI to ‘Undress’ Women in Photos Are Soaring in Popularity::It’s part of a worrying trend of non-consensual “deepfake” pornography being developed and distributed because of advances in artificial intelligence.

  • Eezyville@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    11
    arrow-down
    1
    ·
    10 months ago

    I agree with you nudity being an issue but I think the real problem is this app being used on children and teenagers who aren’t used to/supposed to be sexualized.

      • TORFdot0@lemmy.world
        link
        fedilink
        English
        arrow-up
        13
        arrow-down
        1
        ·
        10 months ago

        Not all nudity is but there is no non-sexual reason to use AI to undress someone without consent

        • Eezyville@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          3
          ·
          10 months ago

          The question on consent is something I’m trying to figure out. Do you need consent to alter an image that is available in a public space? What if it was you who took the picture of someone in public?

      • Pyr_Pressure@lemmy.ca
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        10 months ago

        Just because something shouldn’t be doesn’t mean It won’t be. This is reality and we can’t just wish something to be true. You saying it doesn’t really help anything.

        • lolcatnip@reddthat.com
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          4
          ·
          edit-2
          10 months ago

          Whoooooosh.

          In societies that have a healthy relationship with the human body, nudity is not considered sexual. I’m not just making up fantasy scenarios.

              • mossy_@lemmy.world
                link
                fedilink
                English
                arrow-up
                2
                arrow-down
                2
                ·
                10 months ago

                You caught me, I’m an evil villain who preys on innocent lemmings for no reason at all

    • deft@ttrpg.network
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      1
      ·
      10 months ago

      Fully agree but I do think that’s more an issue about psychology in our world and trauma. Children being nude should not be a big deal, they’re kids you know?

      • Eezyville@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        6
        ·
        10 months ago

        It shouldn’t be a big deal if they choose to be nude some place that is private for them and they’re comfortable. The people who are using this app to make someone nude isn’t really asking for consent. And that also brings up another issue. Consent. If you have images of yourself posted to the public then is there consent needed to alter those images? I don’t know but I don’t think there is since it’s public domain.