• JackGreenEarth@lemm.ee
    link
    fedilink
    English
    arrow-up
    35
    arrow-down
    14
    ·
    1 year ago

    You can’t stop them being made, they’re just the same deepfakes people have been making before. It’s important to note that they’re not photos of people, they’re guesses made by a algorithm.

    • strider@feddit.nl
      link
      fedilink
      English
      arrow-up
      63
      arrow-down
      3
      ·
      1 year ago

      While you’re completely right, that’s hardly a consolation for those affected. The damage is done, even if it’s not actually real, because it will be convincing enough for at least some.

    • InternetTubes@lemmy.world
      link
      fedilink
      English
      arrow-up
      31
      arrow-down
      11
      ·
      edit-2
      1 year ago

      If governments can go after child porn, then they can go after the websites generating it and people distributing it.

      I’m sort of sick about services that can generate whatever bullshit people ask of them with zero oversight and control, specially when it involves deepfakes. When deepfakes become real enough, societies will just become a race towards distributing the deepfakes that serve whatever passes as the prejudices of the times, and people will eat it up.

      It already happens in societies without deepfakes, and even the people who disagree with the mainstream still adopt their perception of things towards the prejudices present in the media of their society that they don’t really become aware off until they try living outside of it for a while.

      Deepfakes will become like steroids for creating bubbles of ideology once it is able to cross the uncanny valley territory.

      • Fal@yiffit.net
        link
        fedilink
        English
        arrow-up
        18
        arrow-down
        3
        ·
        1 year ago

        It’s almost like people are the problem and will use any excuse to do what they want. So yeah, let’s ban technology, even though as you said people find ways to be shitty anyway, because after all, won’t somebody think of the children?

        • rentar42@kbin.social
          link
          fedilink
          arrow-up
          4
          ·
          1 year ago

          Yeah, people are the root cause of almost all problems that people have to deal with.

          And we’ve been dealing with them for a long time and one way to deal with them is to develop norms and rules as a society (which at some point we decided to enshrine into laws).

          So no, it’s not that we need to “ban technology”. But a good first step is to say “hey, if you generate porn of someone in your class and distribute it to others in your school then that’s a pretty shitty thing to do”. Another good step is probably to try to get some consensus on that statement. And if enough people agree with this, then we can start thinking of putting some actual rules behind it.

          Societies have been able to handle these kinds of nuances for many different topics for a very long time. So stop pretending that it’s all just “oh, you all just want to ban the new stuff”. It might take a while to get it all worked out and some steps along the way will almost certainly be missteps, but it’s not like this always ends badly.

        • InternetTubes@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          DRM and IP laws are basically bans on technology. There would not be any system of law if your logic was taken to the extreme. Things can be done.

          • Fal@yiffit.net
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 year ago

            Of course they can be, that wasn’t the point. Drm and ip law are not examples to be held up as things to imitation though

      • Cethin@lemmy.zip
        link
        fedilink
        English
        arrow-up
        9
        ·
        1 year ago

        This stuff can be run locally. Its not something that can be stopped by just going after some service providing it. It may make it slightly less convenient to access, but if anyone wants to access it it’ll be available. Pandora’s box has been opened and it can’t be closed.

        • Touching_Grass@lemmy.world
          link
          fedilink
          English
          arrow-up
          5
          arrow-down
          1
          ·
          1 year ago

          The goal isn’t to stop deepfakes of random people. Its to limit AI access to regular people so it can be horded by select groups of people. Using threats against children to stir up the masses is the oldest play in history. The upper crust needs to make laws against how the rest of us use these tools.

          • Cethin@lemmy.zip
            link
            fedilink
            English
            arrow-up
            4
            ·
            1 year ago

            Sure, it’s illegal. They can’t do anything about it unless you do something else wrong though. I wish they could just magically detect where that content was, but they need a search warrant to find it. Talking about stopping this software will lead to nothing, but sharing this content (real or generated) is where attention should be focused.

    • n0m4n@lemmy.world
      link
      fedilink
      English
      arrow-up
      18
      arrow-down
      1
      ·
      edit-2
      1 year ago

      The faces are not generated, and that is where the damage comes. It targets the girls for humiliation by implying that they allowed the nudes to be taken of them. Depending upon the location and circumstances, this could get the girls murdered. Think of “honor killings” by fundamentalists. It makes them targets for further sexual abuse, too. Anyone distributing the photos are at fault, as well as the people who made the photos.

      The problem goes deeper, though. We can never trust a photo as proof of anything, again. Let that sink in, what it means to society.

    • maegul (he/they)@lemmy.ml
      link
      fedilink
      English
      arrow-up
      20
      arrow-down
      6
      ·
      1 year ago

      To push back your attempt to minimalise what’s going on here …

      Yes, they’re not actually photos of the girls. But, nor is a photo of a naked person actually the same as that person standing in front of you naked.

      If being seen naked is unwanted and embarrassing etc, why should a photo of you naked be embarrassing, and, to make my point, what difference would it make if the photo is more or less realistic? An actual photo can be processed or taken under certain lighting or with a certain lens or have been taken some time in the past … all factors that lessen how close it is to the current naked appearance of the subject. How unrealistic can a photo be before it’s no longer embarrassing?

      Psychologically, I’d say it’s pretty obvious that the embarrassment of a naked image is that someone else now has a relatively concrete image in their minds of what the subject looks like naked. It is a way of being seen naked by proxy. A drawn or painted image could probably have the same effect.

      There’s probably some range of realism within which there’s an embarrassing effect, and I’d bet AI is very capable of getting in that range pretty easily these days.

      While the technology is out there now … it doesn’t mean that our behaviours with it are automatically acceptable. Society adapts to the uses and abuses new technology has and it seems pretty obvious that we’re yet to culturally curb the abuses of this technology.

    • Rayspekt@kbin.social
      link
      fedilink
      arrow-up
      7
      arrow-down
      1
      ·
      1 year ago

      Exactly, the technology is out there and will not cease to exist. Maybe we’ll digitally sign our photos in the future so that deepfakes can be sorted out by that.