• wildginger@lemmy.myserv.one
    link
    fedilink
    English
    arrow-up
    6
    ·
    11 months ago

    Man, AI devs really are only interested in finding uses for this stuff that pisses directly in everyones food bowls, huh? Its like they cant finish unless the project has massive fundamental ethical problems

  • givesomefucks@lemmy.world
    link
    fedilink
    English
    arrow-up
    6
    arrow-down
    1
    ·
    1 year ago

    That’s kind of crazy, and once it’s “trained” there’s no way this could be detected.

    maybe by randomly taking screen caps? But depending how that works, it might not capture it. You’d have to take an actual screenshot of the game, not just catch a frame from the GPU.

    Definitely not worth it, but it’s crazy it was made.

    • Xatix@lemmy.world
      link
      fedilink
      English
      arrow-up
      15
      ·
      edit-2
      1 year ago

      If its an overlay on the screen itself and not on the computer-side, theres just no way to detect it, as the ocerlay is not there when the image leaves the graphics card. Thats at least what the article describes.

      The only way to prevent players from using this would be to exclude players using this specific monitor alltogether, regardless if they are having this function activated or not.

      • givesomefucks@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        3
        ·
        1 year ago

        Ok, sounds like a transparent display on top of the normal one, that makes sense.

        And completely undetectable

        • conciselyverbose@kbin.social
          link
          fedilink
          arrow-up
          9
          ·
          1 year ago

          I don’t think it’s an extra display.

          If you have a 4K display, you can still have that display accept a 1080p signal. You can do this because the GPU isn’t controlling the display. It’s merely sending an image to the display 60 (or 120, or 144, etc) times a second. This passes through a chip that’s part of the display that is able to turn that feed into the signals to each sub pixel and tell them how bright to be. Monitors generally don’t really process much, but TVs often do additional processing to (in theory) make the image look better. This is the level this MSI display is going to be processing and adjusting the image at. (I’ve glossed over a lot of details here, and am not pretending I understand all of them. But in broad strokes, this is what’s happening)

          Screen captures by your computer don’t see any of the adjustments your display makes. They just see the image you send to the display. They have no way of knowing if your display is cranking saturation through the roof, inserting gross fake frames it’s calling “true motion” or whatever, blasting the shit out of brightness and blowing out highlights, etc. They don’t actually know what the final output looks like. They only know what they send.