• Jaysyn@kbin.social
    link
    fedilink
    arrow-up
    82
    arrow-down
    7
    ·
    8 months ago

    Surprise, that’s completely unenforceable.

    Yet more out of touch legislators working with things they can’t even begin to understand.

    (And I’m not shilling for fucking AI here, but let’s call a spade a spade.)

    • Max-P@lemmy.max-p.me
      link
      fedilink
      English
      arrow-up
      27
      arrow-down
      9
      ·
      8 months ago

      What baffles me is that those lawmakers think they can just legislate any problem with law.

      So okay, California requires it. None of the other states do. None of the rest of the Internet does. It doesn’t fix anything.

      They act like the Internet is like cable and it’s all american companies that “provides” services to end users.

    • assassin_aragorn@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      1
      ·
      8 months ago

      I’m not so sure. A lot of environmental laws require companies to self report exceeding limits, and they actually do. It was a common thing for my contact engineer colleagues to be called up at night to calculate release amounts because their unit had an upset.

      A law like this would force companies to at least pretend to comply. None can really say “we’re not going to because you can’t catch us”.

    • Brkdncr@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      1
      ·
      8 months ago

      Hmm, technically speaking we could require images be digitally signed, tie it to a CA, and then browsers could display a “this image is not trusted” warning like we do for https issues.

      People that don’t source their images right would get their cert revoked.

      Would be a win for photo attribution too.

        • Brkdncr@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          1
          ·
          8 months ago

          You also had 30 seconds but chose to insult instead of contribute. See you at the next comment section.

    • bluGill@kbin.social
      link
      fedilink
      arrow-up
      1
      arrow-down
      1
      ·
      8 months ago

      It is enforceable. Not in all cases, probably not even in the majority, but it only needs a few examples to be hit with large fines and everyone doing legal things will take notice. Often you can find enough evidence to get someone to confess to using AI and that is aall the courts need.

      Scammers of course will not put this in, but they are already breaking the law so this might be - like tax evasion - be a way to get scammers who you can’t get for something else.