• Elias Griffin@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      4 months ago

      In fact just the other day information wanted a ham sandwhich before I set it free so it could find more people not on an empty stomach :/

    • Elias Griffin@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      3
      ·
      4 months ago

      Oh yeah, tell me about Intellectual Property, Patent, Invention, and Ideation thievery, was it still there afterwards? IP theft has been recognized for centuries.

      Back to the basement Mustafa Jr…

    • bitchkat@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      2
      ·
      4 months ago

      Yes but you don’t have a right to create derivative works which by definition is all that AI can spit out.

      • afraid_of_zombies@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        ·
        4 months ago

        I am so glad humans are never derivative with culture. Just look at the movie The Fast and Furious. If we were making derivative works we would live in some crazy world where that would be a franchise with ten movies, six video games, a fashion line, board games, toys, theme park attractions, and an animated series that ran for six seasons.

  • kibiz0r@midwest.social
    link
    fedilink
    English
    arrow-up
    8
    ·
    4 months ago

    Pirating Windows for your own personal, private use, which will never directly make you a single dollar: HIGHLY ILLEGAL

    Scraping your creative works so they can make billions by selling automated processes that compete against your work: Perfectly fine and normal!

    • experbia@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      4 months ago

      bunch of fuckin art pirates. crying about software piracy while they have their own bots pirating everyone’s art.

      • kibiz0r@midwest.social
        link
        fedilink
        English
        arrow-up
        2
        ·
        4 months ago

        It’s not even piracy though. I never saw anyone torrent Windows_XP_Home_Cracked.iso and go “Hey guys, check out this operating system I made!”

    • yesman@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      4 months ago

      Do people still pirate Windows? You can download the iso directly from Microsoft’s website and you don’t need a registration key anymore.

      • Scrollone@feddit.it
        link
        fedilink
        English
        arrow-up
        1
        ·
        4 months ago

        You do need a registration key, but now it’s tied to the hardware so it activates as soon as you connect to the network, no need to actually type the registration key.

        • Balder@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          edit-2
          4 months ago

          They’re saying Windows will lock away some customization, but you don’t need a key to use it nowadays.

    • cmhe@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      ·
      edit-2
      4 months ago

      “Copying is theft” is the argument of corporations for ages, but if they want our data and information, to integrate into their business, then, suddenly they have the rights to it.

      If copying is not theft, then we have the rights to copy their software and AI models, as well, since it is available on the open web.

      They got themselves into quite a contradiction.

      • masterspace@lemmy.ca
        link
        fedilink
        English
        arrow-up
        3
        ·
        4 months ago

        You realize that half of Lemmy is tying themselves in inconsistent logical knots trying to escape the reverse conundrum?

        Copying isn’t stealing and never was. Our IP system that artificially restricts information has never made sense in the digital age, and yet now everyone is on here cheering copyright on.

      • Buffalox@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        2
        ·
        edit-2
        4 months ago

        If copying is not theft, then we have the rights to copy their software

        No we don’t, copying copyrighted material is copyright infringement. Which is illegal. that does not make it theft though.
        Oversimplifying the issue makes for an uninformed debate.

    • GamingChairModel@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      edit-2
      4 months ago

      Yeah, I’m not a fan of AI but I’m generally of the view that anything posted on the internet, visible without a login, is fair game for indexing a search engine, snapshotting a backup (like the internet archive’s Wayback Machine), or running user extensions on (including ad blockers). Is training an AI model all that different?

      • Evotech@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        4 months ago

        You can’t be for piracy but against LLMs fair the same reason

        And I think most of the people on Lemmy are for piracy,

        • sugar_in_your_tea@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          2
          ·
          edit-2
          4 months ago

          I’m not in favor of piracy or LLMs. I’m also not a fan of copyright as it exists today (I think we should go back to the 1790 US definition of copyright).

          I think a lot of people here on lemmy who are “in favor of piracy” just hate our current copyright system, and that’s quite understandable and I totally agree with them. Having a work protected for your entire lifetime sucks.

          • masterspace@lemmy.ca
            link
            fedilink
            English
            arrow-up
            4
            ·
            4 months ago

            The problem with copyright has nothing to do with terms limits. Those exacerbate the problem, but the fundamental problem with copyright and IP law is that it is a system of artificial scarcity where there is no need for one.

            Rather than reward creators when their information is used, we hamfistedly try and prevent others from using that information so that people have to pay them to use it sometimes.

            Capitalism is flat out the wrong system for distributing digital information, because as soon as information is digitized it is effectively infinitely abundant which sends its value to $0.

            • sugar_in_your_tea@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              1
              ·
              4 months ago

              Copyright is not a capitalist idea, it’s collectivist. See copyright in the Soviet Union, the initial bill of which was passed in 1925, right near the start of the USSR.

              A pure capitalist system would have no copyright, and works would instead be protected through exclusivity (I.e. paywalls) and DRM. Copyright is intended to promote sharing by providing a period of exclusivity (temporary monopoly on a work). Whether it achieves those goals is certainly up for debate.

              Long terms go against any benefit to society that copyright might have. I think it does have a benefit, but that benefit is pretty limited and should probably only last 10-15 years. I think eliminating copyright entirely would leave most people worse off and probably mostly benefit large orgs that can afford expensive DRM schemes in much the same way that our current copyright duration disproportionately benefits large orgs.

      • petrol_sniff_king@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        1
        ·
        4 months ago

        None of those things replace that content, though.

        Look, I dunno if this is legally a copyrights issue, but as a society, I think a lot of people have decided they’re willing to yield to social media and search engine indexers, but not to AI training, you know? The same way I might consent to eating a mango but not a banana.

      • sugar_in_your_tea@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        4 months ago

        Yes, it kind of is. A search engine just looks for keywords and links, and that’s all it retains after crawling a site. It’s not producing any derivative works, it’s merely looking up an index of keywords to find matches.

        An LLM can essentially reproduce a work, and the whole point is to generate derivative works. So by its very nature, it runs into copyright issues. Whether a particular generated result violates copyright depends on the license of the works it’s based on and how much of those works it uses. So it’s complicated, but there’s very much a copyright argument there.

        • TheRealKuni@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          4 months ago

          An LLM can essentially reproduce a work, and the whole point is to generate derivative works. So by its very nature, it runs into copyright issues.

          Derivative works are not copyright infringement. If LLMs are spitting out exact copies, or near-enough-to-exact copies, that’s one thing. But as you said, the whole point is to generate derivative works.

          • sugar_in_your_tea@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            1
            ·
            4 months ago

            Derivative works are not copyright infringement

            They absolutely are, unless it’s covered by “fair use.” A “derivative work” doesn’t mean you created something that’s inspired by a work, but that you’ve modified the the work and then distributed the modified version.

        • Halosheep@lemm.ee
          link
          fedilink
          English
          arrow-up
          2
          ·
          4 months ago

          My brain also takes information and creates derivative works from it.

          Shit, am I also a data thief?

          • sugar_in_your_tea@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            1
            ·
            4 months ago

            That depends, do you copy verbatim? Or do you process and understand concepts, and then create new works based on that understanding? If you copy verbatim, that’s plagiarism and you’re a thief. If you create your own answer, it’s not.

            Current AI doesn’t actually “understand” anything, and “learning” is just grabbing input data. If you ask it a question, it’s not understanding anything, it just matches search terms to the part of the training data that matches, and regurgitates a mix of it, and usually omits the sources. That’s it.

            It’s a tricky line in journalism since so much of it is borrowed, and it’s likewise tricky w/ AI, but the main difference IMO is attribution, good journalists cite sources, AI rarely does.

    • ZILtoid1991@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      4 months ago

      Issue is power imbalance.

      There’s a clear difference between a guy in his basement on his personal computer sampling music the original musicians almost never seen a single penny from, and a megacorp trying to drive out creative professionals from the industry in the hopes they can then proceed to hike up the prices to use their generative AI software.

      • Snot Flickerman@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        1
        ·
        edit-2
        4 months ago

        Is it that or is it that the laws are selectively applied on little guys and ignored once you make enough money? It certainly looks that way. Once you’ve achieved a level of “fuck you money” it doesn’t matter how unscrupulously you got there. I’m not sure letting the big guys get away with it while little guys still get fucked over is as big of a win as you think it is?


        Examples:

        The Pirate Bay: Only made enough money to run the site and keep the admins living a middle class lifestyle.

        VERDICT: Bad, wrong, and evil. Must be put in jail.

        OpenAI: Claims to be non-profit, then spins off for-profit wing. Makes a mint in a deal with Microsoft.

        VERDICT: Only the goodest of good people and we must allow them to continue doing so.


        The IP laws are stupid but letting fucking rich twats get away with it while regular people will still get fucked by the same rules is kind of a fucking stupid ass hill to die on.

        But sure, if we allow the giant companies to do it, SOMEHOW the same rules will “trickle down” to regular people. I think I’ve heard that story before… No, they only make exceptions for people who can basically print money. They’ll still fuck you and me six ways to Sunday for the same.

        I mean, the guys who ran Jetflicks, a pirate streaming site, are being hit with potentially 48 year sentences. Longer than a lot of way more serious fucking crimes. I’ve literally seen murderers get half that.

        But yeah, somehow, the same rules will end up being applied to us? My ass. They’re literally jailing people for it right now. If that wasn’t the case, maybe this argument would have legs.

        But AI companies? Totes okay, bro.

        • Womble@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          4 months ago

          Yes, it is a travesty that people are being hounded for sharing information, but the solution to that isn’t to lock up information tighter by restricting access to the open web and saying if you download something we put up to be freely accessed and then use it in a way we don’t like you owe us.

          The solution to bad laws being applied unevenly isn’t to apply the bad laws to everyone equally, its to get rid of the bad laws.

        • Grimy@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          4 months ago

          The laws are currently the same for everyone when it comes to what you can use to train an AI with. I, as an individual, can use whatever public facing data I wish to build or fine tune AI models, same as Microsoft.

          If we make copyright laws even stronger, the only one getting locked out of the game are the little guys. Microsoft, google and company can afford to pay ridiculous prices for datasets. What they don’t own mainly comes from aggregators like Reddit, Getty, Instagram and Stack.

          Boosting copyright laws essentially kill all legal forms of open source AI. It would force the open source scene to go underground as a pirate network and lead to the scenario you mentioned.

  • CriticalMiss@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    ·
    4 months ago

    Sure bud, pirating some Microsoft Studio video games and windows ISOs right now. What? I found them on the open web!

  • snekerpimp@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    ·
    4 months ago

    So if I see it on the “open web”, I’m free to use it however I please? Oh, I get thrown in jail and everything I own taken away.

    If companies are people per “citizens united”, why doesn’t the same apply to them?

    • Ænima@lemm.ee
      link
      fedilink
      English
      arrow-up
      1
      ·
      4 months ago

      And if a company makes a negligent decision, which kills a million people over time, why is no one being put on death row? They can and do have it both ways, but I can still wish for a just world where if companies are people, they can be put to death for mass casualties caused by their decisions.

  • interdimensionalmeme@lemmy.ml
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    1
    ·
    4 months ago

    Copyright infrigment is not theft, training models is not copyright infringement either. We need a law equivalent to when an artist says “he’s inpired by someone else” . That it specifically is illegal to do that without permission if you use a machine. That will force big tech to pay a pittance for it and it will instakill all the small player.

    • bitchkat@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      3
      ·
      4 months ago

      Creating a derivative work without a license to do so would be copyright infringement.

    • Elias Griffin@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      3
      ·
      edit-2
      4 months ago

      Copyright Infringment strawman argument. When considering AI, we are not talking legal copyright infringement in the relationship between humans vs AI. Humans are mostly concerned with being obsoleted by Big Tech so the real issue is Intellectual Property Theft.

      artificial INTELLIGENCE stole our Intellectual Property

      Do you see it now?

      • afraid_of_zombies@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        4 months ago

        What I see is a system of laws that came about during the Middle Ages and have been manipulated by the powers that be to kill off any good parts of them.

        We all knew copyright was broken. It was broken before my grandparents were born. It didn’t encourage artists or promise them proper income, it didn’t allow creations to gradually move into public domain. It punished all forms of innovation from player pianos to fanfiction on Tumblr.

      • interdimensionalmeme@lemmy.ml
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        4 months ago

        It’s only theft as long as you cling to the failed “copyright” model.

        Big tech couldn’t steal anything if we don’t respect their property rights in the first place.

        By reifying copyright under the AI paradigm, we maintain big tech’s power over us.

        The truth is chatgpt belong to us. ClosedAI is just the compiler of the data.

        If we finally end the failed experiment of copyright, we destroy their mote.