I know a lot of people want to interpret copyright law so that allowing a machine to learn concepts from a copyrighted work is copyright infringement, but I think what people will need to consider is that all that’s going to do is keep AI out of the hands of regular people and place it specifically in the hands of people and organizations who are wealthy and powerful enough to train it for their own use.

If this isn’t actually what you want, then what’s your game plan for placing copyright restrictions on AI training that will actually work? Have you considered how it’s likely to play out? Are you going to be able to stop Elon Musk, Mark Zuckerberg, and the NSA from training an AI on whatever they want and using it to push propaganda on the public? As far as I can tell, all that copyright restrictions will accomplish to to concentrate the power of AI (which we’re only beginning to explore) in the hands of the sorts of people who are the least likely to want to do anything good with it.

I know I’m posting this in a hostile space, and I’m sure a lot of people here disagree with my opinion on how copyright should (and should not) apply to AI training, and that’s fine (the jury is literally still out on that). What I’m interested in is what your end game is. How do you expect things to actually work out if you get the laws that you want? I would personally argue that an outcome where Mark Zuckerberg gets AI and the rest of us don’t is the absolute worst possibility.

  • Ragnell@kbin.social
    link
    fedilink
    arrow-up
    0
    ·
    1 year ago

    Waaaaait a minute. How is it you thinking AI is good because it makes your life a bit better leisure-wise different from me thinking it’s a problem because it will make my life worse work-wise? You threw that at me saying I was worried about a small group and here you are basing your excitement on it helping your niche hobbies?

    Are you sure you’re not projecting here? In this entire thread, have you budged an inch based on all the people arguing against your original post? Or are you just refusing to admit that it could cause trouble in the world for people’s livelihoods because you get to have fun with it?

        • IncognitoErgoSum@kbin.socialOP
          link
          fedilink
          arrow-up
          1
          ·
          edit-2
          1 year ago

          I’m not sure why you’re asking that. You literally just asked me if I’m refusing to admit that AI could cause trouble for people’s livelihoods. I don’t know where you even got that idea. I never asked you anything about whether you admit it could help with things, because that’s irrelevant (and also it would be a pretty silly blanket assumption to make).

          Are you sure you’re not projecting here? In this entire thread, have you budged an inch based on all the people arguing against your original post?

          Who am I supposed to be budging for? Of the three people here who are actually arguing with me, you’re the only one who isn’t saying they’re going to slash my car tires and likening personal AI use to eating steak in terms of power usage (it’s not even in the same ballpark), or claiming that Stable Diffusion doesn’t use a neural network. I only replied to the other guy’s most recent comment because I don’t want to be swiftboated – people will believe other people who confidently state something that they find validating, even if they’re dead wrong.

          We just seem to mostly have a difference of opinion. I don’t get the sense that you’re making up your own facts. And fundamentally, I’m not convinced of the idea that only a small group of people deserve laws protecting their jobs from automation, particularly not at the expense of the rest of us. If we want to grant people relief from having their jobs automated away, we need to be doing that for everybody, and the answer to that isn’t copyright law.

          And as far as AI being used to automate dangerous jobs, copyright isn’t going to stop that at all. Tesla’s dangerous auto-pilot function (honestly, I have no idea if that’s a neural network or just a regular computer program) uses data that Tesla gathers themselves. Any pharmaceutical company that develops an AI for making medicines will train it on their own trade secrets. Same with AI surgeons, AI-operated heavy machinery, and so on. None of that is going to be affected by copyright, and public concerns about safety aren’t going to get in the way of stockholders and their profits anymore than it has in the past. If you want to talk about the dangers of overreliance on AI doing dangerous work, then by all means talk about that. This copyright fight, for those large companies, is a beneficial distraction.