In its submission to the Australian government’s review of the regulatory framework around AI, Google said that copyright law should be altered to allow for generative AI systems to scrape the internet.
In its submission to the Australian government’s review of the regulatory framework around AI, Google said that copyright law should be altered to allow for generative AI systems to scrape the internet.
Except that isn’t exactly how neural networks learn. They aren’t exactly copying work, they’re learning patterns in how humans make those works in order to imitate them. The legal argument these companies are making is that the results from using AI are transformative enough that they qualify as totally new and unique works, and it looks as if that might end up becoming law, depending on how the lawsuits currently going through the courts turn out.
To be clear, technically an LLM doesn’t copy any of the data, nor does it store any data from the works it learns from.
spoiler
asdfasdfsadfasfasdf
Yes, they probably would, so long as the work is transformative enough. You wouldn’t be the first, or last, author to copy LoTR in their own works.
This is why you can go on Instagram and find people selling presets that give photos the look of a famous photographer. They advertise them as such. But even though they are trying to sell something that supposedly allows you to copy the style of someone else, it’s still legal, because it’s transformative enough.
It doesn’t have to make sense, and we don’t have to agree with it, but that’s how the law works.
The problem is if I wholesale copy a paragraph word for word, then yes, I am engaging in plagiarism. The line is not as clear as you think. The difference is I can’t hide what I took as well as AI can and I can’t do it to 10,000 people in an instant.
Just because I engage in plagiarism at scale and hide it better does not mean I did not engage in plagiarism.