They aren’t insufficient, they are working just fine. In the US, fair use balances the interests of copyright holders with the public’s right to access and use information. There are rights people can maintain over their work, and the rights they do not maintain have always been to the benefit of self-expression and discussion. We shouldn’t be trying to make that any worse.
Yep. Copyright should not include “viewing or analyzing the picture” rights.
Artists want to start charging you or software to even look at their art they literally put out for free. If u don’t want your art seen by a person or an AI then don’t publish it.
It’s sad some people feel that way. That kind of monopoly on expression and ideas would only serve to increase disparities and divisions, manipulate discourse in subtle ways, and in the end, fundamentally alter how we interact with each other for the worse.
What they want would score a huge inadvertent home run for corporations and swing the doors open for them hindering competition, stifling undesirable speech, and monopolizing spaces like nothing we’ve seen before. There are very good reasons we have the rights we have, and there’s nothing good that can be said about anyone trying to make them worse.
Also, rest assured they’d collude with each other and only use their new powers to stamp out the little guy. It’ll be like American ISPs busting attempts at municipal internet all over again.
Copyright should absolutely include analyzing when you’re talking about AI, and for one simple reason: companies are profiting off of the work of artists without compensating them. People want the rewards of work without having to do the work. AI has the potential to be incredibly useful for artists and non artists alike, but these kinds of people are ruining it for everybody.
What artists are asking for is ethical sourcing for AI datasets. We’re talking paying a licensing fee or using free art that’s opt-in. Right now, artists have no choice in the matter - their rights to their works are being violated by corporations. Already the music industry has made it illegal to use songs in AI without the artist’s permission. You can’t just take songs and make your own synthesizer out of them, then sell it. If you want music for something you’re making, you either pay a licensing fee of some kind (like paying for a service) or use free-use songs. That’s what artists want.
When an artist, who does art for a living, posts something online, it’s an ad for their skills. People want to use AI to take the artist out of the equation. And doing so will result in creativity only being possible for people wealthy enough to pay for it. Much of the art you see online, and almost all the art you see in a museum, was paid for by somebody. Van Gogh died a poor man because people didn’t want to buy his art. The Sistine Chapel was commissioned by a Pope. You take the artist out of the equation and what’s left? Just AI art made as a derivative of AI art that was made as a derivative of other art.
MidJourney is already storing pre-rendered images made from and mimicking around 4,000 artists’ work. The derivative works infringement is already happening right out in the open.
You are expressly allowed to mimic others’ works as long as you don’t substantially reproduce their work. That’s a big part of why art can exist in the first place. You should check out that article I linked.
I actually did read it, that’s why I specifically called out MidJourney here, as they’re one I have specific problems with. MidJourney is currently caught up in a lawsuit partly because the devs were caught talking about how they launder artists’ works through a dataset to then create prompts specifically for reproducing art that appears to be made by a specific artist of your choosing. You enter an artist’s name as part of the generating parameters and you get a piece trained on their art. Essentially using LLM to run an art-tracing scheme while skirting copyright violations.
I wanna make it clear that I’m not on the “AI evilllll!!!1!!” train. My stance is specifically about ethical sourcing for AI datasets. In short, I believe that AI specifically should have an opt-in requirement rather than an opt-out requirement or no choice at all. Essentially creative commons licensing for works used in data sets, to ensure that artists are duly compensated for their works being used. This would allow artists to license out their portfolios for use with a fee or make them openly available for use, however they see fit, while still ensuring that they still have the ability to protect their job as an artist from stuff like what MidJourney is doing.
I actually did read it, that’s why I specifically called out MidJourney here, as they’re one I have specific problems with. MidJourney is currently caught up in a lawsuit partly because the devs were caught talking about how they launder artists’ works through a dataset to then create prompts specifically for reproducing art that appears to be made by a specific artist of your choosing. You enter an artist’s name as part of the generating parameters and you get a piece trained on their art. Essentially using LLM to run an art-tracing scheme while skirting copyright violations.
I’m pretty sure that’s all part of the discovery from the same case where Midjourney is named as a defendant along with Stability AI, it isn’t its own distinct case. It’s also not illegal or improper to do what they are doing. They aren’t skirting copyright law, it is a feature explicitly allowed by it so that you can communicate without the fear of reprisals. Styles are not something protected by copyright, nor should they be.
I wanna make it clear that I’m not on the “AI evilllll!!!1!!” train. My stance is specifically about ethical sourcing for AI datasets. In short, I believe that AI specifically should have an opt-in requirement rather than an opt-out requirement or no choice at all. Essentially creative commons licensing for works used in data sets, to ensure that artists are duly compensated for their works being used. This would allow artists to license out their portfolios for use with a fee or make them openly available for use, however they see fit, while still ensuring that they still have the ability to protect their job as an artist from stuff like what MidJourney is doing.
You can’t extract compensation from someone doing their own independent analysis for the aim of making non-infringing novel works, and you don’t need licenses or permission to exercise your rights. Singling out AI in this regard doesn’t make sense because it isn’t a special system in that regard. That would be like saying dolphin developers have to pay Nintendo every time someone downloads their emulator.
Yeah, that’s what I’m saying - our current copiright laws are insufficient to deal with AI art generation.
They aren’t insufficient, they are working just fine. In the US, fair use balances the interests of copyright holders with the public’s right to access and use information. There are rights people can maintain over their work, and the rights they do not maintain have always been to the benefit of self-expression and discussion. We shouldn’t be trying to make that any worse.
Yep. Copyright should not include “viewing or analyzing the picture” rights. Artists want to start charging you or software to even look at their art they literally put out for free. If u don’t want your art seen by a person or an AI then don’t publish it.
It’s sad some people feel that way. That kind of monopoly on expression and ideas would only serve to increase disparities and divisions, manipulate discourse in subtle ways, and in the end, fundamentally alter how we interact with each other for the worse.
What they want would score a huge inadvertent home run for corporations and swing the doors open for them hindering competition, stifling undesirable speech, and monopolizing spaces like nothing we’ve seen before. There are very good reasons we have the rights we have, and there’s nothing good that can be said about anyone trying to make them worse.
Also, rest assured they’d collude with each other and only use their new powers to stamp out the little guy. It’ll be like American ISPs busting attempts at municipal internet all over again.
Copyright should absolutely include analyzing when you’re talking about AI, and for one simple reason: companies are profiting off of the work of artists without compensating them. People want the rewards of work without having to do the work. AI has the potential to be incredibly useful for artists and non artists alike, but these kinds of people are ruining it for everybody.
What artists are asking for is ethical sourcing for AI datasets. We’re talking paying a licensing fee or using free art that’s opt-in. Right now, artists have no choice in the matter - their rights to their works are being violated by corporations. Already the music industry has made it illegal to use songs in AI without the artist’s permission. You can’t just take songs and make your own synthesizer out of them, then sell it. If you want music for something you’re making, you either pay a licensing fee of some kind (like paying for a service) or use free-use songs. That’s what artists want.
When an artist, who does art for a living, posts something online, it’s an ad for their skills. People want to use AI to take the artist out of the equation. And doing so will result in creativity only being possible for people wealthy enough to pay for it. Much of the art you see online, and almost all the art you see in a museum, was paid for by somebody. Van Gogh died a poor man because people didn’t want to buy his art. The Sistine Chapel was commissioned by a Pope. You take the artist out of the equation and what’s left? Just AI art made as a derivative of AI art that was made as a derivative of other art.
You should check out this article by Kit Walsh, a senior staff attorney at the EFF. The EFF is a digital rights group who recently won a historic case: border guards now need a warrant to search your phone.
MidJourney is already storing pre-rendered images made from and mimicking around 4,000 artists’ work. The derivative works infringement is already happening right out in the open.
Something being derivative doesn’t mean it’s automatically illegal or improper.
You are expressly allowed to mimic others’ works as long as you don’t substantially reproduce their work. That’s a big part of why art can exist in the first place. You should check out that article I linked.
I actually did read it, that’s why I specifically called out MidJourney here, as they’re one I have specific problems with. MidJourney is currently caught up in a lawsuit partly because the devs were caught talking about how they launder artists’ works through a dataset to then create prompts specifically for reproducing art that appears to be made by a specific artist of your choosing. You enter an artist’s name as part of the generating parameters and you get a piece trained on their art. Essentially using LLM to run an art-tracing scheme while skirting copyright violations.
I wanna make it clear that I’m not on the “AI evilllll!!!1!!” train. My stance is specifically about ethical sourcing for AI datasets. In short, I believe that AI specifically should have an opt-in requirement rather than an opt-out requirement or no choice at all. Essentially creative commons licensing for works used in data sets, to ensure that artists are duly compensated for their works being used. This would allow artists to license out their portfolios for use with a fee or make them openly available for use, however they see fit, while still ensuring that they still have the ability to protect their job as an artist from stuff like what MidJourney is doing.
I’m pretty sure that’s all part of the discovery from the same case where Midjourney is named as a defendant along with Stability AI, it isn’t its own distinct case. It’s also not illegal or improper to do what they are doing. They aren’t skirting copyright law, it is a feature explicitly allowed by it so that you can communicate without the fear of reprisals. Styles are not something protected by copyright, nor should they be.
You can’t extract compensation from someone doing their own independent analysis for the aim of making non-infringing novel works, and you don’t need licenses or permission to exercise your rights. Singling out AI in this regard doesn’t make sense because it isn’t a special system in that regard. That would be like saying dolphin developers have to pay Nintendo every time someone downloads their emulator.