Maybe noone would need to point out your pedophilia if you stopped conveniently ignoring that it’s not possible to generate child porn ““AI Art”” without having child porn first…
Dont they often train the program with adult porn and then the ai just puts a childs face onto bodies generated from this training data? I imagine these ai companies are scraping data from popular pornsites or just paying for the data and these pornsites work hard not to have CP on them. The result is a childs face on a body too mature for it. Remember that some actual adult actresses have body proportions that many would consider underdeveloped and someone generating these pictures could regenerate till the ai uses these body proportions.
The point being is you don’t need CP to train ai to make CP. I am not justifying any moral positions here, but pointing out a fact in ai technology.
Uh no, adult porno photo with child face edited in is just adult porno with child face edited in, I don’t know anyone insane enough to claim it’s child porn, and pedophiles don’t like physically matured bodies so everyone loses.
In this case the guy did have real images but you don’t need them. AI is kind of intelligent in a sort of hard to define way, it picks up on stuff.
It picked up that people like younger individuals in pornography, so it took that to the logical extreme. AI is weird because it’s intelligence without any actual thought. But it can totally generate variations on things it’s already seen, and a kid is just a variation on a young adult.
Maybe noone would need to point out your pedophilia if you stopped conveniently ignoring that it’s not possible to generate child porn ““AI Art”” without having child porn first…
Dont they often train the program with adult porn and then the ai just puts a childs face onto bodies generated from this training data? I imagine these ai companies are scraping data from popular pornsites or just paying for the data and these pornsites work hard not to have CP on them. The result is a childs face on a body too mature for it. Remember that some actual adult actresses have body proportions that many would consider underdeveloped and someone generating these pictures could regenerate till the ai uses these body proportions.
The point being is you don’t need CP to train ai to make CP. I am not justifying any moral positions here, but pointing out a fact in ai technology.
Uh no, adult porno photo with child face edited in is just adult porno with child face edited in, I don’t know anyone insane enough to claim it’s child porn, and pedophiles don’t like physically matured bodies so everyone loses.
You don’t know what you’re talking about.
In this case the guy did have real images but you don’t need them. AI is kind of intelligent in a sort of hard to define way, it picks up on stuff.
It picked up that people like younger individuals in pornography, so it took that to the logical extreme. AI is weird because it’s intelligence without any actual thought. But it can totally generate variations on things it’s already seen, and a kid is just a variation on a young adult.
Removed by mod
deleted by creator
Yes, AI can create tons of content it’s not trained on.
deleted by creator