The question is will AI satiate those people or drive them to further extremes? I’m hopeful it might reduce demand for making content “the traditional way” due to being lower risk (since less-horrific doesn’t seem to motivate such people).
I say hopeful because this cat can’t be put back in the bag. The technical solutions they suggest are desperate grabbing at straws and (self-evidently) won’t work. People trading in CSM are already taking extreme legal and social risks, so it’s hard to imagine any greater motivation that could be applied to AI generated versions. I think all we can do is hope this makes things better and not worse. Because if it goes the other way the future is looking grim.
WARNING : BEFORE YOU TWIST MY WORDS, I’M AGAINST PEDOPHILIA OR BASICALLY ANYTHING RELATED TO IT
Yeah, you can’t unharm the children that built their model, but maybe you can save thousands or more children by just… Flooding the “market” with cheap, unharming content? Like this with rhino horns https://www.theguardian.com/environment/2019/nov/08/scientists-plan-to-flood-black-market-with-fake-rhino-horn-to-reduce-poaching
There are going to be some very interesting studies done about whether or not this is a good thing or not. I can easily see both sides of the argument. Ultimately this is a mental illness that needs to be dealt with but I’m curious if these images could be created to allow
Continuing since my client just posted before I was done:
…some sort of aversion therapy by creating images without harming children in the process. Again like OP said, totally against any sort of child pornography. But interesting times indeed.
I would be okay if these AI generated images were made available to people who had self identified to a government body and agreed to be placed on a special list and enter psychiatric treatment. After all, although it’s absolutely disgusting, at least with AI generated images no children are being harmed and if it brings these sick people forward to seek treatment and to be identified and monitored to prevent real life abuse, then it could actually save real children from being exploited which is obviously a noble goal.
I just don’t even the poor bastard that has to setup the test data for the AI to generate all that art…
wow, that’s terrifying and gross 🙃 it would have been free for people not to do this
Wish I hadn’t deleted my Reddit account, cause I predicted this. I also predicted that AI would be used to generate “proof” that a person is a child abuser. And there’s absolutely nothing we can do about any of this. What a nightmare.