- cross-posted to:
- technology@beehaw.org
- cross-posted to:
- technology@beehaw.org
Instagram is profiting from several ads that invite people to create nonconsensual nude images with AI image generation apps, once again showing that some of the most harmful applications of AI tools are not hidden on the dark corners of the internet, but are actively promoted to users by social media companies unable or unwilling to enforce their policies about who can buy ads on their platforms.
While parent company Meta’s Ad Library, which archives ads on its platforms, who paid for them, and where and when they were posted, shows that the company has taken down several of these ads previously, many ads that explicitly invited users to create nudes and some ad buyers were up until I reached out to Meta for comment. Some of these ads were for the best known nonconsensual “undress” or “nudify” services on the internet.
Can you link some of these nudificator apps? Just for research 🤭
Go on Instagram. Like four photos of girls in bikinis. The ad tool will figure out what you want shortly.
I want to put photos of non-humans in there genuinely. I wonder what it would do to a photo of a brick or a giraffe.
New confused boners community me thinks.
Or the furrys get a little weirder
Nah it won’t do that because the ai is human-only. Unless it has some secret beastiality DLC I don’t know about.
No, we already have dedicated AI furry porn generators
getsomebitches.net/fuckinvirgin
https://github.com/Stability-AI/stablediffusion