- cross-posted to:
- [email protected]
- cross-posted to:
- [email protected]
Instagram is profiting from several ads that invite people to create nonconsensual nude images with AI image generation apps, once again showing that some of the most harmful applications of AI tools are not hidden on the dark corners of the internet, but are actively promoted to users by social media companies unable or unwilling to enforce their policies about who can buy ads on their platforms.
While parent company Meta’s Ad Library, which archives ads on its platforms, who paid for them, and where and when they were posted, shows that the company has taken down several of these ads previously, many ads that explicitly invited users to create nudes and some ad buyers were up until I reached out to Meta for comment. Some of these ads were for the best known nonconsensual “undress” or “nudify” services on the internet.
How? In my mind, for this scenario, I can picture your face literally perfectly. It is, for all intents and purposes, your real face. In this case what I imagine in my head is identical to what some ai model would churn out.
Is a picture no different than an immigration? Is CGI not a picture because it used a computer?
Of course not because REAL MEDIA is being produced. Your thoughts never leave your head, these images are being used to harass and blackmail women. Your argument is completely asinine.
Hold up, that’s a separate issue. Revenge porn is flat out illegal, so using nudes of people, real or not, as blackmail, isn’t up for debate here. Whether or not it’s obvious, and I’m sorry if it’s not, I’m 100% with you that that’s completely disgusting and shouldn’t be tolerated.
Back to the first part though, is the problem literally just that it exists outside of the persons head? If they don’t share it with anyone, what’s really the difference from them imagining it? In both scenarios they’re effectively getting the same result, and nobody else is affected.