Social networks are becoming a marketplace for AI-generated "nude photos"

Social networks are becoming a marketplace for AI-generated "nude photos"

AI-created "nude" photos are being sold on various social platforms. Often, it is not openly disclosed that these images belong to artificially generated models, not real nude models.

The Washington Post newspaper discovered several models that were recognized as AI-generated after some analysis, for example, because birthmarks on the models were in different places in the photos. It turns out that using the Stable Diffusion tool makes it very easy to create such images.

On Reddit, Twitter, and Onlyfans, accounts showcase AI-generated images of women, sometimes more, sometimes less dressed. To see more, users must pay or become subscribers to the platform. Two students from one account earned $100 from the AI-generated images. In fact, they just wanted to test whether they could deceive people with AI-generated content - the guys did not expect such success.

Deepfake porn is explicit content that easily integrates unrelated people using a few photographs and artificial intelligence. This allegedly gave rise to porn featuring Taylor Swift. This is problematic in two ways: first, for the sex industry, and second, for third-party observers, usually women, who suddenly find themselves in pornographic content.

AI has undergone rapid development, and it no longer requires a real image for processing - as is the case with deepfakes. AI generating images produces full pictures based solely on textual commands. Photos are becoming more and more accurate and can now almost realistically depict hands. The ease of creating such content makes it profitable: content is generated in no time and easily sold on Reddit. According to the media, AI could disrupt the multi-billion dollar sex industry.

However, these AI-generated images are also used to create deepfakes, replacing real material images that were used before. Journalists have already found forums where people discuss the simplest way to create deepfakes using artificial intelligence and image editing techniques.

To degrade women and push them out of public space, the images don't even necessarily have to look real.