A massive collection of nude images has been revealed by an AI image-generating startup’s leaked database.


An artificial intelligence image According to new research reviewed by WIRED, startup Generator has made more than 1 million images and videos created with its systems available to anyone. According to the researcher who discovered the leaked data set, the “vast majority” of the images contained nudity and “depicted adult content”, with some appearing to be children or children’s faces swapped onto AI-generated nude adult bodies.

Security researcher Jeremiah Fowler, who discovered the security flaw in October, says several websites — including MagicEdit and DreamPal — all appear to be using the same insecure database. At the time, about 10,000 new images were added to the database every day, Fowler says. These images show how people used image production and editing tools, these images consisted of “unaltered” photos of real people who may have been secularly “undressed”, or whose faces were swapped with other naked bodies.

“The only real issue is innocent people, especially minors, whose images are being used for sexual content without their consent,” says Fowler, a prolific database hunter who published the findings on the ExpressVPN blog. Fowler says this is the third database of AI image-generated misconfigurations made available online this year — all of which appear to contain unconventionally explicit images, including images of young people and children.

Fowler’s findings come as AI image generation tools continue to be used to maliciously create vivid images of people. A huge ecosystem of “striping” services, used by millions of people and making millions of dollars a year, use artificial intelligence to “undress” people—almost entirely women—in photos. Photos stolen from social media can be edited with just a few clicks: leading to harassment and abuse of women. Meanwhile, reports of criminals using artificial intelligence to create child sexual abuse material, which covers a range of indecent images of children, have doubled in the past year.

“We take these concerns very seriously,” said a spokesperson for DreamX, the startup that runs Magic Edit and DreamPal. An influencer marketing company linked to the database, called SocialBook, is “operated by a separate legal entity and is not involved in the operation of other sites,” the spokesperson said. “These entities share some historical ties through legacy founders and assets, but they operate independently with separate product lines,” the spokesperson said.

A SocialBook spokesperson told WIRED: “SocialBook is not connected to, does not use this storage space, and has never been involved in its operation or management. “Referred images are not generated, processed or stored by SocialBook systems. SocialBook operates independently and has no role in the described infrastructure.”

Leave a Reply

Your email address will not be published. Required fields are marked *