AI Data Breach of Face Swapping App

In March 2025, a cybersecurity expert discovered a publicly accessible database containing close to 100,000 records connected to GenNomis, an AI project by South Korea–based AI-NOMIS. The company is known for its use of artificial intelligence in face-swapping and creating adult-themed “Nudify” content. The database, which was 47.8 GB in size, included 93,485 images and JSON files, some of which featured explicit, AI-generated images of people who appeared to be underage.

The JSON files stored in the database captured text prompts and links tied to the generated images, offering a glimpse into how the AI system operated. While the exposed information did not include names or direct identifiers, the incident shed light on the dangers of unchecked AI image creation and reinforced the importance of strong cybersecurity protocols for such sensitive technologies.

Once the AI data breach was identified, the researcher took immediate action and reported it to both GenNomis and AI-NOMIS. The companies quickly restricted public access to the database. Despite this, they failed to respond to the disclosure or provide any public acknowledgment, leaving open questions about how long the data was exposed and whether anyone else had accessed it during that period.