South Korea’s recent deepfake sex crime scandal was brought into the public spotlight by the X (formerly Twitter) user nicknamed “Queen Archive.”

Graphics by Kim Ha-gyeong
Graphics by Kim Ha-gyeong

Previously known as “Guilty Archive” with 120,000 subscribers, the operator had her initial account suspended and later created the Queen Archive account. She received a tip on Aug. 24 from a user about the widespread distribution of deepfake pornography on Telegram. Taking decisive action, Queen Archive publicized this information on her X account and sought further reports from victims and witnesses. The issue quickly gained momentum, revealing its extensive impact on not only students but also teachers and female soldiers. Recognizing the severity of the situation, President Yoon Suk-yeol has called for immediate measures.

In an interview with the Chosunilbo on Aug. 28, Queen Archive said, “I was receiving eight reports per minute, and notifications continued through the night,” adding that discovering the stories of the victims was “truly devastating.” Below is a Q&A with Queen Archive, who asked to keep her personal information confidential.

You played a major role in publicizing the recent deepfake sex crime. What motivated you to take action?

“A post about a deepfake-sharing Telegram group in Daegu first appeared on an online community on Aug. 24. The group chat involved creating pornographic deepfake materials using photos of real-life acquaintances. I saw it and posted about it on X. One user then tipped me off about a Telegram deepfake group involving teenage victims. Once I learned about it, I felt I had to protect the students. Given the nature of Telegram crimes, investigating them is difficult without identifying the perpetrators. To track them down, I infiltrated the Telegram groups. I discovered that deepfake crimes on Telegram were far more severe and widespread than anticipated, affecting the entire country. I decided to publicize it.”

Graphics by Kim Ha-gyeong

How did you bring the issue to public attention?

“I posted on my X account, Queen Archive, on Aug. 25 at 1 p.m., saying that deepfake sex crimes were occurring not only at universities but also in middle and high schools in Daegu. I also advised female students in Daegu to remove their selfies [from their social media accounts]. At that time, the focus was primarily on the case at Inha University. I shared a screenshot of the list of 17 schools mentioned in group chats distributing deepfake photos of acquaintances, which included 10 middle schools and seven high schools.”

What was the reaction like?

“Starting at 2:30 p.m. that day, I began receiving a flood of reports from people who had seen my post. It turns out that the deepfake group chat targeting middle and high school students in Daegu wasn’t the first—it was actually the fourth group to be created. After that, similar groups were discovered in Seoul, Suwon, Daejeon, Uijeongbu, and Bucheon. Posts exposing these groups started appearing in real-time. In one “humiliation room” with 1,932 members, there were sub-rooms targeting their cousins, moms, acquaintances, older sisters, and younger sisters. That was also when I learned about a group targeting female soldiers.”

What kind of posts and photos are shared in these chat rooms?

“For example, in the ‘younger sister room,’ one man posted a a photo of himself lifting his sleeping sister’s skirt and touching her thigh, with the caption, ‘I failed to give her a sleeping pill today. I’ll succeed tomorrow.’ Other users responded with comments like ‘You’re brave’ and ‘I’m jealous.’ In another group, they shared a middle school girl’s photo along with her name, school, home address, and phone number, with a comment saying, ‘She’s so innocent that I can easily threaten and XX her.’”

What were your thoughts after infiltrating the group chats?

“I felt utterly sickened. The chat rooms were filled with nude photos of women, full of intimate body parts. I couldn’t understand why they are so obsessed with other people’s naked bodies. There were so many rooms like ‘insulting acquaintance’ and ‘family humiliation.’ When I joined the ‘link-sharing room,’ it led me to another set of links, and those links led to more, creating an endless chain. I realized it would be impossible to track down all these perpetrators alone, so I decided to post on X to seek assistance in gathering evidence. Eleven people, including myself, came together. Once we had identified some perpetrators, we sought legal advice and posted about it on X. This led to even more reports from people. I received direct messages at a rate of eight per minute, and notifications continued to come through even in the middle of the night. I made every effort to keep updating the situation. Hearing the stories of the victims and sharing their trauma was truly devastating.”

What was the biggest challenge you faced?

“Navigating Telegram was quite difficult for me. I struggled to take screenshots without getting kicked out of rooms, as I didn’t realize that room owners were notified of such actions. I am learning and adapting as I go.”

On the first day, Queen Archive exposed 19 chat rooms to the public, which were soon after deleted. Some of these rooms had as many as 20,000 participants. As more reports continued to surface on her account, another account emerged by 9 p.m. on Aug 25., dedicated to compiling and sharing lists of the affected regions and schools.

What case stands out the most to you?

“The most distressing incident was when a separate room was set up to humiliate a specific victim, sharing their personal details and deepfake videos. Over 1,000 participants contacted the victim, further tormenting them by sharing their reactions.”

Are there any deepfake sexual crime chat rooms targeting male victims?

“I haven’t seen any; they all targeted women. The deepfake tools available on Telegram seem primarily designed to manipulate images of women.”

Why are there so many ‘humiliation rooms’ targeting acquaintances?

“Involuntary celibate (Incel) culture plays a significant role. Some men derive pleasure from degrading women they can’t access in real life, often without recognizing the criminal nature of their actions. Deepfake crimes create scenarios where victims are falsely portrayed, leading to severe emotional distress as their images and personal information are widely circulated. This issue is far from trivial.”

Have any victims reached out to you?

“Yes, I’ve received numerous messages, all expressing gratitude. I’m not earning money from this, but the support I’ve received motivates me to continue my efforts.”

What actions are necessary to eliminate digital sexual crimes?

“Perpetrators must be made to feel that they are not beyond reach. They should receive strict punishment without leniency for ‘first offenses’ or ‘diminished capacity.’ Light sentences only encourage them to find ways to avoid being caught again. Many don’t realize they could also become victims.”

☞ Deepfake

A term combining ‘deep learning’ (a form of artificial intelligence) and ‘fake,’ used to describe AI-generated content that manipulates images or videos to create deceptive or misleading visuals. Recently, applications that enable users to effortlessly generate explicit material by merging social media photos with inappropriate content have become increasingly popular, often being exploited for sexual crimes.