We're sunsetting PodQuest on 2025-07-28. Thank you for your support!
Export Podcast Subscriptions
cover of episode Deepfake porn crisis: How it’s affecting schools in South Korea

Deepfake porn crisis: How it’s affecting schools in South Korea

2025/3/24
logo of podcast What in the World

What in the World

AI Chapters Transcript
Chapters
The podcast starts by describing the shocking reality of deepfake pornography in South Korean schools, revealing that over 500 institutions are affected and that the perpetrators are often teenagers. The scale of the problem and the reasons behind the perpetrators' actions are explored, with a focus on the lack of awareness of the seriousness of their actions.
  • Over 500 schools and universities in South Korea are dealing with a deepfake porn crisis.
  • 80% of those arrested for creating and distributing deepfakes are teenagers.
  • Many perpetrators view their actions as a "silly prank".

Shownotes Transcript

This BBC podcast is supported by ads outside the UK. With the Amex Gold Card, turn your errands into rewards with four times membership rewards points at US supermarkets. Then grab a little pick-me-up from Dunkin' courtesy of Amex. With the $84 Dunkin' credit, earn up to $7 in monthly statement credits when you pay with the Gold Card. The Amex Gold Card rewards you on life's necessities.

Enrollment required. Terms and points cap apply. Learn more at americanexpress.com slash us slash explore dash gold.

I'm Zing Singh. And I'm Simon Jack. And together we host Good Bad Billionaire. The podcast exploring the lives of some of the world's richest people. In the new season, we're setting our sights on some big names. Yep, LeBron James and Martha Stewart, to name just a few. And as always, Simon and I are trying to decide whether we think they're good, bad or just another billionaire. That's Good Bad Billionaire from the BBC World Service. Listen now wherever you get your BBC podcasts.

Let's say you open up your phone and you see a naked photo that's been shared loads of times across social media. It's of you. But you know it's not a real photo. It's a deepfake. A stolen photo of your face mapped onto someone else's nude body.

In South Korea, students have been waking up to this nightmare. More than 500 schools and universities have been affected. The perpetrators are often students themselves. In fact, 80% of those arrested for creating and distributing deepfakes are teenagers. They think it's just a silly prank, but it can have a huge impact on people's mental health.

So today, you're going to hear about South Korea's problem with deepfakes in schools. The country is known for its cutting-edge tech, but what happens when students use it to humiliate their classmates? This is What in the World from the BBC World Service. I'm William Lee Adams. Here to tell us more is Hyojung Kim. She is from the BBC's Korean service, Hyojung High. Hi. So let's just start with some numbers. How big of a problem are deepfakes in South Korea?

Yeah, actually, the defake issue is really significant and serious in South Korea. We have seen a surge in defake pornography, particularly affecting schools.

At the end of August last year, there was a report that more than 500 schools were affected. These incidents usually involve the fake videos or photos made from pictures of students and teachers. These images are either taken from photos

Social media like Twitter, Instagram, and these images are also secretly taken at schools. According to information we got from the police, there were 1,202 reports of defake-related sex crimes in 2024.

This represents a significant increase from the 156 cases reported in 2021. Obviously, for the victim, this can be upsetting, embarrassing, demeaning. Why are people making these deepfake images?

Yeah, that's a good question. Why do teenagers engage in these kinds of criminal activities? Many are not fully aware of the seriousness of their actions. According to a survey conducted by the Ministry of Education last December,

which include 2,145 middle school and high school students in their first and second years. When asked about reasons behind committing the fake sexual offenses,

Over 54% of the students cited it's just for fun. It was the primary reason. Other reasons included curiosity or the perception that the punishment for such offenses is lenient.

Actually, one teacher we spoke with, she told me that some students would frequently make sexual comments and push their classmates toward female teachers, not male teachers, to make some kind of physical contact. So when she tried to address this behavior, the response from them was often like, oh, I was just messing around or it was just a prank.

Actually, I want to share my experience in my own school days. I attended middle school and high school in the early 2000s. At the time, some boys used very small meters hidden in their hands to look up the skirts of young female teachers. So it is often considered just a prank.

Although it made me uncomfortable, the atmosphere, the societal attitude at that time did not view it as criminal. So in my view, these deepfake crimes are some kind of evolution of the old perception that women are

easy targets now amplified and facilitated by technology. What I'm hearing is a lot of students find this funny, just a joke, and they don't think about the consequences. Do you think that misogyny underpins this? Is there some sort of double standard where it's okay to mock a woman but not a man?

Yeah, that's right. According to the survey, most of victims, about 90% of the victims are females. And how is this topic broached in schools? Do teachers and administrators make it clear this is a serious offense? Actually, I interviewed some students. So students still feel that nothing has changed at their schools. So we spoke with a female student.

I remember that she told me everyone is worried that they might be the next target, but there's no clear plan at the school for what to do next. She also told me that we thought there would be education in schools all over the country, whether or not something happened there, but nothing has happened.

happen. The victims, if they see these photos, they'll know that's not me, that's not my body. However, the fear, of course, is that other people may not realize that. How realistic are these deepfake images? We interviewed one of the teachers who are affected by the deepfake images. So she told me when she first

When I was a teenager, someone made an image of me putting my head on a woman's body, and they printed it and put it all over the school. And back then, the technology was quite bad. So it very clearly wasn't me. And yet I still felt quite upset about it.

Did any of the people you spoke with talk to you about their mental health, like what the consequences were, you know, going back to school because that's your job or that's a place where you study, you have no choice? The most memorable person we interviewed was a high school teacher in Incheon near Seoul. She was very shocked and she was shown a photo that had been uploaded to X app.

showing close-up photos of parts of her body. So she says she became frustrated because the perpetrator was her student. But she had to find the perpetrator by herself because...

of the lack of police action after reporting the images. The investigation is still ongoing. She told me that she was very, very disappointed and she was very sad. Hyojung, have any people been prosecuted for making or circulating these images? Last year, the police arrested 600 people.

When you say legal punishment, what is that punishment? Are we talking jail time, a monetary fine? It's depending on your age, the age of the perpetrators.

because in South Korea, the law considers anyone under 14 as too young to be held criminally responsible. So this means they can be punished under criminal law, even if they break it. So that means they cannot be put into jail. However, they can still be given educational programs or community service. So

So if they commit a serious crime, including the fake sex crimes, they may be sent to a juvenile training center. In a juvenile training center, they receive counseling, probation, and education to prevent reoffending. The center is more like a training facility than a prison. So staying there doesn't result in a criminal record.

And as you said up top, this is a nationwide problem affecting schools all over the country. Therefore, is the Ministry of Education addressing it? Is there a plan of action to combat this? Actually, we reached out to the Ministry of Education to check on this matter.

The Ministry of Education told us that they are doing a lot to address this issue. They said, we are working hard to make it clear through education, wellness campaigns and other actions that

And they also told that they are trying to teach that any pictures or manipulated images taken without someone's permission can be a crime or something like that. They have also created educational materials and sent various official documents to schools.

However, the victims we met mentioned that the need for more proactive support and changes. For example, there isn't yet a manual issued by the Ministry of Education on how teachers should respond when they are victimized. Hyojung, thank you so much for explaining that. Thank you very much. Thank you.

Now, this isn't just an issue in South Korea. It actually affects women all over the world. This is Noelle Martin from Australia. She spoke to the BBC after discovering a deepfake that was made of her. I got ruthlessly and mercilessly attacked by the public because this was pre-MeToo movement. This was at a time when this wasn't a topic of discussion in the media like it is today. So I was completely ripped apart.

I found out about people creating doctored pornographic images and sharing them on the internet when I was 18, but they had been targeting me before I ever found out. And over time, they have escalated in terms of the amount of images they shared of me and in terms of how graphic they were. And as the years passed, they...

also created deepfake videos of me, pornographic deepfakes that falsely depicted me having sexual intercourse and performing oral sex. And they've just kept taunting and trying to intimidate me ever since. I think originally it began as...

almost like a sexual fetishization because a lot of the images of me were on pornographic sites that were for bustier women. And then they became more mainstream. But over time, because I started fighting back publicly and also being involved in the law reform here in Australia and then speaking out all over the world, I think the motivation changed for them. And they just wanted to silence me and taunt me because they kept

escalating the abuse in terms of how advanced it was becoming. So what can you do if you discover deepfake porn of yourself or if you face image-based harassment online? Jess Davies is a presenter, women's rights campaigner, and she made a BBC documentary all about the topic. Here's what she advises.

My first piece of advice for anyone who suspects that they have fallen victim to deepfake abuse is not to panic and I know it sounds so much easier said than done but you're not alone in this. Unfortunately a lot of people are falling victim to this technology at the moment but there are avenues you can go down to get support

and to seek out justice if that's via the legal route or to have your images removed from the site. Now, I know it can be really triggering to view this content, especially if it's quite graphic or exploitative in its nature, but it's so important that you take screenshots of the images, of the videos and where they've been shared. So ask a family member or a friend to help you if you don't feel that you can do this yourself, but it's really important to collect those screenshots to save the usernames of the people sharing your content

and also save the full URL links of the website and of the specific forum or thread that your content has been shared on because this can all be used in evidence if you do decide to go to the police and when you submit takedown requests to websites. If you know where your images have been shared, contact the website via the contact us button or if you scroll down to the footer of some websites, they usually have a specific DMCA takedown request link

Now, in certain countries, it is a specific crime

to distribute an explicit deepfake of someone without their consent. So if you do feel up for it, definitely report this to the police because it is a crime in many countries. And even if there isn't specific legislation, they may still be able to help you. My final piece of advice would be to seek out support helplines because they can be really helpful in just giving people

that support that is needed to victims of image-based abuse and offering a listening ear that aren't going to judge you they're not going to blame you for this because this isn't your fault and if you don't want to go to the helplines please please just turn to a friend or a family member and let them know what you're going through because it can feel really isolating and lonely to experience digital harms but know that you're not alone it isn't your fault and there is support out there

That brings us to the end of this episode. I'm William Lee Adams. You've been listening to What in the World from the BBC World Service.

I'm Zing Singh. And I'm Simon Jack. And together we host Good Bad Billionaire. The podcast exploring the lives of some of the world's richest people. In the new season, we're setting our sights on some big names. Yep, LeBron James and Martha Stewart, to name just a few. And as always, Simon and I are trying to decide whether we think they're good, bad or just another billionaire. That's Good Bad Billionaire from the BBC World Service. Listen now wherever you get your BBC podcasts.