This message is brought to you by Apple Pay. Forget your wallet, it's all good. Because with Apple Pay, you can pay with a simple tap of your iPhone, the wallet you never forget, at millions of places worldwide, including websites, apps, and anywhere you see the contactless symbol.
Security is built in with Face ID, so you don't have to worry about your cards getting lost or stolen. And the best part, you still earn the card rewards, points, and cash back you love. So say goodbye to the buy fold. Add your card to Apple Wallet and start paying the Apple way. Terms apply.
At Sierra, discover top workout gear at incredible prices, which might lead to another discovery. Your headphones haven't been connected this whole time. Awkward. Discover top brands at unexpectedly low prices. Sierra, let's get moving. Just a quick heads up before we get started. Today's show is about non-consensual pornography. We use several graphic descriptions and words. Please take care when listening.
When did you first realize that your image was being deepfaked? It was in early June of last year. It was just one evening. I was bored online at night. And, you know, sometimes like everybody Googles their name out of curiosity just to see what would come up.
Joanne Chu is an actress and visual artist based in LA. And the first couple listings were my website, my IMDb page. And then I scrolled down and I said, what is this? What Joanne saw were deepfaked pornographic videos and images of herself. More than 400 pieces of content.
It was Joanne Chu, petite Asian cunt stretched out or impregnated by five black men or Joanne Chu, stupid Chinese whore, that type of thing. But none of it was actually her. She'd been deep faked. Joanne was stunned. I think the only way that I could process it was to kind of separate myself from it and then just observe it from above. But at the same time, I was also feeling so much rage and it's just...
How can somebody do this and just post like all over the Internet? I was just I mean, I still can't believe it. Joanne still doesn't know the identity of the person who targeted her. He's had a handful of different screen names on multiple platforms, but his obsession with her seems to remain constant. I posted something on TikTok saying, you know, for the last few months, this has been happening. And I filed a report with YouTube, like YouTube, what are you doing?
And a few days after that, I noticed that whoever this person is, he posted a strange YouTube video saying, my apologies to Joanne Chu. I didn't mean to hurt you. And here is a collection of images celebrating you that are not lewd in nature or something. And it was just the craziest. Exactly. And that was almost worse than the graphic content in a way because he was making all of these random fantasy storylines. And then he started...
Joanne Chu was a little girl, Joanne Chu this, and I was just, oh God, please don't be involved with making content about underage children. After limited success with the various online platforms, Joanne fought back, hiring a digital investigator to find and remove the abusive content. But her story's not over, and she still has to exist online as an actor and an artist.
I guess if I wanted to talk to people about it, first of all, this is not something that I would wish on my worst enemy. And I would want to tell people, don't let this make you feel like you don't own your own body or own your own image anymore, because that's what I felt for a while. And this is not your fault. You're not alone. Don't be afraid to speak up.
Today on the show, how AI supercharged non-consensual pornography, giving anyone the ability to target another person from the palm of their hand. I'm Lizzie O'Leary, and you're listening to What Next TBD, a show about technology, power, and how the future will be determined. Stick around. ♪
This podcast is brought to you by Progressive Insurance. Do you ever find yourself playing the budgeting game, shifting a little money here, a little there, hoping it all works out? Well, with the Name Your Price tool from Progressive, you can be a better budgeter and potentially lower your insurance bill too.
You tell Progressive what you want to pay for car insurance, and they'll help you find options within your budget. Try it today at Progressive.com. Progressive Casualty Insurance Company and affiliates. Price and coverage match limited by state law. Not available in all states.
This message is brought to you by Apple Pay. Forget your wallet, it's all good. Because with Apple Pay, you can pay with a simple tap of your iPhone, the wallet you never forget, at millions of places worldwide, including websites, apps, and anywhere you see the contactless symbol.
Security is built in with Face ID, so you don't have to worry about your cards getting lost or stolen. And the best part, you still earn the card rewards, points, and cash back you love. So say goodbye to the buy fold. Add your card to Apple Wallet and start paying the Apple way. Terms apply. I first heard Joanne's story from reporter Sam Cole, who I consider the preeminent American journalist covering deepfakes.
Sam's a co-founder of 404 Media, and she's been reporting on deepfake technology for years. Deepfakes first really started appearing around 2017, but they were pretty basic. They didn't look great and took a lot of work to make. You needed multiple images, a specifically trained algorithm, and a level of technical skill that most people didn't have. But now, the advent of generative AI has brought deepfake technology to the mass market.
You can just type in a prompt. You can say like, I want to see Taylor Swift doing such and such. And you don't even need like an image. You can just use text. There's apps that do this. You know, we see middle schoolers doing it. It's just that easy now. It's not something that you need like coding skills to do, which you used to be able to, used to have to. That's the landscape that we're dealing in now is just like, it's so simple to do it just through your phone.
Whereas you used to need like a computer, a PC, like, you know, a pretty good gaming PC and some skills to do it. Is there a benign use case for this technology? Because we hear about the non-consensual porn as a use case, and I don't even want to call it porn. Abuse. But is there a good use case? Is there a way that this tech is ever harnessed for other things?
So when it first came out, and the famous example is Princess Leia in Star Wars, in the new Star Wars movie that came out, because they, like, put her face onto her older body or, like, they swapped her into the new movie to make it look like she was acting as her younger self. So Hollywood has been using it for a while. There's, I think it was Robert De Niro, one of his movies, had it where he's, like, de-aged and it was, like, a deep fake technology thing. Yeah.
Movies have been doing it that way for quite a while now, but it's
still not something that's super common because people are turned off by it in general. Like they don't like to see it. I could see it being used in like advertising and marketing. And like, if you want to change the look of an actor in an ad, but you don't want to redo the whole shoot, which is a huge time expensive thing to do in advertising. And it happens all the time where the client's like, I don't like her hair. Can we put a different
Hello.
Just to be clear for everyone seeing this, I am a version of Chris Pelkey recreated through AI that uses my picture and my voice profile. I would like to make my own impact statement. So, hello everybody. Thank you so much for being here today. It means a lot to Gabriel Horcacidas, the man who shot me. It is a shame we encountered each other that day in those circumstances. In another life, we probably could have been friends.
I believe in forgiveness and in God who forgives. I always have, and I still do.
he was accepting the apology of his killer and saying, I forgive you for killing me from the grave. It's so, I laugh because it's absurd. It's so freaky. And the judge was moved by it. The judge was like, I love this. I'm so moved. I'm so glad you did that. And the family did it completely. Like the family, his sister did it. The family was totally on board. They said, you know, this is realistic to what he would have said, but like,
Who's to actually say what he would have said? Right. I don't know. I mean, they would know him better than anyone, but he's still his own person. So I feel like that's going to be it. I wouldn't, I don't know if I would call that benign use of the technology either. But it is a use. It's a use that's not illegal.
One of the things that is unique about Joanne's story is that it seems to come from one guy. Like when I think about deep fake harassment campaigns, I think about a lot of trolls coming after a celebrity, Taylor Swift being the example. But Joanne, despite being an actress, is not a celebrity. And this really seems like it all came from this one guy. Yeah.
Have you ever seen something like that before? Is that common? So you're right. Like I, when I talk to deep fake victims and when I see this happening online to celebrities and stuff, it's usually like a community of people for lack of a better word. It's like a group of people making this big effort to like make the best Scarlett Johansson deep fake that they can or whatever it is. So yeah,
And usually they're just like anonymous hordes of people that do this stuff. But yeah, it's her case I thought was really unique because it was it seemed to be just this one person who was really obsessed in a really scary way with her and then would respond back and forth to her. Because usually if you try to talk to these guys, they don't really take the bait. They don't bite at all.
Other than I managed to talk to the first guy who made deepfakes in 2017 very briefly. And what did he say? He said what they all say, which is that if he didn't make this technology, someone would eventually. And technology can be used for good and bad. So who's to say that what he's doing is necessarily bad or not inevitable? They all say the same thing.
Reclaiming your image after being deepfaked is exhaustingly difficult.
It usually involves finding the video on a platform, say X or YouTube or a porn site, and sending in requests for it to be taken down. But platforms are often slow to respond and sometimes don't follow through, not to mention the emotional toll that the process takes.
Joanne eventually hired digital investigator Charles DeBarber to help her out. The process when you have an investigator is they can kind of go a step further and like really start doing that for you and start sending those letters for you, sending the DMCA requests for you.
And it takes a huge burden off of you, but a lot of the time you're the one still ending up having to find all those links because, you know, you are impacted by it the most. So you're motivated to do it. Investigators definitely have more tools in their toolkit to do this. And they were also trying to find him by sending him links that then he would click and it would reveal links.
his IP address. So that was part of talking to him directly was, can we get him to show where he is, who he is? You know, can we kind of keep this going? Which is a huge burden on victims is...
You know, acting as, again, it's kind of like live bait for their abuser. And this is something that happens, again, it's like this is common across lots of different abuse scenarios, but engaging directly with this guy was important to try to find who he is. Joanne never went to the police. Why?
She had had a bad experience with the police in the past where she had gone in and tried to report something and they basically told her to go away. This is really common in abuse victims in general is the hesitancy to go to the police because a lot of the time the police will say, well, you don't have enough evidence. They'll say, what did you do to deserve it? Which is horrible. You know, they, I mean, in the case of a lot of the time, it's like if you're
If you're a sex worker going to the police, which Joanna is not in sex work, but she's supportive of it, but she's just like, it's not my choice in life, so I don't want to be thrust into this scenario. But sex workers a lot of times are abused by police themselves. You know, you're risking going from one harasser to another, essentially. And it's a systemic issue that happens all across the country. It's just hard to get help when...
the police don't believe you a lot of the time. When we come back, Congress just passed legislation around deepfakes. Sam is skeptical. This podcast is brought to you by Progressive Insurance.
You chose to hit play on this podcast today? Smart choice. Progressive loves to help people make smart choices, and that's why they offer a tool called AutoQuote Explorer that allows you to compare your Progressive car insurance quote with rates from other companies, so you can save time on the research and could enjoy savings when you choose the best rate for you. Give it a try after this episode at Progressive.com, Progressive casualty insurance company and affiliates. Not available in all states and situations, prices vary based on how you buy.
This episode is brought to you by OutShift by Cisco. While single agents can handle specific tasks, the real power comes when specialized agents collaborate to solve complex problems. But there's a fundamental gap. There is no standardized infrastructure for these agents to discover, communicate with, and work alongside each other. That's where agency comes in.
The Agency is an open-source collective building the Internet of Agents, a global collaboration layer where AI agents can work together. It will connect systems across vendors and frameworks, solving the biggest problems of discovery, interoperability, and scalability for enterprises. With contributors like Cisco, Crew.ai, Langchain, and MongoDB, Agency is breaking down silos and building the future of interoperable AI.
That's A-G-N-T-C-Y dot O-R-G.
Today's episode is sponsored by NerdWallet's Smart Money Podcast. Making financial decisions shouldn't feel like trying to pick a place to eat with friends. Too many choices and everyone has an opinion on what's worth it. One person says invest in crypto. Another says go all in on real estate. Meanwhile, you're just trying to figure out if your credit card's annual fee will ever pay off. That's where the nerds come in.
NerdWallet's finance journalists do the research so you don't have to, breaking down the fine print, cutting through misinformation, and giving you real fact-based insights so you can make smart financial moves without the group chat chaos. NerdWallet's Smart Money podcast can help you get smarter about using home equity without over-leveraging yourself and when refinancing actually saves you money and when it doesn't.
Want to know if dividend stocks or growth stocks make more sense for you? Listen to the nerds. They also help decipher the real difference between ETFs and mutual funds without the jargon. Make your next financial move with confidence. Follow NerdWallet's Smart Money podcast on your favorite podcast app. And stay tuned at the end of this episode to hear the Smart Money trailer. We're talking about Joanne's story not long after Congress passed the Take It Down Act, which...
would enact strict penalties for distribution of non-consensual imagery. And it passed with overwhelming bipartisan support. What do you make of the legislation? It's tough because I like to see this sort of thing being taken more seriously. I like to see online abuse, non-consensual imagery in general, taken seriously, of course, anytime that's the case. But this is just...
a setup for more censorship, in my opinion. The Take It Down Act does not provide provisions against abuse of takedown requests, and it sets up platforms to comply quickly with abusive requests. So when I say abusive requests, I'm saying, you know, people abuse copyright all the time by saying, I don't like this video, so I don't want it out there, when it's really not
legally any of their domain to be allowed to take that down. So it's a huge problem. And it's a problem for censorship. It's a problem for journalists. It's a free speech issue, of course. But the Take It Down Act requires platforms to remove any offending content within 48 hours. Wow. Which is so short. It's so fast. It's so fast.
Especially considering a lot of these big tech platforms like Facebook and X and Instagram and all these others that have a lot of non-consensual imagery on them, to be honest. They are all gutting their moderation teams. So they don't think that moderation is a priority anymore in a lot of these cases because they're just like, well, everyone should be allowed to say whatever they want. And especially under Trump, it's like, oh, well, you know, we're getting rid of Trump.
the woke rules that said no hate speech or whatever it was. I'm being really extreme in that, that description, but like, you know, Zuckerberg said that he's, he's just not, he's, he was not interested in moderating speech in that way anymore. And now you have fewer and fewer actual human moderators doing the job that is already really grueling and difficult to
and time-intensive, and now you have this federal law that says if you don't comply in 48 hours, we're going to come down on you with, you know, the FTC is going to come after the platform. So what's going to happen is they're going to just push everything through and then figure it out later. That approach, Sam says, might mean veering too far in the other direction. So images like a breastfeeding mother get treated the same way as nonconsensual porn.
On top of that, the fast-tracked process might give internet trolls and abusers themselves the ability to target content they don't like.
Once you have trolls or whoever abusing these laws, the platforms are going to be like, oh, shit, I need to comply with this federal law as fast as possible. And what's going to happen is just like mass censorship. The platforms are going to get rid of adult content altogether, which would be very extreme. But, you know, it's like I could see it happening for sure. It's just saying we can't deal with this. This is too much. The burden is too high.
it's too much money to hire more moderators. It's too much expense. So no more adult content at all. No more not safe for work stuff because it's creating a risk that we can't handle. And that I think is going to happen with smaller platforms, especially because they definitely don't have the capabilities to comply with this. And take it down also requires platforms to not just take down that one instance that's being reported. It requires them to take down all other instances that
copies essentially of that content on their platform, which would require fingerprinting essentially every piece of content, which is technologically expensive and time consuming. And it's not something that small platforms will be able to do, let alone what big platforms want to pay for and what they want to make the effort to do. So I don't know. I just think it's, I think it's going to be really bad for adult content. I think it's going to be really bad for
Everyone in general, because what's bad for adult is bad for everybody eventually. I think probably we're going to see a lot of like AI moderators like jumping in and saying just scanning an image and saying, OK, that's lawful and that's not. It's hard to argue that removing non-consensual pornography is a bad thing.
But Sam says this is a free speech issue and that the way the law is constructed could mean that any content deemed objectionable could be removed without a fair review. I do think, especially under this administration and considering this is a law that would fall to the FTC, a big roadmap for a lot of
People in the administration now and supporters of Trump is to get pornography taken from the Internet entirely. Like they don't want to see any adult content on the Internet at all. And that's legal, lucrative content that's very popular and it's all over the Internet. And they see it as illegal.
this ethical and moral evil that's beset our country and they want it gone. So I see Take It Down, honestly, as a step in that plan. It's making the law do the ethical and moral work for a lot of these politicians that just hate porn. They hate queer people. They hate sexual expression and they don't want to see any of it on the internet or in public.
In general, which the internet is now our public space. Well, it's sort of reminiscent of FOSTA and SESTA. I'm going to spell out the acronyms. They're so incredibly long. FOSTA is Allow States and Victims to Fight Online Sex Trafficking Act, and SESTA is Stop Enabling Sex Traffickers Act, in that...
Those sound like they were born of good intentions, but the ramifications were pretty great. Like, could you explain what happened with those laws? Like you said, it's these are laws that had good intentions, quote unquote, if you read them by if you believe. If you read the title. Yeah. It's like, yeah, who's against it?
uh, taking down sex trafficking or getting rid of, um, trafficking on the internet. That seems great. Everyone wants this. Uh, you know, porn performers want this. People in the adult industry want this. Everyone wants this. Um, but, um,
The result of this was everything construed or perceived as, you know, quote unquote trafficking, which was like a lot of stuff that could say, you know, if we're talking about like meeting up in person, things like that. It's just a lot of consensual adult activity there.
Could be construed as trafficking. And what happened was the platform said, you know what, we're not going to deal with this because it's too much to have these, have the FTC come after us, have these huge fines and fees and things like that. So they preemptively in a lot of cases just took down
anything that could be wildly imagined as trafficking. Craigslist took down a bunch of their adult sections. Smaller dating sites took down their entire services because they were just like, we can't deal with this law anymore. And it affected platforms in ways that
I think people, you know, like sex workers expected it and it was coming, but the average person was just like, oh my God, like I can't access a lot of the stuff that I used to because of this law. And it hurt a lot of people's livelihoods as a result. I don't know. It just, it affects our ability to like talk to each other online, which I think sucks. That's like...
It's a huge part of existence now is being able to connect with each other. You have been reporting on this issue for so long. I think of you as the preeminent reporter on it. And I wonder, seeing how the tech has morphed over time, where is this going? So I used to be a lot more optimistic than I am currently now.
Like within the last few months, I used to be way more optimistic about, you know, adult, the adult industry has always survived and thrived no matter what bullshit gets thrown at it. Very adaptable industry, very resilient people. So I think just talking about that specific industry, which again is, it's not a niche thing. It's really important and huge. It's a huge segment of industry.
life online. I think what's different and new now is the people at the wheel are now controlling how these laws get passed, if they get passed. And also the laws are just suggestions now. It's like the laws don't really even seem to matter that much in a lot of cases. I don't know. It's like, obviously I want the legal system to keep functioning as it is, but like I would like...
I would like to see people give a shit about, you know, like the basic constitutional rights. And I don't know if that's so much the case anymore in a lot of leadership. So I think with the Trump administration, we're seeing it totally stacked with people who just hate anything that's not heteronormative, Christian sex and life in general. And that's really dark and scary to me. And I think that...
Unless there's a huge pushback against some of the stuff that they're trying to pass, and some of the stuff that they're just saying out loud, you know, it's just, they're going to keep bulldozing through our rights. Sam Cole, thank you so much for your reporting and for talking with me. Yeah, thank you so much. Samantha Cole is a co-founder of 404 Media. Special thanks to Joanne Chu.
And that is it for our show today. What Next TBD is produced by Evan Campbell, Patrick Fort, and Shana Roth. Slate is run by Hilary Fry. And TBD is part of the larger What Next family.
And if you like what you heard, the number one way to support our independent journalism is to join Slate Plus. You get all your Slate podcasts, like this one, ad-free, as well as access to exclusive bonus content. Just head over to slate.com slash whatnextplus to sign up. All right, we'll be back next week with more episodes. I'm Lizzie O'Leary. Thanks so much for listening.
In honor of Military Appreciation Month, Verizon thought of a lot of different ways we could show our appreciation. Like rolling out the red carpet, giving you your own personal marching band, or throwing a bumping shindig.
At Verizon, we're doing all that in the form of special military offers. That's why this month only, we're giving military and veteran families a $200 Verizon gift card and a phone on us with a select trade-in and a new line on select unlimited plans. Think of it as our way of flying a squadron of jets overhead while launching fireworks. Now that's what we call a celebration because we're proud to serve you. Visit your local Verizon store to learn more.
I'm Leon Nafok, and I'm the host of Slow Burn Watergate. Before I started working on this show,
Everything I knew about Watergate came from the movie All the President's Men. Do you remember how it ends? Woodward and Bernstein are sitting with their typewriters, clacking away. And then there's this rapid montage of newspaper stories. About campaign aides and White House officials getting convicted of crimes. About audio tapes coming out that prove Nixon's involvement in the cover-up. The last story we see is, Nixon resigns. It takes a little over a minute in the movie. In real life, it took about two years.
Five men were arrested early Saturday while trying to install eavesdropping equipment. It's known as the Watergate incident. What was it like to experience those two years in real time? What were people thinking and feeling as the break-in at Democratic Party headquarters went from a weird little caper to a constitutional crisis that brought down the president? The downfall of Richard Nixon was stranger, wilder, and more exciting than you can imagine. Over the course of eight episodes, this show is going to capture what it was like to live through the greatest political scandal of the 20th century.
With today's headlines once again full of corruption, collusion, and dirty tricks, it's time for another look at the gate that started it all. Subscribe to Slow Burn now, wherever you get your podcasts.