We're sunsetting PodQuest on 2025-07-28. Thank you for your support!
Export Podcast Subscriptions
cover of episode The ‘Take It Down Act,’ Explained

The ‘Take It Down Act,’ Explained

2025/6/11
logo of podcast KQED's Forum

KQED's Forum

AI Deep Dive AI Chapters Transcript
People
A
Aisha Wahab
J
Jasmine Mitani
P
Paresh Dave
Topics
Jasmine Mitani: 《下架法案》旨在打击未经同意的私密图像传播,包括未经同意的自拍和深度伪造。该法案要求互联网服务提供商提供用户友好的方式来请求删除这些图像,并在48小时内完成。此外,该法案还涵盖了发布图像的威胁,试图阻止勒索行为。法律的刑事条款已生效,而下架条款的实施还需要一年时间,由联邦贸易委员会负责执行,违规者将面临罚款和监禁。这项法案的请求和删除部分是模仿版权法的《数字千年版权法》,但没有足够的保障措施,例如针对新闻内容或与公共利益相关的内容。48 小时的时限意味着公司会下架所有被举报的内容,无论其是否未经同意,这可能会影响有色人种、酷儿群体、模特或性工作者,因为他们的在线性行为最容易受到平台的监管和惩罚。 Paresh Dave: 参议员们听取了一些受裸照应用程序影响的青少年,或未经同意传播的亲密图像的故事,这些故事激励了他们推动这项法案。创造和分享这类内容太容易了,这些应用程序甚至有时在 Facebook 上做广告,你下载它,上传一张照片,几分钟内你可能会得到一张合成图像,显示某人处于未穿衣服的状态。与亲密伴侣关系恶化后,传播亲密照片也是一个常见问题,这些照片可以立即传播到整个互联网,这就是为什么 48 小时的下架条款如此重要。Meta、Microsoft 等科技公司也支持这项法案,他们希望有一些法规可以依靠,以某种方式减轻他们的责任。许多性积极团体或性权利倡导团体担心,《下架法案》可能会成为联邦层面禁止所有类型网络性内容的楔子。 Aisha Wahab: 加州法律也旨在解决数字性剥削问题,SB 981 允许报告所描绘图像的人将使平台冻结 48 小时,之后会启动一个时间线,要么确定这是不适当的内容并将其删除,要么不删除。SB 926 解决了未经授权创建和传播未经同意的性暴露图像的问题。联邦法律《下架法案》允许 48 小时的时间线来删除报告的图像和任何已知的相同副本,而我们的 SB 981 必须穿针引线,尤其是在加利福尼亚州,许多科技公司都说这是言论自由,平台不需要进行任何类型的拦截,对此我们也不同意。反对意见让我感到担忧,尤其是在人工智能对许多人非常有用,但对许多其他人也非常危险的情况下。

Deep Dive

Shownotes Transcript

Translations:
中文

Support for KQED Podcasts comes from Save the Bay, protecting and restoring San Francisco Bay for over 60 years. Community and corporate volunteer opportunities available along the shoreline to help drive positive change. More at savesfbay.org.

Switch to Comcast Business Mobile and save hundreds a year on your wireless bill. Comcast Business. Powering possibilities. Restrictions apply. Comcast Business Internet required. Compares two unlimited intro lines and lowest price 5D plans of top three carriers. Tax and fees extra. Reduce speeds after 30 gigabytes of usage. Data thresholds may vary. From KQED.

From KQED in San Francisco, I'm Mina Kim. Coming up on Forum, we explain the federal Take It Down Act aimed at curbing revenge porn and deep fakes. When it was signed into law last month by President Trump, it was called a milestone among supporters in the fight to stop the spread of non-consensual, intimate depictions, especially of women and girls. This hour, we look at how the law is being implemented and how you can protect yourself if you've been victimized online.

We'll also learn why some First Amendment watchers say the new law is deeply flawed. Join us. Welcome to Forum. I'm Nina Kim. A rare moment of bipartisanship in the federal government this year was when lawmakers came together to overwhelmingly pass, in a vote of 4-0-9-2, the Take It Down Act.

Aimed at cracking down on the sharing of revenge porn, the proliferation of sexually explicit images and hyper-realistic deepfake videos of female celebrities all the way to middle school girls has been called an epidemic of online degradation.

President Trump moved quickly to sign the act into law last month, but the new statute is not without its critics. We learn more this hour with my guests, Paresh Davey, senior writer at Wired, whose piece for Wired is called Trump Signs Controversial Law Targeting Non-Consensual Sexual Content. Paresh, thanks so much for being with us. Great to be here.

Also with us is Jasmine Mitani, data and technology reporter for The 19th, a nonprofit newsroom reporting on gender, politics, policy, and power. And her piece is called Here's How You Can Use the Take It Down Act. Jasmine, really glad to have you too. Thanks for having me. So let me start with you. What does the Take It Down Act do?

So the Take It Down Act has two parts. First is that it criminalizes the publication and distribution of non-consensual intimate imagery. And this means anything from, you know, a selfie that someone took that was meant to be shared privately and then was posted online without their permission to these non-consensual deep fakes such as those nudify apps or something like that where a person is sort of undressed.

And then the second part of the law is creating a request and removal provision where internet service providers are required to have sort of a user-friendly way for people whose non-consensual images, intimate images are posted online to request for them to be removed. And they must be removed within 48 hours. And I also understand that

The Take It Down Act also covers threats of releasing images. Correct. It's not just when the harm, after the harm has happened. It's also trying to take into account if someone's trying to blackmail someone, say, you know, you must do this or else I'll release these images, and trying to criminalize that behavior as well. So it's not exclusively after the harm has occurred.

And what's the timeline for the implementation? Has it gone into effect completely? So the criminal provisions of the law went into effect as soon as President Trump signed the bill. The request and removal takedown part of the law, internet service providers have one year to implement that process. So that would go into effect and the Federal Trade Commission could start enforcing that in May 2026.

And so the Federal Trade Commission is responsible for enforcement. What are the penalties if you are, say, the creator or share of that kind of information, or if you're a company who doesn't take it down in 48 hours? There are fines and also for people who post it themselves, there is jail time as well as part of the law. And also for companies there are fines. So is it initially a misdemeanor then?

These, I believe, are felonies. Felonies from the start.

Paresh, can you remind us why some kind of effort like this was needed? How bad the problem of non-consensual imagery has become? Yeah, I mean, I spoke to the offices of two of the senators who really led the charge on this, Amy Klobuchar and Ted Cruz. And I think the inspiration for them was hearing from some of the teenagers even who were affected by

by nudify apps in their high schools or sharing of non-consensual intimate imagery that they had sent maybe consensually to someone and then it spread far and wide from there. And, you know, these are people who have families themselves. It is very concerning to them, these stories that they heard and, you know,

There's no reason that anyone would want to sort of support this kind of behavior. And so it was an easy victory for both parties. And how fast and easy is it to create something like this and share it?

Far too easy. These apps are even advertised sometimes on Facebook and places where people are congregating all the time. You download it, you upload a picture, and in a few minutes you might have a synthetic image showing someone in a state of undress. And then, of course, images that people take maybe to send to an intimate partner or

things go bad with that intimate partner, they start spreading that online. That is also a common issue. And of course, you know, that can happen instantly where you post it in one place and it spreads across the internet really quickly, which is why the takedown provision of 48 hours is so essential. Yeah. You also wrote about how hard it is to get these removed. You profiled Bree's Lou and her story. Can you talk about how hard it's been?

Yeah. So she was, you know, a victim. She learned while she was in college and sort of about a video of her that was online. Eventually, she worked with some organizations to identify that there were hundreds of links of different images or screenshots of her in an intimate state.

She tried contacting various organizations and websites to get the image down, to get those videos down.

Eventually, she landed on around 100 images that Microsoft just would not remove. She and a colleague had to corner a Microsoft executive at a conference, and it was only after that and about eight, nine months, almost 10 months, that the image was taken down. And this was years after the original situation.

It can be grueling. And not everyone has the wherewithal to pursue that because this is a very emotional, draining thing to deal with. You can't always have the wherewithal and the courage to stick it out through a grueling process like that. Jasmine, what have you heard from advocates in terms of the emotional toll this can have?

Yeah, so a lot of advocates have been calling this distribution of nonconsensual images digital sexual violence. Research has shown that the impact on survivors is similar to people who have been assaulted sexually, physically, in real life. And that's why this is sort of being treated like such a serious crime because of the also long-lasting impacts

you know, and that it can be very difficult to get offline, like in the case that Paresh just talked about. It's something that can plague victims for years. Yeah. Which may explain what you alluded to earlier, Paresh, just the widespread support for the bill from both parties, for example. Tech companies were behind it too, though, right?

Absolutely. Meta, Microsoft, who I just mentioned, got behind it, which is kind of funny because Microsoft took so long in that case that I just explained. And here they're advocating for 48 hours. But I think tech companies like to have some sort of regulation to fall back on, something that's sort of...

takes away the liability from them in a way. And in this case, that is what this law does. If they do, you know, comply with the takedown provision, it sort of absolves them of liability down the line. There were some other groups that were interested in this bill, for example, religious groups, Paresh, who are strong supporters. What was their reason for that?

So that is an interesting element to this, where there's a lot of Christian groups, conservative groups that for years have been advocating for the removal of sexual content from the web or more guardrails. We've seen in a number of states in the country where age verification laws have been passed, where to access pornography websites, you have to upload an ID or prove your age in some other form or fashion to show that you're an adult.

Those laws, again, very much, you know, pushed and advocated for by these Christian groups or these religious groups. And you're seeing the same here. And it's kind of like there's a concern among a lot of sort of

sex positive groups or sexual advocacy groups, that this law, the Take It Down Act, is kind of a wedge at the federal level to move towards banning all kinds of sexual content across the web. Yeah, you've heard about that too, right, Jasmine? Concerns from sex positive groups and others. Yeah, I think particularly the

Request and removal part of the bill there. It's sort of modeled on the same thing that we use for copyright law, the Digital Millennium Copyright Act, which I think many people are familiar with, you know, suddenly music disappearing from videos on YouTube. There's many ways that that sort of impacts your life online.

But it doesn't have as many guardrails, you know, for things such as like journalistic content, you know, things that could be relevant to the public interest. Also, you know, when it comes to things like this, there's potential for it to be abused. That's something that has that, you know, with the argument has been made by many journalists.

groups concerned with free speech aspects of the bill infringing on it, that a 48 hour timeline means that companies will just take down anything that's reported, whether it's non-consensual or not. And if people are reporting, you know, any content that could be sexualized, my guess would that this would, uh, you know, impact, you know, um, people of color, it would impact queer people. It would impact, um, people who are models or work in the, um,

or sex workers, because those are people whose online sexualities are most heavily policed and penalized by platforms. Yeah.

We're talking with Jasmine Mitani, data and technology reporter for The 19th and Paresh Davey, senior writer at Wired, about a new federal law signed last month that's been called a milestone in protecting victims of revenge porn and sexually explicit deep fakes, but has raised concerns from some First Amendment watchers and privacy advocates. It's called the Take It Down Act. And listeners, what questions do you have about this new law?

Do you believe that it is an effective way to address the issue of sexually explicit non-consensual images and hyper-realistic videos that are being distributed online, which predominantly, unfortunately, affect women and girls? Or do you believe it goes too far? Are you skeptical that the government can effectively police online abuses?

If you've ever tried to have content taken down on an online platform before, how did that go? You can tell us by emailing forum at kqed.org, finding us on our social channels, Blue Sky, Facebook, Instagram, or threads at KQED Forum. And you could call us at 866-733-6786.

Again, that's 866-733-6786. Jasmine Mitani's piece for the 19th on this topic is called Here's How You Can Use the Take It Down Act. And Prash Davey's piece in Wired is called Trump Signs Controversial Law Targeting Non-Consensual Sexual Content. Stay with us. More forum after the break.

Support for Forum comes from the University of San Francisco School of Management. Celebrating 100 years of partnership with the Bay Area business community, the USF School of Management connects students to the city's vibrant culture, hands-on internships, and a wealth of career opportunities. Where AI and sustainability are integrated into every facet of business education.

and where students bring innovation, ethics, and entrepreneurial leadership to a planet in need. The University of San Francisco School of Management. Change the world from here. Support for KQED Podcasts comes from Earthjustice. As a national legal nonprofit, Earthjustice has more than 200 full-time lawyers who fight for a healthy environment.

They wield the power of the law to protect people's health, preserve magnificent places and wildlife, and advance clean energy to combat climate change. Earthjustice fights in court because the Earth needs a good lawyer. Learn more about how you can get involved and become a supporter at earthjustice.org.

You're listening to Forum. I'm Mina Kim. We're talking about the Take It Down Act this hour, a new federal law signed last month aimed at protecting victims of revenge porn and sexually explicit deep fakes. We're talking about it with Paresh Davey of Wired and Jasmine Matani of The 19th, a data and technology reporter there. And we're talking about it with you, our listeners. 866-733-6786 is the number to call. You can email your comments and questions about the Take It Down Act at

to forum at kqed.org, or you can find us on Blue Sky, Facebook, Instagram, or threads at kqedforum. So Jasmine, many states already have laws, including California, that criminalize non-consensual sexual images. So why was a federal law needed in this case?

So the first thing is that, you know, at this point now, as of May, every state has a law that addresses non-consensual intimate images, authentic images. But there isn't a universal standard about those laws, about, you know, what defines harm. Some of them say, you know, it will only be a crime if it is...

if the person who shared the images was intending to do harm, but we know a lot of people do this just because they think it's funny, right? They weren't intending to hurt the victim and that can be a loophole that can prevent justice. Um, also all the laws don't necessarily cover this category of like explicit deep fakes. Um, but that's something that the federal law covers both explicit deep fakes and authentic images in one law. Hmm.

Well, joining me now is the Hayward State Senator who introduced California's laws on this that took effect in January. Senator Ayesha Wahab is with us. Welcome to Forum. Thank you. Thank you so much. So tell us what SB 926 and SB 981, the governor Newsom signed into law last year. What do they do?

Definitely. So, you know, we need to address the problems of digital sexual exploitation from multiple angles. One, the person creating and posting the images and two, the venue where the images are posted. So SB 926 handles the former and SB 981 handles the latter.

SB 981 is basically the person reporting the depicted images will have the platform freeze it for 48 hours. There's a timeline that kicks in where either it's determined that this is inappropriate content and it's removed or not. Whereas

SB 926 addresses the unauthorized creation and distribution of non-consensual sexually explicit images. It closes a loophole in revenge porn by prohibiting unauthorized distribution of artificially created sexually explicit images that cause emotional distress to the person depicted in the image. So in California, if somebody...

says that there is an image of them that is non-consensual and that's sexually explicit, you're saying that it must immediately be frozen or temporarily blocked while the platform is investigating whether or not it is in fact a non-consensual image. Yes. So that's different from the federal law? What do you think of the federal take it down act? Yeah.

Yeah, which I think it's a step in the right direction, right? So the federal law, the Take It Down Act, allows for 48 hours a timeline to remove the reported image and any known identical copies. Whereas ours, the SB 981, we had to thread the needle, especially in California with so many tech companies saying that it's, you know, freedom of speech and the platforms are not required to do any type of intercepting, which we also disagree with. Yeah.

Overall, SB 981, the digital identity theft, allows for 48-hour timeline for the platform to confirm receipt of the report, freeze it, seven days for a timeline to respond to reporting the user with an update on their request, 30 days to make a final determination.

for a total of 60 days for any unforeseeable circumstances for the platform. This does temporarily block the reported instances, right? And immediate removal once determination has been made. And it's enforced by the AG, the district attorney, county council, city attorney, city prosecutor, whereas the Take It Down Act is enforced by the FTC. So then how will the California law interact with the federal law? Do you have any concerns about conflicts?

No, I actually I was happy to see the federal law get signed as well. Overall, I think that opposition is what's concerning to me. And then the enforcement piece, especially as AI has been, you know, very useful to so many people. It's also very dangerous to so many others.

And so covered material is vague. There is a federally preempted section 230 that has always kind of been our difficulty when we are trying to take control of these images on these platforms. So it's threading the needle. We're not where we want to be, but these are all steps in the right direction.

I see. So have you heard about any issues related to its implementation in California? I know it was just implemented in January or took effect in January.

No, but I will also say that within the first 16 days of this year that AFTR was implemented, we were able to see Sacramento PD arrest an individual, a New York Times cartoonist, who had over 130 images on their computer, many of them of children that were also digitized images, not necessarily real children.

So that individual was arrested and it was largely contributed to the bill that we had in place. So did you encounter any obstacles getting the votes you needed, Senator?

Yes, we always have to hear from the tech companies that this is, you know, blocking freedom of speech. This is too much work. You know, we can't referee what's happening on our platforms. And and yes.

It's ridiculous. At a certain point, we also have to say that we need to do the job and protect people. Also, technology moves so quickly that policy doesn't keep up with technology. Half of my colleagues are 50 and up, some of them pushing 70. They don't necessarily use the platforms the way that young people use right now. And we also know that it's young girls and women and children that tend to be

the most exploited when we see these things. So I think that we still have a lot of work to do in this space, and people need to understand how technology is being used. It's not just to Google something anymore. It's literally to support people's fetishes, which is a problem. Senator Ayesha Wahab, who represents California's 10th senatorial district, which includes Hayward, Fremont, Sunnyvale, Santa Clara, thanks so much for talking with us. Thank you. Take care.

We're talking about the Take It Down Act, a federal law signed last month that has been called a milestone in protecting victims of revenge porn, but has also raised

concerns. We're also talking about California's law that you just heard about there. I do want to get into some of the concerns about it. Steve on Discord writes, I 100% agree with the intent of the original drafters of the law. And also I agree with those expressing concern. I suspect bad actors may have had input in creating some of these problems, but it is the law now. And I think we need to correct the problems called out ASAP. So

So, Prash, go over for us why First Amendment advocates are worried about the take it down law. And I'm assuming, or maybe I shouldn't be assuming, but did it have the same issues? Did the California law have the same issues or raise the same concerns in as strong a manner as the federal take it down? I think there was less attention on the state law, so it did not attract as much attention, but the concerns are still very much the same.

And it sort of has Jasmine said earlier that

The concern is that tech companies already rely a lot on automation or AI or machine learning to take down content, right? And to remove content within 48 hours, they're going to automate these processes. That would be a lot of labor otherwise, given the volume of requests that potentially could be made here, you know, hundreds of thousands potentially annually, right?

So what we're going to see is content taken down, and there's not necessarily an appeals process. People might not appeal. And that's what First Amendment advocates are concerned about, is that there will be content that is not non-consensual imagery that is needlessly removed. And we've already seen that, as Jasmine mentioned, with the copyright protections, where people

We see one company try to use the copyright law to take down content of...

another company or one, you know, person who, you know, one YouTuber tried to take down the content of another YouTuber, even though there's no grounds, there's no copyright grounds. They're just using this law to try to get tech companies to take the content down because they know that the social platforms just do this kind of automatically and they're worried about the liability. And so they take content down quickly and it becomes a headache to try to get it back online.

I see. So in their haze, they don't necessarily make sure that this is something that or they don't necessarily verify that the person is asking to have this content taken down because it actually follows the letter of the law. The other difference, it sounds like you're describing is that

This process actually has an appeals one, but the Take It Down Act does not? Correct. So that's one of the major concerns there. There's also some ambiguities with Take It Down. One of the concerns is that while the criminal provisions clearly lay out a definition for that synthetic material for that deepfake,

Does the deepfake provision actually apply to the take it down part of it where, you know, to remove it within 48 hours? It's a little vague because it's not as well defined in that takedown part of the law.

So there's, you know, how could the FTC potentially weaponize the takedown provision because the FTC can be a political body? There's concern that the FTC will use this to take down, try to take down content that it does not like for political reasons as well. Yeah.

This is Nurkay writes, does this law apply to people from other countries who commit these crimes, Parish? I think that's a great question. And I mean, generally, yes. One of the big issues in the Breeze Liu case was that it seemed like some of the websites were overseas that were hosting these content. We've also seen plenty of cases where there's overseas perpetrators, people distributing this content from overseas.

And it becomes hard to bring them to justice in the U.S. But having this at the federal level, federal authorities generally have more recourse to try to bring justice to people overseas than state and local authorities. That's part of why the criminal provisions in the federal law are important. Yeah. Jasmine, I want to get into more of what Paresh was just saying with regard to the FTC. First, what has the president said about the law?

Yeah, so President Trump has been very vocal in his support for the Take It Down Act. Particularly First Lady Melania Trump was a huge advocate and convened, you know, a forum at the White House, basically rallying support for this bill. And in his joint speech,

addressed to Congress in March, President Trump said, voiced his support for it and, you know, said that he'd want to use it for himself because, you know, no one is treated worse online than him. And I know that specifically raised the hackles of many people who are concerned about, you know, it infringing on free speech. And it sort of opened the idea that he could use it to take down

any sort of critical speech. One thing I do want to point out is that the law is pretty specific about what like intimate visual depiction means. As Parash said, it's like not necessarily as explicit, but the takedown provision does say intimate visual depiction. And that is, I will say, very explicitly defined in the law in ways that perhaps we don't want to go into on the radio.

With the FTC being the enforcement body, it has been raised that the FTC has been politicized. And talk about why, Jasmine.

Yeah. So President Trump, you know, removed the two commissioners so that FTC has five commissioners, you know, the party in power. So that would be Republicans in this case, appoint three. And then there's two Democratic commissioners. And President Trump removed two. He fired two of them.

He fired the two Democratic commissioners. And that's, you know, against decades of Supreme Court precedent to intervene without cause into an independent federal agency. And both of those commissioners are fighting for their jobs. But one of them did recently, I believe this week, he had to resign because he said, you know, I have a family to take care of and I can't keep doing this without getting paid for it.

And also there have been layoffs at the FTC. The FTC would have to prioritize this. And there have been public statements about the importance of child online safety. I'd say that a lot of

advocates of the Take It Down Act who, particularly in the Republican Party, have really emphasized that this is an online child safety law, even though it affects survivors and victims of all ages. But that's part of the problem is, will the FTC actually make this a priority in enforcement? And also, do they actually have the staffing and the resources to do that?

And I will just add there that I tried to get an answer this week from the FTC about how they are progressing on the implementation and have yet to receive an answer. And Jasmine makes a great point. The funding is also important on the criminal provision. Does the FBI, do other criminal authorities have the power and the training to go out and investigate these crimes? Yeah.

Yeah, we're talking with Paresh Javeh, senior writer at Ward and Jasmine Mittani, data and technology reporter from the 19th about the Take It Down Act. Are you skeptical listeners of the government can effectively enforce the law or police abuses of it? Do you believe that the law goes too far, not far enough? What questions or reactions do you have to the Take It Down Act? Were you about to say something there, Jasmine?

Yeah. And just to add to what Paresh said, you know, that is part of the reason why it's important to have both a federal law and state law, because that gives, you know, survivors and victims options of what they want to pursue. Right. You know, state court can move faster. But as Paresh was saying, you know, if it's overseas, you know, federal authorities have greater sway overseas.

over that. And also, you know, there are other laws that have been introduced. The Defiance Act has been introduced for the second time by AOC into the House this year, and that is creating, you know, a civil right of action. So that's for people who don't necessarily want to pursue criminal rights

charges, but they can sue for damages. And, you know, that is one of the things that for survivors can make a really big difference is, you know, getting payment for that understands the emotional distress and the harm that they have gone through. Yeah. So that's sort of where this could be going next with regard to creating this possibility. Yeah.

Is there a chance, Paresh, that the Take It Down Act would not be fully implemented since, you know, the tech companies do have some time, for example, to be able to come up with their system for taking down images within 48 hours? Are you hearing from First Amendment groups, for example, that they're concerned it might try to challenge some of its provisions?

They're definitely concerned whether they're going to challenge it, I think is a little unclear at this point. Part of what they may be waiting for is seeing what the FTC rules look like, how the FTC actually moves towards implementing it, how tech companies move towards implementing it. Maybe all the tech companies say that we will manually review every takedown request, we won't rely on AI, and we will make sure that it's actually an intimate visual depiction. And maybe then there's not as much of a concern here.

That seems unlikely, but I think that's part of what the groups may be waiting for. We've seen this administration ignore the law before and agencies miss deadlines all the time, even under previous administrations. So it's unclear whether they'll actually make it in law a year from now.

You can join the conversation by emailing forum at kqed.org, finding us on our social channels on Blue Sky, Facebook, Instagram, or threads at KQED Forum, or by calling 866-733-6786, 866-733-6786. More after the break. I'm Mina Kim. Thank you.

Support for Forum comes from the University of San Francisco School of Management. Celebrating 100 years of partnership with the Bay Area business community, the USF School of Management connects students to the city's vibrant culture, hands-on internships, and a wealth of career opportunities.

where AI and sustainability are integrated into every facet of business education, and where students bring innovation, ethics, and entrepreneurial leadership to a planet in need.

The University of San Francisco School of Management. Change the world from here. Greetings, Boomtown. The Xfinity Wi-Fi is booming! Xfinity combines the power of internet and mobile. So we've all got lightning-fast speeds at home and on the go. That's where our producers got the idea to mash our radio shows together.

Through June 23rd, new customers can get 400 megabit Xfinity Internet and get one unlimited mobile line included, all for $40 a month for one year. Visit Xfinity.com to learn more. With paperless billing and auto-pay with store bank account, restrictions apply. Xfinity Internet required. Texas fees extra. After one year, rate increases to $110 a month. After two years, regular rates apply. Actual speeds vary.

This is Forum. I'm Mina Kim. We're talking about the Take It Down Act this hour, a new federal law signed last month aimed at protecting victims of revenge porn and sexually explicit deep fakes. But First Amendment watchers and some others have concerns about it. We're talking about that with Jasmine Matani, data and technology reporter for the 19th

a nonprofit newsroom reporting on gender politics, policy, and power, and Paresh Dave, senior writer at Wired. And we're talking about it with you, our listeners, at 866-733-6786, at the email address forum at kqed.org. You can find us on our social channels at KQED Forum.

And Bumi writes, platforms have the capacity control to prevent listing such images and vetting images before posting. Why is there not a proactive approach to ensure these types of images are vetted, authorized before making them live online? Take it down is a reactive solution and certainly is needed, but we need to get in front of the problems.

What additional responsibility of any of our platforms taking to prevent this? Jasmine, I think you're probably familiar with this concern about this being just one of those things that's, again, reactive, puts a lot of onus on the victims to have to locate and report and so on. What's your response to Bumi? Yeah.

There's two parts. The first thing I want to address is why aren't we being more proactive about this? And I want to say that is one of the examples where we get into over-policing of any type of content, especially when automated tools are involved. You know, there's so much content online, you know, as police.

Parsh said, "I don't think there's a way that anyone could individually look at all of these and approve them." And we know that these automated image recognition systems, they often flag women as sexual even when they are not. They also are biased toward anyone who has lighter skin tones. And so that can be part of the problem of scanning every piece of content before it is posted online.

And then the other part of that, where what can other companies do? It's true. There are many things. And I think that the Biden White House worked a lot with

private companies to get these sort of voluntary agreements on how to confront this issue of deepfake abuse. And, you know, one of the things in particular is, you know, delisting these apps. They shouldn't be accessible, right? Making it harder for them to advertise on platforms such as Google, delisting them. So if you try to search for something online, they don't appear. Also, a lot of these apps

that create these explicit non-consensual deep fakes, you know, require payment. And there's a role for payment processors here to just say, you know, we're not going to allow you to make any charges to this website or make any charges to these kinds of apps. And those are some of the more proactive things. And those are, again, those were voluntary commitments that were under, you know, the Biden administration. So it's a little unclear, you know,

going forward, what that would be like. There isn't necessarily a federal enforcement mechanism there. And also, I'd say the current administration has been pretty hostile to any AI policy that was passed by the previous administration. But there are definitely many opportunities in the entire digital ecosystem to stop this abuse before it even happens. Yeah. Will this effectively also...

and get at parts of the internet that tend to be anonymous, the dark web, hidden, private. Jasmine? Yeah, so that is one of the concerns is that, um,

people have that there is a large burden on victims in order to report this. They have to know what websites it is on. They have to track it down. Even with Take It Down, where there's the requirement now that it must be removed within 48 hours, they have to know that it is actually on the website. And those things, dark web, it's harder to know. Also, one thing that it doesn't cover, and I think this is an issue that

I think particularly groups who are concerned about encryption and privacy have brought up. There is some debate on whether it covers, for instance, private messaging. Lawmakers have said while this was in committee that they don't believe that this covers things like encrypted storage or encrypted messaging. But I think we're in a time where we're seeing the letter and intent of the law doesn't necessarily matter in its enforcement. Right.

A listener on Discord writes, the big beautiful bill they're pushing wants to impose a moratorium on regulating generative models and systems, making regulating AI virtually impossible and opening the floodgates on the software that will actually generate this stuff. Take it down simply forbids publishing the output on the Internet. It leaves a huge hole almost to the point that it doesn't matter anymore. Paresh, the big beautiful bill imposes a moratorium on regulating this?

On generative AI, yes. It calls for a 10-year pause on the state laws and says those state laws can't be enforced. It's a big concern for a lot of groups that are advocating for more regulation and more safety from these models that do everything from write poetry to generate funny images to generate sexualized deepfakes. And

That is a, you know, a concern. But the federal lawmakers are saying, you know, we want innovation to thrive. We don't want a patchwork of state laws. And that's why we're seeing what we are in the big, beautiful bill.

Well, this is our discord says, why not push at least a bit of the responsibility on the AI model providers that are more than capable of detecting if they are being used to generate porn and such. It makes common sense. And I see no reason to exclude it. Jasmine, your thoughts?

Yeah, I think that is an area where we are seeing some changes. And I will say this is a place where journalism and accountability has had the most impact. You know, publications like the independent 404 Media have reported on the companies that are, you know, have these models available. And that's been the most effective way to get them down. And it's just a problem of finding companies

the particular models. And there have been issues too. There was a really prominent study by Stanford that found that in image generators, some of them had actually been trained on images of child sexual abuse material. And so they were able to then take those models down

and remove that training data. But it had been in the world for so long. And you know, the way things are shared on the internet, it's once something is out in the world, it's really hard to eradicate it. Let me go to Sylvia in Berkeley. Hi, Sylvia, you're on.

Hi, Nina. First of all, thank you so much for your program. It's magnificent. And I do think it's very wonderful that it's protection on the federal and the state level. Because, you know, I watched actually when Melania Trump and this group of, I guess, politicians were actually having a meeting about this issue.

federal law. And, you know, there was a congressman there whose son actually, who had not come out being gay, actually committed suicide because some of these young kids had put some images of him. And that is very tragic. And for me, it's important because I do have grandkids and young people around and work with a lot of kids.

Yeah, thank you so much. Sylvia, thanks. So you have some concerns with regard to actually the opposite, as opposed to policing the kinds of images that maybe the right would consider deviant images.

The images themselves, which is the concern earlier of somebody being distributed around them. Still, the Cessna writes, why does no one appreciate that this is the Internet Censorship Act? This is a continuation of our puritanical background in this country. This will only suppress our freedom of expression. It will give the right a tool to dictate our online content and attack their opponents.

So, Jasmine, for people who are interested in using the law as intended, where should they start and what steps should they take so that we don't have the types of tragic outcomes that we've talked about on the show and listeners have shared if they are a victim of online abuse that fits the Take It Down Act provisions?

Yeah. So, um, I do want to, um, really quickly address the comment because I have seen so much about this being an internet censorship bill. And I understand that this is, um,

We don't necessarily know how it'll be enforced, but I also want to emphasize that this is really big for survivors. You know, the number one thing for someone who's been victimized in this way of like private images of them being distributed to their coworkers, you know, for teenage girls in high school to have deep fake spread of them throughout, you know, thousands of classmates. They want the images to stop. They want the abuse to stop.

And that is what, you know, Take It Down is trying to do. There are, I think, very legitimate concerns about this law, but I think it's also important to talk about the impact for survivors. So how do we use the Take It Down Act? As we said earlier, you know, there's a year where companies can...

need to implement this takedown provision of the law. In the meantime, there are many companies, including like search engines, including, um,

Social media sites have ways that you can request images to be taken down. You know, sometimes it's an email. Sometimes it's a form you fill out. Also, I want to say, you know, pornography websites also have that available. The important thing to do, the first thing to do is to document images. You want to, you know, create a strong, secure record that includes like date, time.

time, user, where something was posted. You only wanna do that for yourself or if someone explicitly authorizes you to do that for them, but you basically want a record of the abuse and then you can report it to those sites. If it is an image that you took of yourself, for instance, like you took a selfie and then it was shared without your permission, one avenue is actually using the copyright law

in order to take it down for a copyright violation because it's an image that you created that was then distributed without your permission. And I also wanna say, there are different,

ways that people who are specifically youth who are victimized, there's resources dedicated specifically to help them. There's the National Center for Missing and Exploited Children operates the Cyber Tip Line. And that is also anybody can report to that. If you see something that looks like, you know, a child is being abused online, you can report to the Cyber Tip Line. They also operate something called Take It Down, which

which is no relation to the law, but that is like tailored specifically to help, you know, people whose images, you know, youth, you know, anyone minors, people under 18, whose images of like sexual abuse are being shared. And they have like tailored support for that kind of content sharing.

Um, so do you think there are lots of things where this could get real hairy real quick that people should pursue finding a lawyer, Jasmine, in these situations? Yeah, you know, I think, you know, advice that advocates that I've spoken to is this is really hard. This is really difficult to go through. And I think

The Cyber Civil Rights Initiative, which is a nonprofit, which is, you know, fighting technology facilitated abuse, has on their website a list of lawyers. They also have a list of laws so you can be aware of, you know, what your rights are in the state, what sort of actions you can pursue. They have a list of lawyers who have experience, you know, in different states working.

with these kind of laws. And also, you know, it's always, you know, if you have the ability to, it's useful to reach out to a lawyer, to have someone guide you through a process that's really traumatic. And, you know, it's hard to navigate legal stuff. That's where, and it's also useful to have somebody who has experience in these areas. And again, that's the Cyber Civil Rights Initiative.

Let me remind listeners, you are listening to Forum. I'm Mina Kim. Well, we're getting lots of comments from listeners. Alex on Discord writes, Section 230 needs to be repealed. With it, tech companies avoid liability for hosting illegal content. Repealing it would force companies to actually moderate their content and not just with automated systems. Well, that would be a whole other discussion, Parash, but do you want to talk about the connection this listener is making to Section 230? Absolutely. Like,

It's been a discussion point for Congress and for tech companies, for internet platforms for several years now. There hasn't been a lot of progress. It doesn't feel like there's going to be much progress. Even if Republicans do move on Section 230 reform, there'll be

lawsuits from the industry. It's going to get tied up for years. We're not anywhere close, in my view, from removing that liability provision.

Douglas on Discord writes, so we're delisting apps for public consumption, but allowing huge companies to build and use them at will? Because I seriously question whether or not we should be giving corporations more reign than individuals, given the history of the last 20 years of the internet. Restricting it from consumer use, but allowing businesses to own and operate them seems like another massive step in the wrong direction. Paresh, let me go back to you on that. Yeah, I just want to make a distinction here that...

There's porn, and we used the term earlier in the show, revenge porn, which has gone out of fashion because what we're talking about is, as Jasmine mentioned, tech-facilitated abuse. This is image-based sexual abuse. That is a different category of content from porn, which is either consensual or doesn't affect real person content.

And so to use these tools to generate porn where it's consensual or, again, not a real person depicted, that's one thing. These laws try to tackle the narrow scope of the tech-facilitated abuse. And I think it's important to draw that distinction. Yes, there is this sort of religious movement to get rid of all porn and the tech-facilitated abuse, but we need to draw that line between the two sides.

You know, earlier, Senator Ayesha Wahab was mentioning that our regulations of technology tend to be behind, especially because lawmakers are not necessarily the most tech savvy. What do you think about the longevity of this particular law? Do you think it can absorb, you know, shifts in technology for a decent amount of time?

It does feel like that. And I think it goes to part of what you discussed earlier, the lack of proactive requirements. You know, one of the proactive ideas that's been discussed is can you identify when something is image-based sexual abuse versus legitimate pornography?

And right now, tech companies would tell you that it's very difficult to tell the difference because there are signs of coercion that you can see sometimes in content that suggests that it was a non-consensual, you know,

created or distributed image, but other times it's not. And to build technology to do that is very difficult. And I, you know, had a whole investigation last year of some of the proactive ideas that Google executives refused to adopt because of these concerns. And because it's, you know, the law doesn't necessarily prescribe

The tech companies use specific technologies to address this issue. And I think for that reason, there is some longevity here. Yeah, though, this listener writes, what can we do with AI being able to generate explicit imaging on seemingly normal photos like a selfie, like that additional layer? Is that policed by the Take It Down Act, Paresh?

It is not right now, other than, you know, you as the perpetrator trying to use one of these nudify apps to do that, you'd be violating the law if you were sort of doing that with an intent to harm people.

But again, it's that line between free speech. There may be cases where you could be political commentary about the president and you're putting him in a state of undress for reasons of political commentary and you want to be able to use a tool to do that, should that be allowed. Well, I'm sure you'll be watching where this law targeting non-consensual sexual content goes, how it plays out. Paresh, thank you so much for talking with us.

Of course.

use, how you can use the Take It Down Act. And she's a data and technology reporter for The 19th. Thank you, as always, listeners. And thank you, Mark Nieto, for producing this segment. This is Forum. I'm Mina Kim.

Support for Forum comes from the University of San Francisco School of Management. Celebrating 100 years of partnership with the Bay Area business community, the USF School of Management connects students to the city's vibrant culture, hands-on internships, and a wealth of career opportunities. Where AI and sustainability are integrated into every facet of business education.

and where students bring innovation, ethics, and entrepreneurial leadership to a planet in need. The University of San Francisco School of Management. Change the world from here. Support for KQED Podcasts comes from Earthjustice. As a national legal nonprofit, Earthjustice has more than 200 full-time lawyers who fight for a healthy environment.

They wield the power of the law to protect people's health, preserve magnificent places and wildlife, and advance clean energy to combat climate change. Earthjustice fights in court because the Earth needs a good lawyer. Learn more about how you can get involved and become a supporter at earthjustice.org.

Hey Forum listeners, it's Alexis. Did you hear that Forum is launching a video podcast? It is true! Each week we'll drop a video recording of a recent Forum episode on the KQED News YouTube channel. We can't wait to bring you into the studio for our conversations on Bay Area culture, California news, and beyond.

Our first few episodes are out now. Just visit youtube.com slash kqednews to see it all. That's youtube.com slash kqednews.