We're sunsetting PodQuest on 2025-07-28. Thank you for your support!
Export Podcast Subscriptions
cover of episode The Take It Down Act Is a Free Speech Killer

The Take It Down Act Is a Free Speech Killer

2025/5/23
logo of podcast Power User with Taylor Lorenz

Power User with Taylor Lorenz

AI Deep Dive AI Chapters Transcript
People
B
Becca Branum
T
Taylor Lorenz
通过深入探讨互联网文化和政治,Taylor Lorenz 为听众提供了对在线世界的深刻分析。
Topics
Taylor Lorenz: 我认为目前网络言论自由正受到前所未有的攻击。《下架法案》以打击非自愿私密图像为名,实则允许任何人随意要求下架任何他们不喜欢的网络内容,无需任何理由,且必须在48小时内完成。主流媒体对此视而不见,反而推波助澜。我认为捍卫网络言论自由是我们时代最重要的问题之一,因为互联网赋予了每个人发声的权利,但也因此,我们看到了对网络言论权利的空前攻击。我们需要警惕这种以监管为名,实则限制言论的立法趋势。 Becca Branum: 我认为《下架法案》的起草含糊不清,可能对网络言论和隐私产生重大影响。该法案的主要缺陷是,一旦收到投诉,平台无需验证内容是否为非自愿私密图像,甚至无需考虑是否包含裸露图像。这意味着该法案很容易被滥用,成为政治武器。虽然该法案旨在解决非自愿私密图像的问题,但实际上可能会导致对大量与此无关的言论进行审查。更令人担忧的是,特朗普政府已经表现出利用该法案打击异己的意愿,这使得情况更加复杂。因此,我认为我们需要警惕该法案的潜在危害,并努力改善其执行,以保护用户的言论自由。

Deep Dive

Shownotes Transcript

Translations:
中文

The Hoover Dam wasn't built in a day. And the GMC Sierra lineup wasn't built overnight. Like every American achievement, building the Sierra 1500 heavy duty and EV was the result of dedication. A dedication to mastering the art of engineering. That's what this country has done for 250 years.

and what GMC has done for over 100. We are professional grade. Visit GMC.com to learn more. Assembled in Flint and Hamtramck, Michigan and Fort Wayne, Indiana of U.S. and globally sourced parts.

Calling hard seltzer lovers. Searching for the tastiest seltzer? Look no further. Neutral Vodka Seltzer is absolutely delicious. Made with real vodka and real juice, Neutral keeps it tasty with every sip. With flavors like pineapple, watermelon, orange, and lime, there's something for everyone. Neutral. Keep it tasty. Enjoy responsibly. Copyright 2025 Anheuser-Busch. Neutral flavored vodka and carbonation. 4.5% alcohol by volume. St. Louis, Missouri.

Hi, welcome to my new YouTube series, Free Speech Friday. Right now, free speech is under attack. Across the country, state and federal lawmakers are passing sweeping draconian censorship laws, all under the guise of reining in big tech. These laws are not about protecting the public, nor do they even rein in big tech. They're about controlling and restricting speech online. The internet is powerful because of how it democratizes speech. It gives everyone a voice and allows anyone to build a platform, especially to challenge power. As I wrote in my book, Extremely Online, it's because of

this shift that we're seeing a record assault on online speech rights. And the mainstream media is asleep at the wheel. They cheer on this bad legislation and push moral panic narratives and manufactured outrage campaigns that are then used to justify stripping our civil liberties online. I believe that this aggressive assault on free speech is one of the most important issues of our time. So every week on Free Speech Friday, I'll be talking to the people on the front lines of this fight, constitutional lawyers, digital rights activists, tech policy experts, and more.

I'll be breaking down bad legislation, exposing the power players behind these attacks, and giving you ways to fight for the right to speak freely online. I know tech policy can be kind of a snooze, but these laws are so incredibly important, and I really hope that I can convince you guys to care about them. Because if we don't stand up and fight for the right to speak freely online now, we may not have that ability at all in the future.

So without further ado, I want to get into this week's conversation. I spoke to Becca Branum, who is deputy director of the Center for Democracy and Technology's Free Expression Project. Becca has been a member of the Center for Democracy and Technology's Free Expression Project for a long time.

has been helping to lead the fight against a very dangerous law that just passed Congress called the Take It Down Act. The stated goal of the Take It Down Act is to combat the sharing of non-consensual intimate images, NCII, what you might know as revenge porn. This includes things like AI-generated nude deepfakes and other explicit imagery shared without a person's consent.

So cracking down on something like this seems great, right? Sadly, that is not what this law does, nor is it what this law is actually about. What the Take It Down Act actually does is make it so that anyone at any time can get any content posted about them online that they don't like taken off the internet within 48 hours, no questions asked.

Which is crazy, right? Becca is going to break down exactly how the law will function and the terrifying ways that the Trump administration is already threatening to weaponize it. OK, so, Becca, in your own words, can you tell me exactly how this law will work? Sure. So the Take It Down Act is a bill that was led by Senators Ted Cruz and Amy Klobuchar that intends to address something called nonconsensual intimate imagery or NCII.

So imagine that you've taken an intimate picture of yourself to share with a partner, and then the partner shares it without your permission elsewhere. That's real NCII, and that's been a problem for a long time. But given the new and ubiquitous access that people have to general artificial intelligence, sadly, what we're seeing is the proliferation of AI-generated NCII. So people can now, with that really powerful technology, actually take clothed images of people and create nude images of them, to which they never consented, certainly either to create or distribute.

And so the Take It Down Act intends to address this problem in two ways. First, it creates a federal criminal law that prohibits the nonconsensual distribution of intimate imagery, whether it's real or generated with AI. And then it also creates a novel notice and take down system

for non-consensual intimate imagery that allows people to submit a complaint to a platform and then within 48 hours that platform will be required to take down whatever it is has been sent through the notice and takedown system which in theory if it operates exactly as intended right it would be a great tool for survivors and victims to have to have this image imagery taken down

Unfortunately, it was drafted in a pretty ambiguous and imprecise way. And for that reason, we think it's going to have some pretty significant implications for speech and privacy online. OK, so this law basically in an effort to stop what's colloquial known as revenge porn, these explicit images kind of like what we saw go viral with Taylor Swift last year on X, where there were these AI generated deep fake nude images of her doing pornographic acts. The law would make it so that anybody can request to have content taken down and that content has to be taken down within 48 hours.

And do the platforms do anything to make sure and verify that this is actually NCII content? No. So that's one of the major flaws of the law is that really it's whatever is submitted through the notice and takedown system that needs to come down. So once a complaint is submitted, there's nothing that requires the platform to consider and sort of examine what is this content? Is it something that should come down or should not? They don't even have to consider if it actually includes nude imagery at all. So if I find an unflattering picture of myself on the Internet and want to take it down, I

So long as I fib a little and say that in good faith, I think is NCII and submit it to a platform, this law will make it that that picture, whatever it happens to be, has to come down. Well, I guess you'd probably be doing it in kind of bad faith, too. Like, I feel like this would be very easy for bad actors to weaponize, right? Because they constantly want content taken down. And if they can just flag it in a system where it's

mandated to be removed within 48 hours, I feel like that would create a lot of room for abuse. Absolutely. And ironically enough, the president himself previewed this for us. He was giving an address to Congress where he was advocating for passage of the Take It Down Act and noted that no one is treated as poorly as him on the internet. And so he'd like to use

this mechanism for himself. Now, he might have been talking about NCII, but there's an awful lot of criticism about him on the internet. And I do wonder if he and his supporters might have something else in mind once this bill and its obligations come online. This law sounds like a free speech disaster. Can you talk through some of the implications and risks involved?

to artists, activists, journalists, if this law passes? It's really risks across the speech spectrum, right? The most obvious risks are for the sharing of consensual intimate imagery, right? Nude imagery and pornography is protected by the First Amendment in the United States. And there are people who support this bill who would prefer that nude imagery altogether be censored from the internet. And so that's the most immediate effect.

But also, because there's no requirement that platforms actually verify what is submitted and verify that it should be taken down. And candidly, I don't know that they even have authority to resist unlawful or inappropriate requests,

Really anything that's submitted through these systems could be subject to takedown. All it really requires is somebody who's willing to make a complaint in bad faith. And if you're going after people who are criticizing you, I imagine that that's pretty easy and there might be quite a few complaints like that. And we have examples of that happening with other systems like the Digital Millennium Copyright Act, where people often use that takedown mechanism to censor criticism that they don't like having nothing to do with copyright at all.

Yeah, I think anybody that operates a YouTube account is familiar with the concept of copy striking. And this is basically, as you mentioned, when they file this copyright claim to try to get content taken down. In that instance, though, it does have to go through some sort of review process, right? There is a review process as well as other guardrails that are implemented.

And also, importantly, there's no time limit really within the DMCA. It has to be taken down within a reasonable amount of time, but it's not 48 hours. And so platforms do, although not as much as I'd like, they do do some due diligence to make sure that it's information that should be taken down. And certainly wrongful takedowns happen under the DMCA.

But the Take It Down Act really doubles down on some of the worst aspects of the DMCA and eliminates even the minimal guardrails that exist there. And so I think we can expect the same effects that we see under the DMCA of wrongful takedowns, but amplified. I read that this law could also be used to dismantle things like Wikipedia or rollback end-to-end encryption. What are some of the broader implications that this might have for

for the way the internet is structured currently? Sure. So I'll start with encryption and then sort of talk about the broader internet ecosystem. So the bill is intended to apply to platforms that primarily consist of user-generated content. Beyond that, it doesn't get into much detail. It excludes email, which is great, right? That's private messaging person to person, but it doesn't really exclude much beyond that except for some technical internet infrastructures.

And so it's really unclear the extent to which it could apply to things like private storage, private messaging, and things that are encrypted. And the end-to-end encryption piece is particularly important for a few reasons. End-to-end encryption is utilized with the expectation that only the user who's sending information and the person receiving it have access to it. And that's how platforms market themselves. Things like Signal and others, the whole point is that remains private.

So if Signal, for example, who would be covered by this act, gets a request for imagery to be taken down, they don't really have much that they can do to respond. Really, it's two impractical options that they have. One is just to ignore the request, which if you do enough of that, the FTC under this bill is very likely going to come after you.

The other option would be to break encryption, which we know would be extraordinarily damaging to products like Signal and others that are really based on end-to-end encryption, but also has significant security implications for the broader internet ecosystem. Given that Wikipedia, for example, and lots of things on the internet are primarily user-generated, really this bill applies to everyone. Although it's been framed as a way to hold big tech accountable,

It applies to anything and anyone that hosts user-generated content. And so for a platform like Wikipedia, which would be swept in here, they will still have to set up a system wherein they would have to take requests for takedown of nonconsensual intimate imagery. And that imagery would have to come down within 48 hours.

Summer's here and Nordstrom has everything you need for your best dress season ever. From beach days and weddings to weekend getaways in your everyday wardrobe. Discover stylish options under $100 from tons of your favorite brands like Mango, Skims, Princess Polly, and Madewell. It's easy too with free shipping and free returns. In-store order pickup and more. Shop today in stores online at nordstrom.com or download the Nordstrom app.

This episode is brought to you by State Farm. Knowing you could be saving money for the things you really want is a great feeling. Talk to a State Farm agent today to learn how you can choose to bundle and save with a personal price plan. Like a good neighbor, State Farm is there. Prices are based on rating plans that vary by state. Coverage options are selected by the customer. Availability, amount of discounts and savings, and eligibility vary by state.

And again, there's no proof that it's that imagery. It could be anything, right? Like, I mean, I hate the photo of me on Wikipedia. I would love to get it taken down. Like, theoretically, you could just file this request and the platforms would be required to honor it within 48 hours. That's right. I mean, some platforms might read the law differently and approach it differently. And I hope that platforms do their due diligence to determine whether or not things that are being complained about really do need to come down. But given a choice between FTC and

enforcement and just taking something down, it's often a lot easier and cheaper to just take it down. Take it down. Right. Why is the FTC in charge of all this? So that's just the way that the bill was structured. Basically, it defines a failure to operate this notice and takedown system as an unfair or deceptive trade practice under the FTC's existing authority.

And that's a really unique structure in the US. And it's actually a pretty problematic one given the current political reality we live in. The FTC is not what it used to be, right? The president has purported to fire the two Democratic commissioners. And so it's only run by Republicans right now. And it's also-- this system is being set up within an FTC that the president himself is trying to control, right?

he's actively trying to undermine the independence of this agency, which means I think it's pretty fair to wonder to what extent platforms like X or Truth Social or others that have close ties to the state administration are actually going to be held accountable under this law, meaning that it could be weaponized for all sorts of purposes, but there could also be weaponized under enforcement, meaning that actual victims won't actually have the recourse they need and are supposed to have under the law as applied to actual NCII.

This law is so bad and so dangerous to speech. Why do you think it passed? It's a good question. I think there's a charitable answer and then a less charitable answer. At CDT, we take NCII very seriously and it is a really profound harm. And I want to emphasize it's a speech harm itself.

Right. NCII is often weaponized against women and LGBTQ people who are prominent in public life and who dare to speak out. It's often weaponized against activists and others with the intention of humiliating and silencing them. And so empowering people to be able to take down these images and not feel dissuaded from participating in public discourse is really important and why we do want to figure out ways and good constitutional and effective ways to actually get this imagery taken down.

And so I think Congress was eager to be responsive to what is a really important issue. I think it's important, though, to put this in the broader political context of the fact that Congress has been trying really hard to regulate big tech for a while and hasn't had much luck. The difference with this bill is, candidly, most of the tech industry threw its weight behind it. And that's a really interesting phenomenon. I don't know why I can't speak for them.

But what I do know is that this bill, as compared to some others that have been proposed, is quite easy to comply with, right? It doesn't cost a lot of money for platforms to just censor people's speech. They can just take down content and they will be in the good graces of the FTC, where other types of regulations that have been proposed actually will require them to fundamentally restructure their products in the ways that they make money.

I think it's so important to also just like distinguish what is actually cracking down on big tech and what's not. Because I feel like you hear this rhetoric so much, especially from the lawmakers, I feel like involved with this bill where they're like, we're cracking down on big tech. And what they're ultimately doing, I mean, you see this with the Kids Online Safety Act and other bills too, what they're really doing is censoring speech. They're censoring users' speech. And as you said, providing this easy way for the tech companies to just sort of mass censor people without really like having to invest in resources or do actual thoughtful moderation. And I think that's a really important thing.

And it doesn't really harm these companies that much. I mean, it seems like time and time again, they're targeting speech and users free expression rather than tackling things like data privacy or really fundamentally going after these companies, businesses. Absolutely. I fully agree. And it's something that we see, unfortunately, throughout the tech regulation conversation and including related to things like Section 230.

So early this year, Senator Durbin and others said that they would be introducing a bill to just repeal Section 230 outright. And their theory of the case, as I understand it, is that they want to introduce this as a way to get big tech to the bargaining table with legislators.

But that leaves a really important party out of the conversation, which is actual users who are the ones who do really benefit from Section 230. These enormous platforms operate around the world without the Section 230 protections. And that doesn't mean that the internet is perfect in those places. It's actually a lot less free and has a lot less information accessible to people. And so I get really distressed about conversations about regulating big tech that leave actual people and people's free expression rights out of the conversation.

Well, because they're not meaningfully regulating it. They try to pass these bills and they want to seem like they're cracking down. And like you said, you'll get these headlines in the media saying, you know, Congress passes this bill to protect users from NCII. And the framing, I think, to the public is like, wow, Congress is doing great work. And

I guess I'm wondering, like, what role do you think the media has played in this, too? Because it seems like there hasn't been a ton of accountability on this. I mean, even the Washington Post coverage where I used to work and I love everyone there on the tech policy team. But the headlines sort of take this framing as a given. They're not centering the speech concerns or centering the harm. Yeah, it's certainly a frustration for free speech advocates like me and partners that have worked on this bill and who have been.

trying to offer solutions to Congress to do two things at once. One, create an effective tool for victims to get this imagery taken down and also protect people's privacy and free expression rights. And I think I can't really explain why that framing wasn't picked up by the media, but it's disappointing because it does us disservice to people in at least two ways. One, they're not really getting the full picture of what this law does.

But also what really saddens me about this bill is it's making a false promise to victims of NCII by saying this is something that's going to work for you. The FTC is going to work on your behalf to make sure that platforms actually do this. And I'm not confident that that will work. It's a bill, in our view, a bill that's plainly unconstitutional and creates a false promise to victims that they'll have a tool at their disposal when it's going to ultimately harm a lot of people in the end.

Speaking of its constitutionality, assuming as soon as it passes or Trump signs it, it will be challenged. What is the likelihood that you think that this bill will ultimately go into effect as law? So there's the criminal provisions of the bill and then there's the notice and takedown provisions. I can't speak to sort of who within the ecosystem would challenge the bill immediately. Again, a lot of tech companies did support it.

That being said, once the FTC gets this up and running in about a year, I would expect there eventually to be an enforcement action against a platform that either wasn't aware that this was a requirement or just wants to challenge the bill itself. And unfortunately, as someone who would like there to be a constitutional and privacy protective way for victims to have help,

I think the notice and takedown provisions are really going to struggle because it is so poorly defined and is going to result in censorship of so much speech that's not actually NCII. I think the bill will struggle to actually be fully implemented into law or to stay as enacted law if it does go into effect.

It just seems like such a messy situation that we're, I mean, we're creating all of these laws. I've written about a bunch of these state laws too, like which was passed in Utah around, well, there was a Social Media Regulation Act, which has now been, I think, rolled back. But now they've passed this child influencer law that has really egregious speech concerns. But as you mentioned, like, I mean, when I talked to a bunch of groups,

advocate for free speech. They said there are so many assaults and they're dealing with the potential overturn of Section 230 that they can't challenge all of the laws that are passing at a rate that I think there's just not enough resources to challenge them. Why do you think that's all happening so much at once? It's a great question. I think part of it would be the long term struggle that people have had to regulate

big tech platforms and the like. And I think people are getting increasingly frustrated with experiences online. Unfortunately, what legislators are proposing as solutions aren't going to actually meet the moment for what people's real concerns are. I think generally speaking, though, we are in a crisis of free expression and it's not just online.

The government is taking unprecedented actions to silence its critics, silence people on the basis of their protected expression, to defund things that they disagree with, to go after their perceived enemies. And so we are unfortunately online and off in a real crisis of free expression, which makes me wonder and it makes me a little disappointed that

people in Congress, who I know share our free expression values, still went along with this bill. Even in this moment of crisis, it's pretty disappointing. Why do you think they went along with it? Is it just this effort to seem like they're cracking down on big tech? It's hard to say. I think part of it is just how worthy a cause this is. Unfortunately, that doesn't relieve them of their obligation of getting it right, right? Congress can't pat themselves on the back, or shouldn't rather, for passing unconstitutional bills that

are well-intentioned but aren't going to function as they should. And so it's a disappointing state of affairs. Even if I don't begrudge anyone for trying to address NCII, I really wish they would have taken more care in doing so. I feel like this is a constant story of Congress, though, where they say, like, there's this really important issue that's affecting all of these people online. And rather than target the root cause, they're just going to pass this slapdash kind of terrifying law, especially under the

context of the current administration, I was listening to Nilay Patel at The Verge, who's really great talking about this. And he was saying like, look, this kind of law, if it was passed under previous administrations, like might work its way through, it might ultimately not be enforced in certain ways. But you mentioned how Trump himself has already spoken about wanting to use it to take content about himself down. I guess, can you talk a little bit about the ways that the Trump administration specifically might try to weaponize this law?

The Trump administration has demonstrated a willingness to go after whether it's law firms or others that they see as their perceived enemies. And so when this comes online, there are certainly companies that could be if to the extent that they're not sort of in line with the Trump administration or they're perceived as enemies, an FTC that has lost its independence could certainly pursue them for investigation. But there's also the weaponization of under enforcement, right? There are a lot of tech

platforms that are really close with the administration. And if they can rest easy at night, knowing that the FTC won't go after them because they've cozied up to the administration, then people are left without the actual takedown mechanism that they're supposed to have for actual NCII. Right. So it doesn't do anything for anyone, really, except, I guess, the people getting the headlines, the lawmakers that can sort of delude people into thinking that they're more safe as their speech is, you know, dismantled. It's genuine.

disappointing. I'll say, I think from here, to the extent that the bill isn't taken down, I think civil society should really commit itself to making this bill better in its implementation and working with companies to urge them to implement this in an appropriate

way. The bill in its current state is really disappointing, but we can't afford to sort of cede to this bill being weaponized in all the ways that it could be. And I'm hoping that in working together with civil society and putting pressure on companies, they might actually implement this in a competent way. Oh, my gosh, that's very hopeful. I don't share that optimism at all. Only because I don't think they have a huge incentive to I think they'll do whatever is cheapest.

Well, hope springs eternal. I couldn't do this work if I wasn't an eternal. Yeah. I guess like, what do you think average people can do about these type of laws? I feel like, I mean, as I mentioned, most people just read the headlines and they, the headline that they see is Amy Klobuchar, you know, passes this landmark

bill getting NCII taken down and they're not aware. So aside from educating themselves kind of on what a lot of this legislation actually does, how can they make a change in the system to prevent laws like this from being passed? It's hard to sort of get past the headlines, but paying attention to those in civil society who are trying to speak out about these things is really important.

And I'll say as we move forward on the Take It Down Act, it's really important for people to document their experiences with the bill, right? To the extent that they themselves are submitting non-consensual intimate imagery and it's not getting taken down, it's really important to know because it means that platforms are flouting their responsibilities. But also, if people find themselves getting censored under this provision too, that's really important to document as well. We can't fight back against something that's

without that sort of evidence of its negative implications. And so that's a really helpful way that people can fight back against this as it gets implemented.

Do you think that there's going to be any sort of chilling effect that this law could have in terms of people censoring themselves almost before putting something up? That's always possible with speech regulations online. And particularly as people find themselves getting censored for things that shouldn't be censored, I think it's going to be a lot harder to post things that are adjacent to nudity or sexuality at all online.

Particularly as we see this bill get implemented, we know that there's going to be censorship that comes along with it and along with it also self-censorship. And we can look to things like SESTA-FOSTA, which was the last intermediary liability bill that Congress passed related to sex trafficking and the like.

As those bills were implemented, an enormous amount of speech entirely unrelated to sex trafficking got censored. And it really made spaces dedicated to sexual content or sexual identity that much harder to operate. And so I think we will inevitably see content both censored by platforms, but also people succumbing, understandably in some cases, to self-censorship as well.

Yeah, it seems like there's such a massive attack on NSFW content across the internet. We're seeing these age verification bills. We're seeing, yeah, just the rollback of free expression on all platforms in terms of that stuff. And also we know that SESTA-FOSTA was a disaster in terms of sex worker safety, right? Like it just sort of pushed a lot of this stuff that was previously well-run and transparent publicly online to these underground systems. And actually I think ended up exacerbating the human trafficking problem that they...

purported to try to address. Right. It's a real tragedy. And it's sad to see Congress with that knowledge continuing to pursue these intermediary liability bills under the guise of wanting to hold companies accountable. But really, it's user speech that ends up suffering in the end. Well, Becca, thank you so much for joining me today. Where can people continue to follow your work?

We are at cdt.org. Awesome. Thank you so much. All right. That's it for this week's Free Speech Friday. If you like this video, don't forget to subscribe to my tech and online culture newsletter, usermag.co. That's usermag.co. And listen to my tech and online culture podcast, Power User, wherever you get your podcasts. My bestselling book, Extremely Online, is also now out on paperback with a brand new cover that I'm obsessed with. You can pick it up wherever books are sold. See you next time.