Thank you.
It uses an AI-powered identity solution that lets you target accurately and optimize when and how you message. So for the marketers and decision makers out there, try Attentive for messaging that performs and results that transform. Visit attentive.com slash decoder to learn more.
Support comes from ServiceNow. We're for people doing the creative work they actually want to do. That's why this ad was written and read by a real person, and not AI. You know what people don't want to do? Boring, busy work. Now with AI agents built into the ServiceNow platform, you can automate millions of repetitive tasks in every corner of your business, IT, HR, and more, so your people can focus on the work that they want to do. That's putting AI agents to work for people.
It's your turn. Visit servicenow.com. Support for Decoder comes from Arm. Have you ever wondered what's powering your smartphone and other devices we interact with daily? Or what lies at the heart of life-saving drug discoveries and robotic surgeries? The answer is Arm. Arm technology is moving the world forward, enabling AI to create a more meaningful, more connected life for everyone, everywhere.
ARM believes the future isn't about technology. It's about people and the possibilities technology can offer us all. The future is built on ARM. You can discover more at arm.com slash discover. Hello and welcome to Decoder. I'm Neil Apatow, editor-in-chief of The Verge, and Decoder is my show about big ideas and other problems.
Today I'm talking to Verge Policy Editor Addie Robertson about a bill called the Take It Down Act, which is one of a long line of bills that would make it illegal to distribute non-consensual intimate imagery, or NCII. That's a broad term that encompasses what people used to call revenge porn, but which now includes things like deepfake nudes.
The bill was sponsored by Democrat Amy Klobuchar and Republican Ted Cruz, and it just passed the Senate. It would create criminal penalties for people who share NCII, including AI-generated imagery, and also force platforms to take that imagery down within 48 hours of a report or face financial penalties. NCII is a real and devastating problem on the internet. It ruins a lot of people's lives, and AI is just making it worse. There are a lot of good reasons you'd want to pass a bill like this.
But Addy just wrote a long piece arguing against it, saying that giving the Trump administration new powers over speech in this way would be a mistake. Specifically, she wrote that passing the Take It Down Act would be handing Trump a weapon with which to attack speech and speech platforms he doesn't like.
At a high level, Addy's argument is that Trump is much more likely to wield a law like this against his enemies, which means pretty much anyone he doesn't personally like or agree with, and much more likely to shield from its consequences the people and companies he considers friends. And we know who his friends are. It's Elon Musk, who now works as part of the Trump administration, while at the same time running the ex-social network, which is full of NCII.
Now, Addy and I have been covering online speech and talking about how it works and how it's regulated for about as long as The Verge has existed. And she and I have gone back and forth about where the lines should be drawn and who should draw them about as many times as possible as two people can over the years. But our conversation and our coverage has always presupposed a stable, rational system of policymaking that's based on the equal application of law.
Here in 2025, Trump has made it clear that he can and will selectively enforce the law. And that changes everything. Once you break the equal application of law, you break a lot of things. And there is just no evidence that Trump is interested in the equal application of law. You'll hear us really wrestle with that here. The problem doesn't go away just because the solutions are getting worse, or that the people entrusted with enforcing the law are getting more chaotic.
So in this episode, Addy and I really get into the details of the Take It Down Act, how it might be weaponized, and why ultimately we can't trust anything the Trump administration says about protecting the victims of abuse. Okay, the Take It Down Act and its collision course with our constitutional crisis. Here we go. Addy Robertson, welcome to Decoder. Hi. Hi.
Let's talk about the Take It Down Act. This is a bill that would solve the problem of AI-generated deepfakes, of what people had been calling revenge porn. Now we call it non-consensual intimate imagery, which I think is a much better name. What is the Take It Down Act?
The Take It Down Act is one of several bills that have been, as you said, meant to address AI replicas and non-consensual intimate imagery. It sort of has two parts. The first part is that it's criminalizing NCAII, including digital forgeries. And this is a part that is important and has gotten somewhat less discussion because it is not the most controversial part. The controversial part is the taking it down part, which is that if it's passed, then the
Web platforms that focus on user-generated content are going to have to create a system by which people can report intimate visual depictions in the BILS language. And those depictions have to be taken down within 48 hours at the risk of the FTC stepping in and imposing penalties.
So this is a bill that's kind of landed in the Federal Trade Commission? Yes. Well, the criminal provisions of it get just enforced through sort of criminal enforcement channels, but the FTC is the one that's responsible for enforcing the part against tech platforms. Is that new? That feels like a new power for the Federal Trade Commission. You and I have covered a lot of tech platform laws and policy ideas. The notion that it's the FTC that's going to show up and
fine meta if they don't comply with some content moderation rule. That seems new. I can't say whether it is completely new, but it does seem like kind of a novel interpretation of it's basically unfair competition act law.
And so they're going to define that, I guess, to include NCII, which as, yeah, as you mentioned, it's a law that can stretch pretty far, but it is a little unusual, I think. I want to come back to that because the question of who gets to enforce this law and who might gain leverage over the platforms is very important to your overall thesis, which is this law in this administration is just a cudgel. It might solve the problem or it might help us
in a policymaking framework, think about ways to solve the problem. But the reality of it is that you're going to give a pretty gangster like Trump administration, just something to beat platforms over the head. So they comply with other speech ideas. So I want to come back to the FTC of it because that seems important to me, but I just want to stay in the sort of practical, uh,
part of the problem. Who gets to decide if any of this imagery is inappropriate or intimate? Is there a definition we get to use? Is it just, here's a picture of Taylor Swift, she obviously didn't consent to it, she gets to say take it down? The interesting thing that the Electronic Frontier Foundation and some other people have pointed out is that there are, it seems like, different standards for what happens if something is counted as criminal versus what has to be taken down through these systems.
Everything is based on the idea that there's a definition in the law of an intimate visual depiction, which is sort of what you might expect. It's someone who is engaged in sexual activity. There's nudity. There's a variety of sort of a constellation of things there. The part of it that criminalizes it talks about these other sets of conditions that have to be met. So, for instance, there's a carve out if something is of public interest. Right.
And you have to meet these conditions that go beyond the idea that it's just a sexual image of someone. The take it down part of it, at least in the EFF and some other folks' interpretation, doesn't actually have those limits, that it really is just, is something an intimate or sexualized depiction of someone? Okay, you have to take it down. Which, funnily enough, there is a Trump-related example of recently, which is that...
Someone in the federal government was protesting Doge by creating an AI-generated video of Trump licking Elon Musk's feet.
And there was debate on Blue Sky in particular about whether this could constitute NCII that should be taken down. Blue Sky took it down, but then didn't because it's in this place where, yes, it's a sexualized image. It's also a sexualized image that is not only of a public figure in a way that is specifically related to some news, but that is framed in the context of it's not just that this image exists, it's that it exists.
is specifically part of a news story about government employees doing something that is incredibly noteworthy. So you could probably come down on either side of whether that is inappropriate or not, but it's clearly a situation that's unique and that goes just far beyond the idea that this is a sexualized image of someone. And it doesn't seem like the Take It Down Act really accounts for that. The blue sky example is particularly interesting because the video that Blue Sky took down was
was not the video itself. It was a video of the monitors at the Office of Housing and Urban Development where employees had hacked all the displays and started playing this video. So the video itself was newsworthy. And so that is just a layer of complexity and complication and nuance that I think
is not in the law as we see it. It's hard for the platforms to make determinations. And then you have our current set of platforms, which do not seem well-suited to making nuanced moderation decisions in the current administration, which is just sort of constitutionally allergic to nuance. How do you think that all plays together? Is it just...
We know it when we see it, which is like the classic line people use about sexualized imagery. Is it we just get to decide? Is it famous people are going to get protected and regular people are going to get washed away in the fray? We've just been spending decades trying to work this out as a legal framework and a moderation framework even before AI. There were issues where, say, Facebook decides, all right, there's no nudity on Facebook, but all right, there's this...
the very famous napalm girl photograph from the Vietnam War. So does that get taken down? There's just this incredibly complicated dance and all these incredibly complicated questions about, yes, should public figures get more protection, less protection?
It's something I don't think I have a clear answer for, and I don't think anyone does. It kind of boils down to you know it when you see it. There are many situations where it is just clearly unambiguous. There are a bunch of websites and there are a bunch of services whose deal is allowing people to make decisions.
non-consensual intimate images or post them specifically because they are sexualized images of women that we hate. And they are women that we personally know and we want to humiliate. There's not any nuance about whether there's value to that. It's bad. And so there definitely are situations where I think you could enforce a law that says this is bad and we don't have to worry about that. I think that the problem with the Take It Down Act is that it includes really none of in the takedown provisions
even the focus of that, that it really is this very broad net that even if we weren't under the Trump administration would be causing all of these problems and questions about you're just building this system that's very open for abuse. And we especially have an example of that working already, which is the DMCA. So copyright, you
probably if you're listening to this, have heard of copy striking. It's very obvious that when you create something that is, while not legally mandated, really required to get safe harbor protection under copyright law, then you're making this big, very blunt instrument and you have to weigh the potential good that it can do against the harm that clearly is just
undeniably happening with something like the DMCA. You know, it's interesting about the copyright example is that it is such a powerful weapon on the platforms that in the creator economy, there exists an entire parallel set of norms that that culture has developed about how nuclear it is to issue a copy strike. You see it play out in all these ways that I don't think the framers of the DMCA could ever, would have ever contemplated. I don't think you can do that
with non-consensual intimate imagery. I don't think you get to have a big normative argument with a person who feels wrong because there's a sexualized AI depiction of them. This seems even worse in that way. Yeah, I think there are a couple of unintended consequences, which is that part of the reason why it's such a big deal in copyright is that the law just gets used for things that it was never meant to be used for. It's not just that someone says, this is copyright infringement. There is
an entire extortion industry that is just based around the idea that you'll be fraudulently accused of copyright infringement. And if you don't pay up, then they're going to use this blunt instrument against you. So it, first of all, erodes the idea that the law itself is worthwhile and is addressing that thing. And I think that while this should not stop people from trying to stop NCII, it also creates the scenario where if this law and this blunt instrument gets
used in a way that is not meant to actually address the problem, it sort of devalues the problem that you suddenly get to this point where I think if it's like copyright, people stop taking the idea of NCII accusations seriously because you look at it as, oh, well, it's just clearly this person trying to cause drama in the community or take something down and
for reasons that have nothing to do with NCII. And so the actual conversation about people who are being hurt here can get lost if you create this system that doesn't really target it well. I can guarantee you there's someone who's listening to this right now who is saying, this is so hard, why even try?
And I get that. There's a nihilism, I think, to the current moment in policymaking. There's a nihilism in the reaction to the Trump administration. There's a kind of nihilism embedded in the Trump approach to policy that says this is too hard. Why even try? People will just sort it out, get tough. But it's not actually the case that it's too hard, right? Across the states, we have seen different approaches to this kind of material, to the responsibilities of platforms, right?
to whether or not it should be left up or taken down. What does this look like across the states right now? At this point, I think 48 states have some kind of NCII law. Mostly the laws tend to focus on the people who are creating it. I don't think there are that many states
laws that go after the larger tech platforms, which I think is just the point at which it goes from here's a person committing a crime to here is this absolutely massive system that you have to navigate in a way that creates these huge risks. And recently, like you said, we've sort of been moving toward deepfakes. I think around 14 states currently have
mostly just laws that add digital replicas to this kind of existing NCII framework.
A lot of the problem with DeFix so far, though, is that there are all these other issues that get wrapped up into it. So there's NCII, but then there are also attempts to make laws that will fight, say, AI-generated imagery in election misinformation, which is obviously an issue, but it is a somewhat different issue that raises a whole bunch of different constitutional questions and harm questions.
There's issues that are basically the equivalent of copyright infringement. There's the Elvis Act where the goal isn't really NCII. It's we have to stop artists from getting their livelihoods appropriated, which again, serious problem, completely different like threat matrix. So I think that the whole AI discussion is still really confused. We spent last year talking about the Kids Online Safety Act that had a lot of ideas in it. It went nowhere. It appears to be stalled out completely now.
But Melania Trump is basically advocating for, hey, we should do a take it down act. Like I'm famous. There are nudes of me on the internet. I don't want there to be AI generated nudes of me. Let's pass this bill. And then you have bipartisan support for here's the most obvious problem we can see, which is AI generated NCII. Here's a bill. Here's just a solution. Let's have it. And that feels like it's very narrow, but also just ill considered.
There have been several bills that try to address this, and some of them have been a lot more limited and a lot less controversial as a result. So the Defiance Act, which passed, I believe, out of the Senate last year but didn't end up ultimately passing, is something that adds AI-generated imagery essentially to existing civil penalties for NCII. So back in 2022, the Violence Against Women Act was amended to include civil action, which again means you can sue someone for anything
NCII and the Defiance Act kind of bolts, again, as many places have done AI generated imagery into that. It doesn't include the kind of take it down provisions that have proven really controversial. There is also the Shield Act, which has been reintroduced, which introduces criminal penalties. I think that there are a bunch of efforts to individually criminalize or create civil penalties against the creators of this thing.
And I think that there are then these huge problems when you try to expand that to we have to make anyone on the Internet who is unknowingly hosting it remove it.
And that's the shift to the platform, right? That's saying, okay, Facebook and YouTube and TikTok are now going to be responsible for what's on their platforms. One more distinction I want to make about the various state approaches to this in the pre-AI era is that they were often rooted in copyright law, right? Like there would be some non-consensual intimate imagery or people had taken photos and then one partner would have them and distribute them vengefully, right?
and there's a copyright interest, right? You'd made the photo together and that provided the basis for some of this imagery to come down. I'm not sure where that comes from with the AI generated stuff. So are we just in a totally new realm of where the authority to take things down comes from? Copyright even for non-simulated NCII was a nightmare.
So the problem with copyright is that you have to have created the image. And so it applied to selfies. If you took a picture of yourself and you send it to someone else and it spread, okay, you own that photo. The problem is a bunch of NCII isn't that. A bunch of it, even if it is something that was consensually taken, it wasn't taken by you. So you don't own the photograph.
It's something that a partner took. And so copyright, either it means it doesn't really apply to those things, or it means you're creating this really weird copyright exception that causes all of these other problems. Like, say there have been, this is not related to NCII, but lawsuits around whether a paparazzi photo can be then claimed by the person who was in the photo, which just causes all these other problems. Yeah.
Yeah. The reason I asked that question is the idea that the government can look at a picture and declare that it's illegal or should be taken down is very complicated. It requires some framework. It requires some rigor. It requires some due process that people can understand and argue against. And then...
Making that bigger so that the responsibility also lies with the platforms like YouTube or TikTok or Instagram seems even more complicated. And I think that's where you get to the Take It Down Act because that's the big step in the Take It Down Act, right? Saying, okay, the Federal Trade Commission is going to be able to fine Instagram if this imagery appears on Instagram and Instagram doesn't take it down immediately. And that seems like a lot of leverage for our government to get over these platforms.
The 48 hours, the take it down immediately is also a problem there, because if, say, you sue someone and you go through an entire case about whether something is NCII, at the end of that, you have, say, a court pretty clearly considered whether it counts. And the 48 hours issue is just...
We need to take a quick break. We'll be right back.
Support for the show comes from Alex Partners. Disruption is the new economic driver. The days of predictable business cycles are over. For over 40 years, Alex Partners has helped companies develop winning business strategies amidst uncertainty. One of today's greatest challenges, the rise of AI. As AI reshapes the tech landscape, Alex Partners is committed to helping your company thrive.
In their sixth annual Alex Partners Disruption Index, a global survey of 3,200 senior executives, 65% of executives believe AI and machine learning provide positive opportunities for their companies. And 62% of CEOs expect significant business model changes in the next year. In the face of disruption, businesses trust Alex Partners to get straight to the point and deliver results when it really matters.
Read more on the latest trends and C-suite insights at disruption.alexpartners.com. Disruption.A-L-I-X. Partners.com. Support for this show comes from Liquid Ivy. It's the middle of winter. The air is dry. Your radiator is blasting. Your humidifier ran out. And you wake up parched.
Sure, you can keep a glass of water by your bed, but sometimes you're so bone dry, you feel like you're about to crumble into dust like a cartoon skeleton. When you need extraordinary hydration quickly, there's Liquid IV. They say that just one stick and 16 ounces of water can hydrate better than water alone. Liquid IV is powered by something they call LIV Hydroscience, an optimized ratio of electrolytes, essential vitamins, and clinically tested nutrients that turn ordinary water into extraordinary hydration.
Plus, they're easy to take on the go, so you can feel hydrated after a long flight, before a workout, or when you just feel dried out. You can enjoy one of the delicious flavors like white peach or acai berry and feel hydrated and healthy quickly. Treat yourself to extraordinary hydration from Liquid IV. Get 20% off your first order of Liquid IV when you go to liquidiv.com and use code DECODER at checkout. That's 20% off your first order.
with code DECODER at liquidiv.com.
Sometimes a single performance can define an artist's legacy. Think about Hendrix's fiery Woodstock National Anthem or Beyonce's Homecoming at Coachella. Coming up on Switched on Pop, we're exploring artists who've had recent transformative live shows. First is Missy Elliott, who recently put on her first world tour where she taught everybody to get their freak on. And then there's her collaborator Timbaland, who recently evolved from beatmaker to orchestra conductor at the Songwriter Hall of Fame.
And then Lady Gaga, whose Chromatica Ball featured a theatrical museum of brutality, revealing the darker side of Gaga's mayhem. Listen to these live moments on Switched on Pop wherever you get podcasts. Brought to you by Defender.
We're back with Verge Policy Editor Addy Robertson. Before the break, we were discussing the Take It Down Act and what it's trying to accomplish. At the federal level, there has simply never been a good solution for regulating non-consensual intimate imagery that isn't either too broad, which creates potential civil liberties violations, or too narrow in that it covers too little of the problem while the problem is still evolving. That's what we're seeing today with AI making NCII much more complicated.
The Take It Down Act seems to be firmly in the too broad category, which raises all kinds of problems. But we're not evaluating all this in a vacuum. The states have had a patchwork of laws trying to cover the abuses of NCII for years now to mix results. And as you've heard Addy and I talk about, copyright law has been one of the only effective ways the government has been able to curb some of this within the confines of the First Amendment. But that is a deeply imperfect solution that has resulted in widespread misuse. So
So I wanted to know, where does this all leave us? And what about the current Trump administration? Has Addy concerned that this new bill might be weaponized in ways that severely undermine its goals? So in a normal environment, maybe this law passes. Maybe there's a bunch of chaos. There's a bunch of lawsuits. A bunch of platforms might issue some policy documents.
And we would slowly and somewhat chaotically stumble towards a revised policy, right? Maybe the law gets amended. Maybe there's an enforcement regime that builds up around the law. Something. Frankly, the most likely outcome is that someone takes this law to court and a lot of this is declared unconstitutional. Sure. In a functioning system. And then maybe part of the law stands and maybe hopefully it's a good part that isn't open to abuse. But good chance it would just get overturned.
And even in that process, I think Congress would look at that and say, OK, this is a problem. We're going to have some solutions for the back end of this win or lose. Right. Like you can see how the normal policymaking legal judicial process might otherwise play out. We have a lot of history with that. Your piece is titled The Take It Down Act isn't a law, it's a weapon. And your thesis is that we do not live in a normal world anymore.
And the Trump administration in particular is so sclerotic and so addicted to selective enforcement that what they're really going to do is pass this law and then use it as a cudgel to beat platforms in the submission. Explain what you mean.
All right. So the normal process we've been talking about this whole time just assumes there's a functioning government. There's a hard problem. Everybody in the government fights about this problem. Civil society does. People play their part. But everyone's kind of acting in good faith. Everyone does actually care about stopping NCII. They do recognize that there are problems with overbroad restrictions on speech and
everyone's trying to work toward a solution because they believe that laws are things that should be applied evenly and that laws should be applied in ways that fundamentally work with the Constitution. The Trump administration just doesn't believe in the rule of law. It doesn't think that laws are things that you should apply to everyone in the way that they are meant to be applied by Congress. What it believes is that laws are things that you apply to the people that you hate in any way that can hurt them. And you don't apply them to the people that you like.
The way that you apply them is not actually in a way that stops the problem they're meant to address. It's a way that gets you the thing you want, which probably has nothing to do with that. So we've seen this, say, play out with the TikTok ban might be the most absolutely egregious example, which is that while I don't agree with the ban, it was something that was passed with a bunch of bipartisan support. It was passed after years and years of working with TikTok. It was then sent up to the Supreme Court and the Supreme Court upheld it.
it is hard to find a law that was more rigorously vetted.
And then Trump takes office a day after it passes and he says, well, specifically, I like TikTok because TikTok got me elected and also TikTok has been saying I'm really great. So what I'm going to do is I'm going to sign an executive order. The executive order doesn't make an argument for why I have the power to extend this deadline. It doesn't make any kind of argument for why this is compatible with the law. What it says is don't enforce the law.
And then it goes to all of these platforms that are trying to follow the law and it tells them don't follow the law.
And there is absolutely no reason to do this that is compatible with the thing that Congress and the Biden administration and the Supreme Court did because he doesn't care about the law. What he cares about is getting the law to do what he wants. And the Trump administration is kind of staffed with folks who believe this, who act this way. We talked about Brendan Carr a lot at the FCC, who said,
uses his enforcement power or his merger review power to push broadcasters into doing whatever speech he wants or punish them for news coverage he doesn't like. There's Elon, who seems like an important character in all this because he runs a platform. There's Mark Zuckerberg, who seems more amenable to making deals with the Trump administration or in moderation, is saying, okay, we have this bill that says if you don't take down this imagery in 48 hours, the FTC can fine you.
Is that just another way for Trump to say, I could destroy your company unless you do what I want, or I can tell the FCC to hold off?
Yeah, there are two sides to this. And one of them is the side that we talk about often, which is what if this gets weaponized against people that the government doesn't like? And then there's the other side that I think less often is raised before Trump, which is even if you take this law seriously, you're not going to get it applied against the people that are actually hurting NCII victims. Because, again, the administration doesn't even care about applying the law to people that it should be used against. Yeah.
Elon is maybe the clearest example of that, which is just let's take the extreme view that it is worth doing anything to get NCI off the internet.
A place this would come into play is X, formerly Twitter, which has had probably the biggest NCII scandal of the last several years, which is that a bunch of Taylor Swift sexually graphic images were posted there and spread there. And it did very little to stop them. It eventually kind of blocked searches for Taylor Swift. If you're looking at major platforms, it's the first one you think of.
You cannot enforce this law against X. It is almost literally inconceivable because Elon Musk runs the department that governs whether the FTC has money and people who work there. The week before I wrote this, we broke a story that said that
someone, very likely Doge, had cut about a dozen people from the FTC. I'm trying to imagine a scenario where X completely ignores the law and says, well, screw you, Taylor Swift, I don't like you. In what world does the FTC do anything? I can't think of a way where it would act in any way in the interests of NCII victims. Right. You can just make the comparison to the TikTok ban. Yeah.
Congress passes a law, it goes to the Supreme Court as I think the Take It Down Act would immediately go to the Supreme Court. Some version of the law remains or is thrown out, who knows. And then you have a law where the president can say, I'm telling my FTC not to enforce this law as it relates to X.
But at the same time, he might say, go push Mark Zuckerberg. I want to make sure I'm the most popular person on Facebook today. And so if that doesn't happen, we know for a fact that this imagery exists on these platforms because platforms at scale always have this imagery.
and we're going to find some way for the FTC to punish them, Elon, go get it done. And you can just see that play out pretty simply. I don't think you need to be very imaginative to get to that scenario. Is there any provision in this law that would stop it? So Ted Cruz and Amy Klobuchar are the primary sponsors of the bill. And I asked them about Elon. I asked them, do you think that X has any way that they could be dinged for this? They haven't gotten back to me.
I don't really know how you would build that because the point of laws is that Congress writes them and it says, here's what's supposed to happen. And the executive branch makes it happen. Like the original sin here is that Congress has now allowed the executive branch to just decide that it doesn't pass laws anymore. Like Congress isn't real.
And there's nothing that Congress can do inside one individual bill to solve the fact that it has ceded all its authority. The thing it has to do is get that back and say, you have to do what we want. So we have to be able to write laws again. So that problem is playing out, I think, across the entire government. That's the constitutional crisis that everyone is always talking about, that we are always writing about. But I just want to stay focused on the sort of easy-to-grasp notion of selective enforcement.
In a world where Donald Trump says there's illegal imagery on YouTube and I'm shutting down Google, I'm imposing fines so high on Google that it effectively can't run and we're not doing that for X.
That's a lawsuit, right? Like Google shows up and goes to court and say, this is selective enforcement. There's some interests on the other side of that that might reconcile that. But that all feels like the elephants are dancing and the regular people who are actually the victims of this imagery have no ability to stop the bad thing from happening anymore.
Does that feel like regular people who are actually the victims in the situation have any recourse at all? First of all, there's the whole part where you can try to directly go after the people who are posting this stuff and making it. But in terms of the larger platform stuff, you can probably file a lawsuit that says this law is not getting followed. And then that's good for you. You do not have the power of somebody like Google. You don't have the legal resources. There are nonprofits that will probably back you and it's worth it.
a try, but it is not something that regular people should have to do, nor that regular people are probably the best equipped to do. So that's just the sort of graspable issue here, right? You have the selective enforcement, you have massive disparities in legal ability and resources and financing between the platforms and regular people. You have a constitutional crisis. Everyone can see that. I really don't think it takes a lot of imagination to see all of that play out in the context of this law and this administration.
Then there's one turn down the road where I think you do have to see some farther consequences. You wrote in your piece, there are concerns that this law, the Take It Down Act, could be used to undermine end-to-end encryption or to somehow go after Wikipedia. How would that work? The end-to-end encryption is another kind of thing that would be a problem even outside Trump, which is that it's just not necessarily clear that having a service where you can't see what's on it
doesn't still mean you're in breach of the law because you don't know whether there's something that you're supposed to take down. So say you're running signal or iMessage and somebody says, well, there's this person forwarding this image to
And you don't, as the company, by design, have access to that image or have the ability to stop what people send through your service. So are you then liable under the FTC? This is just a problem that comes up with all kinds of rules about takedowns. It's a huge issue.
And then we get a little more to the selective enforcement where it's always, again, a problem, but we have never had such a clear indication that a presidency is going to abuse it. Trump has publicly said to Congress, well, I think I'm going to use this law, too, because nobody gets treated as badly on the Internet as me. And like everything, he kind of frames it as maybe a joke, but there is no reason to believe that he's joking anymore.
He has extorted millions of dollars from platforms that banned him because he filed these specious lawsuits and he's very powerful. So you could really see a world where he does decide that's not a joke. I'm going to go after any platform that I think treats me badly.
And we also then have, like you've mentioned, the Elon of it all. Elon has made a really clear public stance against how much he hates Wikipedia, which is a platform full of user-generated content that, while it is carefully moderated, could potentially have a problem where bad actors egged on by Trump or a functionary or one of the many public outlets that supports him tries to get it punished by the FTC for, say, somebody's
spamming NCII on it and it's trying to create a takedown process but that doesn't stop the FTC from claiming that it's violating this process and then they try to just drain its resources with a lawsuit that say it can fight but it's
Just plausible enough that then courts have to go in and try to work through it. And that's assuming you get a judge who is acting in good faith, which there is pretty good evidence that there are some Texas judges that Elon Musk has worked with that are not acting in good faith that have allowed things like his lawsuit against Media Matters, which is.
just absolutely ridiculous to proceed in a way that has caused it to lay off staff and that has just drained it, even if it doesn't ultimately lose. We need to take another quick break. We'll be right back.
This episode is brought to you by Polestar. Electric performance is at the core of every choice that went into the all-electric Polestar 3. Like merging a spacious interior with the torque and handling of a sports car, or the ability to go from 0 to 60 in as little as 4.8 seconds, and get an EPA-estimated range of up to 315 miles per charge.
Choices like this all lead to making your decision to choose Polestar 3 obvious. Book your test drive today at Polestar.com. You may get a little excited when you shop at Burlington.
I'm saving so much! Burlington saves you up to 60% off other retailers' prices every day. Will it be the low prices or the great brands? You'll love the deals. You'll love Burlington. I told you so.
Workdays can be unpredictable, but your workwear shouldn't be. With the Cintas Apparel Plus program, you'll have freshly laundered garments for everyone on your team delivered every week. Cintas has workwear for just about any job imaginable, with high-performance fabrics and an uncompromising fit that stretches and moves with you. Don't leave looking and feeling good to chance. Visit Cintas.com and get ready for the workday.
We're back with Verge Policy Editor Adi Robertson. Before the break, we were diving into Adi's thesis around the Take It Down Act and how it might be weaponized by the Trump administration. What makes it worse is the fact that this Congress has ceded so much of its authority to the executive branch in a way that puts us in a very precarious position when it comes to preventing presidential overreach. So what happens next? And more importantly, is there a way to actually tackle the problem of NCII in a meaningful way at the federal level, or is it just a matter of how we're going to do it?
Or are the victims here just caught in a political power struggle as the problem keeps getting worse? I think as we are all seeing in the early going of the second Trump administration, litigation might be able to solve some of these problems, but it is costly and slow going and by no means certain. And that feels like a thing that the Trump administration has now realized, right? That
flooding the zone with all of these actions, with all these executive orders. Maybe they'll lose even the majority of them. But the concept of action actually brings people into line. Do you think this is part of that trend? You mentioned Amy Klobuchar is one of the sponsors here. Is this truly a bipartisan effort or is this a bunch of people want something to happen and this is the thing that seems most likely to happen?
I think this is bipartisan in the sense that these laws have been coming up for years now. Take It Down Act is part of a long line of internet safety bills. Those bills are bipartisan because this is an issue that a lot of people care about and genuinely do want to stop. But I think that Democrats in Congress have just done an
almost incredibly bad job of responding to the threat of the Trump administration. And it feels almost like this is just inertia of this is a thing that maybe you could have done under another administration. And maybe you could have had these fights that we talked about to try to make it better. And there's not in that world and they don't recognize it. You and I have covered a
attempts to regulate content on the internet, attempts to regulate internet providers for a decade now together, maybe more, which is a little scary. And in that time, I feel like my personal pendulum has swung back and forth, right? To, well, maybe we should do some rules for platforms because the market is not providing any incentive for platforms to do this stuff, right? Like in a sane world, right?
The platforms themselves would have gotten way out ahead of we should not allow sexualized AI-generated images of people, and they would have stopped it. But instead, they're going the other way, right? They're moderating less and less for some reason, maybe to please Trump, maybe because it's cheaper. Who knows? And so it feels like, okay, the government should set some rules. And we see that happen in other countries. But right now, I'm at government seat regulation is bad because it will just be weaponized by a corrupt administration. And I don't know...
where that pendulum will ever land. I don't know if it will ever stop swinging for me. I'm curious where you are, because again, you and I have been doing this together for so long.
Yeah, I think there are a few questions for me. The first question is how much laws could address big platforms at all? There's the theorem that Mike Masnick came up with, which is just that content moderation at scale that's good is impossible. So, for instance, meta platforms, they do have rules against NCII. They have systems they've created that are meant to take it down.
They're just a gigantic platform. And for them to moderate at the level that they probably would need to to actually comply with, say, just promptly taking things down would have to be just massive. So there might just be something there.
to bigness that makes it inherently impossible. So that's the first problem. Then the second problem right now is, yeah, as you mentioned, for a while, it did actually seem like it was the government versus big tech. So at the very least, you had if you didn't like what these companies were doing, the government was at least targeting them and was trying to do something.
And we're just, I think, at the other side of the tech lash now, because at this point, tech companies have gotten a friendly administration. And so the battle lines just aren't even drawn in the same way, which means that you're, I think, you can't trust Congress and you can't trust the administration to the same extent. And so pragmatically, even if you think these laws are good in theory, they're just less likely to make sense and work.
And that you also have now this at least partial movement to create alternative platforms that I think is more successful than it's been in the past. It's mostly come up through microblogging with, say, Blue Sky and Mastodon are serious attempts at contending with these big platforms. And those places are clearly more vulnerable. So the kind of threat that I think sometimes seemed really hypothetical in previous years, which is, well, these big platforms are going to be fine, but the little guys are going to
be hurt, which made less sense when, say, you didn't know where the little guys were and the big platforms seemed like they were going to get hurt. We're just in that hypothetical situation now. Like the stuff that sounded to me maybe kind of like, OK, I feel like I'm being paranoid here. It's actually just it's life. Yeah, we can just read about it every day. And I think one of the interesting things about your piece was that even some of the folks that we've
covered, that we've written about, that we've interacted with, who have made different trade-offs, who've said, actually, this problem is so bad, the speech trade-offs might be worth it, are agreeing with you that this bill is
A weapon that the Trump administration could use. Marianne Franks, who is someone who takes, I think, a different stance on the First Amendment in general than me and a variety of people that are similar to me, has still said, yeah, I wish it weren't true that the Trump administration is probably going to weaponize this in a way that doesn't necessarily help NCII victims. But it is.
This is a crisis that a lot of people think is unfortunately just going to skew the battlefield. And I think that comes back to you can have a lot of smart people with different views on where the line should be. That's civil society. That's what you're talking about. That's that system that is built up, right? Here are the think tanks. Here are the policymakers. Here are the academics who are going to argue about how to make policy and what the tradeoffs are and whether these ideas worked.
And usually that leads you to some rational refinement over time. But in this case, I think that whole ecosystem, that whole set of people is looking at bills like this and looking at the Trump administration saying, maybe we shouldn't give them more power because there isn't this check on it. There isn't this refinement process that will occur. And I'm just not sure how we get back to it. Right. That seems like the thing that has stopped everybody in their tracks. Right. We all know this is bad.
Even the problem is happening to Trump himself. It's happening to his wife. It's happening to J.D. Vance. It doesn't seem like they're motivated to stop it, right? Or if they're given the tools to stop it, they will use the tools to actually stop it. What do you think is going on there? It has just never been clear that there is a group of people here who care about what happens to them. They don't care about what happens to anyone else. And they also have spent
an extraordinary amount of time and energy signaling that they do not care about women, that in fact, they support men who are accused of assaulting women, who are accused of sex trafficking women, women make up the vast majority of NCII victims.
And that this is part of their attempt to establish an anti-woke culture, that this is a way to, as J.D. Vance puts it just sort of more broadly, that we need to protect masculinity, that we need to let men be men again. And I think that it is rare to see someone so blatantly tell you that he does not care what happens to women as long as it's the women he doesn't like.
And I think that you should absolutely not trust anything that anyone in the Trump administration says about protecting women because it is only a way to get to the people that he thinks shouldn't be allowed to abuse women because he doesn't like them. His cabinet is full of men who have been fairly credibly accused of violence.
abuse and assault and harassment. He has recently allegedly stepped in to free someone who a Republican attorney general has called an admitted sex trafficker. I think that we can't trust him. Women should not trust him. No one who cares about this issue should trust him.
i mean that is as clear of a statement about the trump administration as there's ever been it's obvious now in a way that it was maybe subsumed in in the first trump administration but now it's right there on the surface and i i think in the context of a law like this which is ostensibly meant to protect people but can actually be used as a weapon against companies and people the administration isn't like it's worth saying out loud
What happens next? Is this law going to pass? Is it going to get signed by the president? Congress seems like it's mired in dysfunction. What do the next steps here look like?
After COSA, which came within like one vote of passing and then failed after everyone on earth in Washington said they were going to support it. I don't really know what happens. It seems like anything could fail now. This has advanced pretty far. And obviously it has the backing of the president and first lady. So I think it's definitely a real threat. I think that at this point,
Maybe congressional dysfunction could still save us. I think that maybe the best hope is that Congress does manage to pass something that is like the Defiance Act that has broad support and that really does create an actionable way to help this problem that is less clearly weaponizable. And I'm just hoping for that.
I'd like to thank Addy for joining me on the show, and thank you for listening. I hope you enjoyed it. If you have thoughts about this episode or anything else, you can email us at decoderattheverge.com. We really do read all the emails, and I will tell you, in the last week, we got one email saying an interview was the most boring ever, and another email saying the interview was the best we'd ever done. So keep them coming. You can also hit me up directly on Threads or Blue Sky, and we have a TikTok and an Instagram. They're both at decoderpod. They're a lot of fun.
If you like Decoder, please share it with your friends and subscribe or review your podcasts. If you really like the show, hit us with that five-star review. Decoder is a production of Verge and part of the Vox Media Podcast Network. Our producers are Kate Cox and Nick Statt. Our editor is Ursa Wright. The Decoder music is by Breakmaster Cylinder. We'll see you next time.