Renee DiResta's research detailed the mechanics of the 'big lie' regarding election manipulation, which angered individuals involved in spreading this misinformation. Her work has been spun into a conspiracy theory that she was part of a government scheme to censor right-wing speech, leading to harassment, subpoenas, and lawsuits.
The renewed interest can have a chilling effect on researchers and institutions, deterring them from continuing their work on content moderation and online misinformation. This could harm the public, especially children, by allowing harmful content to spread unchecked.
The SIO turned over all relevant documents and materials, which showed no evidence of any government-directed censorship. Despite this, the accusations and the resulting investigations did not lead to any apologies or corrections, and instead, moved the goalposts to new accusations.
Social media platforms are responding to public pressure and the political landscape. They face costs and risks associated with moderation, but they also recognize the public's desire for a safe and moderated environment. The platforms are trying to balance these considerations, often making decisions that align with the expectations and values of their dominant user base.
While foreign actors were present and attempting to influence the election, they were not as impactful as domestic factors. The platforms had improved their detection and response mechanisms, which likely mitigated the impact of foreign interference and AI-generated content.
Niche platforms allow users to find communities that align with their values and reduce harassment, but they also risk creating echo chambers where users are less exposed to diverse perspectives. This could exacerbate societal divisions and reduce opportunities for constructive dialogue.
The Fediverse is a network of independent servers using the same protocol (ActivityPub), allowing users to join and moderate their own communities. It offers more granular control over content and moderation, and allows interaction between different servers, creating a more community-focused and user-controlled social media experience.
Renee believes it is crucial to reach audiences that might not normally encounter her work or the facts she presents. Engaging in these spaces can help counter misinformation and provide a more nuanced understanding of complex issues, even if it invites harassment.
Thank you.
You can learn how Stripe helps companies of all sizes make progress at Stripe.com. That's Stripe.com to learn more. Stripe. Make progress.
And of course, podcasting.
Yes, the thing you're listening to right now. Well, it's increasingly being produced directly by companies like venture capital firms, investment funds, and a new crop of creators who one day want to be investors themselves.
And what is actually going on with these acquisitions this year, especially in the AI space? Why are so many big players in tech deciding not to acquire and instead license tech and hire away co-founders? The answer, it turns out, is a lot more complicated than it seems. You'll hear all that and more this month on Decoder with Nilay Patel, presented by Stripe. You can listen to Decoder wherever you get your podcasts.
From the Vox Media Podcast Network, this is Channels, a podcast about media and tech. I'm Peter Kafka, and when I'm not chatting here, I'm also the chief correspondent at Business Insider. Thank you to everyone who had nice things to say about our last two episodes, and thank you to everyone who told other people about them. You guys rock. Also, thanks to Hello Donuts, the fine Philadelphia donut shop that closed its doors last weekend.
i dropped by and i got a half dozen and a cool t-shirt if you listen to channels and you own a donut shop in the acela corridor please let me know i would like to buy a donut from you in the future
Today's chat is about content moderation. Wait, don't go anywhere. It really is a conversation about content moderation, but this is not a fuzzy theoretical discussion. It's a real-world discussion about the way big platforms like Facebook and Twitter do and don't police what's on their site, how that might be changing. It's a chat with Renee DiResta, who studies this stuff and who is facing very real-world repercussions herself about her work.
She's been sued. She's been harassed by Congress. And as we discuss, I think those repercussions are likely to get more severe under the next Trump administration. And yes, we also talk about Joe Rogan because of the podcast in the late fall of 2024. But this is a different Joe Rogan discussion than the ones you're used to. So I saved it for the end. And now here's me and Renee DiResta.
Renee DiResta is an academic and researcher who specializes in online abuse. She now works at Georgetown School of Public Policy. She is also the target of powerful people who want to punish her for her work, and those people just got a lot more powerful.
Welcome, Renee. Thanks for having me. Did I describe the work you do correctly? Is that the best way of describing it, online abuse research? Yeah, I study adversarial abuse, so people who are trying to use technology to manipulate the public in some way. If you are interested in this stuff, you have probably heard Renee talk about it at some point or seen her quoted. She's done a lot of good work, and she's been helpful to journalists like me who want to figure this stuff out.
It's also raised your profile among people who are really angry at you. As you've described it, your work has already generated harassment, threats to congressional investigations, to lawsuits that name you specifically. One of those lawsuits is spearheaded by Stephen Miller, who is about to reenter Washington as an important policy advisor for Donald Trump.
On top of that, you've got Trump allies like, sorry for the long intro, Trump allies like Jim Jordan, Marc Andreessen, and Elon Musk are all making noises about punishing the, quote, censorship industrial complex. And my understanding is, and part of what they're talking about is punishing you. So we might get into the weeds on this, but big picture, can you explain to our listeners what these people are talking about, why they're angry about you and your work? Well, to be clear, I don't think they're actually angry about me or my work. They're angry about...
propaganda campaign that has spun our work into something that was advantageous for people who wanted to maximize their chances in the 2024 election. So if you're studying election integrity, as we did in 2020, and we detailed the mechanics of the big lie quite extensively in a 200 page report,
And that report pissed off a lot of people who were involved in spreading the big lie. So, you know, there's quotes about how, you know, if you have good enemies, it means you've stood up for something in your life and you're doing meaningful work. And that's how I choose to think about this.
One of the dynamics of the propaganda campaign about our work turned us into what I think of as like characters in a cinematic universe, right? They needed some villains to position themselves against to rally their base. And so the people who made claims about our work, pretending to be like exposing wrongdoing or whatever, used very classic storytelling tropes in which I was some kind of villain and they were some kind of heroine.
hero and they were exposing us and you know and it's all sub stack subs and that's basically what we're confronting now.
And when you say us, you were previously a research director at Stanford Internet Observatory. Is that the us you're talking about? Yes. Okay. And we can get into this too, but Stanford Internet Observatory was created post-2016 to study online interference from folks like Russia. You did a lot of work there. Alex Stamos did a lot of work there. That institution is mostly shut down now. They sort of debate that, but basically you didn't receive a renewal. Alex Stamos left. It seems like they don't have any more funding. Yeah.
That's correct. Okay. So just for context there. So I'm just wondering if you can talk about how your attitude or thinking may or may not be changing as a result of the election. Like I mentioned, you'd already been harassed. You'd already been hauled before or asked to speak before Congress. And you can talk about sort of what it's like to be the subject of a Jim Jordan inquiry. But a lot of the folks that you were tracking and concerned about throughout your career are
are either going to be in the White House or very close to the White House. RFK Jr. is someone you spent a lot of time researching earlier in your career. He now looks to be having a big part of the health apparatus at the US. Have you thought about sort of what this is going to mean for your work and then for you personally? Well, I think me personally is...
Not the problem here, right? It's the American public that's going to suffer. It's children who are going to get sick because RFK Jr. continues to promote discredited lies about vaccines causing autism, right? It's people who are going to be subject to the same kind of kangaroo court investigations that I was in an effort to silence them and silence their work. And again, just to be clear, we experienced this, you know, we were perhaps the canary in the coal mine
in the context of the Jim Jordan investigations, but climate scientists went through a very similar rigmarole back in 2012, prior to the internet, so it got a little bit less attention. But the intent was to discredit their work, to set it back. And it did, it did set it back. And, you know, climate change is real. And the impacts of setting back that kind of research are...
you know, while they're perhaps most immediately felt by the people who are targeted, the problem is societal, the impact is societal. And so we're entering into, I think, a very unfortunate time where we really need people to stand up and push back to, we need institutions, particularly at the state and local level now, to stand up and push back. We need governors, we need
you know, civil society, this is the time to speak up and speak out. And my concern is that people are going to see what happened to us and they're not going to want to. That's the impact it's going to have. It's going to be seen as a chilling effect. So I'm not particularly worried about myself. I'm worried about the broader societal implications of what's about to happen. So I share those broader concerns and maybe even some specific ones. I'm a journalist and the coming administration looks like they're very angry with journalists as well. But
But I am just, just to push a little bit, I am curious. I mean, when you got appointed at Georgetown, did you guys discuss the fact that, that, you know, you have a very, you have a long list of enemies coming after you and are they, you know, is Georgetown want to get itself involved in,
hiring someone who may be the subject of more lawsuits, you know, more congressional investigations. Is that the kind of thing that you have to deal with professionally? Well, it is, but I want to, I want to, again, harken back to history here, right? This is what happened under McCarthyism. This is what happened with the house and American activities committee. There were efforts to turn certain people into scapegoats, into cautionary tales so that institutions would back down. So Georgetown, I think really has a strong sense of professionalism.
of that history and being in DC, I think helps. I think other institutions need to see the writing on the wall and recognize that we were again, canaries in the coal mine, but if somebody wants to come after you, I think it's very important to understand that no amount of documents that you turn over are going to exonerate you.
Nothing that you actually have done is really the subject of the inquiry. It's a propaganda campaign to frame you, to smear you, to make you untouchable in an effort to deter other people from continuing to do their work as well. And I think it's very important...
you know, well, my own personal feeling on this, I've been this is now in March, it'll be two years, right, that I've been under these investigations and lawsuits. And so I think I have a maybe more, you know, this is a this is a normal for me now. And what I what I really want to get across, though, is I spend a lot of time reading history these days, reading transcripts from the House and American Activities Committee, watching the dynamics of what happened, and the historical parallels are right there. Not
that we did at Stanford Internet Observatory, the conspiracy theory that was spun up about our work, just to clarify for your listeners who may be unfamiliar, is that we were created as some sort of government cutout by the deep state to deliberately censor right-wing speech in an effort to rig the election. That is the conspiracy theory that was spun about our work that led to investigations.
The only justification, the impetus for the investigation was two reporters from the Twitter files sitting and misleading the committee under oath with written testimony under oath that said things like that we had censored 22 million tweets, an absolutely false claim, complete, utter and total nonsense. And when all of the documents and things were turned over, that was again borne out. We had made the
There was absolutely no evidence of this anywhere. The insinuation and innuendo alone, the accusation was enough to spark the investigation. And even after the investigation, all of the documents that were turned over, all of the many, many hours of testimony, even after that showed absolutely no evidence of those court claims, they didn't say, whoops, sorry, guys, we got it wrong. They moved the goalposts and made a different accusation.
And this is why when I make the connection back to McCarthyism and to that history, I think it's really important that the public understand and that those who are going to be targeted understand that you don't get to exonerate yourself because this is not actually an investigation.
And this stuff is, right now, it looks like it's only picking up steam, frankly. I mean, you've got Marc Andreessen going on Joe Rogan, I think it was last week, and explaining how NGOs like yourself are not really NGOs, but they're really, you're doing the work on the behalf of the government, and this is just a clever way to get around it.
So a lot of people who hadn't even hadn't heard of this stuff before are getting to hear about it now. Just one last question on this. Do you do your work differently now? Do you not put things in email that you used to put an email or do you use encrypted stuff where you weren't using encrypted stuff? Or does it frankly make you reconsider some work you're doing saying, I don't want to touch that one. That's, that's too, that's going to trigger the Jim Jordans of the world. Maybe I'll study something else instead. Well,
Well, I think, again, it's important to note that what you actually do or don't do doesn't matter. So, you know, I have never been shy about my work. I've never been unwilling to turn over, you know, documentation.
about the research that we did because we stand by it fully. And even as you read the reports that Jim Jordan's committee put out, you can see how absolutely flimsy they are. There is no smoking gun email in there that I have written because the emails were so mundane. As for Marc Andreessen, one of the things that's remarkable
about that is that Mark's been a board member of Meta for a very, very long time. So even as he's making comments about, you know, Meta mass censoring about, sorry, he says social media companies, which Meta perhaps he mentally exempts, you know, running these various cutouts and censorship cabals and whatever else he kind of rattles on about. He was a board member during the election. He knows exactly what work Meta did.
It's when he's putting out a quote that says like it's what that quote's like. That's on Twitter. Every which he's blocked me on, but I can see it. Every participant in the orchestrated government university nonprofit company censorship machine of the last decade can be charged criminally.
And then he lists some federal laws. So presumably that'd be him as a board member. I mean, you know, ultimately the board members are also responsible is my understanding of the law. But, you know, I guess we'll all wait and see. It seems like, you know, kind of performative saber rattling to gin up support and, you know, maximize clout on X. But what do I know? I guess we'll see. Yeah. And I guess that is part of the question, right, is how much of this is real and how much of it is performative?
In some cases, I guess it doesn't matter, right? If you're arguing, look, the whole point is to have a chilling effect. It doesn't matter if it's real or not. The threat of it is important. For you specifically, it matters a lot whether it's real, whether you have to defend yourself against a criminal investigation. I mean, I guess this is the part where we see how willing DOJ and others are to spin up just front investigations.
I think I'm less concerned about me again. It's been two years now, you know, the material that what we worked on has been turned over and made public. The public can see it. Jim Jordan released all of the tickets, you know, the sort of tracking tickets we use to study online rumors. They're out there. The public can see them. And what you see in those tickets is.
actually refutes the conspiracy theory. They're tickets created by undergraduate students, by researchers, by graduate students, by even people like me. We were not sent secret messages by DHS or CIA or whatever the deep state entails demanding censorship. The material is out there and the public is welcome to look. We'll be right back with Renee DiResta, but first a word from a sponsor.
Support for this podcast comes from Stripe. Payment management software isn't something your customers think about that often. They see your product, they want to buy it, and then they buy it. That's about as complex as it gets. But under the hood of that process, there are a lot of really complicated things happening that have to go right in order for that sale to go through.
Stripe handles the complexity of financial infrastructure, offering a seamless experience for business owners and their customers. For example, Stripe can make sure that your customers see their currency and preferred payment method when they shop. So checking out never feels like a chore. Stripe is a payment and billing platform supporting millions of businesses around the world, including companies like Uber, BMW, and DoorDash. Stripe has helped countless startups and established companies alike reach their growth targets, make progress on their missions, and reach more customers globally.
The platform offers a suite of specialized features and tools to power businesses of all sizes, like Stripe Billing, which makes it easy to handle subscription-based charges, invoicing, and all recurring revenue management needs. Learn how Stripe helps companies of all sizes make progress at stripe.com. That's stripe.com to learn more. Stripe. Make progress.
And of course, podcasts.
Yes, the thing you're listening to right now. Well, it's increasingly being produced directly by companies like venture capital firms, investment funds, and a new crop of creators who one day want to be investors themselves. And what's the point?
And what is actually going on with these acquisitions this year, especially in the AI space? Why are so many big players in tech deciding not to acquire and instead license tech and hire away co-founders? The answer, it turns out, is a lot more complicated than it seems. You'll hear all that and more this month on Decoder with Nilay Patel, presented by Stripe. You can listen to Decoder wherever you get your podcasts. Thumbtack presents the ins and outs of caring for your home.
Out. Indecision. Overthinking. Second-guessing every choice you make. In. Plans and guides that make it easy to get home projects done. Out. Beige on beige on beige. In. Knowing what to do, when to do it, and who to hire. Start caring for your home with confidence. Download Thumbtack today.
And we're back. Let's pull back the scope a little bit. As we're having this discussion, there's a parallel discussion going on with all the big social media platforms. And it started well before the election. I think it started when Elon Musk bought Twitter.
Sort of a rethinking of the way those various platforms had engaged in moderation post-2016, through COVID, through the election denial campaigns, and basically sort of a pendulum starting to swing back saying, you know, maybe we overdid it a bit, which is what Nick Clegg, chief comms guy, said.
uh for meta said to the press yesterday uh mark zuckerberg said a version of that and a letter to jim jordan this summer and it's important because a lot of people are trying to figure out all right is this what the the platforms mean or is this something they're doing to make republicans slash trump happy how much of it do you think is a is is pure politics and how much of it is a
actual discomfort with the moderation the platforms were doing over the last eight years or so? So possibly a combination of the two, right? One of the things I think it's important to understand is that for a long time, platforms shifted policy in response to public pressure, right? There's a phrase that we use for working the refs. And sometimes that was refs on the right, right? You may recall Facebook once had a trending topics feature.
Kind of, you know, would periodically go haywire and recommend Macedonian content farm nonsense. Like, you know, I remember in 2016, it was like Megyn Kelly fired by Fox News. Pope endorses Donald Trump, all sorts of crazy stories. There were like witch blogs that would show up in the science section every now and then. And so they had human curators who would just try to keep the, you know, the weird sort of viral spam going.
you know, out of the kind of curated trending feed that was recast as anti-conservative bias. And, you know, Meta had to kind of host a whole bunch of, you know, sort of right-wing media folks. Glenn Beck, I believe, went down and a couple of others. And this led to them really shifting, you know, trying to assuage concerns by simply eliminating human curation on that feature altogether, which then, of course, resulted in it going absolutely complete
haywire and then ultimately killing the feature altogether. Right. So this dynamic of, you know, working the reps, of course, you also see this on the left, right? The left is all, you know, is lobbying for its particular preferred moderation and curation standards. This is, it's been kind of a back and forth for a while. And they do respond to, you know, to the concerns of the dominant party in power because they know that otherwise they're going to be hauled in for hearings and subpoenas and, you know, all sorts of other, uh,
types of things. So there is a component of job owning in that dynamic. I think one thing though, is that there were, I think legitimately as they tried to, and Nick Clegg said this in the commentary that you're referring to and that I read this morning,
They did do things like throttle the lab leak hypothesis. Right. And there were certain kind of high profile cases where, you know, Hunter Biden's laptop is another one that comes up. That issue will never die. Well, yeah, because it's pretty extraordinary. Right, it was. Twitter suppressing links to a New York Post story that turned out to be true. Yes. I mean, lots of... Well...
Lots of people had good reason to think it wasn't true, but they actually said you can't read the story essentially via Twitter. Right. Which they then apologized, Jack Dorsey apologized for, said that was overreach. But it was sort of the worst case scenario you would hear a conservative dream up, a
of online censorship and they made it a reality. Exactly. And so in those particular cases, you know, that apology was issued. They unthrottled it, I think, within somewhere between 24 and 48 hours and they tried to kind of set things right. They had congressional hearings in which people, employees of Twitter at the time, went and testified about that decision in the Twitter files. You can read them making that decision, you know, how they decided to make these determinations. So,
Moderation is challenging. It really is difficult. Somebody somewhere is going to be upset about every decision. As the platforms have tried to cope with issues of scale, they use AI moderators oftentimes. I've seen a lot of people on threads complaining that they just suddenly all, you know, they're
Their posts are deleted. They're locked out of their account. You know, oftentimes these are journalists who are just sharing stories. It's sort of unclear why the moderator did the thing. And so there is a lot of dissatisfaction. And I think that one of the issues here is that there's just not a lot of transparency around
around content moderation. When we did our work, we did try to understand, you know, of the narratives that we paid attention to of the, you know, we focused explicitly in our work, we didn't work on the lab leak hypothesis, we didn't work on Hunter Biden's laptop, that was out of scope for us. We were looking very specifically at narratives related to voting,
And when we went and looked, we noticed that even in posts that appeared to very clearly violate their policies, moderation was really not particularly uniform. Sometimes one account would post something and get moderated, oftentimes just via a label. Overwhelmingly, things were labeled way more than they were taken down.
But the post would receive a label when one person said it, but not if another person said it. And that creates feelings of inequity. It creates feelings of unfairness. It undermines the legitimacy of the very enterprise of moderation. And then, of course, there are the real mistakes.
that then are turned into propaganda campaigns that attempt to delegitimize the idea of moderation entirely, to reframe moderation as censorship, even if it is just something like a label. And that narrative was very effective on the right and became kind of a centerpiece in the campaign.
I mean, you spent a lot of your time working with people who worked at Trust and Safety or what they usually describe that way at various platforms. Twitter famously has gotten rid of almost all those people. And in general, there has sort of been a de-emphasis on that work across the platforms. They've rolled back restrictions they previously put in. The people you work with, I assume, are very committed to the idea that moderation is important. But I'm wondering...
If you have a sense of how their bosses, the CEOs of these companies and shareholders for that matter, think about moderation, do they think it's important or would they rather say, look, we'd rather just not be in this business altogether if we could. And so whatever we're doing is as grudging as possible. Oh, I love that question. So first, when we worked with Facebook and Twitter trust and safety teams, that was
almost exclusively on two issues. One, child safety. So the presence of illegal child exploitation content. Unfortunately, periodically, they would not catch it. And we would. And we would engage on that. The second area was state actor takedowns.
So when we saw evidence of influence operations, we would engage with them around, hey, there are these accounts, they're pretending to be Libyans, they're posting Russian content.
you might want to take a look at this, like that kind of thing. And then we would engage around the data sets. And that was not limited to the United States at all. So that was really very much significant international work. Occasionally we engaged with them when, for example, the US Pentagon was running influence operations. So those were the two areas that we engaged with the platforms on. We did not engage, again, in routine content moderation decisions with the exception of periodically tagging them in something
around election or vaccine-specific narratives that appeared to violate their policies. So that was our means of engaging. Now, a lot of that work is much more insular now. Those collaborations are largely done, at least with Stanford, obviously. SIO had its own
rather dramatic collapse, but I don't think that they're doing very much with other outside voices. And this, I think, is a problem because it does mean that those decisions are being made fully within the company, which is sort of a form of unaccountable private power, right? And I think you do want transparency. You do want data sets being released. You do want data sets evaluated by independent researchers. We want those channels of communication to be open to make sure that the public can have
have an independent outside assessment of what is happening in those areas. The other thing that was a chilling effect was that as these sort of, you know, weird theories of cabals and deliberate censorship made the rounds, lawsuits were filed as well. There was one called Missouri v. Biden, then Murphy v. Missouri when it got up to SCOTUS, that actually actively sought injunctions to prevent the governments from speaking to the platforms.
And again, even though some of those injunctions had carve outs for matters of national security, which one would assume would include state actor campaigns.
The government, out of an abundance of caution, really just ceased communicating entirely during that period, leading Meta to say in November of 2023, the government doesn't talk to us anymore. And they were saying that in the context of releasing new information about Russian, Chinese and Iranian influence operations.
where normally they might have engaged with government on those topics as well. Now, all of a sudden they weren't. So this is the environment that we find ourselves in, right? You want to have transparency. I think it's reasonable if government is asking for a takedown or a moderation action to make that transparent, right? To kind of put that out there as Google and some other companies do. But again,
You don't want companies to simply say, we're not going to do this anymore. We're not going to engage with researchers or the government anymore. Everybody is now back in their own silos because that's where we were prior to 2016. And we didn't like how that worked out either. And that's all useful. I guess I just try to...
I'm assuming you don't spend a lot of time hanging out with Mark Zuckerberg or Sundar Pichai or Neil Mohan or any of those folks. But do you have a sense of whether they are kind of relieved that the tide has turned back on this and there's less demand for it or less of a certain kind of demand for it? Or do you think they're frustrated as well and they would actually like to have better control of the platforms? Well, if there are...
Things like high profile state actor influence operations that happen as a result of inaction, you're going to see the public outcry and the pendulum swing back. So they're in a, you know, they're kind of between a rock and a hard place here. And I think that, of course, you know, moderation is a cost. Fielding employees to do the work is a cost. Building classifiers is a cost. You know, focusing developer attention is a cost.
This is not a thing that they are happy to do. But the one thing I'll say is that the public wants moderated platforms. The public does not want to encounter certain types of what we sometimes refer to as lawful but awful content. The public does not want to be harassed. You see this come out in polling over and over and over again. If you want to be on a platform that does not moderate at all, 8chan is right there for you. Telegram is right there for you, right?
People are not choosing to spend time on those platforms because they do want to have an experience of feeling that they're on a platform where when they post, they're not going to be harassed. They're not going to be pushed out of conversations. They're not going to be engaging with, you know, spam bots, fake accounts, scammers and Russian trolls. Right. So so what we see is the tension that the platforms face where people
It is a business decision to moderate. This has been well established since, you know, I think there's a great paper by Kate Flanick written in 2017 called The New Governors that looks at the ethics that go into, or the ethos maybe I should say, that informs platform moderation and speech policies. And one of those, you know, criteria, the sort of,
One of the biggest ones is actually that platforms are trying to create environments that align with the expectations of the users. And so you do see moderation shifting in one direction or another in accordance with the dominant voice on the platform. But the one thing I'll say, sorry for monologuing here, is that you are seeing an exodus of people on the left front.
from Twitter, particularly after the election, because there are now other platforms for them to go to where they feel like the culture is going to be more aligned with their values. And that's going to be a very, very significant thing going forward. That's actually where a lot of my work is focused, is on that dynamic of when users have choice
What are they optimized for and where do they go? And I think that Blue Sky and Mastodon and Threads and some of these others are really interesting ways of seeing how users migrate in response to company policy decisions.
Yeah, let me come back to the other platforms at the end of this conversation. I want to talk about the election. People in your world and mine spent so much time over the last few years talking about election interference, whether that happened in the last election, preparing for a world where Trump and his allies refused to accept election results. This is what your last book is about in a lot of many ways. It's called Invisible Rulers You Should All Buy.
Donald Trump on Election Day was saying, yeah, there's all kinds of interference happening in Pennsylvania. Obviously, all of that went away as soon as the election was over. But what happens to all of that energy and effort that people were putting into either getting people alarmed about election interference or preparing for a world where election interference is a reality?
What happens to all that energy? I guess I'm mostly thinking of sort of on the conservative side where they were using that as a tool to whip people up. But I'm thinking in general, lots of people spent a lot of time thinking about this stuff. It seems to have vanished overnight. What do you think happens to that infrastructure and energy?
Well, that infrastructure and energy will be pointed at something else, right? Because if you need to kind of rile up your base- It just has to go somewhere. It's got to go somewhere. Yes. And it's very powerful. I think that rage and that feeling that we are doing the investigation and finding out the truth, right? That's why QAnon was so powerful. Yeah.
it was that sense of agency that members had. We're kind of co-discovering the facts of the world together, right? We're exposing the wrongdoers. And you're just going to see that pointed at something else. It's not totally clear what it's going to be, right? There'll be things that will, you know, sort of emerge from the online ether in response to whatever kind of current events happen in the world. But that energy will be galvanized and will be put towards something. I think it is really, really, really important, though, to highlight that,
that there's no accountability for any of the people who who ginned up that outrage, who made those claims about the election being stolen. As you say, it all simply evaporated. And that's one of the things that I think really needs to be internalized and understood. Because I think one of the ways that people, even in research, even in media characterize this is
um, they describe it as misinformation and it's just not, that word is so inadequate for what, what that dynamic actually is. Misinformation implies that like when the facts of the matter are revealed and the person learns that they were wrong, they say like, whoops, sorry, you know, they, they correct their priors and, you know, and then they, uh,
And they, you know, maybe reckon with the mistake. That's not what happens here. It's a propaganda campaign to galvanize a political movement. And that political movement needs an outlet and needs to be redirected to something else. And it will be because it was not actually about the facts. The facts were not the problem. None of the people participating in, you know, what Elon Musk going on about Dominion voting machines on stage at rallies.
There was never any reckoning or apology or anything along those lines. It simply stopped happening. We'll be right back with Rene D'Arresta, but first, a word from a sponsor. Support for this podcast comes from Stripe. Payment management software isn't something your customers think about that often. They see your product, they want to buy it, and then they buy it. That's about as complex as it gets. But under the hood of that process, there are a lot of really complicated things happening that have to go right in order for that sale to go through.
Stripe handles the complexity of financial infrastructure, offering a seamless experience for business owners and their customers. For example, Stripe can make sure that your customers see their currency and preferred payment method when they shop. So checking out never feels like a chore. Stripe is a payment and billing platform supporting millions of businesses around the world, including companies like Uber, BMW, and DoorDash. Stripe has helped countless startups and established companies alike reach their growth targets, make progress on their missions, and reach more customers globally.
The platform offers a suite of specialized features and tools to power businesses of all sizes, like Stripe Billing, which makes it easy to handle subscription-based charges, invoicing, and all recurring revenue management needs. Learn how Stripe helps companies of all sizes make progress at stripe.com. That's stripe.com to learn more. Stripe. Make progress.
Your home with Blinds.com. Fa-la-la-la-la-la-la-la-la. Fa-la-la-la-la-la-la-la-la-la. D-I-Y or let us install. Fa-la-la-la-la-la-la-la-la. Fa-la-la-la-la-la-la-la-la-la. Free design consultation. Free, free, free, free, free. Plus free samples and free shipping. Free, free, free, free, free, free. Head to Blinds.com now for up to 40% off site-wide plus a free professional measure. Fa-la-la-la-la-la-la-la-la-la-la. Rules and restrictions may apply.
This episode is brought to you by LifeLock. The holidays mean more travel, more shopping, more time online, and more personal info in places that could expose you to identity theft. That's why LifeLock monitors millions of data points every second. If your identity is stolen, their U.S.-based restoration specialist will fix it, guaranteed, or your money back. Get more holiday fun and less holiday worry with LifeLock. Save up to 40% your first year. Visit LifeLock.com slash podcast. Terms apply.
And we're back. Another theme that people were talking about a lot during the election was we're going to see a rerun of 2016 where you've got Russia, Iran or China or all of them trying to interfere with the election. And then this time it was going to be supercharged by A.I.
deep fakes, all that stuff. Maybe we're missing something, but it doesn't seem like that was a major factor in the election. Again, Meta just put out a report saying, you know, AI didn't seem to be a big deal. And yes, there was attempts by sovereign nations to interfere the election, but not that big a deal. He's not at Meta. Does that mean we got the threat wrong or that the platforms got really good at handling the threat somewhere in between?
So I think a lot of us... Well, let me try to caveat that. At Stanford, anyway, at Stanford Internet Observatory, our feeling was always that...
State actors are present. They're amplifiers. They're in the space. They are not as impactful as what's going to happen in domestic land. I lay this out in the book, too, in the context of the understanding of 2020, right? The Iranians were doing things in the 2020 midterms. We had fake Chinese networks, fake Russian networks, fake Iranian networks. They're there. They get some lift. Sometimes their accounts manage to seed a conversation. We did see things like
The DOJ releasing that indictment against Tenant Media showing that Russia had actually moved into explicitly hiring influencers. Right. So we see the evolving strategies. They're there. There's there's no cost to them to being there. Right. Like, why wouldn't you try? This is this is the information environment. This is how we shape public opinion today. We do it on the Internet. You can do it essentially for free.
Right. With the tenant stuff, I think they spent like $10 million and everyone was talking about how much money that was for the podcasters. But of course, it's literally nothing. It's nothing for the state, right? So again, we went into it thinking, will they be there? Possibly. Will they do something novel? Possibly. I don't know that we ever heard
a resolution or attribution on those bomb threats, right, that happened down in Georgia and places that did temporarily halt voting as, you know, as people were evacuated and things like that. That was attributed to Russia, I believe, by Secretary Raffensperger. I don't know that there was ever any corroboration of that from Russia.
ODNI or the FBI or CISA, the entities that normally would have released a statement. So again, that was kind of an alarming potential escalation, right, actually actively interfering. But it quite clearly also did not sway the election.
So there are these, you know, you can kind of hold two ideas in your head at once, right? It's bad for foreign governments to try to do this kind of interference. We need to be vigilant. We need to respond. Also, it's not the be all end all and we shouldn't over focus on it. Do you think that we collectively over focused on it after 2016? I do. Yeah, I do. Cambridge Analytica scandal, which I think a lot of people now have sort of reconciled as being.
you know, made way too much of that. Well, this was always sort of a, it became, again, a propaganda campaign.
And what I mean by that is, you know, I did one of the outside investigations for the Senate Select Committee on Intelligence on the data sets turned over by Facebook, Twitter, and Google, right? And the Russians, when they were undisturbed in the period of about, you know, two years before the platforms found them and took them down, they were actually effective at inserting themselves into targeted communities
growing large followings, you know, half a million followers on some of the Instagram accounts. They were effective on Twitter at getting influential people to retreat them. They were participants in the conversation. But again, as I was always, you know, very careful to articulate that did not indicate that they were
swayed people or that they made them vote or not vote in any particular way. So again, we have to hold the two things in our head that this is bad and we should be taking it down. We should be acting against it. And also that this is not the be all end all in a very, very complicated information environment. So that was my personal feeling on it as a person who did some of that research firsthand. I generally hate this kind of equivalence, but it seems to me there is someone that, you know,
People on the left after 2016 were looking for a reason to explain why Hillary Clinton had lost and Facebook was sort of the easiest punching bag available to them. And you saw a version of in 2020. How could how could the Republicans have lost? Well, it had to be an election interference. And again, I'm very wary of a false equivalence, but it seems like there is something at play there.
Well, it's distrust, right? It's looking for a way to... It's looking for a scapegoat. It's looking for an easy answer. It's looking for a convenient answer. That's what conspiracy theorizing attempts to do. And conspiracy theorizing also layers on this component of that entity over there behind the curtain did the thing, right? And this is why it was so remarkable to me to have my own work...
studying the big lie in 2020 reframed as this is how the election was rigged, right? Those people in Stanford and other places, these cabals of academics did the rigging. I mean, that's an absolutely surreal take, baseless, completely baseless. But it appeals to people's sense of something unfair happened to me and I'm looking for a villain to point at. And so that's where I think that
We look at technology as the cause. It's a tool, it's a component, but I think that there is a ascribing...
you know, more to foreign actors when other ways to look at technology or looking at the, the dynamics of, for example, content curation and, uh, follower networks and other things on who we engage with and how. And I, I try to get at that in the book, right? This idea of, um,
We've really devolved down into niches and there's very little trust between them. And that is something that social media has exacerbated. But it's also, you know, the influencer and the participants, which I call the crowd, are
are actively involved. It is not something where if you eliminated technology tomorrow or eliminated social media tomorrow, everything would go back to being quote unquote normal, whatever that means to people. Yeah, we're not getting rid of social media or technology. The idea that you brought up earlier that people are leaving Twitter or other platforms and going to the mass... I don't think anyone... I mean, I know people are technically going to mass. But the
Point is, they're going to other platforms. And there's a whole constellation on the right that don't seem to have a lot of pickup, but they exist. And now you've got threads and, like you said, blue sky.
Is that the future of social media, that everyone sort of goes to the platform that sort of represents their ideology? Because one of the counters to that is actually there's no fun in doing that. It's you want to be around. You want to reach the most people you like. Maybe you should be engaged with people who don't think the way you do. You want to persuade them. Or are we just going to head back to sort of head to silos where you're unlikely to face pushback?
I don't think it's so much silos that you're unlikely to face pushback. I don't think that's what people are necessarily going for. I think that it's what's happening is really interesting. So first, the market solutions to dissatisfaction with content moderation and platform policies began on the right, right?
There was an entire ecosystem, Parler, Getter, Truth Social, Gab, these platforms that spun up explicitly to offer the right wing audiences the content moderation styles and quote unquote free speech that they were looking for. And they weren't sticky. I cannot stress enough, they were not sticky. Yeah, Rumble is the most successful and folks in that world tell me it's not actually that successful, that you don't actually get a lot of engagement there.
And then Truth Social is successful as a financial vehicle for Donald Trump, but doesn't seem to have many actual users. It does not have much activity. And the...
The thing that's interesting about that, when we try to get at why, one of the things we used to kind of joke around about, but I think there's actually a lot of truth to it, is that reactionary groups need something to react to. And sitting there saying like, you can't own the libs if there are no libs. So they would constantly migrate back to places where the libs were. And actually, you do see now, you know, kind of group,
like right-wing groups kind of deliberately, you know, raiding blue sky, if you will, to try to, to try to trigger the lips, right? This is, this is an old dynamic, you know, trolls do what trolls do. There's just more places for them to do it. But what's happening, the thing that I am interested in with, with Mastodon and blue sky and threads is not,
their potential as quote, like left wing social networks. Like you said, I actually like I enjoy pushback, I find it like I want to follow people who are ideologically diverse and interesting, I want to have arguments.
Twitter kind of crossed a Rubicon for me personally. It became just a platform for just massive harassment. I had, you know, nutcases screaming at me all day long about supposed cabals that I ran and how I secretly worked for the CIA. And I was like, this is just not fun anymore. So people, again, people want to be on social media to enjoy themselves. They want to enjoy the argument, not feel like they're engaging with people who are going to become threatening in some way, right? There's a difference there. There's, um,
One is a debate. The other is harassment. And people want the debate and not the harassment. What's interesting to me, though, about Blue Sky and Mastodon is actually the technological underpinnings and what they make possible. So I think that the Fediverse is going to be very interesting. I hear you laughing, but I'm actually seriously honest. No, no, no.
Well, here's the thing, like, and that's actually, you want it to be seamless, you want it to not feel like a thing that people have to learn. And that's one of the areas where Mastodon, there was a kind of a small exodus there after Elon Musk bought Twitter and things began to change. But ultimately, it wasn't particularly sticky.
It's grown slowly. There's still things happening there. But it didn't have the sort of algorithmic curation and dynamics that people were used to, that had become kind of the norm for them on social platforms. And
Threads' creation is interesting because it brings together some of that, the sort of feed dynamics of the Instagram powerhouse team. This is what you want to see. Let us show you things. While also having that capacity to integrate into the Fediverse. So there's some interesting dynamics there. Can you give us a 30-second or 15-second explanation of what the Fediverse is? Yeah, so you can... It's a...
kind of networked environment where the distinct instances called servers are connected because they all use the same protocol. In this particular case, that's called ActivityPub. So you can join a server that is run by either maybe one of your friends or somebody you feel ideologically aligned with, and the server administrator can kind of set the rules of moderation.
and things like that. So you can have a much more granular selection in this is the kind of environment I want to be in. - But it also allows interaction with other platforms, right? - Exactly. But you can also see the content from other servers. And so Threads engages with the broader Mastodon community. Right now, your posts are pushed to them. You can see them reply.
you can't reply back at the moment. So it's a little bit of a weird experience, but that dynamic of like you can actually go and join a place where you can get that preferred moderation environment you might want, and you can go and join it, and you can also kind of pick up your account and move it to another server if something goes wrong. But let me just mention one thing because the interesting dynamic on Blue Sky, which Threads is beginning to copy,
is that it also creates an opportunity for something that we kind of refer to as middleware, where users or companies, even a news organization, maybe Vox, decides, hey, we're going to create a particular type of feed and anybody can subscribe to our feed. So maybe Vox goes and creates the feed of Vox writers, people,
So maybe Vox makes a sports feed. Who knows? There's a whole bunch of different ways you can do it. And then users can subscribe to the feed and that's what they see. They can switch between feeds very, very easily when they open their app. So I subscribe to a gardening feed. I'm a pretty lousy gardener.
And I want to see like when I want help, I can engage with people who are there specifically for that feed. So it creates kind of a smaller niche, much more community focused feel where I have agency over what I'm seeing in that moment. I am deciding what my curation looks like, but I also don't have to be fully responsible for it the way I kind of am on Mastodon. Like it's a much more, in this particular case, I don't have to do the work of going and figuring out all the interesting people in like
gardening land because someone else has already done that for me and I'm just subscribing. I know that there are some people who want that experience because I know them personally or they talk to me or they ask me to turn on Fediverse sharing for my threads post. So I do that. But my hunch is that's a very small minority of people who want to be that involved in their
Any version of technology and definitely their social media. They just want their social media to sort of make them happy. Yeah, they just want it to work. They don't want to spend time curating the feed. But you think that this idea is important for the future. No, no, no. My point is actually that you don't want them to have
to do the work of curating the feed because the ecosystem of providers who can come and make those feeds is out there, right? That what you're essentially doing is saying, I want to subscribe to a different type of curation. Like you see people saying, I really hate Twitter now because I go and the For You feed is a disaster. It's just showing me posts from Elon. It's just showing me posts from whoever. I want sports Twitter back.
- Right, exactly. - Where is it? - And then you have to go and you've gotta kinda make your list or maybe, you know, it becomes a much more, it's a feeling of it being like an arduous process. So people are looking for that ease and I think what Blue Sky is doing is trying to build that in from the ground up so it creates a positive experience. It shows people what's possible. We didn't all 20 years ago, you know, those norms and that question of like how should we use social media, what is social media for?
That evolved over time as features became available and our friends were there and network effects happened. And what I think is really interesting about particularly Blue Sky is that you're starting to see that happen. You're actually starting to see threads
copy Blue Sky features, right? And that's a pretty remarkable thing for Meta property. Well, let's be honest, Meta does it all the time. They rip off ideas constantly. But they're doing it in response to that engagement that's happening over on Blue Sky. And so I think that the time is...
Last question. We're no longer asking whether the
The left needs its own Joe Rogan because we are tired of that discussion. That said, you have gone on The Joe Rogan Show as a guest. You went on in 2019. I think you were still in L.A. there. What did you learn from going on Joe Rogan? And what did that experience teach you that other people may not understand about Rogan? Well, I really enjoyed it. And
He actually, he reached out to me a couple times, asking me to go on. So this was, again, around the time I had done that work for the Senate Intelligence Committee on Russia. And because it was so politicized, he wanted, he'd heard me on Sam Harris discussing that report, and he wanted someone to communicate those facts, right? That holding all the collusion and, you know, whatever, Mueller stuff aside, that, you know, the sort of
collection of theories about intersection between the Trump campaign and all that that I knew nothing about and didn't work on. But I did have this very clear evidence-based assessment of what had happened and he wanted his audience to hear it. And I appreciated the opportunity. We both knew that half his audience would hate me. And I think I was, I think I remember seeing
on a Reddit post that happened to mention my name so it hit my radar that it was, I was like the number two most hated guest that year or something along those lines. Yeah, you can get a good sense when you go to the YouTube page and see the comments. Yeah, yeah, yeah, exactly. And this again, the word Russiagate means nothing. It's just a shibboleth at this point. Russiagate hopes are, and I was like, what the hell even is Russiagate? Here's a demonstrable data set. It happened. It's real. I'm not making any big, broad sweeping claims about it. I'm just stating facts.
That conversation was very nuanced, I think, and I appreciated it. I did also get a lot of shit because I got a cough on the plane there. He gave me a cough drop. This is so stupid. He gave me a cough drop.
I was not like an old hand at podcasting and I was eating a cough drop close to the microphone. And I cannot tell you how many emails I got from randos screaming at me for like sucking on a cough drop. Well, you are. One, don't do that. But also you're a woman on the Internet. Right. Exactly. Exactly. People were like, this was absurd.
seen i didn't i was like i i really just had a cough i'm so sorry i didn't know is lozenge aside is joe rogan a good you you deal with complex stuff that involves a lot of nuance and a lot of like well actually you need to know this before you know that is that a good place for that discussion is joe rogan a good format for that it's a very open-ended format right and um
And I mean, I prepped a lot. Right. And I think that it is important to go in with like a very strong sense of like, you know, you're gonna be talking for two to three hours. It's going to be very freewheeling. A lot of complicated things might come up, you know, unexpected questions because it's just a conversation. And I think that it's less of an interview and more of a this is an interesting person and we're going to have a chat. Mm hmm.
I think that, you know, I wrote about some of Rogan's evolution during COVID in the book in that there was a trend towards bringing on contrarians in part because of the sense that they had been censored, like that they'd been taken down or something along those lines or that they were...
really the sort of brave truth-sellers and it kind of continued in that direction. And the one thing that I do think podcasters and influencers don't reckon with enough is how influential they actually are, right? How trusted they actually are, how they are in fact the media now, right? And that comes with maybe a little bit more, I don't know why I'm hedging here, but it comes back to candidly a lot more, right? A lot more responsibility because
One of the things that's very frustrating is that
there's very little opportunity for correction in that format unless you go and edit your show notes and say things like this person was wrong about this or that person was wrong about that and then that starts to feel newsy and not conversational and so it really doesn't happen. And that dynamic of how do you reckon with the limitations of the format I think and the
Would you advise other people who do the kind of work you do or just do complicated work? Or you, frankly, would you advise 2025 Rene DeRista to go back on Joe Rogan? I mean, I think that for me, particularly as my work became a weird conspiracy theory, I spent a lot of time going on podcasts where...
I want the challenge because I feel like it's really important for those audiences to hear the response. I can write op-eds in the New York Times and people who read the New York Times will read them, right? I can give quotes to the Washington Post. I can talk to mainstream media. And I do, and I can put my own stuff out. I at least have enough of a platformer
people can find me and get the facts directly from me. But I thought it was really important to go on some of the more right wing coded podcasts, again, with good faith hosts, I don't think there's, you don't want to set yourself up to be a punching bag where you're just gonna have to be sitting there constantly, like, no, but this is the reality. No, but you got that wrong. No, but here are the facts. I think that there's ways to do it where you're going to have a conversation about a particular issue, like how should content moderation work on the internet?
how should we think about job owning? And so I went on FIRE's podcast to talk about that with them. And I really appreciated that opportunity.
because I also felt like they were on the wrong side of the issue in some of their amicus briefs and I wanted an opportunity to fight that fight. And when I did that podcast, I got some pushback from people on the left, like why would you give them a platform? Why would you go on that platform? And I think that it's just the wrong question. It assumes that people on the left get to legitimize or delegitimize media that speaks to other audiences and they just don't.
have to be there. You have to be engaging in the space in some way. I understand that it can be stressful. I understand that it can be hard, but it invites harassment. Like I said, you know, a hundred people emailing me about my cough drop. But I think that people who can, people who maybe enjoy the debate or who feel like
It's a challenge that they welcome actually should be doing it a lot more. I think that we need to have those kinds of cross-pollinated conversations. Renee DiResta, thank you for coming on my podcast. If you don't have a podcast and you want to learn more about Renee, you can find her online. You can hang out with her in Georgetown. I don't know if you can actually hang out with her in Georgetown and you can read her book. It's called Invisible Rulers. You should go buy it today. Thanks, Renee. Thank you. Thanks again to Renee DiResta. I did not properly push her book.
It's called Invisible Rulers. You should go buy it. Thanks again to Jelani Carter, our producer and editor. And thanks to our sponsors. And not least, you guys. Thank you. Thank you for listening. See you next week. Support for the show comes from AT&T.
What does it feel like to get the new iPhone 16 Pro with AT&T NextUp anytime? It's like when you first light up the grill and think of all the mouth-watering possibilities. Learn how to get the new iPhone 16 Pro with Apple Intelligence on AT&T and the latest iPhone every year with AT&T NextUp anytime.
See att.com slash iPhone for details.