Get that Angel Reef special at McDonald's now. Let's break it down. My favorite barbecue sauce, American cheese, crispy bacon, pickles, onions, and a sesame seed bun, of course. And don't forget the fries and a drink. Sound good? Ba-da-ba-ba-ba. I participate in restaurants for a limited time. At Sierra, discover great deals on top brand workout gear, like high quality bikes, which might lead to another discovery.
Getting back on the saddle isn't always comfortable. Good news is Sierra has massage guns. And chafe wipes, too. Discover top brands at unexpectedly low prices. Sierra, let's get moving.
Hey, everyone. It's Shaina Roth, a producer and fill-in host here on What Next TBD. Lizzie has a cold and lost her voice today, so you're going to hear a blend of the two of us, Lizzie doing an interview and me explaining parts of the story. Okay, here's the show. On January 20th, the day of President Trump's inauguration, a post spread across Facebook. It read,
ICE is allegedly offering $750 per illegal immigrant that you turn in through their tip form. Cash in, folks. This post was one of many that I saw across Facebook and across Instagram where people were saying that, yeah, ICE was now offering a cash bounty if you turned in undocumented people. That's reporter Craig Silverman. He covers technology for ProPublica.
And the most important thing to know about this is that it's not true. So ICE does have a tip line, does encourage people to send reports in, but they did actually post themselves on Instagram and told several fact checkers that we do not offer cash bonuses. But that still, you know, was a claim that spread fairly widely across some of these social networks.
Who was posting that stuff? So in some cases, it's just, you know, average users on Facebook, regular people. I saw it in some cases from pages. One of the pages that I saw it from came from a page called No Filter Seeking Truth, which often had a lot of stuff I saw on it that was, you know, fairly extreme, fairly partisan, and in many cases, not true.
The thing is, it could get worse. This spring, Meta is ending its work with fact-checking organizations. And at the same time, it's expanding its content monetization program, which pays creators for high-traffic posts.
So the more viral and popular content is, the more money you can potentially make from your posts on Facebook and Instagram. And you can just sort of imagine how these two things might combine to create perhaps some problematic incentives.
Today on the show, how Meta could end up paying people to spread misinformation. I'm Shaina Roth, in for Lizzie O'Leary. You're listening to What Next TBD, a show about technology, power, and how the future will be determined. Stick around.
Running a small business means you're wearing a lot of hats. Your personal phone becomes your business phone. But as your team grows, that's impossible to manage. That's where Open Phone comes in. Open Phone is the number one business phone system. They'll help you separate your personal life from your growing business. It's affordable and easy to use. For just $15 a month, you get visibility into everything happening with your business phone number. Open
Open Phone works through an app. They use AI-powered call transcripts and summaries, and if you miss a call, automated messages are sent directly to your customer. Right now, Open Phone is offering 20% off of your first six months when you go to openphone.com slash TBD. That's O-P-E-N-P-H-O-N-E dot com slash TBD for 20% off six months. openphone.com slash TBD.
And if you have existing numbers with another service, Open Phone will port them over at no extra charge.
If you think back 10 years ago, before social media companies got serious about content moderation, Facebook was kind of a mess.
I spent a lot of 2014 actually doing a research project looking at viral hoaxes, viral fakes, viral rumors, how they were spreading across social media, how they sometimes ended up in, you know, online news articles, but also the dynamics of what was happening on Meta. Craig was one of the first people to write about online content farms, which just made stuff up.
There were people, some in the U.S., some based overseas, who were making really good money just writing completely fabricated, totally made-up articles, things that at that time I described as fake news articles.
You know, this was before Trump had adopted the term, where it was written like a news article, looked like one on the website, but completely made up. So claiming that, you know, a small Texas town had been quarantined because Ebola was spreading. Claiming that a woman was arrested because she made a coat out of, you know, the fur of cats owned by her neighbors. Just outrageous stuff, partisan stuff, scary stuff.
And it was going gangbusters. It would get hundreds of thousands of shares and likes and comments on Facebook. And that would cause people to visit the website where the link had come from, earning potentially, you know, five figures a month for the people running these. Yeah, it was a really good business. So the platform's approach was that it just didn't deal with that stuff. That all changed after the 2016 election.
And, you know, there's some real concerns about how Meta's system was rewarding false stuff and accurate information about the election was not performing as well. And Meta decided that they wanted to take steps to provide a framework to at very least kind of flag for people like, hey, this post that you just made or that you're encountering now on Facebook, it turns out that some people actually, you know, some fact checkers have found this to be false.
And so Meta wasn't really removing false content, but it was labeling it. It was adding context to it. They were very careful about this. They said, "We're not the arbiter of truth."
It's a very famous line. It's a core, core line, along with Mark Zuckerberg saying that it was crazy that people would think, you know, fake news had an effect on the election, which, you know, in retrospect, wasn't a totally crazy thing for him to say. I mean, I don't think Trump won because there were a bunch of false headlines on Facebook going viral, but there was a real deterioration of the quality of information that people could get.
And Meta acknowledged that. They invested heavily in content moderation, not just keeping sexual abuse or hate speech off its platforms, but working to try to root out false information.
And so we've had almost a decade of them investing in partnerships with fact checkers, not just in the US but around the world, them labeling content, them down ranking, meaning stopping the spread or slowing the spread of content that had been fact checked so fewer people would see it, them building automated systems to detect
you know, copies of previously checked things and just kind of growing this huge infrastructure. But then, you know, after this most recent election, it really seems like they're sort of rolling back the clock. And which is why for me, it was so much nostalgia. I'm like, oh, they're basically going to take us back to 2015 or, you know, early 2016 when the viral hoax was a great and rewarding thing on Facebook. Which brings us to January.
Hey everyone, I want to talk about something important today because it's time to get back to our roots around free expression on Facebook and Instagram. CEO Mark Zuckerberg said Meta would be getting out of the fact-checking business, replacing fact-checkers with community notes. After Trump first got elected in 2016, the legacy media wrote non-stop about how misinformation was a threat to democracy.
We tried in good faith to address those concerns without becoming the arbiters of truth. But the fact-checkers have just been too politically biased and have destroyed more trust than they've created, especially in the US.
And, you know, that, I mean, fact checkers are really upset about that. I've spoken to a bunch. And one of the things that they're really upset about is that they say, you know, as fact checkers would, it's a claim without evidence, meaning that, you know, Mark Zuckerberg didn't cite any studies or any hard data to show that they are politically biased in what they are checking in the United States. And in fact, if you think about
who is perfectly positioned to actually do a study to prove that it would be meta because they have 10 years, almost 10 years of data, of fact checks that they could have taken some of their very talented engineers and data scientists and said, hey, let's see if there is political bias. But as far as we know, meta has never done that. And so, you know, he came out and really, to some people, they have said it just sounded like
a real mega coming out for Zuckerberg of saying we are about free expression. And Meta denies that it's about catering to Trump. They say this is about getting back to our roots. How big of a universe of fact checkers are we talking about?
So there are a few hundred fact-checking organizations around the world. And the interesting thing is the Duke Reporters Lab has been tracking the number of fact-checkers for a very long time. And what you do see is post-2016, a massive expansion in the number of fact-checking organizations around the world. And
One of, not the only, but one of the drivers for that is that Meta started paying fact checkers. It's never been a super lucrative thing. And even as Meta was partnering and paying, it wasn't super lucrative. But all of a sudden, you could partner with one of the biggest and most valuable companies in the world and you could have steady revenue.
And so Meta actually kind of created a scenario for the growth of fact-checking around the world. And it is, as of now, still working with fact-checkers outside of the U.S. The plan to sunset in the U.S. is the only one it's announced, but I think a lot of fact-checkers see the writing on the wall and believe that they've really benefited from the Meta revenue, and now they are about to see that go away. Let's talk about the Facebook content monetization program, because that's the other change here.
How will it work? So this program in some ways is already in place. Meta has had programs in place where if you're posting reels like short form video on its platforms, you can earn some money, for example, from the ads that were placed in that.
And then Meta rolled out a program back in 2021, basically called like a performance bonus program. It doesn't matter what ads are running in your content or what have you, you are going to get paid based on an undisclosed formula that combines views with engagement. So pretty much, hey, you know, if your content does well, we're going to pay you money. And this was a program that people had to sort of apply to or be selected to be included in.
And so it has been paying people out since 2021. And what happened last year is Facebook basically said all of our monetization programs for creators, we're combining them into the content monetization program. It's going to be the performance bonus. It's going to be any ads things.
And not only are we combining them, but our plan in 2025, this year, is to expand this program beyond invite only. So there's apparently going to be a lot more pages and accounts that are going to be able to earn money by figuring out what gets people to click, to share, to view. Meta has rules for this program. No violent graphic content, nothing sexually explicit. But at the same time, if...
They are getting rid of the fact checkers. How can you ensure that those rules are being followed? This is the interesting needle that they are trying to thread right now. And what's interesting is that, you know, the fact checking rollback obviously relates very much to false or misleading content. But that was combined with Zuckerberg saying, you know, we're sort of turning down or getting rid of some of our automated systems that were meant to detect fraud.
and either like slow down the spread of or in some cases, you know, flag and send off to human moderators. Stuff that might be not just sort of false, but anything that might be sort of a controversial health claim. Also things that might be sexually suggestive or, you know, starting to border into maybe being a little excessively violent. And so they have publicly said that they are turning down these systems.
And so, you know, if you're a content creator trying to figure out what might earn you the most money, knowing that the fact checkers are going away and knowing that these systems have already been turned down or turned off, you're absolutely going to try and push the envelope and figure out what you can get away with. And in fact, I mean, there was recently an issue where extremely violent content was being shown on Reels to...
to the extent where Meta actually had to publicly apologize for this. They blamed it on an error, but people were seeing extremely violent content suddenly showing up in their streams. Deaths. Yeah. And so it just sort of reminds you, this stuff is on the platform. People are posting this stuff, and it has been these systems that have been keeping it out of your feed. So what is it going to look like in the coming months?
When we come back, Meta gets some inspiration from Elon Musk. This episode is brought to you by Discover. It's smart to always have a few financial goals. And here's a really smart one you can set. Earning cash back on what you buy every day. With Discover, you can. Get this. Discover automatically matches all the cash back you've earned at the end of your first year. Seriously, all of it.
Discover trusts you to make smart decisions. After all, you listen to this show. See terms at discover.com slash credit card. Imagine what's possible when learning doesn't get in the way of life. At Capella University, our game-changing FlexPath learning format lets you set your own deadlines so you can learn at a time and pace that works for you.
It's an education you can tailor to your schedule. That means you don't have to put your life on hold to pursue your professional goals. Instead, enjoy learning your way and earn your degree without missing a beat. A different future is closer than you think with Capella University. Learn more at capella.edu.
Every day, thousands of Comcast engineers and technologists like Kunle put people at the heart of everything they create. In the average household, there are dozens of connected devices. Here in the Comcast family, we're building an integrated in-home Wi-Fi solution for millions of families like my own.
It brings people together in meaningful ways. Kunle and his team are building a Wi-Fi experience that connects one billion devices every year. Learn more about how Comcast is redefining the future of connectivity at comcastcorporation.com slash Wi-Fi. This all sounds not particularly dissimilar to the changes that Elon Musk instituted when he took over Twitter and rebranded it to X. Is that what Facebook and Instagram are going to look like?
There is definitely some inspiration being taken from what Musk did when he took over Twitter, turned it into X. So one thing that Mark Zuckerberg explicitly said is that he wants to basically adopt the Community Notes model that has become even more prominent on X under Musk. Now, so this is basically the kind of crowdsourced fact-checking idea. People who are on X can sign up to be part of the Community Notes program
And if they see a tweet that they think has false information or is lacking context, they can sort of flag that in this system and they could append a link to say a fact checking website or to Wikipedia and say, hey, this is misleading or this is incorrect because, and here, check the source.
And the way community notes works is that, you know, obviously they need to have lots of people participating so that they can sort of have lots of submissions and then people actually vote on submissions. But it's not just the ones with the most, like, yes, this is a good, accurate community note that gets shown on X, because most of them don't.
What they're looking for is kind of an ideological consensus. So they have users from different points of view and they want users from different points of view to agree. And so you'll have lots of valid community notes that are in the system that never get shown because they haven't reached the threshold of, oh, hey, people on the right and the left or what have you have agreed that this is a valid one. And so, I mean, personally, I think crowdsource systems like that are really interesting and have great potential.
But to do it only with that and to sort of remove the fact checkers before you have that in place and scaled up, you know, when I speak to experts and other people, those are the things that they're pretty concerned about. Meta's platforms also operate on just a vaster scale than X.
Yeah, that's a really valid point. When you think about the size of X, which has always kind of punched above its weight because it had the so-called elites on it. It had celebrities and it had, you know, performers and politicians and those people who were news drivers and folks that people wanted to see. But, you know, you're talking about a few hundred million users typically or less, you know, in a particular country, far less.
I mean, Meta is the biggest social media company in the world with like billions of people. A decent portion of the global online population is on one of Meta's products, be it Instagram or Facebook and that kind of thing. And so it is a much bigger challenge to be able to kind of reach that scale, even just talking about the US, where because Facebook has been around for so long and Instagram around for so long, they have
very high user numbers, and people use the heck out of those platforms. They post a lot. Why are these changes happening now? I mean, it is not lost on me that Mark Zuckerberg spent years going to and from Capitol Hill, sitting in front of this congressional committee or that one, getting yelled at by various members of Congress. And I feel like...
The political environment is just simply so different now. What's the calculus there? It is interesting because you're right that, you know, there has been for quite some time a tradition on Capitol Hill of a bipartisan yelling fest against Mark Zuckerberg.
Here's Senator Josh Hawley grilling Zuckerberg about child safety on the platform just a year ago. 37% of teenage girls between 13 and 15 were exposed to unwanted nudity in a week on Instagram. You knew about it. Who did you fire? Senator, this is why we're building all these rituals. Who did you fire?
Senator. That's I don't think that that's who did you fire? So you didn't take any action. You didn't take any action. You didn't fire anybody. You haven't compensated a single victim. Let me ask you this. Let me ask you this. There's families of victims here today. Have you apologized to the victims?
They like bringing him in. They like putting him under the lights and they like trying to make him squirm. But, you know, as you say, the sort of power dynamics have changed and they actually arguably started to sort of change around 2022 when the Republicans get the House back and
And Jim Jordan had a committee that was looking at kind of weaponization of speech and government and really looking at what he viewed as censorship.
And so in that scenario, since then, the Republicans have been very aggressive of saying, you know, we think that content moderation is just another way of saying censorship. We think disinformation research and fact checking are biased and also really about censorship. And so that is taking place at the same time that there have been antitrust inquiries and lawsuits filed by states and the federal government against companies like Meta. So there is a lot of pressure on Meta from different corners.
And I think if you're meta and if you take Mark Zuckerberg at his word and say that he personally became uncomfortable with where their systems and the fact-checking partnership was at, and he also looks at it and says, like, the Republicans control the House, the Senate, and the White House, and all of them are steadfast in saying this is censorship and you need to stop doing this. For him to make these changes is nothing but a political win for him. And there's this other thing where there's, like, a whole—
universe of kind of right-leaning, right-adjacent posters or influencers who have the president's ear. And many of them right now are being rewarded with either access to the president or positions in the administration itself. Dan Bongino, now the deputy director of the FBI. I wonder if you think there are dots to connect between the online posts that
of someone like a Dan Bongino and the path to the White House via Meta, via X, via his posts online? I think, you know, you can start with Trump on this. He, a lot of people feel like, you know, he posted his way into the White House in 2016 because these platforms, they reward often the most extreme content and information. That's a great way to stand out. Trump stood out.
primarily initially on Twitter, because he posted stuff that nobody could believe a presidential candidate was saying. And he brought a whole cadre of influencers and MAGA people and built up an online ecosystem and infrastructure around him. And more than the Democrats, far more than the Democrats, they saw that as part of their power base.
more important than mainstream media and those kinds of traditional paths. And so I think, you know, the fact that it has been so critical to Trump's success, it makes it not surprising that influencers and, you know, posters like Dan Bongino and others are finding themselves in the administration or being hugged very closely by them because they see that as part of their own media universe.
And they can't have that universe and reach the normies, the regular people, unless they are there and succeeding on the platforms where regular people are. And I think they also subscribe to the idea that, you know, being out there, being extreme, saying the things that other people won't say, they have realized that that is a core part of their strategy. And if you can get the platforms to back off on moderation, then you can push it as far as you want.
I think the thing you said about where the normies are is really worth digging into a little bit because you can go on X and see that X is very much a creature of Elon Musk's construction right now. And maybe putting a finger on the scale with the algorithm. I mean, there's a lot we don't know because it's now a private company. But if you think about the scale of Facebook...
And the scale of Instagram and the number of people who are on there and the regular people who are on there, what do these changes potentially auger for the ability for either those same right-leaning influencers to reach normies or new ones to grow up in that ecosystem if content is going to be, you know, monetized that is more and more outrageous?
I think on a simple level, it feels to me like it just puts so many people and puts their timelines up for grabs. So if you roll back this stuff and you're removing the guardrails, the things that would sort of, you know, prevent certain things from ending up in people's Facebook feed or what have you, that stuff is being stripped away. And so if you're somebody who is, you know, actively every day thinking about how do I bring people over to my side? How do I influence them? How do I get content in front of them?
Suddenly some of the restrictions and things you had to be careful about, oh, if I get fact-checked too many times, I might have an overall kind of down ranking on my account, which was something that could happen. Now it's like, oh, it's game on again. The normies are up for grabs. We can push the limits to as far as we can get them.
And if you're of the mentality of doing that, you're probably gonna beat the people who are just posting the same stuff they were posting before. So I do wonder if we see some of the folks, MAGA World folks sort of pushing and seeing what can I get away with now and being willing to sort of risk
whatever kind of human review or something Meta might have now in places they remove these things because you're not as likely to get caught as quickly. You can probably get away with more and you're probably going to get rewarded with engagement for that, if not with an actual deposit in your account at the end of the month. Are there any breaks? Are there any fail-safes?
Meta has said that they continue to have systems in place detecting a wide variety of violations. They say that their community standards overall are in place, but they have changed some of their wording of policies about how, for example, you can talk about trans people. They've really opened that up much more to dehumanizing language and other things like that. And so there absolutely are still rules in place, but
the systems are set in a different way, the dials are set in a different way, and there are some things you can say now that you couldn't say before. So the game has changed. And I do think it means there are tens of millions potentially of normies kind of up for grabs on Meta of what is gonna end up in their newsfeed. And for people trying to persuade others, it's always about the unpersuaded and the potentially to be persuaded. You can certainly play to your own audience and your own tribe,
but if you feel like you can actually solidify your base and grow your influence even more in a moment like this, you're going to really try and do that in a very strategic way. And I'm interested to see who really tries to jump into this breach and take advantage of it. You know, reading your work and talking to you, you have made me think about a journalism professor I once had who described the idea of getting to the best available version of the truth.
Talking to this person and that person and this account and that account and trying to arrive through really process of information gathering and elimination on something that resembled a shared reality. And I keep wondering if there will be a shared reality on social media in a year or two years or five.
Well, I think you may find that there is, but it may be a shared reality that is perhaps not as rooted in facts.
than one might want it to be. I think in general, it is hard to sort of pin down a platform as big as Facebook, even in one country and say, here's what's there and here's what it is about now. But there are always winners and losers on platforms. These systems are built with basically recipes to say, find the stuff that's gonna work for the most amount of people and show it to them so they spend more time on it.
And at the end of the day, if some of these guardrails and rules go away, it's just what is capturing attention the most. And Facebook is not saying we care that the stuff that spreads the most is the most accurate. These are not the values being viewed in the platform. And so I think if you're someone who cares about, I would like accurate information to be rising to the top, I mean, I don't think that's the priority that Meta is communicating with these changes.
Craig Silverman, thank you for your reporting and for talking with me. Thanks for having me. Craig Silverman is a reporter for ProPublica, where he covers tech platforms, scams, fraud, and online manipulation. And that is it for our show today. What Next TBD is produced by Evan Campbell and Patrick Fort. Our show is edited by Rob Gunther. TBD is part of the larger What Next family. And if you like what you heard, the best way to support us is by joining Slate Plus. You should...
really check it out. You get all your Slate podcasts ad-free, including this one, plus some other nice bonuses too, like no hitting the paywall on the Slate site and extra bonus content from a lot of different shows. All right, we'll be back next week with more episodes. I'm Shaina Roth, in for Lizzie O'Leary. Thanks for listening.
Get that Angel Reef special at McDonald's now. Let's break it down. My favorite barbecue sauce, American cheese, crispy bacon, pickles, onions, and a sesame seed bun, of course. And don't forget the fries and a drink. Sound good? Ba-da-ba-ba-ba. I'm Leon Nafok, and I'm the host of Slow Burn Watergate. Before I started working on this show, everything I knew about Watergate came from the movie All the President's Men. Do you remember how it ends?
Woodward and Bernstein are sitting at their typewriters, clacking away. And then there's this rapid montage of newspaper stories. About campaign aides and White House officials getting convicted of crimes. About audio tapes coming out that prove Nixon's involvement in the cover-up. The last story we see is: Nixon resigns. It takes a little over a minute in the movie. In real life, it took about two years. Five men were arrested early Saturday while trying to install eavesdropping equipment. It's known as the Watergate incident. What was it like to experience those two years in real time?
What were people thinking and feeling as the break-in at Democratic Party headquarters went from a weird little caper to a constitutional crisis that brought down the president? The downfall of Richard Nixon was stranger, wilder, and more exciting than you can imagine. Over the course of eight episodes, this show is going to capture what it was like to live through the greatest political scandal of the 20th century. With today's headlines once again full of corruption, collusion, and dirty tricks, it's time for another look at the gate that started it all. Subscribe to Slow Burn now, wherever you get your podcasts.