We're sunsetting PodQuest on 2025-07-28. Thank you for your support!
Export Podcast Subscriptions
cover of episode The Facebook Election

The Facebook Election

2022/8/10
logo of podcast Land of the Giants

Land of the Giants

AI Deep Dive AI Chapters Transcript
People
A
Alex Heath
A
Andrew Bosworth
B
Brad Parscale
C
Crystal Patterson
J
Julia Owono
K
Kate Klonick
K
Katie Harbath
N
Nick Clegg
马克·扎克伯格
Topics
Alex Heath: 本集探讨了Facebook与其政治关系的演变,以及一位政治家如何反复迫使扎克伯格面对公司在民主进程中扮演的角色。从Facebook早期声称其工具帮助政治家赢得选举,到如今面临的复杂政治局面,该平台的立场发生了巨大转变。 Katie Harbath:Facebook最初的目标是吸引更多有影响力的人物,包括政客,使用该平台。2012年奥巴马竞选的成功,使得Facebook对2016年大选充满乐观。然而,特朗普的崛起和其在Facebook上的策略,改变了这一局面。 Crystal Patterson:特朗普2015年12月发布的反穆斯林言论,成为Facebook的一个转折点。公司内部就其是否应该被删除存在争议。扎克伯格最终决定不删除,这引发了关于Facebook在民主进程中角色的更大讨论,也为日后“不干预”政策的形成埋下了伏笔。 Nick Clegg:Facebook面临公众对其权力的担忧,以及在内容审核方面的困境。Clegg的加入旨在改善Facebook的公众形象,并通过成立独立的监督委员会来应对这些挑战。然而,监督委员会的权力仍然有限,Facebook仍然拥有最终决定权。 Julia Owono:监督委员会在处理特朗普案件时,重点关注Facebook的规则和国际人权法,而不是大选本身。委员会认为Facebook对特朗普的无限期禁令不合理,并建议设定期限。 马克·扎克伯格:扎克伯格曾表示,赋予更多人发声的能力能够增强弱势群体的权力,并推动社会进步。他还试图将Facebook定位为超越公司本身的存在,一个新的社会支柱。然而,在面对特朗普的言论和俄罗斯干预等事件时,扎克伯格的立场和决策备受争议,也暴露出Facebook在内容审核和政治影响力方面的不足。 Brad Parscale:特朗普的竞选团队充分利用了Facebook的广告系统,并取得了显著成效。 Andrew Bosworth:Bosworth认为特朗普在Facebook上进行了历史上最有效的竞选活动。 Kate Klonick:Klonick对Facebook监督委员会的独立性表示担忧,认为其权力范围受到限制,并且其决策可能受到Facebook的影响。

Deep Dive

Chapters
Facebook's role in politics has become increasingly complex, with Mark Zuckerberg grappling with the platform's influence on democracy. Zuckerberg's speech at Georgetown University highlighted his belief in giving more people a voice, but the reality of Facebook's impact on elections and political discourse is more nuanced.

Shownotes Transcript

Translations:
中文

When you're running a small business, sometimes there are no words. But if you're looking to protect your small business, then there are definitely words that can help. Like a good neighbor, State Farm is there. And just like that, a State Farm agent can be there to help you choose the coverage that fits your needs. Whether your small business is growing or brand new, your State Farm agent is there to help. On the phone or in person. Like a good neighbor, State Farm is there.

Silicon Valley Bank is still the SVB you know and trust. The SVB that delivers human-focused, specialized lending and financial solutions to their clients. The SVB that can help take you from startup to scale-up. The SVB that can help your runways lead to liftoff. The only difference? Silicon Valley Bank is now backed by the strength and stability of First Citizens Bank. Yes, SVB. Learn more at svb.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.com.

A student at Harvard builds a website for ranking classmates on campus by hotness. Then he builds a college directory. Facebook's origin story is well known. That's why it was curious when, in October of 2019, Mark Zuckerberg tried to recast it around a much loftier idea. Back when I was in college, our country had just gone to war in Iraq.

And I remember feeling that if more people had a voice to share their experiences, then maybe it could have gone differently. This is Zuckerberg framed by a giant American flag speaking to a packed room at Georgetown University. And those early years shaped my belief that giving more people a voice gives power to the powerless.

and it pushes society to get better over time. The timing of this speech was pretty revealing. Next on his D.C. itinerary, in just a few days, Zuckerberg would testify before Congress. It would be his second experience getting grilled on Capitol Hill since he appeared the year before to explain how Russia weaponized Facebook ahead of the U.S. election.

There was growing concern that Facebook, with its more than 2 billion users, had become too powerful. Concerns politicians had campaigned on. Yes, Mark Zuckerberg, I'm looking at you. And here was Zuckerberg giving a speech in the nation's capital, trying to position his company as something that was much bigger than a company. People having the power to express themselves at scale is a new kind of force in the world.

It is a fifth estate alongside the other power structures in our society. A new pillar of society. I understand the concerns that people have about how tech platforms have centralized power. But I actually believe that the much bigger story is how much these platforms have decentralized power by putting it directly into people's hands. With pressure mounting on Facebook to take a harder line on what speech it allows, here is Zuckerberg doing something else entirely.

distancing himself from that responsibility, and then branding it as good for the world. But as it turns out, reality is much messier than high-minded philosophy. Because one politician would again and again force Zuckerberg to confront his own power to curtail political speech and even sway elections. This is Land of the Giants. I'm Alex Heath. Today, the story of how Facebook comes to reckon with its own role in democracy.

Back in the early 2000s, Katie Harbath was working for the Republican Party on Senate election campaigns in Washington, D.C. Then Facebook called. The company actually approached me following the 2010 election cycle because the goal of the company was still to try to get more influencers, celebrities and politicians using the platform more. Believe it or not, at the time, Facebook's role in politics felt full of potential.

The Arab Spring kicked off in December 2010, becoming a symbol for how social media could fuel democratic revolutions. Harbath joined Facebook as one of its first political strategists on the policy team. They knew that they were going to have more Republicans running for president in 2012 and President Obama would be running for re-election. Obama's 2012 campaign ended up being a wild success story for the company. His team used Facebook as a tool to find potential voters and tailor messages to key demographics.

Facebook and the Obama administration were pretty cozy, basically right up to the 2016 election. Here is President Obama in June of 2016 hosting a panel at Stanford with none other than Mark Zuckerberg. There's a good-looking group. And I could not wear a t-shirt like Mark for at least another six months. But I will take off my jacket so that I don't look too formal.

Soon. Soon. It's going to happen soon. Given how well Facebook had worked for Obama's campaign, Harbath says the company was looking ahead to the 2016 election with optimism, for history to mark 2016 as the Facebook election. We really wanted Facebook to be seen as this place where major conversations of things that were happening in the world were.

were happening on the platform and that it was seen as a crucial tool that candidates needed to be using to engage with voters and something that if they weren't on, it would be much harder for them to win. And if 2016 was going to be the Facebook election, then Donald Trump would be the Facebook candidate. We need somebody

that literally will take this country and make it great again. While Trump was known for his incessant tweeting, Facebook was actually the platform he relied on the most for his campaign. Here's Brad Parscale, the architect of Trump's digital campaign, talking to 60 Minutes in 2018. Twitter is how he talked to the people. Facebook was going to be how he won.

The campaign's secret: taking full advantage of Facebook's advertising system. Parscale says the campaign would test thousands of variations of ads every day, with each ad hyper-targeting specific messages. Andrew Bosworth, who ran Facebook's ad platform at the time and is now the CTO of Meta, has said Trump ran the most effective Facebook campaign in history.

But of course, it wasn't just ads. Trump also used Facebook to talk directly to his voters. We really didn't start to see some of these really difficult conversations around candidate and politician speech until December of 2015. That was when then-candidate Trump posted something that would push Facebook into an uncomfortable and new kind of political position. Donald J. Trump is calling for a total and complete shutdown.

of Muslims entering the United States. I do think that was a turning point, just in that, one, we hadn't had somebody post something like that that was in that kind of position before that we would have had to take punitive action. That's Crystal Patterson, a former Democratic lobbyist for Facebook who joined in 2014.

We'd have members of Congress who would block people from their pages. And it was like, is that legal? Those are the kinds of questions we were grappling with. Not what do you do when the leading candidate for president posts an attack on an entire, the biggest religion in the world? Like, you know, what do you do with that? Exactly. Exactly.

This Trump post was a big deal inside Facebook. Many employees had never grappled with this possibility, that their platform could be used to stoke division like this by a presidential candidate. And Trump's video targeting a religious group seemed to directly violate Facebook's rules. Internally, Facebook employees pushed Zuckerberg to take the post down. This was something that would be removed if it came from a regular user. Shouldn't the rules be the same for an especially powerful person?

But Zuckerberg refused. Instead, he wrote on his public Facebook page, "If you're a Muslim in this community, as the leader of Facebook, I want you to know that you are always welcome here." He was no fan of Trump, but pulling down speech by a major political candidate was not something to be taken lightly. The episode brought up a bigger question about what Facebook's role was in the democratic process.

Here's Katie Harbath. There were a lot of questions about what role should a tech company have overall in moderating any of this sort of political debate. And so any decision around taking anything down would have been incredibly precedent setting. The decision not to touch Trump's post, that was also precedent setting. Crystal Patterson. I'll just speak for myself. Like, it was the first time I felt like we were...

being, what's the word, pliable with how we were going to apply the community standards. I used to be able to tell you chapter and verse what would be okay and what wouldn't on the platform. And now all of a sudden, this felt like the target was moving a little bit. And the rub was that we were doing it for someone who had real impact. This Trump post would have a real impact on Facebook's policies too. The company started carving out an exception to its rules called newsworthiness.

The gist: If a piece of content broke its rules but was deemed historically significant or of public interest, Facebook would leave it up. So funny, I sound probably like more of an apologist than I am, but I think there's a real effort to try to not be a factor in these debates. Like, they'd rather keep hands off as much as possible. But that hands-off approach, it wouldn't be easy for Zuckerberg to maintain.

Because of course, what happened next was Donald Trump became more than just a presidential candidate. We sat here 12 hours ago and we looked at an electoral map that seemed impossible. CNN can report that Hillary Clinton has called Donald Trump to say that she will not be president. The shockwaves are already being felt around the globe.

On November 10th, 2016, Zuckerberg sat down for an interview at the Techonomy conference in California. It was on this stage that Zuckerberg would say something that would haunt him for years. Personally, I think the idea that fake news on Facebook, of which it's a very small amount of the content, influenced the election in any way, I think is a pretty crazy idea.

Even before the full scope of Russia's meddling on Facebook was known, questions were being raised about the company's role in the victory of Donald Trump. One part of this that I think is important is we really believe in people, right? And that they can, like, you don't generally go wrong when you trust that people understand what they care about and what's important to them. And you build systems that reflect that. Things had in fact gone very wrong.

How much revenue did Facebook earn from the user engagement that resulted from foreign propaganda? So wrong that two years later, Zuckerberg would be hauled before Congress for the first time. One of my greatest regrets in running the company is that we were slow in identifying the Russian information operations in 2016. Russian-backed operatives had used Facebook to the fullest extent to stoke division in the United States leading up to the 2016 election.

They made Facebook pages with names like "Army of Jesus" and paid for ads comparing Hillary Clinton to Satan. They even set up real-life dueling protests using Facebook events. Facebook admitted it sold at least $100,000 worth of Russian ads. And the spread of regular content was much bigger. In total, Russian-backed Facebook posts reached 126 million people ahead of the election. Any goodwill left for Facebook in D.C. was gone.

And Democrats, who had cozied up to Facebook in the Obama years, were especially furious. Safe to say, the Facebook election hadn't gone the way the company thought it would.

I just felt culturally the organization was in quite a shell-shocked state, quite a defensive sort of crouch. This is Nick Clegg. He joined in late 2018 to lead Facebook's policy and communications teams. He's now the company's president of global affairs, reporting directly to Zuckerberg. Clegg isn't your typical Silicon Valley executive. He was the deputy prime minister of the UK from 2010 to 2015. Then he lost his seat in parliament entirely in 2017.

And Facebook came calling. My first conversation with Sheryl Sandberg, I remember I was sort of halfway up a mountain in the Alps on a hiking holiday and, um,

I mean, it was so sort of left field from my point of view that I just, I said, no, there's no way I'm going to work at Facebook and move to California. But Cheryl is persuasive and persistent. Cheryl Sandberg was once the second most powerful person at Facebook. In 2008, she became the chief operating officer. And in early 2022, she announced she was leaving. While it wasn't clear to the outside world at the time, her hiring of Clegg was a key step in setting up her departure.

Before Clegg, Sandberg had been the top political figure at the company. But with Facebook's reputation in shambles, it was time for someone else to step in and reset things. I came to this company partly because I'd said to Mark, you know, in all our endless conversations before I joined, your fundamental problem is the people's perception of your power. The perception that Zuckerberg has too much of it, that is. Not an unreasonable take.

Let's talk about why Mark Zuckerberg is uniquely powerful for a second. It's because he essentially can't be fired. Zuckerberg owns a special class of company stock that effectively gives him majority voting power over all company decisions, including acquisitions or who sits on the board of directors. It's a strategy he borrowed from the co-founders and former leaders of Google, Larry Page and Sergey Brin. Zuckerberg has shown no desire to give away his voting control or his CEO title.

But around the time that Clegg joined, he was looking for a way to offload something else. I understand that people are concerned about how much control we have over how people communicate on our services. And frankly, I don't think that we should be making so many important decisions about speech on our own either. So that's why we're establishing an independent oversight board.

for people to appeal our content decisions. Enter the Oversight Board. That's how Zuckerberg described it at his Georgetown speech. The idea was, the Oversight Board would function like a Supreme Court for Facebook. It would be staffed by people who weren't Facebook employees and be able to make binding judgments on certain kinds of content decisions. If Zuckerberg was Facebook's executive branch, this would be his version of checks and balances.

Facebook would fund the board, but would do it through a trust so that once it was set up, the company couldn't meddle. Building this institution is important to me personally because I'm not always going to be here. And I want to ensure that these values of voice and free expression are enshrined deeply into how this company is governed. For someone who gave himself outsized control of the governance of his company, this was a bold statement. And one met with skepticism.

I cannot tell you how many people told me that this was just this incredibly stupid PR stunt exercise and that it was going to be a waste of time.

Kate Klonick is a professor of law at St. John's University. She spent over a year embedded in the making of the Oversight Board and wrote about it for The New Yorker. It's no good if it's never going to have independence or any buy-in if it just comes kind of, as someone told me, like fully sprung from the head of Mark Zuckerberg, like Athena from Zeus.

According to Klonick, there were early questions about just how powerful Facebook would let the oversight board be. There was some fear that they could change or kill News Feed. Yeah, that was something that's particularly, I think, that product managers were concerned about. As we explained in our first episode, the News Feed has been at the center of Facebook's controversies since the very beginning.

But, at least to start, that wouldn't be the Oversight Board's focus. After over a year of talking to human rights and policy experts around the world, the Oversight Board's initial remit was narrowed to this.

It would offer the final word on cases where Facebook users appealed the company's decision to remove their content. Facebook itself could also refer content decisions to the board for advice, though the board's recommendations wouldn't be binding. Not exactly a supreme court, but still, more power than any other tech company had handed over to an outside body.

Which also gave insiders pause. There is something about not opening the door that means that you can control the message and that you can control the image of your product. And the second you start a dialogue, you open yourself up to more liability. And you open yourself to disappointing people and people having expectations of you and you not meeting them. So why open yourself up to this kind of scrutiny? Nick Clegg has some thoughts. Keep in mind, this is his boss he's talking about.

He is immensely reluctant to deploy the very considerable power he has to curtail, marshal, limit, curate human speech. I think he's almost intuitively, and it's very smart, he understands that the very act of doing what he's often pressed to do will only exacerbate people's concerns about unaccountable power.

Zuckerberg was no longer just a tech CEO. Being the leader of his so-called Fifth Estate, it made him a political lightning rod.

And all of this, according to Clegg, was distracting. I think we'd found through the Trump years that it had a somewhat paralyzing effect on the whole of the kind of senior leadership of the company every time these blowups happened. And look, you know, Mark has to run a company. His great passion and expertise is on the sort of product side of things. The oversight board would help make some of the most controversial content decisions at Facebook. But the board would take another two years to set up.

two more years of Trump in the highest office. After the break, Zuckerberg is drawn back into the responsibility he wanted out of.

Support for Land of the Giants comes from Quince. The summer is not quite over yet, but shifting your wardrobe to the colder months could start now, little by little. You can update your closet without breaking the bank with Quince. They offer a variety of timeless, high-quality items. Quince has cashmere sweaters from $50, pants for every occasion, and washable silk tops.

And it's not just clothes. They have premium luggage options and high-quality bedding, too. Quince's luxury essentials are all priced 50% to 80% less than similar brands. I've checked out Quince for myself, picking up a hand-woven Italian leather clutch for my mom. As soon as she saw it, she commented on how soft and pretty the leather was and told me it was the perfect size for everything she needs to carry out on a quick shopping trip.

Make switching seasons a breeze with Quince's high-quality closet essentials. Go to quince.com slash giants for free shipping on your order and 365-day returns. That's q-u-i-n-c-e dot com slash giants to get free shipping and 365-day returns. quince.com slash giants

On September 28th, the Global Citizen Festival will gather thousands of people who took action to end extreme poverty. Watch Post Malone, Doja Cat, Lisa, Jelly Roll, and Raul Alejandro as they take the stage with world leaders and activists to defeat poverty, defend the planet, and demand equity. Download the Global Citizen app to watch live. Learn more at globalcitizen.org.

In May 2020, George Floyd was murdered in Minneapolis by police. Protesters gathered around the country to express their grief and outrage. President Trump, of course, weighed in online. Speaking about protesters in Minneapolis, Trump posted a statement on Facebook and Twitter that included a phrase with a racist, violent history. Quote, When the looting starts, the shooting starts. Another moment that was just...

The post itself was shocking and also inside the company was just a turning point, I think, in terms of a lot of simmering frustrations coming to the surface on a lot of fronts. Crystal Patterson again, then still working on Facebook's policy team. And also at that point, we hadn't hit January 6th yet, but we knew...

It felt to many inside Facebook that Trump's statement put the lives of Americans, particularly Black Americans, in danger.

Once again, Facebook had to decide what to do about Trump. But this time, he was the president. Nick Clegg was in the room for the debate. Here's how he described the arguments for and against taking the post down. The argument for was that it was a post that could be interpreted as sort of crossing a line in terms of inciting violence.

The argument against was heads of state have a right to say they are thinking of deploying the force of the state. And that would equally be a very big thing, a very big thing for a private company to say no. Again, Zuckerberg decided to leave the post up. I think Mark, in the end, felt as the ultimate decision maker that the argument that, you know, you're crossing a Rubicon if you're going to start

saying that heads of government can't threaten to deploy a force to restore order, that that sets a really quite a worrisome precedent. The blowback was intense.

Hundreds of Facebook employees staged a virtual walkout to protest the decision, posting messages about it on Twitter. To this day, it's the most public display of dissent from within the company. Later that week, Zuckerberg doubled down on his decision in a tense town hall meeting with employees. They were livid and pressed Zuckerberg.

He said he didn't read Trump's comments as a dog whistle for vigilante violence, despite the history of the phrase. It had been used by segregationist cops while violently cracking down on civil rights protesters in the 1960s. Crystal Patterson found the whole episode disillusioning. She left Facebook the next year, in 2021. When you're hands off when someone's hurting another person, you know, you're part of the problem.

You'll never take back our country with weakness. You have to show strength and you have to be strong. We have a breach of the Capitol. Breach of the Capitol. They broke the glass in the United States Capitol and now they are climbing through the window. This happened a moment ago. I know you heard we had an election that was stolen from us.

The next time Trump forced Facebook into a corner, Zuckerberg had become less involved in content policy. This time, what to do about Trump and the January 6th insurrection would fall to Nick Clegg.

So by that stage, Mark had anyway sort of decided that he was just keen to let me kind of take those decisions more fully on his behalf. Oddly enough, it wasn't a hugely difficult decision. It was pretty clear that what had been shared on Facebook and the run-up to the insurrection

was clearly aimed at seeking to interrupt the peaceful transfer of power, was clearly contributing to the violent kind of mood of the time. And so it wasn't actually a very complicated question about that this was just simply not something that we want on our platform.

Trump crossed a line the company believed he had not crossed before, past the point of newsworthy and into dangerous. And in that sense was a classic example of exactly the limits to the kind of ethos that Mark set out in the Georgetown speech. Of course there are limits to speech. In retrospect, Clegg makes the decision to ban Trump sound simple and definitive. But it was actually a series of escalating decisions.

First, during the insurrection on Wednesday, Facebook removed the video Trump posted in which he told rioters to go home but also said, "We had an election that was stolen from us." Less than an hour after that, Facebook took down another post where Trump wrote, "Remember this day forever." Then, based on those two violations, Facebook went a step further by putting a 24-hour block on Trump's ability to post.

It was the next day, on January 7th, when Zuckerberg announced that the company was banning Trump indefinitely. But Clegg wasn't the only one who'd be on the hook for this decision. Of course, we wanted to see whether our approach made sense. So we referred it to the oversight board. Which had just started taking cases in October, a few months before. Yes, there was a fear that...

We would be, I would be used as a PR object. That's Julia Owono, a member of the Oversight Board. We left off with the board when it was just an idea. Now it was real. Owono is the director of a nonprofit called Internet Without Borders. Her colleagues on the board include the former prime minister of Denmark and a Nobel Peace Prize laureate. And they were all wading into the most controversial, high-profile content decision in Facebook's history.

It's former President Trump, you know? There was an election and a significant part of the American electorate voted for him. So who are we to come into this conversation? But what I appreciate is that

The conversation was not about the election, was not about our role. We focused on the rules. What are the rules on Facebook and also beyond Facebook and international human rights law that could justify or not what has happened? The board ultimately agreed that Trump had violated Facebook's rules, but it took issue with Facebook's ban being indefinite. It just didn't make sense. Like a judge sentencing someone to jail for an undetermined amount of time.

Oversight board member Julie Iwono. We told them that the indefinite aspect of the sanction, we didn't find that anywhere in the text, in the community standards, in the terms of services of the company. And so we said, this is totally arbitrary. You cannot come up with indefinitely. So give us a deadline. The board kicked the decision back to Facebook, pushing it to clarify whether Trump would be permanently banned or not.

I mean, they quite sensibly sort of slightly ducked the issue, but they equally made valid criticisms that the way in which we had announced and explained our decision-making wasn't entirely sort of precise enough and in conformity enough with published rules. We arrived at this decision to suspend him for two years and he will come back onto the platform in January. January 2023. Depending on one thing,

They will do an assessment, an impact assessment on human rights and potential violation of the community standards that this return could represent. According to Facebook, this report will measure whether the risk to public safety has receded. If Facebook determines that Trump's rhetoric continues to be a threat, it could re-extend its ban.

Two things are true here. Yes, the Oversight Board pushed Facebook to more clearly define how it handled Trump. But this Trump case also showed just how far the Oversight Board has to go before it's anything like a Supreme Court. Because something else came up when the board got involved. When we were working on the Trump case,

We did ask the company whether or not, you know, that there was exemptions for this particular profile. The board's question to Facebook was, did the company have some internal rule about how it treated high profile people differently than everyone else? A list, maybe, of users evaluated with extra care? Facebook said yes. They just told us, you know, it's just a few, just a few people. It's not just a few people, though. No, it's not.

The program was called CrossCheck. In a published report, the board said that Facebook first told them this program had only applied to a "small number of decisions." But then, a former Facebook product manager named Francis Haugen leaked documents revealing that in 2020, the program actually covered 5.8 million accounts, most of which were celebrities and politicians. If you were on the list, you weren't subject to the same rules as everyone else.

After Haugen revealed the full scope of cross-check, the board said Facebook acknowledged that the phrase small number was misleading. This all hits home a truth about the oversight board. For now, Facebook is still ultimately the one with the power. Here's Nick Clegg again. So I...

Look, it is not a Supreme Court. In a sense, it's unfortunate that people keep using that analogy. It is the Facebook Oversight Board. It is what it says on the tin. I think what it has done within the parameters that were set at the outset has way exceeded my expectations. The thing is, even though the board is limited in what it can do now, both Facebook and the board want its power to expand over time. The company just gave the board another $150 million.

I asked Julia Wano what specific power the board might push for next. So we haven't in the past shied away from talking about things that we're not supposed to talk about initially. That includes algorithms. But wanting that power and getting it from Facebook, those are different things. In the meantime, this topic of algorithms, how Facebook and Instagram decide what we see in our feeds, that's what we want to turn to next.

Because right now, Zuckerberg is making a fundamental pivot to a new kind of social media. A feed that places even more power in his company to determine what we see. In our fifth episode, Facebook defined an era of social media built on our connections, our social lives. We're watching that era come to a close. So what is next?

News clips of Elizabeth Warren from CBS 8 San Diego. Trump election clips from Morning Joe, CNN, and CBS This Morning. George Floyd protest clip from the Associated Press. January 6th news clips from CNN and CBS News. Congressional clips from C-SPAN.

Land of the Giants, the Facebook meta disruption is a production of Recode, The Verge and the Vox Media Podcast Network. Megan Cunane is our senior producer. Oluwakemi Oladesui is our producer. Production support for this episode from Cynthia Betubizo. Jolie Myers is our editor. Richard Seema is our fact checker. Brandon McFarland composed the show's theme and engineered this episode. Samantha Altman is Recode's editor-in-chief. Jake Castronakis is deputy editor of The Verge.

Art Chung is our showrunner. Nishat Kerwa is our executive producer. I'm Alex Heath. If you like this episode, please share it and follow the show by clicking the plus sign in your podcast app.