We're sunsetting PodQuest on 2025-07-28. Thank you for your support!
Export Podcast Subscriptions
cover of episode Zuck your feelings

Zuck your feelings

2025/1/16
logo of podcast Today, Explained

Today, Explained

AI Deep Dive AI Chapters Transcript
People
A
Andy Roddick
B
Ben Wofford
J
John Herman
S
Sean Elling
S
Sean Ramos-Verm
Topics
John Herrman: 我观察到扎克伯格的形象发生了显著转变,从穿着简单到如今更具男子气概的打扮。这与Meta近期政策的转变相呼应,包括取消事实核查项目和放宽内容审核标准。我认为这些变化并非偶然,而是扎克伯格政治立场的转变以及对自由表达的重新定义。取消事实核查项目以及放宽内容审核标准,将会导致平台上充斥更多虚假信息和仇恨言论,这与平台的早期目标背道而驰。此外,Meta推出的社区评论功能,并不能有效替代事实核查机制,平台的整体信息环境将变得更加混乱,这与Elon Musk掌管下的Twitter的转变非常相似。 Meta的这些变化可能导致平台回归到2016年之前的状态,信息环境更加混乱,缺乏秩序,这将对用户体验和平台功能性造成负面影响。虽然Meta声称此举是为了促进自由表达,但其商业属性并未改变,平台仍然受到各种限制。总而言之,Meta的政策调整将对平台的未来发展产生深远的影响,其信息环境将会变得更加复杂和不稳定。 Ben Wofford: 我认为Joel Kaplan是理解Meta近期政策转变的关键人物。Kaplan在Meta扮演着幕后操盘手的角色,他的政治立场和人脉资源对Meta的政策走向有着深远的影响。Kaplan的职业生涯跨越了多个领域,从哈佛大学的进步学生到共和党人,再到Facebook的政策制定者,这反映了他政治立场的转变。他成功地将MAGA保守主义与Facebook的企业价值观相调和,这使得Meta能够在复杂的政治环境中生存和发展。 Kaplan在处理Facebook与保守派政治之间的冲突方面发挥了重要作用,他通过与保守派媒体人士会面,成功地解决了2016年的“Gizmodo事件”。此后,Kaplan成为Facebook处理内容和言论政策的中心人物,他主导了“Common Ground”项目的搁置,这反映了他对政治风险的敏感性和对政治立场的考量。在未来几年,Kaplan将继续在Meta发挥关键作用,他将负责协调Meta与特朗普政府的关系,并尽可能地避免冲突。 Sean Ramos-Verm: Meta的政策转变,特别是取消事实核查项目和放宽内容审核标准,引发了人们对平台信息环境恶化的担忧。这与Elon Musk掌管下的Twitter的转变非常相似,都导致了平台上虚假信息和仇恨言论的增加,以及用户体验的下降。 Joel Kaplan作为Meta的政策制定者,其在其中扮演的关键角色值得关注。他的政治立场和人脉资源对Meta的政策走向有着深远的影响。

Deep Dive

Chapters
This chapter explores Mark Zuckerberg's recent transformation, encompassing his new image, policy changes at Meta, and his apparent shift towards a more conservative stance. It analyzes his video addressing these changes and the implications for Meta's future.
  • Zuckerberg's shift in image and public persona.
  • Meta's changes to content moderation and fact-checking policies.
  • Zuckerberg's video explaining the shift towards 'free expression'.

Shownotes Transcript

Translations:
中文

Mark Zuckerberg's in his cool era. He's letting his hair grow out. He's wearing black t-shirts with a gold chain. He covered Get Low with T-Pain. That's him singing.

Mark Zuckerberg is also in his MAGA era. He's throwing a party at Trump's inauguration next week. He went on Joe Rogan to say companies need more masculine energy. He's ending Meta's DEI initiatives. He's taking tampons out of the men's bathrooms at his offices. He's getting rid of the non-binary and transgender themes on Meta's Messenger app. But perhaps most important of all, he's changing Meta's

content moderation, and fact-checking policies, we are going to poke around the new Zuck Your Feelings metaverse on Today Explained.

This week on The Gray Area, how are digital devices changing us? We've become more machine-like, and I think the exhibit A for that is how young people, for example, talk about their sex lives in machine-like terms, performative terms, in ways that actually have shaped their understanding of what an intimate sexual relationship even should be, what it should look like, what it should feel like.

Listen to The Gray Area with me, Sean Elling. New episodes every Monday, available everywhere. Hey, it's Andy Roddick, and I'm not just a former tennis player. I am a tennis fan, a tennis nerd. I just can't stop watching it. I can't stop analyzing it. I can't stop talking about it to anyone that will listen, which is why I started my podcast Serve with Andy Roddick, now a part of the Vox Media Podcast Network. On the show, we talk about everything from new up-and-coming players to the champions dominating the narrative of

to whatever's on my mind. This January is the Australian Open, and you know I've got some thoughts. So tune in for our Australian Open coverage, find served wherever you get your podcast, or on our YouTube channel. Content moderation and fact-checking on Facebook and Instagram is kind of like oxygen. You can't see it, but it's out there, and it's essential to your user experience. It's getting rid of all the illegal material, the hateful material, and the spam that's

John Herman has been writing about the changes Meta's making to content moderation for New York Magazine. And if you're like, content moderation is boring. A reminder that without it, we have seen real world political violence. Exactly. And the fact checking piece was intended to sort of close a little bit of a loophole that

existed with news content where if false or inflammatory stories about, say, an ethnic minority in a country going through political strife were going viral again and again and again, they could feed into real political violence and have.

The persecution of the Muslim minority continued for years. The picture changed drastically once Facebook entered the fray in 2012. Anti-Muslim and anti-Rohingya memes and propaganda have spread through Facebook, eroding support for the Rohingya's flight. And, you know, in 2016, there was a lot of domestic pressure on Facebook to address the

Similar issues: During the last three months of the presidential campaign, fake or false news headlines actually generated more engagement on Facebook than true ones. People actually believe a conspiracy theory that Hillary Clinton and her former campaign manager John Podesta ran a child sex ring at a pizzeria in DC.

This is a lie. To borrow Facebook's language, it was creating a less authentic environment, which is an incredible euphemism for a place that, you know, was just full of garbage. And so for a while, the critics of Meta and Facebook and Facebook and Meta were sort of aligned. That is no longer true. That is very pointedly not true. Now Zuckerberg is killing this program forever.

Zuckerberg posted a video explaining his reasoning. What did you make of the video? He looked so good with his hair and his t-shirt. Hey, everyone.

I want to talk about something important today because it's time to get back to our roots around free expression. You know, you see this video and truly, if you haven't been watching this closely, it is crazy. It's like, okay, Mark Zuckerberg, you know, hoodie guy, plain shirt guy, Caesar haircut guy. He's got curly hair. He's got a gold chain. He's big now. He's got a little bit of a tan. Like, okay, he's, something's going on here.

But a lot has happened over the last several years. And, you know, it's funny, but it's also a signal. It's sort of slightly right-wing coded. It's more of an obvious performance of masculinity, circa 2025. Before these platform changes, I think there was a...

to treat this as just like a personal rebrand, maybe like an early midlife crisis type thing. But now, in hindsight, you know, we can sort of understand this as perhaps part of a, you know, more personal and authentic political transformation, or at least a sense of personal freedom, catharsis of, you know, getting ready to sort of tell everyone to just, you

deal with it because we're going to do what we want now. So we're going to get back to our roots and focus on reducing mistakes, simplifying our policies, and restoring free expression on our platforms. And then he starts using terminology that, again, is slightly right-wing coded. He's sort of complaining about the quote-unquote legacy media. After Trump first got elected in 2016, the legacy media wrote nonstop about how misinformation was a threat to democracy.

He's talking about how, you know, he was being sort of pushed around and bullied by the Biden administration. And that's why it's been so difficult over the past four years when even the U.S. government has pushed for censorship. In this announcement and then elsewhere on posts on threads and on the Joe Rogan experience, talking about how, you know, maybe if people are going to leave over these changes, they're just virtue signaling. Society has become very, like,

I don't know. I don't even know the right word for it, but it's like, it kind of like,

neutered or like emasculated. And so, you know, probably the most striking thing about this video is how, on one hand, it's really familiar. This is Mark Zuckerberg after an election sort of laying things out and saying, you know, we are listening. We are working on this. We're trying to fix things. His audience is just different now. It's a different group of people. It's not a critical press organization.

or potential regulators that he thought were important in 2016. He is now sort of, you know, looking in the imminent future and saying, all right, like, how can we work with you? I'm looking forward to this next chapter. Stay good out there, and more to come soon. And tell us exactly what the new policy is. Is it just we're going to let you guys hash it out in the comments? Kind of.

So the two lanes for this are, one is that the fact-checking program is being discontinued. This got sort of like top billing from Zuckerberg, but the bigger changes are to Facebook's basic and much broader moderation systems. So there are a few new carve-outs. You are allowed to use more dehumanizing speech about transgender people, immigrants,

you are allowed to more broadly use harsher language in your interactions on the platform. We do allow content arguing for gender-based limitations of military, law enforcement, and teaching jobs. We also allow the same content based on sexual orientation. Mark Zuckerberg says that they will be rolling out a community notes style program. If you've been on X,

through Elon Musk's sort of takeover and remaking of the site, you'll know that they have a system that sort of allows users to weigh in on posts and say, you know, this is true, this is misrepresentative, this is not true. The posts then carry this tag, things like that. It's an interesting and frankly kind of useful feature on X, but it is not

nearly up to the task of, you know, broad platform moderation. It tends to be slow. And I think the circumstances on Facebook, for example, are much less conducive to a good community notes program. We'll see what they build. But it is, I think, a partial replacement at best for the fact-checking program that existed before, which was already not doing a whole lot. You brought up the transition Twitter made when Facebook

Elon took over my experience as a user of that regrettable platform is My feed started getting more confusing quite frankly there was more spam coming in to my DMS there were more verified users who were just You know random people who wanted to amplify their voices It got harder to tell

misinformation or even disinformation from reliable information. I started seeing porn in my feed more often. Like the whole thing just got messier. Is that what people should expect from their experience on Facebook or Instagram right now? In some ways, I think yes. And what's funny is I've had the same or similar feelings about the transformation of X and

One thing that kept coming to mind is that this platform, which was certainly always flawed and full of all kinds of stuff that you didn't necessarily want to see or whatever, it's always a complicated product.

It felt kind of familiar. It kind of felt like pre-2016 Facebook, where you're just scrolling around, things are kind of out of order, like literally not chronological. You don't know where things are coming from, why you're seeing them. It's just sort of like an unstable, but in some ways very engaging environment. So in rolling those back, there's a return potentially to this version of Facebook that the company left behind nearly 10 years ago.

And yeah, the most useful current comparison is certainly X, which in some ways is probably doing very well in the eyes of its owner, but is used by far fewer people, is now a sort of fairly hostile political environment for a lot of its previous users.

It is far less useful in, for example, a disaster like the fires in LA County or recent hurricanes. It is just full of untrustworthy information from untrustworthy people.

who are often there with malign ends, too misinformed, too make money, too spam. It's a different kind of place. Euphemistically, it's rougher on the edges. It's rowdier. Functionally, it just doesn't work as well. And if you take seriously Elon Musk's commitment to free speech, if you take seriously Mark Zuckerberg's,

sudden commitment to free expression. Maybe you can conceptualize this as just a trade-off. But the reality of these platforms is much more complicated than that. They are not built with enabling free speech in mind. These are commercial advertising and subscription platforms with tons of restrictions on what you can do and what you can say. And that fundamental fact hasn't changed. The sort of flavor of censorship is what's changing.

John Herman is a tech columnist and intelligencer from New York Magazine. You can read and subscribe at nymag.com. There's one guy over at Meta who's in charge of getting the flavor of censorship just right. And his name's not Mark, it's Joel. We need to talk about Joel next on Today Explained. Support for Today Explained comes from Noom. Many a weight loss plan takes a one-size-fits-all approach.

Thank you.

without restricting what you eat. They've even published more than 30 peer-reviewed scientific articles describing their methods and effectiveness. Our colleague Phoebe Rios here at Vox got to try out Noom and let us know how it went.

I feel like the plan Noom created was catered to my individual needs. It was very thorough. I felt like the questions they asked, I hadn't even asked myself, like what time I get out of bed in the morning and if I eat with my phone in my hand. It was very helpful and very, very educating of how I spend my day. You can stay focused on what's important to you with Noom's psychology and biology-based approach. You can sign up for your trial today at Noom.com.

Support for the show today comes from Vanta. Trust isn't just earned, Vanta says. It's demanded. Have you demanded someone's trust lately? Whether you're a startup founder navigating your first audit or a seasoned security professional scaling your governance, risk, and compliance program, proving your commitment to security is critical and complex. And that's where Vanta comes in. You know the deal. Vanta says they can help businesses establish trust

by automating compliance needs across 35 frameworks like SOC 2 and ISO 27001. They say they can also centralize security workflows, complete questionnaires up to five times faster, and proactively manage vendor risks. You can join over 9,000 global companies like Atlassian, like Quora, and Factory who use Vanta,

to manage risk and prove security in real time for a limited time. Our audience can get $1,000 off Vanta at Vanta.com slash explained. That's V-A-N-T-A dot com slash explained for $1,000 off. Support for the show today comes from Indeed. It says that we're halfway through January, and that means it's way too late for anyone to be telling you Happy New Year rude.

I'd like to say Happy New Year into May. But you know what else is too late? Hiring the right person for that open position from 2024. Luckily, there's Indeed. Indeed, you can stop struggling now.

to get your job posting. Indeed, says their sponsored jobs help you stand out and hire fast. With sponsored jobs, your post jumps to the top of the page, which can help reach the people you want faster. There's no need for

To wait any longer, you can speed up your hiring right now with Indeed. And listeners of this show will get a $75 sponsored job credit to get your jobs more visibility at Indeed.com slash Today Explained. You can go to Indeed.com slash Today Explained right now and support our show by saying you heard about Indeed on this show. Indeed.com slash Today Explained. Terms and conditions apply. Hiring? Indeed is all you need.

You're listening to Today Explained. Sean Ramos-Verm here with Ben Wofford, who considers himself a Kaplanologist, which is to say he's written a lot about a guy named Joel Kaplan for places like Wired and Business Insider.

Hani Farid, who's a professor at UC Berkeley, calls Joel Kaplan the most influential person at Facebook that most people haven't heard of. There's no question that the things that happen at Meta are coming from Mark. But there's also no question that there has been a change over the last... So Kaplan, for the last 15 years or so, has had this extremely important role at Facebook. And formally, his role has been to forecast and manage policy risk.

Functionally, his role in the last 10 years has grown to be as sprawling, basically, as Facebook's reach itself. And it involves overseeing a prolific lobby in Washington, D.C., which is managing relations with the federal government and state capitals. And he leads, Kaplan, a team of about 1,000 policy staff worldwide in Facebook, shaping and massaging and sometimes thwarting

the international laws and regulatory bodies and policies that graze any part of Facebook's enormous business. But it's this third role that has made Kaplan so controversial, and that is helping design and arbitrate

Facebook's policies on political speech, which have changed so much and so dramatically over the last 10 years. Ben says Joel Kaplan is a Forrest Gump-type figure. He went to Harvard. He was a good progressive college student. But then the Gulf War starts,

And he finds himself feeling more conservative. He graduates, enlists, goes to law school and comes out a proper Republican. Clerks for Antonin Scalia at the Supreme Court. Becomes best buds with Brett Kavanaugh. And then he joins up with George W. Bush. Serves all eight years in the Bush administration. And then he gets out and he's like, what's next? And that's

Just when his old pal from Harvard, Sheryl Sandberg, calls him up and offers him a job. Kaplan's role for the first three years, he's one of a number of elder statesman types surrounding a younger Zuckerberg who has increasingly realized that the reach of his company is going to be entangled in policy matters in Washington. Senator, we run ads.

I see. It's during this period, you call it sort of from 2011 to 2016, that Kaplan, if not a mentor, is sort of described by colleagues as sort of an older brother figure to a younger Zuckerberg. He's accompanying Zuckerberg to tech summits in the Obama Oval Office. My name is Barack Obama, and I'm the guy who got Mark to wear a jacket and tie.

By the time Washington Kaplan comes out of those eight years in the Bush White House, he's got a reputation as a real, you know, a bipartisan impresario. So Kaplan is a certain, you know, breed of Bush conservative that is open-handed and warm and interested in bipartisan compromise. And it's part of why he's so prolific and such a valuable asset to any lobbying operation or company, but especially to Facebook. There are lots of these moments where

Facebook is growing. It stumbles on some kind of tripwire of conservative politics it didn't know was there. And the company sort of frantically looks around and says, who do we have who's like a singular Republican operative who can help us with this problem? And over and over and over again, the answer is just Joel Kaplan, Joel Kaplan, Joel Kaplan.

But the real hinge moment comes in 2016. I mean, this is the first real crisis that Kaplan solves. And it's sort of a foreshadow of events. But it's a famous episode in May 2016, still known inside Facebook as sort of the Gizmodo affair. Gizmodo publishes an article alleging that Facebook's trending topics widget is

is biased against conservative media publishers. The CPAC conference, for instance, you know, as that was going on, that was not allowed to trend in Facebook's trending news feed. And conservatives are outraged. Forget leaning in. Does Facebook lean left? Republican Senator John Thune... In comes Joel Kaplan for the rescue. Kaplan calls an old friend who's working on the Trump campaign, and he designs this summit at Menlo Park.

where he's going to bring in these conservative media heavyweights, you know, more than a dozen of these big-name guests. They include Tucker Carlson and Glenn Beck and Dana Perino. And they get sort of this VIP treatment. Zuckerberg gives a seminar where he explains to them the problem, what they're doing, how they're going to solve this and sort of finesse and massage and charms them. And Kaplan, of course, is preparing the summit, briefing Zuckerberg, walking him through the talking points.

And it works. When the conservatives kind of come back from the summit, the consensus is that Kaplan sort of put out this four-alarm fire. The trending topics widget controversy showed three things. One, that there were these political landmines that Zuckerberg and Facebook might not realize exist. Two, that Kaplan was the person that could navigate Zuckerberg and the company around them. And three...

Just as often as not, those types of landmines were about content and speech and the speech product. And so if you thought of this as a unified problem, right, you would want one person to be in charge of a unified solution. And that point person more or less becomes Kaplan. How unusual is it for a tech company to have a, you know, individual go from,

essentially top lobbyists to top policy advisor, top policy programmer for the platform. Smart people and scholars who think about the architecture of the internet and social media really encourage people to step back and look at Facebook and think about how unusual it is and how not obvious or self-explanatory it is that the person who would be in charge of your political lobbying is

and policy operation is also largely in charge of crafting and designing the policies around content and speech. I think the one inside story that really summarizes Kaplan's role and influence happens in 2017, and that's with a really radical proposal called Common Ground.

After 2016, there's this shock about the election and how ugly it was. And Common Ground has these big ambitious goals all about reducing polarization with a cocktail of what they call, quote, aggressive interventions. They're going to downrank ugly incivility.

and optimize for quote "good conversations" and upregulate that kind of discussion. And it's all about the algorithm. So the new algorithm was going to recommend users join more politically diverse groups, for example. It was going to reduce the viral reach of hyperactive, hyperpartisan users.

And the Common Ground team is really juiced. They're excited. They've hung posters around the office in Menlo Park that have their motto on it and say things like, reduce polarization or reduce hate. And then Common Ground runs into Joel Kaplan.

And Kaplan's policy team grills these programmers and project managers with questions. Questions not just about how it's going to be perceived by users, but how the changes will be experienced and perceived by political stakeholders. And with Trump in office...

Facebook is much more sensitive to how any changes, even neutral nonpartisan changes like Common Ground, might be perceived by politicians or media persona who have a big megaphone and can generate a political crisis and headache for Facebook. So in the end, a few of the tweaks of Common Ground got through. But in the end, almost all of Common Ground was scrapped and put on the shelf and never saw the light of day.

Okay, Ben, you've helped us get to know this shadowy figure at Facebook, at Meta. He's been lurking around our government and our platforms for decades. But what does all of this mean for the next four years of Meta, Mark, and Donald? So to me, Kaplan's professional life and his corporate values at Facebook suggest to me that...

there's almost no limit to the necessities and prerogatives of survival that Kaplan can't find a way to accommodate. You know, I guess a different way of putting this would be, Zuckerberg's donating a million dollars to the inaugural committee or going to Mar-a-Lago or bringing, you know, an MMA executive onto the corporate board. Those are

really obvious, jarring ways that we can see Zuckerberg more than almost any other, not only tech company, but really any other major corporation in the United States. Facebook has managed to stand out in subjecting itself to the coming Trump wave. And Kaplan's appointment to lead global policy is

is to me actually the ur-example of all of those things. Kaplan's singular achievement, I think, of the last eight years is finding a way to accommodate the brash ugliness of MAGA Washington and MAGA conservatism with the elite burnish and professionalized corporate values of Facebook and the corporate world. You know, the next four years of Trump-led

is going to be what Kaplan does best, which is just an era of serious and profound accommodation of Facebook or by Facebook of Trump. You know, if you can think of all the unsavory ways that an empowered Trump might want to use Facebook for illegitimate ends, Kaplan is going to be the person in charge of figuring out a way to accommodate Trump

and MAGA conservatism as far as it can go, and pushing the breaking point further and further before it becomes untenable for Facebook.

Ben Wofford, he writes for whomever he pleases. Most recently, it was Business Insider. The piece was titled Maga's Man Inside Meta. Businessinsider.com. Amanda Lou Ellen produces for today. Explained Amina Alsadi edits. Laura Bullard is our senior researcher. Andrea Kristen's daughter and Rob Byers mix it up. Goodbye for now.

I think I read it wrong. All right, hang on. What?

We allow targeted cursing, defined as terms or phrases calling for engagement in sexual activity or contact with genitalia, anus, feces, or urine, including but not limited to suck my dick, kiss my ass, eat shit. I think I got it. We allow targeted cursing, defined as terms or phrases calling for engagement in sexual activity. I can't do it.

We allow targeted cursing, defined as terms or phrases calling for engagement in sexual activity or contact with genitalia. I don't think I can do it.