Meta's shift in moderation policies reflects a move towards more scalable solutions for handling speech on large platforms. The company is adopting approaches like community notes, which rely on user-driven consensus rather than centralized fact-checking. This change is seen as a response to the failures of previous moderation systems, particularly during events like the COVID-19 pandemic, where suppressing dissenting voices led to unintended consequences like increased vaccine skepticism.
The community notes system is considered a better approach because it is scalable and relies on user-driven consensus rather than centralized fact-checking. It allows for a more representative and self-policing mechanism where different viewpoints must agree on a note for it to be approved. This system addresses issues like brigading and the challenges of defining truth in a global context with diverse value systems.
The moderation policies during the COVID-19 pandemic led to the suppression of dissenting voices on issues like vaccine efficacy, school closures, and COVID origins. This suppression contributed to increased skepticism of vaccines in general, including those for diseases like measles. The policies failed to achieve their intended goals and instead caused broader societal mistrust in public health measures.
The centralized fact-checking system failed because it imposed a particular point of view, often aligning with traditional media consensus, which did not resonate with diverse global audiences. This approach led to the suppression of dissenting voices and created a disconnect between the moderation apparatus and the users it was meant to serve. The system also struggled with scalability and the complexity of managing truth in a global context.
The past decade's moderation challenges highlight the importance of scalable, user-driven solutions like community notes. Centralized moderation systems, particularly those that suppress dissent, often fail to achieve their goals and can lead to broader societal issues like increased skepticism of public health measures. The evidence suggests that allowing free speech, even if it includes some harmful content, is crucial for maintaining trust and addressing systemic harms.
Meta’s new approach to moderation questions, the context for an apparent shift to the right among tech leadership, and lessons from the last several years of moderation challenges and mistakes. At the end: Mark Zuckerberg offers his assessment of Apple in the modern era.
To email the show: [email protected])
@SharpTechPodcast Channel — YouTube)
@Stratechery Channel — YouTube)
Meta Changes Moderation Policies, Zuckerberg’s Journey — and Mine, The Audacity of Copying Well — Stratechery Update)
Mark Zuckerberg on Moderation Changes — Instagram: @zuck)
The Audacity of Copying Well — Stratechery)
Mark Zuckerberg on Apple — X: @tsarnick)
Get all episodes of Sharp Tech, Sharp China, Stratechery Updates and Interviews, Greatest of All Talk and the Dithering Podcast as part of Stratechery Plus) for $15/month or $150/year.