We're sunsetting PodQuest on 2025-07-28. Thank you for your support!
Export Podcast Subscriptions
back
“What Is The Alignment Problem?” by johnswentworth
46:26
Share
2025/1/17
LessWrong (Curated & Popular)
AI Chapters
Transcribe
Chapters
Why is Specifying Problems So Hard?
Toy Problem 1: Old MacDonald's New Hen
Toy Problem 2: Sorting Bleggs and Rubes
Generalizing to the Alignment Problem
What If the Patterns Don't Hold?
Alignment of What Exactly?
Alignment of a Goal or Purpose
Alignment of Basic Agents
Alignment of General Intelligence
How Does This Relate to Today's AI?
What Are Human Values?
Exploring Corrigibility
Exercise: Do What I Mean (DWIM)
Putting It All Together and Key Takeaways
Shownotes
Transcript
No transcript made for this episode yet, you may request it for free.