We're sunsetting PodQuest on 2025-07-28. Thank you for your support!
Export Podcast Subscriptions
back
“UK AISI’s Alignment Team: Research Agenda” by Benjamin Hilton, Jacob Pfau, Marie_DB, Geoffrey Irving
22:23
Share
2025/5/7
LessWrong (30+ Karma)
AI Chapters
Transcribe
Chapters
Why is safety case-oriented alignment research important?
What is the initial focus of the AISI Alignment Team?
Example: Debate safety case sketch
What future work is planned by the team?
More details on the empirical approach to alignment
Moving beyond honesty: exploring automated alignment
What open problems do they want to see solved?
Theoretical challenges in alignment research
How can you collaborate with the AISI Alignment Team?
Shownotes
Transcript
No transcript made for this episode yet, you may request it for free.