We're sunsetting PodQuest on 2025-07-28. Thank you for your support!
Export Podcast Subscriptions
cover of episode "GPTs are Predictors, not Imitators" by Eliezer Yudkowsky

"GPTs are Predictors, not Imitators" by Eliezer Yudkowsky

2023/4/12
logo of podcast LessWrong (Curated & Popular)

LessWrong (Curated & Popular)

Shownotes Transcript

(Related text posted to Twitter); this version is edited and has a more advanced final section.)

Imagine yourself in a box, trying to predict the next word - assign as much probability mass to the next token as possible - for all the text on the Internet.

Koan:  Is this a task whose difficulty caps out as human intelligence, or at the intelligence level of the smartest human who wrote any Internet text?  What factors make that task easier, or harder?  (If you don't have an answer, maybe take a minute to generate one, or alternatively, try to predict what I'll say next; if you do have an answer, take a moment to review it inside your mind, or maybe say the words out loud.)

https://www.lesswrong.com/posts/nH4c3Q9t9F3nJ7y8W/gpts-are-predictors-not-imitators)