We're sunsetting PodQuest on 2025-07-28. Thank you for your support!
Export Podcast Subscriptions
cover of episode “Deep Learning is cheap Solomonoff induction?” by Lucius Bushnaq, Kaarel, Dmitry Vaintrob

“Deep Learning is cheap Solomonoff induction?” by Lucius Bushnaq, Kaarel, Dmitry Vaintrob

2024/12/9
logo of podcast LessWrong (30+ Karma)

LessWrong (30+ Karma)

Shownotes Transcript

TL:DR: Recently, Lucius held a presentation on the nature of deep learning and why it generalises to new data. Kaarel, Dmitry and Lucius talked about the slides for that presentation in a group chat. The conversation quickly became a broader discussion on the nature of intelligence and how much we do or don't know about it.

** Background ** Lucius: I recently held a small talk presenting an idea for how and why deep learning generalises. It tried to reduce concepts from Singular Learning theory back to basic algorithmic information theory to sketch a unified picture that starts with Solomonoff induction and, with a lot of hand waving, derives that under some assumptions, just fitting a big function to your data using a local optimisation method like gradient descent maybe, sorta, kind of, amounts to a cheap bargain bin approximation of running Solomonoff induction on that data. Lucius [...]


Outline:

(00:28) Background

(02:57) Slides

(03:20) Discussion

(26:32) Bridging NN SGD and Solomonoff induction (from Oct 2024)

(30:59) Acknowledgements

The original text contained 1 footnote which was omitted from this narration.

The original text contained 1 image which was described by AI.


First published: December 7th, 2024

Source: https://www.lesswrong.com/posts/LxCeyxH3fBSmd4oWB/deep-learning-is-cheap-solomonoff-induction-1)

    ---
    

Narrated by TYPE III AUDIO).


Images from the article: undefined) Apple Podcasts and Spotify do not show images in the episode description. Try Pocket Casts), or another podcast app.