AI progress is driven by improved algorithms and additional compute for training runs. Understanding what is going on with these trends and how they are currently driving progress is helpful for understanding the future of AI. In this post, I'll share a wide range of general takes on this topic as well as open questions. Be warned that I'm quite uncertain about a bunch of this!
This post will assume some familiarity with what is driving AI progress, specifically it will assume you understand the following concepts: pre-training, RL, scaling laws, effective compute.
** Training compute trends**
Epoch reports a trend of frontier training compute increasing by 4.5x per year. My best guess is that the future trend will be slower, maybe more like 3.5x per year (or possibly much lower) for a few reasons:
Outline:
(00:48) Training compute trends
(08:09) Algorithmic progress
(14:23) Data
The original text contained 3 footnotes which were omitted from this narration.
First published: May 2nd, 2025
---
Narrated by TYPE III AUDIO).