
Sign up to save your podcasts
Or


This is a linkpost to a new Substack article from MIT FutureTech explaining our recent paper On the Origins of Algorithmic Progress in AI.
We demonstrate that some algorithmic innovations have efficiency gains which get larger as pre-training compute increases. These scale-dependent innovations constitute the majority of pre-training efficiency gains over the last decade, which may imply that what looks like algorithmic progress is driven by compute scaling rather than many incremental innovations.
From the paper, our core contributions are:
---
First published:
Source:
Linkpost URL:
https://open.substack.com/pub/mitfuturetech/p/on-the-origins-of-algorithmic-progress
---
Narrated by TYPE III AUDIO.
By LessWrongThis is a linkpost to a new Substack article from MIT FutureTech explaining our recent paper On the Origins of Algorithmic Progress in AI.
We demonstrate that some algorithmic innovations have efficiency gains which get larger as pre-training compute increases. These scale-dependent innovations constitute the majority of pre-training efficiency gains over the last decade, which may imply that what looks like algorithmic progress is driven by compute scaling rather than many incremental innovations.
From the paper, our core contributions are:
---
First published:
Source:
Linkpost URL:
https://open.substack.com/pub/mitfuturetech/p/on-the-origins-of-algorithmic-progress
---
Narrated by TYPE III AUDIO.

113,129 Listeners

132 Listeners

7,262 Listeners

561 Listeners

16,487 Listeners

4 Listeners

14 Listeners

2 Listeners