
Sign up to save your podcasts
Or
Nathan Delisle, University of Chicago
tl;dr: Many critiques of *Situational Awareness* have been purely qualitative; one year later we can finally check the numbers. I did my best to verify his claims using public data through June 2025, and found that his estimates mostly check out.
This is inherently noisy work, so nothing herein is certain. I would encourage red-teaming, particularly in the algo-efficiencies/unhobbling/hardware sections.
Many thanks to Kai Williams, Egg Syntax, and Aaron Scher for their critical feedback.
Abstract
Leopold Aschenbrenner's 2024 essay Situational Awareness forecasts AI progress from 2024 to 2027 in two groups: "drivers" (raw compute, algorithmic efficiency, and post-training capability enhancements known as "un-hobbling") and "indicators" (largest training cluster size, global AI investment, chip production, AI revenue, and electricity consumption).[1] Drivers and the largest cluster size are expected to grow about half an order of magnitude (≈3.2×) annually, infrastructure indicators roughly doubling annually (2× per [...]
The original text contained 78 footnotes which were omitted from this narration.
---
First published:
Source:
---
Narrated by TYPE III AUDIO.
---
Images from the article:
Apple Podcasts and Spotify do not show images in the episode description. Try Pocket Casts, or another podcast app.
Nathan Delisle, University of Chicago
tl;dr: Many critiques of *Situational Awareness* have been purely qualitative; one year later we can finally check the numbers. I did my best to verify his claims using public data through June 2025, and found that his estimates mostly check out.
This is inherently noisy work, so nothing herein is certain. I would encourage red-teaming, particularly in the algo-efficiencies/unhobbling/hardware sections.
Many thanks to Kai Williams, Egg Syntax, and Aaron Scher for their critical feedback.
Abstract
Leopold Aschenbrenner's 2024 essay Situational Awareness forecasts AI progress from 2024 to 2027 in two groups: "drivers" (raw compute, algorithmic efficiency, and post-training capability enhancements known as "un-hobbling") and "indicators" (largest training cluster size, global AI investment, chip production, AI revenue, and electricity consumption).[1] Drivers and the largest cluster size are expected to grow about half an order of magnitude (≈3.2×) annually, infrastructure indicators roughly doubling annually (2× per [...]
The original text contained 78 footnotes which were omitted from this narration.
---
First published:
Source:
---
Narrated by TYPE III AUDIO.
---
Images from the article:
Apple Podcasts and Spotify do not show images in the episode description. Try Pocket Casts, or another podcast app.
26,469 Listeners
2,395 Listeners
7,953 Listeners
4,142 Listeners
89 Listeners
1,472 Listeners
9,207 Listeners
88 Listeners
417 Listeners
5,461 Listeners
15,321 Listeners
482 Listeners
121 Listeners
75 Listeners
461 Listeners