We’ve significantly upgraded our timelines and takeoff models! It predicts when AIs will reach key capability milestones: for example, Automated Coder / AC (full automation of coding) and superintelligence / ASI (much better than the best humans at virtually all cognitive tasks). This post will briefly explain how the model works, present our timelines and takeoff forecasts, and compare it to our previous (AI 2027) models (spoiler: the AI Futures Model predicts about 3 years longer timelines to full coding automation than our previous model, mostly due to being less bullish on pre-full-automation AI R&D speedups).
If you’re interested in playing with the model yourself, the best way to do so is via this interactive website: aifuturesmodel.com
If you’d like to skip the motivation for our model to an explanation for how it works, go here, The website has a more in-depth explanation of the model (starts here; use the diagram on the right as a table of contents), as well as our forecasts.
Why do timelines and takeoff modeling?
The future is very hard to predict. We don't think this model, or any other model, should be trusted completely. The model takes into account what we think are [...]
---
Outline:
(01:32) Why do timelines and takeoff modeling?
(03:18) Why our approach to modeling? Comparing to other approaches
(03:24) AGI timelines forecasting methods
(03:29) Trust the experts
(04:35) Intuition informed by arguments
(06:10) Revenue extrapolation
(07:15) Compute extrapolation anchored by the brain
(09:53) Capability benchmark trend extrapolation
(11:44) Post-AGI takeoff forecasts
(13:33) How our model works
(14:37) Stage 1: Automating coding
(16:54) Stage 2: Automating research taste
(18:18) Stage 3: The intelligence explosion
(20:35) Timelines and takeoff forecasts
(21:04) Eli
(24:34) Daniel
(38:32) Comparison to our previous (AI 2027) timelines and takeoff models
(38:49) Timelines to Superhuman Coder (SC)
(43:33) Takeoff from Superhuman Coder onward
The original text contained 31 footnotes which were omitted from this narration.
---