
Sign up to save your podcasts
Or


Epoch AI is the premier organization that tracks the trajectory of AI - how much compute is used, the role of algorithmic improvements, the growth in data used, and when the above trends might hit an end. In this episode, I speak with the director of Epoch AI, Jaime Sevilla, about how compute, data, and algorithmic improvements are impacting AI, and whether continuing to scale can get us AGI.
Patreon: https://www.patreon.com/axrpodcast
Ko-fi: https://ko-fi.com/axrpodcast
The transcript: https://axrp.net/episode/2024/10/04/episode-37-jaime-sevilla-forecasting-ai.html
Topics we discuss, and timestamps:
0:00:38 - The pace of AI progress
0:07:49 - How Epoch AI tracks AI compute
0:11:44 - Why does AI compute grow so smoothly?
0:21:46 - When will we run out of computers?
0:38:56 - Algorithmic improvement
0:44:21 - Algorithmic improvement and scaling laws
0:56:56 - Training data
1:04:56 - Can scaling produce AGI?
1:16:55 - When will AGI arrive?
1:21:20 - Epoch AI
1:27:06 - Open questions in AI forecasting
1:35:21 - Epoch AI and x-risk
1:41:34 - Following Epoch AI's research
Links for Jaime and Epoch AI:
Epoch AI: https://epochai.org/
Machine Learning Trends dashboard: https://epochai.org/trends
Epoch AI on X / Twitter: https://x.com/EpochAIResearch
Jaime on X / Twitter: https://x.com/Jsevillamol
Research we discuss:
Training Compute of Frontier AI Models Grows by 4-5x per Year: https://epochai.org/blog/training-compute-of-frontier-ai-models-grows-by-4-5x-per-year
Optimally Allocating Compute Between Inference and Training: https://epochai.org/blog/optimally-allocating-compute-between-inference-and-training
Algorithmic Progress in Language Models [blog post]: https://epochai.org/blog/algorithmic-progress-in-language-models
Algorithmic progress in language models [paper]: https://arxiv.org/abs/2403.05812
Training Compute-Optimal Large Language Models [aka the Chinchilla scaling law paper]: https://arxiv.org/abs/2203.15556
Will We Run Out of Data? Limits of LLM Scaling Based on Human-Generated Data [blog post]: https://epochai.org/blog/will-we-run-out-of-data-limits-of-llm-scaling-based-on-human-generated-data
Will we run out of data? Limits of LLM scaling based on human-generated data [paper]: https://arxiv.org/abs/2211.04325
The Direct Approach: https://epochai.org/blog/the-direct-approach
Episode art by Hamish Doodles: hamishdoodles.com
By Daniel Filan4.4
88 ratings
Epoch AI is the premier organization that tracks the trajectory of AI - how much compute is used, the role of algorithmic improvements, the growth in data used, and when the above trends might hit an end. In this episode, I speak with the director of Epoch AI, Jaime Sevilla, about how compute, data, and algorithmic improvements are impacting AI, and whether continuing to scale can get us AGI.
Patreon: https://www.patreon.com/axrpodcast
Ko-fi: https://ko-fi.com/axrpodcast
The transcript: https://axrp.net/episode/2024/10/04/episode-37-jaime-sevilla-forecasting-ai.html
Topics we discuss, and timestamps:
0:00:38 - The pace of AI progress
0:07:49 - How Epoch AI tracks AI compute
0:11:44 - Why does AI compute grow so smoothly?
0:21:46 - When will we run out of computers?
0:38:56 - Algorithmic improvement
0:44:21 - Algorithmic improvement and scaling laws
0:56:56 - Training data
1:04:56 - Can scaling produce AGI?
1:16:55 - When will AGI arrive?
1:21:20 - Epoch AI
1:27:06 - Open questions in AI forecasting
1:35:21 - Epoch AI and x-risk
1:41:34 - Following Epoch AI's research
Links for Jaime and Epoch AI:
Epoch AI: https://epochai.org/
Machine Learning Trends dashboard: https://epochai.org/trends
Epoch AI on X / Twitter: https://x.com/EpochAIResearch
Jaime on X / Twitter: https://x.com/Jsevillamol
Research we discuss:
Training Compute of Frontier AI Models Grows by 4-5x per Year: https://epochai.org/blog/training-compute-of-frontier-ai-models-grows-by-4-5x-per-year
Optimally Allocating Compute Between Inference and Training: https://epochai.org/blog/optimally-allocating-compute-between-inference-and-training
Algorithmic Progress in Language Models [blog post]: https://epochai.org/blog/algorithmic-progress-in-language-models
Algorithmic progress in language models [paper]: https://arxiv.org/abs/2403.05812
Training Compute-Optimal Large Language Models [aka the Chinchilla scaling law paper]: https://arxiv.org/abs/2203.15556
Will We Run Out of Data? Limits of LLM Scaling Based on Human-Generated Data [blog post]: https://epochai.org/blog/will-we-run-out-of-data-limits-of-llm-scaling-based-on-human-generated-data
Will we run out of data? Limits of LLM scaling based on human-generated data [paper]: https://arxiv.org/abs/2211.04325
The Direct Approach: https://epochai.org/blog/the-direct-approach
Episode art by Hamish Doodles: hamishdoodles.com

26,371 Listeners

2,426 Listeners

1,083 Listeners

107 Listeners

112,430 Listeners

210 Listeners

9,823 Listeners

89 Listeners

489 Listeners

5,477 Listeners

132 Listeners

16,106 Listeners

97 Listeners

209 Listeners

133 Listeners