
Sign up to save your podcasts
Or
My credence: 33% confidence in the claim that the growth in the number of GPUs used for training SOTA AI will slow down significantly directly after GPT-5. It is not higher because of (1) decentralized training is possible, and (2) GPT-5 may be able to increase hardware efficiency significantly, (3) GPT-5 may be smaller than assumed in this post, (4) race dynamics.
TLDR: Because of a bottleneck in energy access to data centers and the need to build OOM larger data centers.
The reasoning behind the claim:
---
Outline:
(00:42) The reasoning behind the claim:
(03:18) Unrelated to the claim:
(04:23) How big is that effect going to be?
(06:52) Impact of GPT-5
---
First published:
Source:
Narrated by TYPE III AUDIO.
My credence: 33% confidence in the claim that the growth in the number of GPUs used for training SOTA AI will slow down significantly directly after GPT-5. It is not higher because of (1) decentralized training is possible, and (2) GPT-5 may be able to increase hardware efficiency significantly, (3) GPT-5 may be smaller than assumed in this post, (4) race dynamics.
TLDR: Because of a bottleneck in energy access to data centers and the need to build OOM larger data centers.
The reasoning behind the claim:
---
Outline:
(00:42) The reasoning behind the claim:
(03:18) Unrelated to the claim:
(04:23) How big is that effect going to be?
(06:52) Impact of GPT-5
---
First published:
Source:
Narrated by TYPE III AUDIO.
26,446 Listeners
2,389 Listeners
7,910 Listeners
4,136 Listeners
87 Listeners
1,462 Listeners
9,095 Listeners
87 Listeners
389 Listeners
5,432 Listeners
15,174 Listeners
474 Listeners
121 Listeners
75 Listeners
459 Listeners