
Sign up to save your podcasts
Or


People thinking about the future of AI sometimes talk about a single project ‘getting there first’ — achieving AGI, and leveraging this into a decisive strategic advantage over the rest of the world. That would represent a massive concentration of power.
This post is a one-sided exploration of concerns with such concentration of power. They're conveyed in part via an extended analogy with The Lord of the Rings, in the hope that that could make it easier to keep track of the concerns at a gut level — even if people are ultimately persuaded by arguments for the inevitability or desirability of a single project.
The original text contained 1 image which was described by AI.
---
First published:
Source:
Narrated by TYPE III AUDIO.
---
Images from the article:
Apple Podcasts and Spotify do not show images in the episode description. Try Pocket Casts, or another podcast app.
By LessWrongPeople thinking about the future of AI sometimes talk about a single project ‘getting there first’ — achieving AGI, and leveraging this into a decisive strategic advantage over the rest of the world. That would represent a massive concentration of power.
This post is a one-sided exploration of concerns with such concentration of power. They're conveyed in part via an extended analogy with The Lord of the Rings, in the hope that that could make it easier to keep track of the concerns at a gut level — even if people are ultimately persuaded by arguments for the inevitability or desirability of a single project.
The original text contained 1 image which was described by AI.
---
First published:
Source:
Narrated by TYPE III AUDIO.
---
Images from the article:
Apple Podcasts and Spotify do not show images in the episode description. Try Pocket Casts, or another podcast app.

112,856 Listeners

130 Listeners

7,217 Listeners

532 Listeners

16,202 Listeners

4 Listeners

14 Listeners

2 Listeners