
Sign up to save your podcasts
Or


I recently wrote 80k's new problem profile on extreme power concentration (with a lot of help from others - see the acknowledgements at the bottom).
It's meant to be a systematic introduction to the risk of AI-enabled power concentration, where AI enables a small group of humans to amass huge amounts of unchecked power over everyone else. It's primarily aimed at people who are new to the topic, but I think it's also one of the only write-ups there is on this overall risk,[1]so might be interesting to others, too.
Briefly, the piece argues that:
The original text contained 4 footnotes which were omitted from this narration.
---
First published:
Source:
---
Narrated by TYPE III AUDIO.
By LessWrongI recently wrote 80k's new problem profile on extreme power concentration (with a lot of help from others - see the acknowledgements at the bottom).
It's meant to be a systematic introduction to the risk of AI-enabled power concentration, where AI enables a small group of humans to amass huge amounts of unchecked power over everyone else. It's primarily aimed at people who are new to the topic, but I think it's also one of the only write-ups there is on this overall risk,[1]so might be interesting to others, too.
Briefly, the piece argues that:
The original text contained 4 footnotes which were omitted from this narration.
---
First published:
Source:
---
Narrated by TYPE III AUDIO.

112,936 Listeners

132 Listeners

7,283 Listeners

541 Listeners

16,372 Listeners

4 Listeners

14 Listeners

2 Listeners