
Sign up to save your podcasts
Or


Permanent disempowerment without restrictions on quality of life achievable with relatively meager resources (and no extinction) seems to be a likely outcome for the future of humanity, if the current trajectory of frontier AI development continues and leads to AGI[1] shortly. This might happen as a result of at least a slight endorsement by AIs of humanity's welfare, in the context of costs for AIs being about matter or compute rather than technological advancements and quality of infrastructure.
The remaining risks (initial catastrophic harm or total extinction) and opportunities (capturing a larger portion of the cosmic endowment for the future of humanity than a tiny little bit) are about what happens in the transitional period when AIs still don't have an overwhelming advantage, which might take longer than usually expected.
Animal Extinction and Suffering
In recent times, with preservation becoming a salient concern, species facing pressure [...]
---
Outline:
(00:56) Animal Extinction and Suffering
(02:50) AGI-Driven Pause on Superintelligence Development
(04:59) Tradeoffs in a Superintelligent World
(07:06) Giving Away the Cosmic Endowment
The original text contained 5 footnotes which were omitted from this narration.
---
First published:
Source:
---
Narrated by TYPE III AUDIO.
By LessWrongPermanent disempowerment without restrictions on quality of life achievable with relatively meager resources (and no extinction) seems to be a likely outcome for the future of humanity, if the current trajectory of frontier AI development continues and leads to AGI[1] shortly. This might happen as a result of at least a slight endorsement by AIs of humanity's welfare, in the context of costs for AIs being about matter or compute rather than technological advancements and quality of infrastructure.
The remaining risks (initial catastrophic harm or total extinction) and opportunities (capturing a larger portion of the cosmic endowment for the future of humanity than a tiny little bit) are about what happens in the transitional period when AIs still don't have an overwhelming advantage, which might take longer than usually expected.
Animal Extinction and Suffering
In recent times, with preservation becoming a salient concern, species facing pressure [...]
---
Outline:
(00:56) Animal Extinction and Suffering
(02:50) AGI-Driven Pause on Superintelligence Development
(04:59) Tradeoffs in a Superintelligent World
(07:06) Giving Away the Cosmic Endowment
The original text contained 5 footnotes which were omitted from this narration.
---
First published:
Source:
---
Narrated by TYPE III AUDIO.

113,393 Listeners

130 Listeners

7,268 Listeners

529 Listeners

16,306 Listeners

4 Listeners

14 Listeners

2 Listeners