
Sign up to save your podcasts
Or


[I'm posting this as a very informal community request in lieu of a more detailed writeup, because if I wait to do this in a much more careful fashion then it probably won't happen at all. If someone else wants to do a more careful version that would be great!]
By crux I mean some uncertainty you have such that your estimate for the likelihood of existential risk from AI - your "p(doom)" if you like that term -might shift significantly if that uncertainty were resolved.
More precisely, let's define a crux as a proposition such that:
---
First published:
Source:
Narrated by TYPE III AUDIO.
By LessWrong[I'm posting this as a very informal community request in lieu of a more detailed writeup, because if I wait to do this in a much more careful fashion then it probably won't happen at all. If someone else wants to do a more careful version that would be great!]
By crux I mean some uncertainty you have such that your estimate for the likelihood of existential risk from AI - your "p(doom)" if you like that term -might shift significantly if that uncertainty were resolved.
More precisely, let's define a crux as a proposition such that:
---
First published:
Source:
Narrated by TYPE III AUDIO.

113,004 Listeners

130 Listeners

7,228 Listeners

532 Listeners

16,218 Listeners

4 Listeners

14 Listeners

2 Listeners