
Sign up to save your podcasts
Or


Eliezer Yudkowsky periodically complains about people coming up with questionable plans with questionable assumptions to deal with AI, and then either:
Sometimes the questionable plan is "an alignment scheme, which Eliezer thinks avoids the hard part of the problem." Sometimes it's a sketchy reckless plan that's probably going to blow up and make things worse.
Some people complain about Eliezer being a doomy Negative Nancy who's overly pessimistic.
I had an interesting experience a few months ago when I ran some beta-tests of my Planmaking and Surprise Anticipation workshop, that I think are illustrative.
i. Slipping into a more Convenient World
I have an exercise where I give people [...]
---
Outline:
(00:59) i. Slipping into a more Convenient World
(04:26) ii. Finding traction in the wrong direction.
(06:47) Takeaways
---
First published:
Source:
Narrated by TYPE III AUDIO.
By LessWrongEliezer Yudkowsky periodically complains about people coming up with questionable plans with questionable assumptions to deal with AI, and then either:
Sometimes the questionable plan is "an alignment scheme, which Eliezer thinks avoids the hard part of the problem." Sometimes it's a sketchy reckless plan that's probably going to blow up and make things worse.
Some people complain about Eliezer being a doomy Negative Nancy who's overly pessimistic.
I had an interesting experience a few months ago when I ran some beta-tests of my Planmaking and Surprise Anticipation workshop, that I think are illustrative.
i. Slipping into a more Convenient World
I have an exercise where I give people [...]
---
Outline:
(00:59) i. Slipping into a more Convenient World
(04:26) ii. Finding traction in the wrong direction.
(06:47) Takeaways
---
First published:
Source:
Narrated by TYPE III AUDIO.

112,842 Listeners

130 Listeners

7,215 Listeners

531 Listeners

16,221 Listeners

4 Listeners

14 Listeners

2 Listeners