
Sign up to save your podcasts
Or


https://astralcodexten.substack.com/p/mr-tries-the-safe-uncertainty-fallacy
The Safe Uncertainty Fallacy goes:
The situation is completely uncertain. We can't predict anything about it. We have literally no idea how it could go.
Therefore, it'll be fine.
You're not missing anything. It's not supposed to make sense; that's why it's a fallacy.
For years, people used the Safe Uncertainty Fallacy on AI timelines:
Since 2017, AI has moved faster than most people expected; GPT-4 sort of qualifies as an AGI, the kind of AI most people were saying was decades away. When you have ABSOLUTELY NO IDEA when something will happen, sometimes the answer turns out to be "soon".
By Jeremiah4.8
129129 ratings
https://astralcodexten.substack.com/p/mr-tries-the-safe-uncertainty-fallacy
The Safe Uncertainty Fallacy goes:
The situation is completely uncertain. We can't predict anything about it. We have literally no idea how it could go.
Therefore, it'll be fine.
You're not missing anything. It's not supposed to make sense; that's why it's a fallacy.
For years, people used the Safe Uncertainty Fallacy on AI timelines:
Since 2017, AI has moved faster than most people expected; GPT-4 sort of qualifies as an AGI, the kind of AI most people were saying was decades away. When you have ABSOLUTELY NO IDEA when something will happen, sometimes the answer turns out to be "soon".

1,998 Listeners

2,670 Listeners

26,343 Listeners

4,277 Listeners

2,459 Listeners

590 Listeners

905 Listeners

291 Listeners

739 Listeners

586 Listeners

705 Listeners

532 Listeners

5,540 Listeners

369 Listeners

155 Listeners