
Sign up to save your podcasts
Or


https://astralcodexten.substack.com/p/mr-tries-the-safe-uncertainty-fallacy
The Safe Uncertainty Fallacy goes:
The situation is completely uncertain. We can't predict anything about it. We have literally no idea how it could go.
Therefore, it'll be fine.
You're not missing anything. It's not supposed to make sense; that's why it's a fallacy.
For years, people used the Safe Uncertainty Fallacy on AI timelines:
Since 2017, AI has moved faster than most people expected; GPT-4 sort of qualifies as an AGI, the kind of AI most people were saying was decades away. When you have ABSOLUTELY NO IDEA when something will happen, sometimes the answer turns out to be "soon".
By Jeremiah4.8
129129 ratings
https://astralcodexten.substack.com/p/mr-tries-the-safe-uncertainty-fallacy
The Safe Uncertainty Fallacy goes:
The situation is completely uncertain. We can't predict anything about it. We have literally no idea how it could go.
Therefore, it'll be fine.
You're not missing anything. It's not supposed to make sense; that's why it's a fallacy.
For years, people used the Safe Uncertainty Fallacy on AI timelines:
Since 2017, AI has moved faster than most people expected; GPT-4 sort of qualifies as an AGI, the kind of AI most people were saying was decades away. When you have ABSOLUTELY NO IDEA when something will happen, sometimes the answer turns out to be "soon".

32,246 Listeners

2,118 Listeners

2,680 Listeners

26,380 Listeners

4,270 Listeners

2,461 Listeners

2,267 Listeners

907 Listeners

291 Listeners

4,167 Listeners

1,635 Listeners

313 Listeners

3,833 Listeners

551 Listeners

688 Listeners