
Sign up to save your podcasts
Or
Machine Alignment Monday, 7/24/23
Intelligence explosion arguments don’t require Platonism. They just require intelligence to exist in the normal fuzzy way that all concepts exist.
First, I’ll describe what the normal way concepts exist is. I’ll have succeeded if I convince you that claims using the word “intelligence” are coherent and potentially true.
Second, I’ll argue, based on humans and animals, that these coherent-and-potentially-true things are actually true.
Third, I’ll argue that so far this has been the most fruitful way to think about AI, and people who try to think about it differently make worse AIs.
Finally, I’ll argue this is sufficient for ideas of “intelligence explosion” to be coherent.
https://astralcodexten.substack.com/p/were-not-platonists-weve-just-learned
4.8
123123 ratings
Machine Alignment Monday, 7/24/23
Intelligence explosion arguments don’t require Platonism. They just require intelligence to exist in the normal fuzzy way that all concepts exist.
First, I’ll describe what the normal way concepts exist is. I’ll have succeeded if I convince you that claims using the word “intelligence” are coherent and potentially true.
Second, I’ll argue, based on humans and animals, that these coherent-and-potentially-true things are actually true.
Third, I’ll argue that so far this has been the most fruitful way to think about AI, and people who try to think about it differently make worse AIs.
Finally, I’ll argue this is sufficient for ideas of “intelligence explosion” to be coherent.
https://astralcodexten.substack.com/p/were-not-platonists-weve-just-learned
4,224 Listeners
13,358 Listeners
26,370 Listeners
2,386 Listeners
87 Listeners
3,755 Listeners
87 Listeners
387 Listeners
128 Listeners
199 Listeners
47 Listeners
90 Listeners
75 Listeners
144 Listeners
114 Listeners