
Sign up to save your podcasts
Or


Posted on Twitter:
Here is the prompt code for the Turing machine: https://github.com/SpellcraftAI/turing
This is the fully general counterpoint to the @VictorTaelin's A::B challenge (he put money where his mouth is and got praise for that from Yudkowsky).
Attention is Turing Complete was a claim already in 2021:
Theorem 6 The class of Transformer networks with positional encodings is Turing complete. Moreover, Turing completeness holds even in the restricted setting in which the only non-constant values in positional embedding pos(n) of n, for n ∈ N, are n, 1/n, and 1/n2 , and Transformer networks have a single [...]
---
First published:
Source:
Linkpost URL:
https://twitter.com/ctjlewis/status/1779740038852690393
Narrated by TYPE III AUDIO.
By LessWrongPosted on Twitter:
Here is the prompt code for the Turing machine: https://github.com/SpellcraftAI/turing
This is the fully general counterpoint to the @VictorTaelin's A::B challenge (he put money where his mouth is and got praise for that from Yudkowsky).
Attention is Turing Complete was a claim already in 2021:
Theorem 6 The class of Transformer networks with positional encodings is Turing complete. Moreover, Turing completeness holds even in the restricted setting in which the only non-constant values in positional embedding pos(n) of n, for n ∈ N, are n, 1/n, and 1/n2 , and Transformer networks have a single [...]
---
First published:
Source:
Linkpost URL:
https://twitter.com/ctjlewis/status/1779740038852690393
Narrated by TYPE III AUDIO.

113,004 Listeners

130 Listeners

7,228 Listeners

532 Listeners

16,218 Listeners

4 Listeners

14 Listeners

2 Listeners