This is a collection of every unidentified GPT2 glitch token listed in the third glitch token archaeology post. I was able to find the source of every single one, except for "?????-" and "?????-?????-"[1]. Please tell me if I missed one, or you've discovered one and don't understand where it came from. This isn't meant to be a well-written analysis, just a quick repository of my glitch-hunting observations.
I plan on writing up and categorizing all of these in greater detail in future posts. The first of which is here.
I used OpenWebText, a recreation of GPT2's training data, for all experiments in this post. I tokenized every .gz file in the archive and made a boolean Numpy array of each tokens that was present at least once. This allowed me to quickly identify infrequent tokens in the dataset and pull up the textual context with regular expressions. If [...]
---
Outline:
(02:23) Glitch Tokens and Where They Came From
(02:28) 48193 @#and 25
(03:34) 35496 ÃÂÃÂÃÂÃÂÃÂÃÂÃÂÃÂÃÂÃÂÃÂÃÂÃÂÃÂÃÂÃÂÃÂÃÂÃÂÃÂÃÂÃÂÃÂÃÂÃÂÃÂÃÂÃÂÃÂÃÂÃÂÃÂ 26
(06:09) 31727 cffff 156
(07:11) (7131 \]\[ 20471) and (3693 .\[ 20604) and (42669 ).\[ 19013) and (42924 .\[ 19219)
(08:00) 31708 ーン 635
(08:48) 48396 ÛÛ 3
(09:10) 24440 ュ 1338
(09:28) 39165 catentry 4
(09:44) 39253 UCHIJ 5
(10:08) 47182 :},{ 21 // 23785 \]= 32 // 32047 $:/ 3
(10:24) // 47182 :},{ 21
(10:34) 21807 \\\\\\\\\\\\\\\\ 45
(11:45) 17629 practition 13
(13:30) 41441 \\\\- 645
(45:09) 49781 EngineDebug 3
(45:34) 42470 TextColor 97
(46:13) 43177 EStreamFrame 0 | 39906 EStream 0
(46:37) 41383 assetsadobe 0
(47:15) Non-English Languages
(48:21) Hypotheses
(48:24) Why does this happen?
(53:22) Glitch token classifications
(54:38) Future Research Plans
The original text contained 2 footnotes which were omitted from this narration.
The original text contained 7 images which were described by AI.
---