
Sign up to save your podcasts
Or


"GPT-3 was trained on is so large that the model contains a certain fraction of the actual complexity of the world. But how much is actually inside these models, implicitly embedded within these neural networks?
I decided to test this and see if I could examine the GPT-3 model of the world through the use of counterfactuals. Specifically, I wanted to see if GPT-3 could productively unspool histories of the world if things were slightly different, such as if the outcome of a war were different or a historical figure hadn’t been born. I wanted to see how well it could write alternate histories." - Samuel Arbesman
From Cabinet of Wonders newsletter by Samuel Arbesman
Great tweet thread summarizing his post
"Securities" podcast is produced and edited by Chris Gates
By Lux Capital4.7
1616 ratings
"GPT-3 was trained on is so large that the model contains a certain fraction of the actual complexity of the world. But how much is actually inside these models, implicitly embedded within these neural networks?
I decided to test this and see if I could examine the GPT-3 model of the world through the use of counterfactuals. Specifically, I wanted to see if GPT-3 could productively unspool histories of the world if things were slightly different, such as if the outcome of a war were different or a historical figure hadn’t been born. I wanted to see how well it could write alternate histories." - Samuel Arbesman
From Cabinet of Wonders newsletter by Samuel Arbesman
Great tweet thread summarizing his post
"Securities" podcast is produced and edited by Chris Gates

1,285 Listeners

534 Listeners

1,976 Listeners

2,463 Listeners

1,097 Listeners

1,443 Listeners

791 Listeners

10,064 Listeners

531 Listeners

139 Listeners

98 Listeners

130 Listeners

473 Listeners

395 Listeners

41 Listeners