
Sign up to save your podcasts
Or


"GPT-3 was trained on is so large that the model contains a certain fraction of the actual complexity of the world. But how much is actually inside these models, implicitly embedded within these neural networks?
I decided to test this and see if I could examine the GPT-3 model of the world through the use of counterfactuals. Specifically, I wanted to see if GPT-3 could productively unspool histories of the world if things were slightly different, such as if the outcome of a war were different or a historical figure hadn’t been born. I wanted to see how well it could write alternate histories." - Samuel Arbesman
From Cabinet of Wonders newsletter by Samuel Arbesman
Great tweet thread summarizing his post
"Securities" podcast is produced and edited by Chris Gates
By Lux Capital4.7
1616 ratings
"GPT-3 was trained on is so large that the model contains a certain fraction of the actual complexity of the world. But how much is actually inside these models, implicitly embedded within these neural networks?
I decided to test this and see if I could examine the GPT-3 model of the world through the use of counterfactuals. Specifically, I wanted to see if GPT-3 could productively unspool histories of the world if things were slightly different, such as if the outcome of a war were different or a historical figure hadn’t been born. I wanted to see how well it could write alternate histories." - Samuel Arbesman
From Cabinet of Wonders newsletter by Samuel Arbesman
Great tweet thread summarizing his post
"Securities" podcast is produced and edited by Chris Gates

1,290 Listeners

533 Listeners

1,898 Listeners

2,441 Listeners

1,101 Listeners

1,439 Listeners

797 Listeners

9,897 Listeners

505 Listeners

133 Listeners

97 Listeners

118 Listeners

516 Listeners

387 Listeners

41 Listeners