
Sign up to save your podcasts
Or


We welcomed the one, the only, Eliezer Yudkowsky to the Bayesian Conspiracy for a quick chat.
His latest project is Arbital, an ambitious effort to solve online discussion. It is focusing on solving online explanation. They want to do for difficult explanations – and someday, complicated arguments in general – what Wikipedia did for centralizing humanity’s recounting of agreed-on facts.
The initial demo page is an explanation of Bayes’s Rule
About Eliezer in his (heavily truncated by me) own words:
One Eliezer Yudkowsky writes about the fine art of human rationality. That Eliezer Yudkowsky’s work can be found in the Rationality section.
The other Eliezer Yudkowsky concerns himself with Artificial Intelligence. Very shortly – on a historical scale, that is – we can expect technology to break the upper bound on intelligence that has held for the last few tens of thousands of years. For more on this see the Singularity tab.
My short fiction, miscellaneous essays, and various other things can be found under Other.
I am a full-time Research Fellow at the Machine Intelligence Research Institute, a small 501(c)(3) public charity supported primarily by individual donations.
A majority of all my written material is currently stored at the community blog Less Wrong.
We asked him about questions ranging from his opinions on AI to rationality to his popular Harry Potter Fan Fiction.
Eliezer brought up a few concepts, and you will definitely want these links:
We will be reposting that last link, I’m sure. As always, you can email us at [email protected] with your valued feedback, or comment at the subreddit.
Other Relevant Links
Surely You’re Joking, Mr Feynman!
By The Bayesian Conspiracy4.7
4545 ratings
We welcomed the one, the only, Eliezer Yudkowsky to the Bayesian Conspiracy for a quick chat.
His latest project is Arbital, an ambitious effort to solve online discussion. It is focusing on solving online explanation. They want to do for difficult explanations – and someday, complicated arguments in general – what Wikipedia did for centralizing humanity’s recounting of agreed-on facts.
The initial demo page is an explanation of Bayes’s Rule
About Eliezer in his (heavily truncated by me) own words:
One Eliezer Yudkowsky writes about the fine art of human rationality. That Eliezer Yudkowsky’s work can be found in the Rationality section.
The other Eliezer Yudkowsky concerns himself with Artificial Intelligence. Very shortly – on a historical scale, that is – we can expect technology to break the upper bound on intelligence that has held for the last few tens of thousands of years. For more on this see the Singularity tab.
My short fiction, miscellaneous essays, and various other things can be found under Other.
I am a full-time Research Fellow at the Machine Intelligence Research Institute, a small 501(c)(3) public charity supported primarily by individual donations.
A majority of all my written material is currently stored at the community blog Less Wrong.
We asked him about questions ranging from his opinions on AI to rationality to his popular Harry Potter Fan Fiction.
Eliezer brought up a few concepts, and you will definitely want these links:
We will be reposting that last link, I’m sure. As always, you can email us at [email protected] with your valued feedback, or comment at the subreddit.
Other Relevant Links
Surely You’re Joking, Mr Feynman!

2,676 Listeners

2,669 Listeners

4,260 Listeners

2,452 Listeners

306 Listeners

592 Listeners

108 Listeners

6,307 Listeners

128 Listeners

4,175 Listeners

9 Listeners

517 Listeners

956 Listeners

138 Listeners

151 Listeners