
Sign up to save your podcasts
Or


The OpenAI Alignment Research Blog launched today at 11 am PT! With 1 introductory post, and 2 technical posts.
Blog: https://alignment.openai.com/
Thread on X: https://x.com/j_asminewang/status/1995569301714325935?t=O5FvxDVP3OqicF-Y4sCtxw&s=19
Speaking purely personally: when I joined the Alignment team at OpenAI in January, I saw there was more safety research than I'd expected. Not to mention interesting thinking on the future of alignment. But that research & thinking didn't really have a place to go, considering it's often too short or informal for the main OpenAI blog, and most OpenAI researchers aren't on LessWrong. I'm hoping the blog is a more informal, lower-friction home than the main blog, and this new avenue of publishing encourages sharing and transparency.
---
First published:
Source:
---
Narrated by TYPE III AUDIO.
By LessWrongThe OpenAI Alignment Research Blog launched today at 11 am PT! With 1 introductory post, and 2 technical posts.
Blog: https://alignment.openai.com/
Thread on X: https://x.com/j_asminewang/status/1995569301714325935?t=O5FvxDVP3OqicF-Y4sCtxw&s=19
Speaking purely personally: when I joined the Alignment team at OpenAI in January, I saw there was more safety research than I'd expected. Not to mention interesting thinking on the future of alignment. But that research & thinking didn't really have a place to go, considering it's often too short or informal for the main OpenAI blog, and most OpenAI researchers aren't on LessWrong. I'm hoping the blog is a more informal, lower-friction home than the main blog, and this new avenue of publishing encourages sharing and transparency.
---
First published:
Source:
---
Narrated by TYPE III AUDIO.

26,370 Listeners

2,450 Listeners

8,708 Listeners

4,174 Listeners

93 Listeners

1,599 Listeners

9,855 Listeners

93 Listeners

507 Listeners

5,529 Listeners

16,019 Listeners

543 Listeners

136 Listeners

94 Listeners

475 Listeners