
Sign up to save your podcasts
Or
Many thanks to Spencer Greenberg, Lucius Caviola, Josh Lewis, John Bargh, Ben Pace, Diogo de Lucena, and Philip Gubbins for their valuable ideas and feedback at each stage of this project—as well as the ~375 EAs + alignment researchers who provided the data that made this project possible.
Background
Last month, AE Studio launched two surveys: one for alignment researchers, and another for the broader EA community.
We got some surprisingly interesting results, and we're excited to share them here.
We set out to better explore and compare various population-level dynamics within and across both groups. We examined everything from demographics and personality traits to community views on specific EA/alignment-related topics. We took on this project because it seemed to be largely unexplored and rife with potentially-very-high-value insights. In this post, we’ll present what we think are the most important findings from this project.
Meanwhile, we’re also sharing [...]
---
Outline:
(00:28) Background
(03:56) Seven key results and implications
(11:41) Survey contents and motivation
(13:47) Who took these surveys?
(17:14) Community views on specific topics (ground truth vs. predictions)
(18:16) Cause area prioritization (ground truth vs. predictions)
(23:51) Other interesting field-level distributions (ground truth vs. predictions)
(28:25) Background on the Big Five
(30:44) Personality similarities and differences
(35:08) EAs and alignment researchers have significantly different moral foundations
(39:33) Free responses from alignment survey
(42:38) Concluding thoughts
(44:52) Appendix: other interesting miscellaneous findings (in no particular order)
(44:59) Using temperament to predict alignment positions
(46:29) Gender differences in alignment
(47:11) EAs and alignment researchers exhibit very low future discounting rates
(48:28) EAs and alignment researchers arent huge risk-takers
(49:05) EAs are almost-perfectly-normally-distributed on some key EA questions
(49:52) Alignment researchers support a pause
(50:14) Alignment org leaders are highly optimistic by temperament
The original text contained 7 footnotes which were omitted from this narration.
---
First published:
Source:
Narrated by TYPE III AUDIO.
Many thanks to Spencer Greenberg, Lucius Caviola, Josh Lewis, John Bargh, Ben Pace, Diogo de Lucena, and Philip Gubbins for their valuable ideas and feedback at each stage of this project—as well as the ~375 EAs + alignment researchers who provided the data that made this project possible.
Background
Last month, AE Studio launched two surveys: one for alignment researchers, and another for the broader EA community.
We got some surprisingly interesting results, and we're excited to share them here.
We set out to better explore and compare various population-level dynamics within and across both groups. We examined everything from demographics and personality traits to community views on specific EA/alignment-related topics. We took on this project because it seemed to be largely unexplored and rife with potentially-very-high-value insights. In this post, we’ll present what we think are the most important findings from this project.
Meanwhile, we’re also sharing [...]
---
Outline:
(00:28) Background
(03:56) Seven key results and implications
(11:41) Survey contents and motivation
(13:47) Who took these surveys?
(17:14) Community views on specific topics (ground truth vs. predictions)
(18:16) Cause area prioritization (ground truth vs. predictions)
(23:51) Other interesting field-level distributions (ground truth vs. predictions)
(28:25) Background on the Big Five
(30:44) Personality similarities and differences
(35:08) EAs and alignment researchers have significantly different moral foundations
(39:33) Free responses from alignment survey
(42:38) Concluding thoughts
(44:52) Appendix: other interesting miscellaneous findings (in no particular order)
(44:59) Using temperament to predict alignment positions
(46:29) Gender differences in alignment
(47:11) EAs and alignment researchers exhibit very low future discounting rates
(48:28) EAs and alignment researchers arent huge risk-takers
(49:05) EAs are almost-perfectly-normally-distributed on some key EA questions
(49:52) Alignment researchers support a pause
(50:14) Alignment org leaders are highly optimistic by temperament
The original text contained 7 footnotes which were omitted from this narration.
---
First published:
Source:
Narrated by TYPE III AUDIO.
26,446 Listeners
2,389 Listeners
7,910 Listeners
4,136 Listeners
87 Listeners
1,462 Listeners
9,095 Listeners
87 Listeners
389 Listeners
5,432 Listeners
15,174 Listeners
474 Listeners
121 Listeners
75 Listeners
459 Listeners