TYPE III AUDIO (All episodes)

"My Model Of EA Burnout" by Logan Strohl


Listen Later

---
client: lesswrong
project_id: curated
narrator: pw
qa: mds
narrator_time: 0h45m
qa_time: 0h15m
---

I think that EA [editor note: "Effective Altruism"] burnout usually results from prolonged dedication to satisfying the values you think you should have, while neglecting the values you actually have.
Setting aside for the moment what “values” are and what it means to “actually” have one, suppose that I actually value these things (among others):
 
True Values:

  •     Abundance
  •     Power
  •     Novelty
  •     Social Harmony
  •     Beauty
  •     Growth
  •     Comfort
  •     The Wellbeing Of Others
  •     Excitement
  •     Personal Longevity
  •     Accuracy

One day I learn about “global catastrophic risk”: Perhaps we’ll all die in a nuclear war, or an AI apocalypse, or a bioengineered global pandemic, and perhaps one of these things will happen quite soon.

I recognize that GCR is a direct threat to The Wellbeing Of Others and to Personal Longevity, and as I do, I get scared. I get scared in a way I have never been scared before, because I’ve never before taken seriously the possibility that everyone might die, leaving nobody to continue the species or even to remember that we ever existed—and because this new perspective on the future of humanity has caused my own personal mortality to hit me harder than the lingering perspective of my Christian upbringing ever allowed. For the first time in my life, I’m really aware that I, and everyone I will ever care about, may die.

Original article:
https://www.lesswrong.com/posts/pDzdb4smpzT3Lwbym/my-model-of-ea-burnout

Narrated for LessWrong by TYPE III AUDIO.

Share feedback on this narration.

...more
View all episodesView all episodes
Download on the App Store

TYPE III AUDIO (All episodes)By TYPE III AUDIO