The Nonlinear Library

EA - Astronomical Cake by Richard Y Chappell


Listen Later

Welcome to The Nonlinear Library, where we use Text-to-Speech software to convert the best writing from the Rationalist and EA communities into audio. This is: Astronomical Cake, published by Richard Y Chappell on June 6, 2024 on The Effective Altruism Forum.
There's one respect in which philosophical training seems to make (many) philosophers worse at practical ethics. Too many are tempted to treat tidy thought experiments as a model for messy real-world ethical quandaries.
We're used to thinking about scenarios where all the details and consequences are stipulated, so that we can better uncover our theoretical commitments about what matters in principle. I've previously flagged that this can be misleading: our intuitions about real-world situations may draw upon implicit knowledge of what those situations are like, and this implicit knowledge (when contrary to the explicit stipulations of the scenario) may distort our theoretical verdicts.
But it's even worse when the error goes the other way, and verdicts that only make sense given theoretical stipulations get exported into real-life situations where the stipulations do not hold. This can badly distort our understanding of how people should actually behave.
Our undergraduate students often protest the silly stipulations we build into our scenarios: "Why can't we rescue everyone from the tracks without killing anyone?" It's a good instinct! Alas, to properly engage with thought experiments, we have to abide by the stipulations. We learn (and train our students) to take moral trade-offs at face value, ignore likely downstream effects, and not question the apparent pay-offs for acting in dastardly ways.
This self-imposed simple-mindedness is a crucial skill for ethical theorizing. But it can be absolutely devastating to our practical judgment, if we fail to carefully distinguish ethical theory and practice.
Moral distortion from high stakes
A striking example of such philosophy-induced distortion comes from our theoretical understanding that sufficiently high stakes can justify overriding other values. This is a central implication of "moderate deontology": it's wrong to kill one as a means to save five, but obviously you should kill one innocent person if that's a necessary means to saving the entire world.
Now, crucially, in real life that is not actually a choice situation in which you could ever find yourself. The thought experiment comes with stipulated certainty; real life doesn't. So, much practical moral know-how comes down to having good judgment, including about how to manage your own biases so that you don't mistakenly take yourself to have fantastically strong reasons to do something that's actually disastrously counterproductive.
This is why utilitarians talk a lot about respecting generally-reliable rules rather than naively taking expected value (EV) calculations at face value. Taking our fallibility seriously is crucial for actually doing good in the world.
Higher stakes make it all the more important to choose the consequentially better option. But they don't inherently make it more likely that a disreputable-seeming action is consequentially better.
If "stealing to give" is a negative-EV strategy for ordinary charities, my default assumption is that it's negative-EV for longtermist causes too.[1] There are conceivable scenarios where that isn't so; but some positive argument is needed for thinking that any given real-life situation (like SBF's) takes this inverted form. Raising the stakes doesn't automatically flip the valence.
Many philosophers don't seem to understand this. Seth Lazar, for example, gave clear voice to (what we might call) academic philosophy's high stakes distortion when he was interviewed on Hi-Phi Nation last year.[2] Lazar claimed that it's "intellectually inconsistent" to simultaneously hold that (i) there are astronomical stakes to longtermism and x-risk reduction, and yet (ii) it's also really important that you act with integrity....
...more
View all episodesView all episodes
Download on the App Store

The Nonlinear LibraryBy The Nonlinear Fund

  • 4.6
  • 4.6
  • 4.6
  • 4.6
  • 4.6

4.6

8 ratings