
Sign up to save your podcasts
Or


The road to hell is paved with optimal intentions. 📉 We investigate Effective Altruism (EA), the philosophy that started with charity bed nets and ended with an $8 billion fraud. We expose the "Despotism Trap," where the obsession with maximizing future good is used to justify unethical power grabs today.
1. The Tale of Two Sams: We break down the collapse of Sam Bankman-Fried (SBF) and the boardroom coup against Sam Altman. Both were driven by EA principles but led to opposite extremes: SBF used "earning to give" to justify massive financial fraud, while the OpenAI board risked destroying an $80 billion company to prevent an abstract future AI risk. We analyze how the same philosophy fueled both disasters.
2. The "Indecent Proposal": Would you be a slave to a benevolent billionaire? We discuss the core philosophical flaw of EA using the "Benevolent Bob" thought experiment. If a billionaire offered you a perfect life in exchange for total control, would you take it? Critics argue EA creates a moral framework that empowers "epistemically privileged" tech leaders to act without democratic constraints.
3. The Luxury Charity: We expose the hypocrisy of Wytham Abbey, a 17 million pound manor house bought by EA leaders as a "conference center." Critics calculated that this money could have saved 5,600 lives if spent on malaria nets, proving that the movement has shifted from self-sacrifice to institutional empire-building.
By MorgrainThe road to hell is paved with optimal intentions. 📉 We investigate Effective Altruism (EA), the philosophy that started with charity bed nets and ended with an $8 billion fraud. We expose the "Despotism Trap," where the obsession with maximizing future good is used to justify unethical power grabs today.
1. The Tale of Two Sams: We break down the collapse of Sam Bankman-Fried (SBF) and the boardroom coup against Sam Altman. Both were driven by EA principles but led to opposite extremes: SBF used "earning to give" to justify massive financial fraud, while the OpenAI board risked destroying an $80 billion company to prevent an abstract future AI risk. We analyze how the same philosophy fueled both disasters.
2. The "Indecent Proposal": Would you be a slave to a benevolent billionaire? We discuss the core philosophical flaw of EA using the "Benevolent Bob" thought experiment. If a billionaire offered you a perfect life in exchange for total control, would you take it? Critics argue EA creates a moral framework that empowers "epistemically privileged" tech leaders to act without democratic constraints.
3. The Luxury Charity: We expose the hypocrisy of Wytham Abbey, a 17 million pound manor house bought by EA leaders as a "conference center." Critics calculated that this money could have saved 5,600 lives if spent on malaria nets, proving that the movement has shifted from self-sacrifice to institutional empire-building.