LessWrong posts by zvi

The Big Nonprofits Post 2025


Listen Later

There remain lots of great charitable giving opportunities out there.

I have now had three opportunities to be a recommender for the Survival and Flourishing Fund (SFF). I wrote in detail about my first experience back in 2021, where I struggled to find worthy applications.

The second time around in 2024, there was an abundance of worthy causes. In 2025 there were even more high quality applications, many of which were growing beyond our ability to support them.

Thus this is the second edition of The Big Nonprofits Post, primarily aimed at sharing my findings on various organizations I believe are doing good work, to help you find places to consider donating in the cause areas and intervention methods that you think are most effective, and to offer my general perspective on how I think about choosing where to give.

This post combines my findings from the 2024 and 2025 rounds of SFF, and also includes some organizations that did not apply to either round, so inclusion does not mean that they necessarily applied at all.

This post is already very long, so the bar is higher for inclusion this year than it was [...]

---

Outline:

(01:40) A Word of Warning

(02:50) A Note To Charities

(03:53) Use Your Personal Theory of Impact

(05:40) Use Your Local Knowledge

(06:41) Unconditional Grants to Worthy Individuals Are Great

(09:00) Do Not Think Only On the Margin, and Also Use Decision Theory

(10:03) Compare Notes With Those Individuals You Trust

(10:35) Beware Becoming a Fundraising Target

(11:02) And the Nominees Are

(14:34) Organizations that Are Literally Me

(14:49) Balsa Research

(17:31) Don't Worry About the Vase

(19:04) Organizations Focusing On AI Non-Technical Research and Education

(19:35) Lightcone Infrastructure

(22:09) The AI Futures Project

(23:50) Effective Institutions Project (EIP) (For Their Flagship Initiatives)

(25:29) Artificial Intelligence Policy Institute (AIPI)

(27:08) AI Lab Watch

(28:09) Palisade Research

(29:20) CivAI

(30:15) AI Safety Info (Robert Miles)

(31:00) Intelligence Rising

(31:47) Convergence Analysis

(32:43) IASEAI (International Association for Safe and Ethical Artificial Intelligence)

(33:28) The AI Whistleblower Initiative

(34:10) Organizations Related To Potentially Pausing AI Or Otherwise Having A Strong International AI Treaty

(34:18) Pause AI and Pause AI Global

(35:45) MIRI

(37:00) Existential Risk Observatory

(37:59) Organizations Focusing Primary On AI Policy and Diplomacy

(38:37) Center for AI Safety and the CAIS Action Fund

(40:17) Foundation for American Innovation (FAI)

(43:07) Encode AI (Formerly Encode Justice)

(44:12) The Future Society

(45:08) Safer AI

(45:47) Institute for AI Policy and Strategy (IAPS)

(46:55) AI Standards Lab (Holtman Research)

(48:01) Safe AI Forum

(48:40) Center For Long Term Resilience

(50:20) Simon Institute for Longterm Governance

(51:16) Legal Advocacy for Safe Science and Technology

(52:25) Institute for Law and AI

(53:07) Macrostrategy Research Institute

(53:41) Secure AI Project

(54:20) Organizations Doing ML Alignment Research

(55:36) Model Evaluation and Threat Research (METR)

(57:01) Alignment Research Center (ARC)

(57:40) Apollo Research

(58:36) Cybersecurity Lab at University of Louisville

(59:17) Timaeus

(01:00:19) Simplex

(01:00:52) Far AI

(01:01:32) Alignment in Complex Systems Research Group

(01:02:15) Apart Research

(01:03:20) Transluce

(01:04:26) Organizations Doing Other Technical Work

(01:04:31) AI Analysts @ RAND

(01:05:23) Organizations Doing Math, Decision Theory and Agent Foundations

(01:06:44) Orthogonal

(01:07:38) Topos Institute

(01:08:34) Eisenstat Research

(01:09:16) AFFINE Algorithm Design

(01:09:45) CORAL (Computational Rational Agents Laboratory)

(01:10:35) Mathematical Metaphysics Institute

(01:11:40) Focal at CMU

(01:12:57) Organizations Doing Cool Other Stuff Including Tech

(01:13:08) ALLFED

(01:14:46) Good Ancestor Foundation

(01:16:09) Charter Cities Institute

(01:16:59) Carbon Copies for Independent Minds

(01:17:40) Organizations Focused Primarily on Bio Risk

(01:17:46) Secure DNA

(01:18:43) Blueprint Biosecurity

(01:19:31) Pour Domain

(01:20:19) ALTER Israel

(01:20:56) Organizations That Can Advise You Further

(01:21:33) Effective Institutions Project (EIP) (As A Donation Advisor)

(01:22:37) Longview Philanthropy

(01:24:08) Organizations That then Regrant to Fund Other Organizations

(01:25:19) SFF Itself (!)

(01:26:52) Manifund

(01:28:51) AI Risk Mitigation Fund

(01:29:39) Long Term Future Fund

(01:31:41) Foresight

(01:32:31) Centre for Enabling Effective Altruism Learning & Research (CEELAR)

(01:33:28) Organizations That are Essentially Talent Funnels

(01:35:24) AI Safety Camp

(01:36:07) Center for Law and AI Risk

(01:37:16) Speculative Technologies

(01:38:10) Talos Network

(01:38:58) MATS Research

(01:39:45) Epistea

(01:40:51) Emergent Ventures

(01:42:34) AI Safety Cape Town

(01:43:10) ILINA Program

(01:43:38) Impact Academy Limited

(01:44:15) Atlas Computing

(01:44:59) Principles of Intelligence (Formerly PIBBSS)

(01:45:52) Tarbell Center

(01:47:08) Catalyze Impact

(01:48:11) CeSIA within EffiSciences

(01:49:04) Stanford Existential Risk Initiative (SERI)

(01:49:52) Non-Trivial

(01:50:27) CFAR

(01:51:35) The Bramble Center

(01:52:29) Final Reminders

---

First published:

November 27th, 2025

Source:

https://www.lesswrong.com/posts/8MJQFHBWJgJ82FALJ/the-big-nonprofits-post-2025-1

---

Narrated by TYPE III AUDIO.

---

Images from the article:

Apple Podcasts and Spotify do not show images in the episode description. Try Pocket Casts, or another podcast app.

...more
View all episodesView all episodes
Download on the App Store

LessWrong posts by zviBy zvi

  • 5
  • 5
  • 5
  • 5
  • 5

5

2 ratings


More shows like LessWrong posts by zvi

View all
Making Sense with Sam Harris by Sam Harris

Making Sense with Sam Harris

26,320 Listeners

Conversations with Tyler by Mercatus Center at George Mason University

Conversations with Tyler

2,451 Listeners

The a16z Show by Andreessen Horowitz

The a16z Show

1,089 Listeners

Future of Life Institute Podcast by Future of Life Institute

Future of Life Institute Podcast

108 Listeners

ChinaTalk by Jordan Schneider

ChinaTalk

289 Listeners

Politix by Politix

Politix

93 Listeners

Dwarkesh Podcast by Dwarkesh Patel

Dwarkesh Podcast

512 Listeners

Hard Fork by The New York Times

Hard Fork

5,507 Listeners

Clearer Thinking with Spencer Greenberg by Spencer Greenberg

Clearer Thinking with Spencer Greenberg

138 Listeners

LessWrong (Curated & Popular) by LessWrong

LessWrong (Curated & Popular)

13 Listeners

No Priors: Artificial Intelligence | Technology | Startups by Conviction

No Priors: Artificial Intelligence | Technology | Startups

130 Listeners

"Econ 102" with Noah Smith and Erik Torenberg by Turpentine

"Econ 102" with Noah Smith and Erik Torenberg

152 Listeners

BG2Pod with Brad Gerstner and Bill Gurley by BG2Pod

BG2Pod with Brad Gerstner and Bill Gurley

467 Listeners

LessWrong (30+ Karma) by LessWrong

LessWrong (30+ Karma)

0 Listeners

Complex Systems with Patrick McKenzie (patio11) by Patrick McKenzie

Complex Systems with Patrick McKenzie (patio11)

134 Listeners