
Sign up to save your podcasts
Or


[Crossposted from: https://www.catalyze-impact.org/blog]
We are excited to introduce the eleven organizations that participated in the London-based Winter 2024/25 Catalyze AI Safety Incubation Program.
Through the support of members of our Seed Funding Network, a number of these young organizations have already received initial funding. This program was open to both for-profit and non-profit founders from all over the world, allowing them to choose the structure that best serves their mission and approach. We extend our heartfelt gratitude to our mentors, funders, advisors, the LISA offices staff, and all participants who helped make this pilot incubation program successful.
This post provides an overview of these organizations and their missions to improve AI safety. If you're interested in supporting these organizations with additional funding or would like to get involved with them in other ways, you'll find details in the sections below.
To stay up to date on our [...]
---
Outline:
(01:29) The AI Safety Organizations
(03:10) 1. Wiser Human
(06:31) 2. \[Stealth\]
(09:14) 3. TamperSec
(12:14) 4. Netholabs
(14:46) 5. More Light
(17:38) 6. Lyra Research
(19:59) 7. Luthien
(21:51) 8. Live Theory
(23:39) 9. Anchor Research
(26:23) 10. Aintelope
(29:09) 11. AI Leadership Collective
---
First published:
Source:
Narrated by TYPE III AUDIO.
---
Images from the article:
Apple Podcasts and Spotify do not show images in the episode description. Try Pocket Casts, or another podcast app.
By LessWrong[Crossposted from: https://www.catalyze-impact.org/blog]
We are excited to introduce the eleven organizations that participated in the London-based Winter 2024/25 Catalyze AI Safety Incubation Program.
Through the support of members of our Seed Funding Network, a number of these young organizations have already received initial funding. This program was open to both for-profit and non-profit founders from all over the world, allowing them to choose the structure that best serves their mission and approach. We extend our heartfelt gratitude to our mentors, funders, advisors, the LISA offices staff, and all participants who helped make this pilot incubation program successful.
This post provides an overview of these organizations and their missions to improve AI safety. If you're interested in supporting these organizations with additional funding or would like to get involved with them in other ways, you'll find details in the sections below.
To stay up to date on our [...]
---
Outline:
(01:29) The AI Safety Organizations
(03:10) 1. Wiser Human
(06:31) 2. \[Stealth\]
(09:14) 3. TamperSec
(12:14) 4. Netholabs
(14:46) 5. More Light
(17:38) 6. Lyra Research
(19:59) 7. Luthien
(21:51) 8. Live Theory
(23:39) 9. Anchor Research
(26:23) 10. Aintelope
(29:09) 11. AI Leadership Collective
---
First published:
Source:
Narrated by TYPE III AUDIO.
---
Images from the article:
Apple Podcasts and Spotify do not show images in the episode description. Try Pocket Casts, or another podcast app.

26,365 Listeners

2,443 Listeners

9,128 Listeners

4,156 Listeners

92 Listeners

1,595 Listeners

9,907 Listeners

90 Listeners

507 Listeners

5,469 Listeners

16,056 Listeners

540 Listeners

132 Listeners

95 Listeners

521 Listeners