“Introducing 11 New AI Safety Organizations - Catalyze’s Winter 24/25 London Incubation Program Cohort” by Alexandra Bos, Catalyze Impact
EA Forum Podcast (All audio) - A podcast by EA Forum Team - Tuesdays

Categories:
[Crossposted from: https://www.catalyze-impact.org/blog] We are excited to introduce the eleven organizations that participated in the London-based Winter 2024/25 Catalyze AI Safety Incubation Program. Through the support of members of our Seed Funding Network, a number of these young organizations have already received initial funding. This program was open to both for-profit and non-profit founders from all over the world, allowing them to choose the structure that best serves their mission and approach. We extend our heartfelt gratitude to our mentors, funders, advisors, the LISA offices staff, and all participants who helped make this pilot incubation program successful. This post provides an overview of these organizations and their missions to improve AI safety. If you're interested in supporting these organizations with additional funding or would like to get involved with them in other ways, you'll find details in the sections below. To stay up to date on our [...] ---Outline:(01:31) The AI Safety Organizations(03:11) 1. Wiser Human(06:32) 2. \[Stealth\](09:15) 3. TamperSec(12:15) 4. Netholabs(14:48) 5. More Light(17:40) 6. Lyra Research(20:00) 7. Luthien(21:52) 8. Live Theory(23:40) 9. Anchor Research(26:25) 10. Aintelope(29:10) 11. AI Leadership CollectiveThe original text contained 1 image which was described by AI. --- First published: March 10th, 2025 Source: https://forum.effectivealtruism.org/posts/GqhDM6FmcC3jnEocG/introducing-11-new-ai-safety-organizations-catalyze-s-winter --- Narrated by TYPE III AUDIO. ---Images from the article:Apple Podcasts and Spotify do not show images in the episode description. Try Pocket Casts, or another podcast app.