У нас вы можете посмотреть бесплатно Scaling AI Safety Through Mentorship w/ Dr. Ryan Kidd или скачать в максимальном доступном качестве, видео которое было загружено на ютуб. Для загрузки выберите вариант из формы ниже:
Если кнопки скачивания не
загрузились
НАЖМИТЕ ЗДЕСЬ или обновите страницу
Если возникают проблемы со скачиванием видео, пожалуйста напишите в поддержку по адресу внизу
страницы.
Спасибо за использование сервиса ClipSaver.ru
What does it actually take to build a successful AI safety organization? I'm joined by Dr. Ryan Kidd, who has co-led MATS from a small pilot program to one of the field's premier talent pipelines. In this episode, he reveals the low-hanging fruit in AI safety field-building that most people are missing: the amplifier archetype. I pushed Ryan on some hard questions, from balancing funder priorities and research independence, to building a robust selection process for both mentors and participants. Whether you're considering a career pivot into AI safety or already working in the field, this conversation offers practical advice on how to actually make an impact. Chapters • (00:00) - - Intro • (08:16) - - Building MATS Post-FTX & Summer of Love • (13:09) - - Balancing Funder Priorities and Research Independence • (19:44) - - The MATS Selection Process • (33:15) - - Talent Archetypes in AI Safety • (50:22) - - Comparative Advantage and Career Capital in AI Safety • (01:04:35) - - Building the AI Safety Ecosystem • (01:15:28) - - What Makes a Great AI Safety Amplifier • (01:21:44) - - Lightning Round Questions • (01:30:30) - - Final Thoughts & Outro Links • MATS (https://matsprogram.org/apply?utm_sou...) Ryan's Writing • LessWrong post (https://www.lesswrong.com/posts/QzQQv...) - Talent needs of technical AI safety teams • LessWrong post (https://www.lesswrong.com/posts/yw9B5...) - AI safety undervalues founders • LessWrong comment (https://www.lesswrong.com/posts/tPjAg...) - Comment permalink with 2025 MATS program details • LessWrong post (https://www.lesswrong.com/posts/WGNYA...) - Talk: AI Safety Fieldbuilding at MATS • LessWrong post (https://www.lesswrong.com/posts/LvswJ...) - MATS Mentor Selection • LessWrong post (https://www.lesswrong.com/posts/Yjiw5...) - Why I funded PIBBSS • EA Forum post (https://forum.effectivealtruism.org/p...) - How MATS addresses mass movement building concerns FTX Funding of AI Safety • LessWrong blogpost (https://www.lesswrong.com/posts/WGpFF...) - An Overview of the AI Safety Funding Situation • Fortune article (https://fortune.com/2022/11/15/sam-ba...) - Why Sam Bankman-Fried’s FTX debacle is roiling A.I. research • NY Times article (https://www.nytimes.com/2022/12/01/te...) - FTX probes $6.5M in payments to AI safety group amid clawback crusade • Cointelegraph article (https://cointelegraph.com/news/crypto...) - FTX probes $6.5M in payments to AI safety group amid clawback crusade • FTX Future Fund article (https://archive.is/JYJJP) - Future Fund June 2022 Update (archive) • Tracxn page (https://tracxn.com/d/companies/anthro...) - Anthropic Funding and Investors Training & Support Programs • Catalyze Impact (https://catalyze-impact.org/) • Seldon Lab (https://seldonlab.com/) • SPAR (https://sparai.org/) • BlueDot Impact (https://bluedot.org/) • YCombinator (https://www.ycombinator.com/) • Pivotal (https://www.pivotal-research.org/) • Athena (https://researchathena.org/) • Astra Fellowship (https://www.constellation.org/program...) • Horizon Fellowship (https://horizonpublicservice.org/prog...) • BASE Fellowship (https://www.baseresearch.org/base-fel...) • LASR Labs (https://www.lasrlabs.org/) • Entrepeneur First (https://www.joinef.com/) Funding Organizations • Coefficient Giving (https://coefficientgiving.org/) (previously Open Philanthropy) • LTFF (https://funds.effectivealtruism.org/f...) • Longview Philanthropy (https://www.longview.org/) • Renaissance Philanthropy (https://www.renaissancephilanthropy.org/) Coworking Spaces • LISA (https://www.safeai.org.uk/) • Mox (https://moxsf.com/) • Lighthaven (https://lighthaven.space/) • FAR Labs (https://www.far.ai/programs/far-labs) • Constellation (https://www.constellation.org/) • Collider (https://collider.nyc/) • NET Office (https://emergingthreat.net/) • BAISH (https://www.baish.com.ar/en) Research Organizations & Startups ...