allfeeds.ai

 

Foresight Institute Radio  

Foresight Institute Radio

Author: Foresight Institute

Foresight Institute Radio features the most cutting-edge talks and seminars from our workshopsfresh insights on advanced AI, nanotech, longevity biotech, and beyond. See the slides and demos on YouTube, and follow @ForesightInst on X for real-time updates. For polished, in-studio interviews, check out our sister feed: The Existential Hope Podcast Foresight Institute is an independent nonprofit devoted to steering emerging technologies toward beneficial futures. Hosted on Acast. See acast.com/privacy for more information.
Be a guest on this podcast

Language: en

Genres: Science, Technology

Contact email: Get it

Feed URL: Get it

iTunes ID: Get it

Trailer:


Get all podcast data

Listen Now...

Eliezer Yudkowsky vs Mark Miller | ASI Risks: Similar premises, opposite conclusions
Wednesday, 24 September, 2025

What are the best strategies for addressing extreme risks from artificial superintelligence? In this 4-hour conversation, decision theorist Eliezer Yudkowsky and computer scientist Mark Miller discuss their cruxes for disagreement. They examine the future of AI, existential risk, and whether alignment is even possible. Topics include AI risk scenarios, coalition dynamics, secure systems like seL4, hardware exploits like Rowhammer, molecular engineering with AlphaFold, and historical analogies like nuclear arms control. They explore superintelligence governance, multipolar vs singleton futures, and the philosophical challenges of trust, verification, and control in a post-AGI world.Moderated by Christine Peterson, the discussion seeks the least risky strategy for reaching a preferred state amid superintelligent AI risks. Yudkowsky warns of catastrophic outcomes if AGI is not controlled, while Miller advocates decentralizing power and preserving human institutions as AI evolves.The conversation spans AI collaboration, secure operating frameworks, cryptographic separation, and lessons from nuclear non-proliferation. Despite their differences, both aim for a future where AI benefits humanity without posing existential threats. Hosted on Acast. See acast.com/privacy for more information.

 

We also recommend:


HKPUG Podcast
HKPUG

warzone
Tolga and Alex

AvalonCast

Sugarenia and Stelabouras make a podcast (#ssmap)
Sugarenia, Stelabouras



Spacemusic (Season 1)
spacemusic.nl

UX Week 2008
Adaptive Path

LEXION.tv
CLAAS of America Inc.

Chapter by Chapter Radio Program
Pastor Sandy Adams

Ushcast
Ushahidi

Flipping Tables
Sunrise Robot