Omslagafbeelding van de show Doom Debates!

Doom Debates!

Podcast door Liron Shapira

Engels

True crime

Tijdelijke aanbieding

1 maand voor € 1

Daarna € 9,99 / maandElk moment opzegbaar.

  • 20 uur luisterboeken / maand
  • Podcasts die je alleen op Podimo hoort
  • Gratis podcasts
Begin hier

Over Doom Debates!

It's time to talk about the end of the world. With your host, Liron Shapira. lironshapira.substack.com

Alle afleveringen

151 afleveringen

episode Did Eliezer Yudkowsky Really Call for VIOLENCE? — Debate with John Alioto artwork

Did Eliezer Yudkowsky Really Call for VIOLENCE? — Debate with John Alioto

My guest, John Alioto, is an independent AI engineer with a computer science degree from UC Berkeley and 25 years building real-world AI systems at companies like Microsoft and Google. In the wake of an attack on Sam Altman’s property, John Alioto came on the show to argue that Yudkowsky’s words are violent rhetoric that helped create this moment. Since I completely disagree with that characterization, we had a lot of fuel for a passionate debate. For the record, here’s my position on why AI doomers are NOT “calling for violence”: Are we acting like we actually think there’s an urgent extinction risk? Yes. Are we calling for lawless violence? Absolutely not. At least not me, or the leaders of the movement, or anyone I’ve ever personally interacted with. Are we calling for violence as a last resort if a government policy has been established and then egregiously violated? Yes… but that’s just standard for any governance proposal! A proposal for a strictly enforced treaty isn’t a call for violence — it’s a call for doing everything we can to make sure no one breaks the treaty, with zero violence, unless rogue actors decide to break the treaty and bring the consequences on themselves. Thanks to John for having an extremely respectful and good-faith debate on this heated subject. Timestamps 00:00:00 — Cold Open 00:00:37 — Introducing John Alioto 00:03:02 — Setting the Stage: Recent Acts of Violence & AI Discourse 00:05:53 — Eliezer Yudkowsky's 2023 TIME Article 00:11:16 — John's Two-Part Argument 00:14:37 — Conditional on High P(Doom), Is Eliezer's Policy Bad? 00:17:46 — Be Like Carl Sagan — Win in the Arena of Ideas 00:21:12 — No Carve-Outs for Non-Signatories 00:26:15 — Hypothetical: What If the UN Voted for a Treaty? 00:30:46 — What's the Correct Interpretation of Eliezer's TIME Article? 00:32:42 — Liron's Interpretation: Same Structure as Any International Law Proposal 00:42:23 — What Should Eliezer Have Written? "Airstrikes" vs "Strong Deterrent" 00:49:57 — How John Would Rewrite the TIME Piece 00:50:54 — Carve-Outs: Allies, Civilians, Consequences 00:52:52 — Debate Wrap-Up 00:56:27 — Last Q: Does High P(Doom) Imply Violence? 00:59:40 — Closing Thoughts Links John P. Alioto on X (Twitter) — https://x.com/jpalioto [https://x.com/jpalioto] Eliezer Yudkowsky, “Pausing AI Developments Isn’t Enough. We Need to Shut It All Down” (Time Magazine, March 2023) — https://time.com/6266923/ai-eliezer-yudkowsky-open-letter-not-enough/ [https://time.com/6266923/ai-eliezer-yudkowsky-open-letter-not-enough/] Doom Debates’ Mission is to raise mainstream awareness of imminent extinction from AGI and build the social infrastructure for high-quality debate. Support the mission by subscribing to my Substack at DoomDebates.com [https://doomdebates.com/] and to youtube.com/@DoomDebates [https://youtube.com/@DoomDebates], or to really take things to the next level: Donate [https://doomdebates.com/donate] 🙏 Get full access to Doom Debates at lironshapira.substack.com/subscribe [https://lironshapira.substack.com/subscribe?utm_medium=podcast&utm_campaign=CTA_4]

Gisteren - 1 h 1 min
episode Are AI Doomers “Calling for Violence”? Debate with Steven Balik artwork

Are AI Doomers “Calling for Violence”? Debate with Steven Balik

Are AI safety advocates like Eliezer Yudkowsky at fault for the recent attacks on Sam Altman because they are “calling for violence”? I invited Steven Balik to join me on this emergency episode to hash it out. Steven is an activist short seller and data engineering professional whose Substack is popular among Silicon Valley VCs and hedge funds. Links Steven Balik on X (Twitter) — https://x.com/laurenbalik [https://x.com/laurenbalik] Steven Balik, “The Talmudic Stock Bubble, AI Psychosis, & Esoterrorism” (Substack, October 2025) — https://laurenbaliksalmanacandrevue.substack.com/p/the-talmudic-stock-bubble-ai-psychosis [https://laurenbaliksalmanacandrevue.substack.com/p/the-talmudic-stock-bubble-ai-psychosis] Eliezer Yudkowsky, “Pausing AI Developments Isn’t Enough. We Need to Shut It All Down” (Time, March 2023) — https://time.com/6266923/ai-eliezer-yudkowsky-open-letter-not-enough/ [https://time.com/6266923/ai-eliezer-yudkowsky-open-letter-not-enough/] Timestamps 00:00:00 — Cold Open 00:00:52 — Introducing Steven Balik 00:01:24 — Setting the Stage: Molotov Cocktail Incident 00:03:31 — Steven’s Opening Position 00:06:10 — Is Eliezer Yudkowsky “Calling for Violence”? 00:07:25 — Steven on AI, Yudkowsky, the Zizians & Escalating Rhetoric 00:12:16 — Focusing on the Time Article 00:18:51 — Who’s Responsible for the Violence? 00:25:33 — Debating the Key Quote in Yudkowsky’s Time Article 00:31:07 — Liron Passes the Ideological Turing Test 00:45:42 — Liron & Steven Find Common Ground 00:46:57 — Why Does Steven Call Eliezer Yudkowsky an “Esoterrorist”? 00:48:51 — Wrapping Up: Deescalating the Situation Doom Debates’ Mission is to raise mainstream awareness of imminent extinction from AGI and build the social infrastructure for high-quality debate. Support the mission by subscribing to my Substack at DoomDebates.com [https://doomdebates.com/] and to youtube.com/@DoomDebates [https://youtube.com/@DoomDebates], or to really take things to the next level: Donate [https://doomdebates.com/donate] 🙏 Get full access to Doom Debates at lironshapira.substack.com/subscribe [https://lironshapira.substack.com/subscribe?utm_medium=podcast&utm_campaign=CTA_4]

16 apr 2026 - 51 min
episode Tristan Harris and Ted Tremper are WAKING UP Humanity to AI Extinction! artwork

Tristan Harris and Ted Tremper are WAKING UP Humanity to AI Extinction!

The AI Doc: Or How I Became an Apocaloptimist could become the most important movie of our generation. It’s a new film from the producers of the Oscar-winning Everything Everywhere All at Once. Tristan Harris is a subject in the film who is well-known for his role in Netflix's The Social Dilemma. He is the co-founder of the Center for Humane Technology. Ted Tremper is a producer on the film who is the interim Executive Director of the Creators' Coalition on AI. Timestamps 00:00:00 — Cold Open 00:01:20 — Introducing Tristan Harris and Ted Tremper 00:04:31 — The Genesis of The AI Doc 00:12:48 — Tristan’s Journey From Social Media to AI 00:14:30 — Updating From AI Skeptic to AI Risk Aware 00:20:31 — How They Convinced the AI CEOs to Agree to be Interviewed 00:28:58 — Ted’s Journalism Advice, Working on Borat Subsequent Moviefilm 00:30:37 — Tristan, What’s Your P(Doom)™? 00:34:30 — The Resource Curse: What AI Revenue Does to a Society 00:44:10 — Ted, What’s Your P(Doom)™? 00:46:34 — Reacting to Demis Hassabis Statement that AGI Development Is Inevitable 00:49:52 — Liron Sharpens the Criticism Towards AGI Builders 00:55:35 — AGI Developers Claim to Want International Cooperation, But Have They Really Tried? 01:04:30 — What Should Be the Single Takeaway for Concerned Viewers? 01:11:40 — Building a Coalition Against Superintelligence Development: From Bernie to Bannon 01:19:52 — Take Action at TheAIDocGetInvolved.com 01:24:40 — Tristan’s Closing Message: We’ve Done This Before Links Watch The AI Doc — https://www.focusfeatures.com/the-ai-doc-or-how-i-became-an-apocaloptimist [https://www.focusfeatures.com/the-ai-doc-or-how-i-became-an-apocaloptimist] Get Involved with The AI Doc Community — https://theaidocgetinvolved.com/ [https://theaidocgetinvolved.com/] Tristan Harris, Wikipedia — https://en.wikipedia.org/wiki/Tristan_Harris [https://en.wikipedia.org/wiki/Tristan_Harris] Ted Tremper, IMDb — https://www.imdb.com/name/nm3998229/ [https://www.imdb.com/name/nm3998229/] Center for Humane Technology — https://www.humanetech.com/ [https://www.humanetech.com/] “The Intelligence Curse” by Luke Drago and Rudolf Laine — https://intelligence-curse.ai/ [https://intelligence-curse.ai/] Doom Debates’ Mission is to raise mainstream awareness of imminent extinction from AGI and build the social infrastructure for high-quality debate. Support the mission by subscribing to my Substack at DoomDebates.com [https://doomdebates.com/] and to youtube.com/@DoomDebates [https://youtube.com/@DoomDebates], or to really take things to the next level: Donate [https://doomdebates.com/donate] 🙏 Get full access to Doom Debates at lironshapira.substack.com/subscribe [https://lironshapira.substack.com/subscribe?utm_medium=podcast&utm_campaign=CTA_4]

14 apr 2026 - 1 h 27 min
episode I Challenged DON’T LOOK UP’s Screenwriter to Look Up At AGI artwork

I Challenged DON’T LOOK UP’s Screenwriter to Look Up At AGI

David Sirota helped create “Don’t Look Up” sometimes feels like we’re living inside his movie. Does he share my belief that the looming planetary threat is rogue AI? Sirota is an award-winning investigative journalist, bestselling author, and former speechwriter for Bernie Sanders. He was nominated for an Oscar for co-writing the story of Don’t Look Up. Find our more about David’s work at The Lever: https://www.levernews.com/ [https://www.levernews.com/] Timestamps 00:00:00 — Cold Open 00:01:20 — Introducing David Sirota 00:04:34 — Why David Fights Against Power and the Concentration of Power 00:13:46 — From NAFTA to AI: The Warnings We Ignored 00:22:05 — How Big Will the AI “Jobpocalypse” Be? 00:25:28 — Superintelligence & the Parallel to Don’t Look Up 00:28:37 — What’s Your P(Doom)™? 00:31:44 — The Speed of the AI Threat 00:36:26 — Society Is Losing a Collective Capacity to Focus 00:38:34 — Is Climate Change David’s Biggest Existential Concern? 00:45:01 — David Reacts to Bernie Sanders’ Data Center Moratorium Proposal 00:49:11 — Can We Build The “Off Button”? 00:52:08 — “Don’t Look Up” x AGI Mashup 00:54:35 — Why There’s Still Hope 00:58:14 — Living in “Don’t Look Up” 00:59:46 — Wrap-Up: Where to Follow Major AI News Links Watch Don't Look Up — https://www.netflix.com/gb/title/81252357 [https://www.netflix.com/gb/title/81252357] The Lever, investigative news outlet — https://www.levernews.com/ [https://www.levernews.com/] David Sirota on X — https://x.com/davidsirota [https://x.com/davidsirota] David Sirota, Wikipedia — https://en.wikipedia.org/wiki/David_Sirota [https://en.wikipedia.org/wiki/David_Sirota] Master Plan podcast — https://the.levernews.com/master-plan/ [https://the.levernews.com/master-plan/] David Sirota, “Hostile Takeover” on Amazon — https://www.amazon.com/Hostile-Takeover-Corruption-Conquered-Government/dp/0307237354 [https://www.amazon.com/Hostile-Takeover-Corruption-Conquered-Government/dp/0307237354] The Three-Body Problem (novel), Wikipedia — https://en.wikipedia.org/wiki/The_Three-Body_Problem_(novel [https://en.wikipedia.org/wiki/The_Three-Body_Problem_(novel]) WarGames (1983 film), Wikipedia — https://en.wikipedia.org/wiki/WarGames [https://en.wikipedia.org/wiki/WarGames] Adam McKay, Wikipedia — https://en.wikipedia.org/wiki/Adam_McKay [https://en.wikipedia.org/wiki/Adam_McKay] Watch Don’t Look Up — https://www.netflix.com/title/81252357 [https://www.netflix.com/title/81252357] AI 2027 scenario — https://ai-2027.com/ [https://ai-2027.com/] Doom Debates’ Mission is to raise mainstream awareness of imminent extinction from AGI and build the social infrastructure for high-quality debate. Support the mission by subscribing to my Substack at DoomDebates.com [https://doomdebates.com/] and to youtube.com/@DoomDebates [https://youtube.com/@DoomDebates], or to really take things to the next level: Donate [https://doomdebates.com/donate] 🙏 Get full access to Doom Debates at lironshapira.substack.com/subscribe [https://lironshapira.substack.com/subscribe?utm_medium=podcast&utm_campaign=CTA_4]

7 apr 2026 - 1 h 1 min
episode I Went On Jubilee's Middle Ground To Warn About AI Extinction! artwork

I Went On Jubilee's Middle Ground To Warn About AI Extinction!

The popular debate show, Middle Ground by Jubilee, invited me on to take the "anti-AI" side back in April 2024. This highlight reel of the episode shows my experience of the discussion. I'm a lifelong techno-optimist, and it's unnatural for me to represent an anti-tech position. It's just that our AI labs are admitting they're on the path to superintelligence they don't know how to control, which implies we're all about to die and the universe will be robbed of all value forever before our kids grow up. Other than that one little consideration, I'm normally pro-tech! Timestamps 00:00:00 — Why Liron is Worried about AI 00:02:13 — The Nuclear Analogy 00:02:43 — Human Evolution and Neuralink 00:03:14 — The AI Labs’ Own Warnings Links Full episode on YouTube —https://www.youtube.com/watch?v=47fGrqzoFr8 [https://www.youtube.com/watch?v=47fGrqzoFr8] Doom Debates’ Mission is to raise mainstream awareness of imminent extinction from AGI and build the social infrastructure for high-quality debate. Support the mission by subscribing to my Substack at DoomDebates.com [https://doomdebates.com/] and to youtube.com/@DoomDebates [https://youtube.com/@DoomDebates], or to really take things to the next level: Donate [https://doomdebates.com/donate] 🙏 Get full access to Doom Debates at lironshapira.substack.com/subscribe [https://lironshapira.substack.com/subscribe?utm_medium=podcast&utm_campaign=CTA_4]

2 apr 2026 - 4 min
Super app. Onthoud waar je bent gebleven en wat je interesses zijn. Heel veel keuze!
Super app. Onthoud waar je bent gebleven en wat je interesses zijn. Heel veel keuze!
Makkelijk in gebruik!
App ziet er mooi uit, navigatie is even wennen maar overzichtelijk.

Kies je abonnement

Meest populair

Tijdelijke aanbieding

Premium

20 uur aan luisterboeken

  • Podcasts die je alleen op Podimo hoort

  • Geen advertenties in Podimo shows

  • Elk moment opzegbaar

1 maand voor € 1
Daarna € 9,99 / maand

Begin hier

Premium Plus

Onbeperkt luisterboeken

  • Podcasts die je alleen op Podimo hoort

  • Geen advertenties in Podimo shows

  • Elk moment opzegbaar

Probeer 30 dagen gratis
Daarna € 11,99 / maand

Probeer gratis

Alleen bij Podimo

Populaire luisterboeken

Begin hier

1 maand voor € 1. Daarna € 9,99 / maand. Elk moment opzegbaar.