coverImageOf

Plutopia News Network

Podcast af Plutopia News Network

engelsk

Personlige fortællinger & samtaler

Begrænset tilbud

1 måned kun 9 kr.

Derefter 99 kr. / månedOpsig når som helst.

  • 20 lydbogstimer pr. måned
  • Podcasts kun på Podimo
  • Gratis podcasts
Kom i gang

Læs mere Plutopia News Network

We talk to interesting people via podcast and weekly livestream.

Alle episoder

328 episoder
episode Bruce Schneier: Rewiring Democracy artwork

Bruce Schneier: Rewiring Democracy

On this episode of the Plutopia News Network podcast, security technologist and author Bruce Schneier [https://www.schneier.com/] joins the hosts to discuss his new book Rewiring Democracy: How AI Will Transform Our Politics, Government, and Citizenship. [https://bookshop.org/a/52607/9780262049948] Schneier frames democracy as an information-processing system that aggregates citizens’ preferences into policy, and defines AI broadly as computer systems doing tasks once done by humans. He argues that AI is fundamentally a power-amplifying tool: in the hands of small-d democrats it can strengthen participation, transparency, and decision-making, but in the hands of authoritarians or monopolistic tech corporations it can just as easily supercharge surveillance, manipulation, and control. Throughout the conversation, he emphasizes that many fears attributed to “AI itself” are really fears about capitalism, corporate power, and concentrated ownership of technology, and he offers real-world examples where AI is already helping journalism, courts, voters, and legislatures. Rather than utopian hype or doom, Schneier advocates a clear-eyed, politics-first view: AI’s impact on democracy will depend less on the technology and more on who controls it and how we choose to govern its use. Bruce Schneier: > But what we want to say is AI is a tool. It’s a power-enhancing tool. In the hands of someone who wants better democracy, it’s a tool for better democracy. In the hands of an authoritarian, it’s a tool for better authoritarianism. And that’s what it’s going to do. And a lot of times, people confuse the evils of AI with the evils of the corporations controlling the AI. And I think that is the most important thing that I say to the people who say nothing’s good here. And you’re right, in a lot of ways, nothing’s good there. It’s not the technology’s fault. It’s the fault of the monopolists. It’s the fault of Silicon Valley. It’s the fault of the white male tech billionaires. That’s where You want to address > > > > > > > LINKS: > > > * TED talk: Dustin Ballard [https://www.ted.com/talks/dustin_ballard_is_ai_ruining_music] > * Story in The Verge on Swiss model Aspertus [https://www.theverge.com/ai-artificial-intelligence/770646/switzerland-ai-model-llm-open-apertus] > * Text of “Franchise”, [https://www.astro.sunysb.edu/fwalter/HON301/franchise.pdf] by Isaac Asimov > * DonorAtlas [https://www.donoratlas.com/] > * Albanian procurement minister Diella [https://en.wikipedia.org/wiki/Diella_(AI_system)] > > The post Bruce Schneier: Rewiring Democracy [https://plutopia.io/bruce-schneier-rewiring-democracy/] first appeared on Plutopia News Network [https://plutopia.io].

I går - 59 min
episode Próspera: Governance as a Service (GaaS) artwork

Próspera: Governance as a Service (GaaS)

On this episode of the Plutopia News Network podcast, hosts Jon, Wendy, and Scoop talk with Próspera [https://en.wikipedia.org/wiki/Pr%C3%B3spera] VP of Growth Lonis Hamaili [https://fee.org/articles/why-i-left-silicon-valley-for-a-honduran-startup-city/] and community development consultant David Armistead [https://www.thrivingbusiness.solutions/team] about Próspera, a privately managed “governance as a service” special economic zone in Honduras. Lonis explains that Próspera operates with its own autonomous legal, regulatory, and tax framework, somewhat analogous to Hong Kong or Dubai’s DIFC. It aims to attract global business, generate high-wage local jobs, and experiment with streamlined, market-driven regulation using mechanisms like insurance-based oversight and private arbitration courts. The conversation covers Próspera’s rapid growth since breaking ground in 2020, with hundreds of companies incorporated, thousands of jobs created, and a mixed public–private council that shares lawmaking power, along with revenue-sharing agreements that send a portion of tax revenue to the Honduran government. The guests also address criticisms and concerns, such as fears of deregulation, exploitation, power infrastructure, local opposition, and comparisons to micro-nations or freeports; arguing that Próspera is apolitical in practice, relies on voluntary participation, bans eminent domain, and is legally protected despite the repeal of Honduras’s ZEDE law. They close by touching on frontier ideas like longevity medicine, potential AI legal status, and possible expansion to other countries, framing Próspera as a real-world testbed for new governance and economic models rather than a fully formed utopia. Lonis Hamaili: > With Próspera we’re inventing a new industry we call governance as a service. Essentially we partner with host countries, in this case Honduras, and create these zones that have very special and autonomous laws, regulations and practices. You can think of it as a little bit like Hong Kong in China. At least as it used to be, where Hong Kong is part of China, but it has its own completely independent political and economical systems. Similarly, with Prospera here in Honduras, we have our own independent system. However, it’s privately managed, right? So we help with the governance operations which include providing the laws, the security, the justice system. Our business model is access. The post Próspera: Governance as a Service (GaaS) [https://plutopia.io/prospera-governance-as-a-service-gaas/] first appeared on Plutopia News Network [https://plutopia.io].

24. nov. 2025 - 59 min
episode Jennifer Granick: Surveillance and Cybersecurity artwork

Jennifer Granick: Surveillance and Cybersecurity

On this episode of the Plutopia News Network, Jon, Scoop and Wendy talk with Jennifer Granick, Surveillance and Cybersecurity Counsel at the ACLU, [https://www.aclu.org/] about the expanding machinery of government and corporate surveillance and its threat to civil liberties and democracy. Jennifer explains how long-standing rules limiting government use and combination of personal data have eroded, enabling massive dossiers on citizens and immigrants built from government records, data brokers, apps, and new technologies like ubiquitous location tracking, spyware, and facial recognition. She highlights how border zones and immigration enforcement operate as Fourth Amendment “gray areas,” how ICE and other agencies exploit data broker loopholes, and how surveillance harms vulnerable people, from abortion seekers to benefit recipients wrongly flagged as frauds. The conversation also covers the politics and dangers of spyware, the importance and limits of tools like Signal, [https://signal.org/]the role of hackers and security researchers in exposing abuses, and the way popular media normalizes surveillance as necessary for safety. Jennifer closes by stressing practical self-defense steps, the need to understand one’s “threat model,” and the importance of legal and political resistance, reminding listeners that although the situation is alarming, organized pushback can still win real protections. Jennifer Granick: > I think one of the biggest new things is that the rules that we had have kind of been thrown away. There were just these expectations that data I gave to the government in order to get Medicare or in order to get food stamps or something of that nature was going to stay used for those purposes. And there are rules about how the government is permitted to combine databases of information and when it’s allowed to do that. And what we’ve seen is a complete ignoring of those rules and this amalgamation of different databases of information into a dossier of people in the country, not just people who are immigrants, but also people who have been born here and were citizens as well. And you put together all these disparate pieces of information and it tells you a lot, maybe almost everything about somebody. The post Jennifer Granick: Surveillance and Cybersecurity [https://plutopia.io/jennifer-granick-surveillance-and-cybersecurity/] first appeared on Plutopia News Network [https://plutopia.io].

18. nov. 2025 - 1 h 1 min
episode Pete Cochrane: Pursuing Truth artwork

Pete Cochrane: Pursuing Truth

Technologist and former British Telecom [https://www.bt.com/] chief scientist Peter Cochrane [https://cochrane.org.uk/bio/] joins the Plutopia to talk about his lifelong pursuit of truth and his work on a “truth engine” that used AI to grade the reliability of news sources and authors. Cochrane argues that real truth is hard, costly, and collaborative — unlike social media, which feeds users comforting falsehoods that match their worldview — and warns that losing a shared grip on truth threatens civilization. Drawing on his career in communications, AI, and cybersecurity, Peter explains how he boosted lie detection rates by tracking sources over time, factoring in bias, and adding linguistic and psychological analysis, pushing accuracy toward 95%. The conversation widens into science as an ongoing search rather than final certainty, the distortions of corporate media, the risks and inevitability of AI-driven systems like driverless cars, and his own experiment living with AI-assisted hearing. Throughout, Cochrane stays optimistic but insistent on building human and machine ethics, noting that technology should be judged by whether it improves on fallible humans and helps us keep truth at the center of society. Peter Cochrane: > Truth is very expensive. It costs you a lot of time, energy, concentration. You have to have these inner arguments. You have discussions with other people and you gradually zero down to an opinion based on the facts. Whereas on Facebook it is easy. You you just believe it. And it’s so outrageous — that it fits your world model. That’s the worst aspect. The whole of social media is tuned to your social or world model, and they just feed you the stuff that reinforces your belief system. I think that can be said of most religions, they do the same thing. Uh they feed you the story from being a child continually till it becomes perfect. The post Pete Cochrane: Pursuing Truth [https://plutopia.io/pete-cochrane-pursuing-truth/] first appeared on Plutopia News Network [https://plutopia.io].

10. nov. 2025 - 1 h 2 min
episode Sophie Nightingale: Our Minds on Digital Technology artwork

Sophie Nightingale: Our Minds on Digital Technology

The Plutopia podcast hosts Dr. Sophie Nightingale, a psychologist at Lancaster University, [https://nightingalelab.co.uk/] to discuss how digital technology — especially social media, generative AI, and the constant flow of online information — shapes human memory, judgment, and vulnerability to deception. She explains that people struggle to evaluate critically the sheer volume of information they encounter, so they’re more likely to accept content that aligns with their preexisting beliefs, and this helps misinformation spread. Nightingale traces her research from early work on how taking photos can impair memory to current studies showing that most people can spot fake or AI-generated images only slightly better than chance, and even training improves performance only modestly. She and the hosts dig into the limits of AI “guardrails,” the uneven global landscape of AI regulation, the rise of misogynistic online spaces, and the troubling growth of AI-enabled nonconsensual intimate imagery, arguing that legal reform, platform accountability, and public education are all needed to reduce harm. > One of the things that tends to make people quite susceptible is just information overload, purely that we live in an age where we are accessing so much information all the time we can’t possibly interpret, or critically think about, everything. So we might well just accept things that we wouldn’t otherwise. There’s quite a lot of evidence showing that’s especially the case, if that information coincides with your pre-existing beliefs. So for example, if I happen to be a huge fan of Donald Trump, let’s say, and I saw some misinformation around Donald Trump that was positive about him, then I would probably be more likely to believe that than somebody who was not a fan of Donald Trump already, if you see what I mean. So those biases definitely exist. There’s a lot of evidence showing that. And then I think, you know, it kind of comes back as well to — if you want to believe something, you will. The post Sophie Nightingale: Our Minds on Digital Technology [https://plutopia.io/sophie-nightingale-our-minds-on-digital-technology/] first appeared on Plutopia News Network [https://plutopia.io].

04. nov. 2025 - 1 h 2 min
En fantastisk app med et enormt stort udvalg af spændende podcasts. Podimo formår virkelig at lave godt indhold, der takler de lidt mere svære emner. At der så også er lydbøger oveni til en billig pris, gør at det er blevet min favorit app.
En fantastisk app med et enormt stort udvalg af spændende podcasts. Podimo formår virkelig at lave godt indhold, der takler de lidt mere svære emner. At der så også er lydbøger oveni til en billig pris, gør at det er blevet min favorit app.
Rigtig god tjeneste med gode eksklusive podcasts og derudover et kæmpe udvalg af podcasts og lydbøger. Kan varmt anbefales, om ikke andet så udelukkende pga Dårligdommerne, Klovn podcast, Hakkedrengene og Han duo 😁 👍
Podimo er blevet uundværlig! Til lange bilture, hverdagen, rengøringen og i det hele taget, når man trænger til lidt adspredelse.

Vælg dit abonnement

Begrænset tilbud

Premium

20 timers lydbøger

  • Podcasts kun på Podimo

  • Gratis podcasts

  • Opsig når som helst

1 måned kun 9 kr.
Derefter 99 kr. / måned

Kom i gang

Premium Plus

100 timers lydbøger

  • Podcasts kun på Podimo

  • Gratis podcasts

  • Opsig når som helst

Prøv gratis i 7 dage
Derefter 129 kr. / måned

Prøv gratis

Kun på Podimo

Populære lydbøger

Kom i gang

1 måned kun 9 kr. Derefter 99 kr. / måned. Opsig når som helst.