Truth Collapse

Truth Collapse is the erosion of shared and personal truth caused by mediating our learning, communication, and decision-making through algorithms whose goals aren’t aligned with human well-being. We need to work together to fix it. Read more on: truthcollapse.com

Listen on:

  • Apple Podcasts
  • Podbean App
  • Spotify

Episodes

Thursday Oct 30, 2025

How did the digital tools we built with optimism become weapons of disinformation? Dan Brown, host of the Unchecked podcast and principal at Curious Squid, joins Noz Urbina to explore the uncomfortable question of designer complicity in our epistemological crisis. They discuss how systems built for connection now fragment truth, why community moderation fails, and what designers can do differently when building information spaces that resist manipulation.
"Information spaces are not well designed to encourage lateral reading... we need to design an information space that not only demands features but that are grounded in practices like lateral reading." – Dan Brown
Key Findings
Designer complicity in disinformation systems - The uncomfortable question of whether UX/tech professionals unknowingly helped build systems that enable misinformation, though most lacked visibility into the bigger picture 10-15 years ago
Information architecture's role in truth collapse - How the systems that mediate between content creators and consumers have become vulnerable to manipulation and need a "structural revolution" similar to accessibility and responsive design
The failure of professional fact-checking - Why traditional fact-checking has become ineffective due to the "truth inversion" where qualified sources are automatically distrusted as part of the conspiracy
Community moderation's fundamental flaws - How user-flagging puts truth to a vote and creates filter bubbles where communities reinforce their existing beliefs rather than correcting misinformation
Lateral reading as a design principle - The need to build information spaces that encourage cross-referencing multiple sources rather than consuming single feeds
Ground News as a positive example - How this app restores context by showing source ownership, bias spectrum, and multiple perspectives on the same story
Hashtag and tagging weaponisation - Examples like #NotAllMen, #SaveTheChildren, and Amazon's vaccine category showing how information systems get co-opted for disinformation
The destruction of context online - How the modern web strips away temporal and source context, presenting 20-year-old content alongside yesterday's news without distinction
Personal responsibility and speaking up - Noz's regret about waiting to address authoritarianism concerns and the importance of recognising warning signs from other countries' experiences
The internet as humanity's mirror - The optimistic view that unprecedented visibility and self-awareness created by digital systems will ultimately help society correct course, despite current challenges
What you'll learn 
Why fact-checking has become a polarised and ineffective solution
How hashtags and tagging systems get weaponised for disinformation
The difference between designing for engagement versus designing for truth
Why community moderation creates filter bubbles instead of stopping misinformation
How "lateral reading" could transform information architecture
What designers didn't know 10 years ago about their role in democracy's decline
Practical approaches to building more resilient information spaces
Why secrecy is harder in the spotlight-addicted digital age
 
🗣️ ABOUT THE HOST:
🔗 CONNECT WITH NOZ:
LinkedIn: https://www.linkedin.com/in/nozurbina/
Consultancy: https://urbinaconsulting.com/
Truth Collapse: https://truthcollapse.com/
 
🎙️ ABOUT TRUTH COLLAPSE:
The Truth Collapse Podcast explores how we come to believe what we believe in an age of algorithmic manipulation and epistemological crisis. Hosted by Noz Urbina, we examine the intersection of technology, information design, and democracy.
Website: https://truthcollapse.com/
 
📧 GET IN TOUCH: Have questions or want to be a guest?
Email: contact@truthcollapse.com
 

Thursday Aug 14, 2025

What happens when the technology designed to connect us becomes the very force that divides us? In this provocative episode of the Truth Collapse podcast, host Noz Urbina and guest Lasse Rindom tear apart our comfortable assumptions about truth, identity, and connection in the digital age.
Starting with a seemingly simple observation that "demystifying doesn't mean simplifying," the conversation spirals into a philosophical exploration of how we've traded truth for attention, nuance for outrage, and genuine diversity for performative categorisation. Through personal stories of being misidentified, misunderstood, and caught between cultures, they reveal how the internet's promise of bringing us together has instead created rigid categories that we defend with religious fervour.
But this isn't just another doom-and-gloom tech critique. Drawing on philosophy from Jürgen Habermas to John Rawls, the duo explores surprising solutions: why tolerance (not love) might save democracy, how Europe's post-WWII peace came from an uncomfortable truth about homogeneity, and why we need to stop seeking "things without the thing"—from decaf coffee to digital relationships. It's a conversation that will make you rethink every label you use, every online argument you've had, and whether there was ever any truth to collapse in the first place.
 
"Demystifying does not mean simplifying. I don't think anything needs to be simple. If things are complex, they're complex, then you just have to explain them over and over and over until people somehow understand some of it." — Lasse Rindom
"I'd love if we were discussing truth, information, data—we're not, unfortunately. We are all discussing attention. That's the only currency in the world right now." — Lasse Rindom

Friday Jun 20, 2025

In this episode, Noz Urbina and Ilya Venger, Data and AI Product Leader at Microsoft, explore the profound societal implications of AI. Moving beyond workplace applications, they dive deep into how AI systems are fundamentally altering human identity, relationships, and society itself.
Ilya brings his unique perspective from building AI systems at Microsoft to examine uncomfortable questions about where these technologies are taking us as humans. The conversation begins with a seemingly simple scenario - AI avatars attending meetings on our behalf - but quickly expands into fundamental questions about identity ownership, cultural bias, and human agency.
Both Noz and Ilya explore how we're moving from AI systems that exploit our dopamine responses (social media algorithms) to what Ilya calls "oxytocin hijacking" - AI that simulates emotional understanding and relationships. They examine how corporations might extract and own digital versions of employees, how different cultures encode conflicting values into AI systems, and why the exponential pace of change risks leaving most of humanity behind.
 
"What happens when we've got a lot of different avatars joining in the meeting, and then you've got just one human, or you have no humans at all?" – Ilya Venger
"The oxytocin hijacking is the next level, where we are seeing the models of the chatbots trying to make us like them." – Ilya Venger

Wednesday Jun 04, 2025


In this episode, Noz Urbina explores with content engineering consultant Rafaela Ellensburg how AI systems are contributing to a breakdown of trust and degradation of the perception of information in society.
Drawing parallels between financial systems and information systems, Rafaela shares her perspective on misinformation, systemic issues created by engagement-driven algorithms, and the urgent need for structured approaches to AI governance.
She uses the "gold standard" metaphor, where removing standards from both money supply and information creation leads to value collapse. The discussion emphasises practical solutions, including semantic frameworks, knowledge graphs, and circular content thinking to ensure organisations contribute value rather than noise to the information ecosystem.
Key themes include moving from volume-based to value-based content creation, implementing human-centred AI governance, and creating sustainable content practices that serve both business goals and human well-being.
"We as humans need to become owners of our reality. We need to own that and use that to govern those artificial things that we are able to create through the systems. The way in which we could do that is taking our reality and translating the concepts that are meaningful to be machine-readable."
"I think if people start to transition from a more linear way of thinking into a more circular way of thinking about content creation, then you will also see the importance of being able to trace back whatever that content tangibly offered or created value... Content is a living thing, and we are communicating it to living people." – Rafaela Ellensburg

Wednesday May 21, 2025

What will your career look like when AI can do many knowledge work tasks better than humans? In this thought-provoking episode, Noz Urbina speaks with Amir Feizpour, entrepreneur and former quantum physicist, about how AI will transform not just what we do but how we define ourselves. They explore the concept of "identity diversification" as a critical strategy for navigating our rapidly changing future, examining both the potential benefits of technology as an equaliser and the inevitable societal disruption that accompanies dramatic technological change.
 
"I don't think loss of job is the right way to talk about what's happening. Loss of job description is what is happening."
 
"What are the business models that we need to find to make protecting our emotional and societal integrity profitable?"
"Where is this going? It's going to take a lot of emotional toll on people. More than all of the doomsday conversations about AI taking over, I'm worried about emotional backlash of people because they're not ready to go through the space of change." – Amir Feizpour
 
What you'll learn
- How AI is changing knowledge work roles faster than ever before
- Why “identity diversification” is crucial for emotional resilience
- How physical spaces and architecture have been reorganised by technology
- Ways technology can act as an equaliser in entrepreneurship
- The concept of multi-agent human-machine collaboration
- How societal reorganisation may occur during technological transitions
- Strategies for finding meaning beyond your occupation.
 
00:00 - Introduction
00:20 - Introducing Amir Feizpour
01:42 - Amir's background as a quantum physicist and entrepreneur
02:28 - Discussion of the future of knowledge work
03:00 - Definition of knowledge work and its evolution
06:23 - Examples of AI-human collaboration in entrepreneurship
09:19 - The electricity analogy for how AI removes constraints
11:50 - Impact on identity and reorganisation of physical spaces
17:58 - The French cultural approach to work identity
19:37 - Identity diversification as a coping strategy
22:10 - Technology as an equaliser for opportunity
25:06 - Entrepreneurship journey and identity evolution
27:36 - Concerns about wealth concentration and inequality
31:29 - Power law distribution and technology's equalising effect
33:46 - Managing transition pain during technological disruption
34:46 - Historical parallel to the Reformation and printing press
37:17 - Hope for human-centred technology solutions
39:29 - Finding business models to protect emotional and social stability
40:35 - Conclusion

Monday Apr 14, 2025

What happens when our entire information environment is designed to prioritise engagement over truth?
Noz Urbina, founder of Urbina Consulting and expert in content design and information systems, launches a new podcast exploring the "Truth Collapse" - the AI meta crisis undermining our ability to have productive conversations about everything from climate change to politics.
In the first episode, he examines how algorithms have been reshaping our information landscape for decades and why they're creating increasingly anxious, polarised societies.What you'll learn: 
How truth is established in societies and why algorithms are reversing this process
Why engagement-driven media promotes emotionally charged content over factual information
How algorithms have been shaping our information landscape for over 20 years
The three levels where truth collapse affects us: personal, interpersonal, and societal
Practical approaches to managing information overload and algorithm influence
How the "anxious generation" is being affected by constant digital content exposure
Why institutions like media and government are struggling in the engagement economy.

Copyright 2025 All rights reserved.

Podcast Powered By Podbean

Version: 20241125