
Sign up to save your podcasts
Or


Memory Without Witness, Truth Without Origin - The Deeper Thinking Podcast
The Deeper Thinking Podcast
A slow meditation on truth without origin, memory without witness, and the subtle loss of metaphor in a world rendered by machines.
What if the future didn’t arrive with force, but with recursion? In this episode, we introduce the theory of Recursive Plausibility: the idea that truth, memory, and presence are increasingly simulated by machines trained on their own outputs. Drawing from the work of Brian Massumi, Lauren Berlant, Avery Gordon, and contemporary epistemology, we explore how AI doesn’t just displace human thought—it inherits a world already withdrawing from its capacity to remember, misread, or remain.
This is not a manifesto or forecast. It’s a conceptual walk through the fading boundaries between simulation and sensation. We reflect on simulation theory, ambient estrangement, and the ethics of unclaimed knowledge—questioning how presence, care, and cognition are altered when AI begins to feel familiar not because it understands us, but because it remembers what we’ve already taught it to forget.
Reflections
Here are some reflections that surfaced along the way:
Why Listen?
Listen On:
Support This Work
If you’d like to support the ongoing work, you can visit buymeacoffee.com/thedeeperthinkingpodcast or leave a review on Apple Podcasts. Thank you.
Bibliography
Bibliography Relevance
We do not train machines to know us. We train them to remember the shape of forgetting we already perform.
#RecursivePlausibility #AIepistemology #SimulationTheory #LaurenBerlant #BrianMassumi #AveryGordon #Baudrillard #TheDeeperThinkingPodcast #EpistemicDrift #AmbientEstrangement #DigitalPhilosophy
By The Deeper Thinking Podcast4
8888 ratings
Memory Without Witness, Truth Without Origin - The Deeper Thinking Podcast
The Deeper Thinking Podcast
A slow meditation on truth without origin, memory without witness, and the subtle loss of metaphor in a world rendered by machines.
What if the future didn’t arrive with force, but with recursion? In this episode, we introduce the theory of Recursive Plausibility: the idea that truth, memory, and presence are increasingly simulated by machines trained on their own outputs. Drawing from the work of Brian Massumi, Lauren Berlant, Avery Gordon, and contemporary epistemology, we explore how AI doesn’t just displace human thought—it inherits a world already withdrawing from its capacity to remember, misread, or remain.
This is not a manifesto or forecast. It’s a conceptual walk through the fading boundaries between simulation and sensation. We reflect on simulation theory, ambient estrangement, and the ethics of unclaimed knowledge—questioning how presence, care, and cognition are altered when AI begins to feel familiar not because it understands us, but because it remembers what we’ve already taught it to forget.
Reflections
Here are some reflections that surfaced along the way:
Why Listen?
Listen On:
Support This Work
If you’d like to support the ongoing work, you can visit buymeacoffee.com/thedeeperthinkingpodcast or leave a review on Apple Podcasts. Thank you.
Bibliography
Bibliography Relevance
We do not train machines to know us. We train them to remember the shape of forgetting we already perform.
#RecursivePlausibility #AIepistemology #SimulationTheory #LaurenBerlant #BrianMassumi #AveryGordon #Baudrillard #TheDeeperThinkingPodcast #EpistemicDrift #AmbientEstrangement #DigitalPhilosophy

43,582 Listeners

15,212 Listeners

10,727 Listeners

2,106 Listeners

1,287 Listeners

12,716 Listeners

325 Listeners

367 Listeners

1,012 Listeners

291 Listeners

449 Listeners

112 Listeners

793 Listeners
![Cosmosis [Formerly The UFO Rabbit Hole] by Kelly Chase & Jay Christopher King](https://podcast-api-images.s3.amazonaws.com/corona/show/3501011/logo_300x300.jpeg)
1,008 Listeners

8,608 Listeners