“Machines scale knowledge; humans preserve wisdom. Intelligence, therefore, is not automation but awareness: the capacity to reflect, adapt, and ethically co-create new systems of understanding.” - Story Systems and Cultural Research (Routledge)
A lot of the conversation around AI still feels too narrow to me. We keep treating disruption as if it is mainly about tools getting better or jobs getting automated, when the deeper shift is really about institutional legitimacy. Consulting, higher education, and other knowledge systems were built on the idea that expertise was scarce, gated, and easy to certify. That story is starting to come apart. A lot of what Marie Lena Tupot and I explore in Story Systems and Cultural Research is how shifts like this are never just technical. They are narrative, cultural, and structural at the same time.
What interests me more is how different futures open up depending on which parts of the past institutions choose to defend, recover, or let go of. Some will protect prestige long after purpose has weakened. Others may find more useful paths by reclaiming sidelined values like apprenticeship, stewardship, trust, and collective intelligence. No one of those narratives is the future. All of them are. The real question is not just what AI changes, but who gets to define what counts as intelligence, authority, and human value in the world that follows.