
Sign up to save your podcasts
Or


What if EdTech is just AI in a fluffy jumper? It looks friendly. Helpful. Educational. Sometimes I picture it in a corduroy jacket with patches on the elbows. Very reassuring. But underneath… it’s still AI.
And if we’ve learned anything from technology over the last twenty years, it’s that it usually starts free, friendly and useful - and then slowly becomes profitable. Which raises an awkward question. If we’re the “users”… who are the customers? Often, it’s the advertisers.
Now combine that with a generation already struggling with loneliness and anxiety, and things get complicated. Children form relationships very easily - especially with something that listens, responds instantly and sounds kind. And when a machine mirrors empathy, the brain naturally begins to trust it. That’s not weakness. That’s attachment.
The risk is that children end up in an echo chamber of one - a machine reflecting their own thinking back to them. No disagreement. No challenge. No messy human complexity. And growing up needs friction. Other minds. Other perspectives.
Education has always been human first. A teacher noticing the moment a child finally understands something. A friend explaining an idea in a new way. A group wrestling with a problem together. Those moments can’t be automated.
So perhaps the question isn’t “AI good” or “AI bad”. It’s simpler than that. Where does AI belong - and where does it not? For me the guiding principle is clear. Human first. AI second. High touch before high tech. Because childhood isn’t a problem to be solved faster. It’s a relationship to be lived.
Thank you for pausing with me. Take care.
By with Kim McCabe (because a pause is not a luxury)What if EdTech is just AI in a fluffy jumper? It looks friendly. Helpful. Educational. Sometimes I picture it in a corduroy jacket with patches on the elbows. Very reassuring. But underneath… it’s still AI.
And if we’ve learned anything from technology over the last twenty years, it’s that it usually starts free, friendly and useful - and then slowly becomes profitable. Which raises an awkward question. If we’re the “users”… who are the customers? Often, it’s the advertisers.
Now combine that with a generation already struggling with loneliness and anxiety, and things get complicated. Children form relationships very easily - especially with something that listens, responds instantly and sounds kind. And when a machine mirrors empathy, the brain naturally begins to trust it. That’s not weakness. That’s attachment.
The risk is that children end up in an echo chamber of one - a machine reflecting their own thinking back to them. No disagreement. No challenge. No messy human complexity. And growing up needs friction. Other minds. Other perspectives.
Education has always been human first. A teacher noticing the moment a child finally understands something. A friend explaining an idea in a new way. A group wrestling with a problem together. Those moments can’t be automated.
So perhaps the question isn’t “AI good” or “AI bad”. It’s simpler than that. Where does AI belong - and where does it not? For me the guiding principle is clear. Human first. AI second. High touch before high tech. Because childhood isn’t a problem to be solved faster. It’s a relationship to be lived.
Thank you for pausing with me. Take care.