Three stories reached our screen this week. Each one stopped me cold.
A politician stood before her nation’s parliament. She held up an image. Not any image, but a deepfake nude of herself. Generated by AI. Created without her consent. She showed the world what violation looks like in the digital age.
Halfway around the globe, the Pope spoke about children. About dignity. About the sacred space between human hearts that no algorithm should ever touch. His words carried the weight of millennia, yet they addressed a threat barely a couple of years old.
And in a laboratory, scientists celebrated. They had taught machines to see rain before it falls. To map storms that haven’t even formed yet. To peer into the sky’s intentions with startling accuracy. Amazing.
Three moments. Three choices about who we become when we build tools that think.
The courage to show your wounds
New Zealand MP Laura McClure did something remarkable. She took her violation and made it visible. Not hidden. Not buried. Not dismissed with bureaucratic language about “emerging technologies” and “policy frameworks.” None of that.
She said: Look. This is what happens to us now. We are powerless against the machines.
There’s a particular kind of bravery in turning pain into purpose. In saying yes, this happened to me, and I will not let it happen to you in darkness. The deepfake was meant to shame her into silence. Instead, she used it to break silence open.
We live in times when our faces can be stolen. Our voices copied. Our bodies simulated without permission. The technology exists. The harm is real. The question becomes: Do we pretend this isn’t happening, or do we face it with clear eyes?
If you are interested in learning more about this, we’re teaching a Masterclass on how to avoid these modern harms of AI.
A sacred boundary
The Pope’s words matter. They still do, because they draw a line. Not against progress. Not against innovation. But around something that matters more than efficiency or capability.
Human dignity.
Children need space to grow without algorithms watching. And that space is becoming harder to carve out. Youth deserve to discover themselves without machines predicting their futures. There are territories of the heart that should remain unmapped by artificial minds.
This message wasn’t targeting fear. It was radiating wisdom. About knowing that not everything that can be built, should be built. Not everything that can be automated should be automated. Some spaces belong to us alone. Singular.
Touching the sky with AI
Then there’s the rain. Nature’s tears. Washing the world clean. But ever more dangerous as Nature’s cries have turned into torrents that are hard to predict.
Scientists can now peer into the atmosphere and see storms before they form. They can map rainfall with precision that would have seemed magical just years ago. They’re teaching machines to read the sky’s moods like an ancient farmer reading cloud formations.
This power fills me with wonder. The ability to understand weather at a global scale. To predict floods. To anticipate droughts. To maybe, someday, guide rain where it’s needed most. Fascinating.
But power always asks questions. Who decides where the rain falls? Who controls the algorithms that read the sky? How do we use tools this profound without losing our sense of wonder at the storm itself?
Forward we go
Here’s what these three stories taught me:
Change happens whether we’re ready or not. AI will continue to grow. Deepfakes will continue to threaten. Machines will continue to learn new capabilities. Weather prediction will become weather influence.
But we still choose how to respond.
We can choose courage over hiding. We can speak truth about harm instead of pretending it doesn’t exist. We can draw boundaries around what matters most. We can use powerful tools while protecting sacred spaces.
We can build the future instead of just letting it happen to us.
We notice when technology serves us and when it doesn’t. We pay attention to which innovations protect human dignity and which ones threaten it. We stay awake to the choices being made in boardrooms and laboratories that will shape our children’s world. And we have to.
This is engaged awareness. The practice of staying human in an increasingly artificial world.
The machines are learning. But so are we. We must.
We’re learning to be brave like that politician. To be wise like the Pope. To be thoughtful about power like those scientists.
The future isn’t something that happens to us. It’s something we create, choice by choice, story by story, boundary by boundary.
Rain falls where it will. But we decide what we do when we get wet.
Research Freedom makes you a smarter researcher with AI.
This is a public episode. If you'd like to discuss this with other subscribers or get access to bonus episodes, visit lennartnacke.substack.com/subscribe