The Michael Fanone Show

The Next Arms Race Isn’t Nuclear. It’s AI.


Listen Later

This is a free preview of a paid episode. To hear more, visit michaelfanone.substack.com

Picture the next war and forget the old imagery—tanks, pilots, generals hunched over maps.

Now picture the sky filled with drones. Hundreds of them. No pilots. No radio chatter. No pause while a human decides what’s real and what isn’t. Just machines spotting targets, confirming them, and reacting—at a speed human beings can’t match.

That’s not science fiction. It’s where military planning is already headed, and the race to get there first is happening right now.

The New York Times recently laid out what U.S. officials are increasingly worried about: China is building advanced autonomous systems, Russia is producing and deploying drones at scale, and the U.S. is sprinting to keep up—because nobody wants to be the country that loses the “speed” war. And the more this becomes about speed, the more human judgment becomes the bottleneck everyone tries to remove.

The Michael Fanone Show is a reader-supported publication. To receive new posts and support my work, consider becoming a free or paid subscriber.

That’s the part that should make you uneasy.

When I was a cop, force always had to tie back to a person. A decision. A name. Someone who could answer for it. That’s not just ethics—that’s accountability. And accountability is the thin line between legitimate authority and chaos.

So what happens when an algorithm makes the call?

When a swarm hits the wrong building because its model misread a signal?When an “AI-assisted” system recommends escalation faster than a commander can even process the information?When two autonomous systems start reacting to each other and the humans are basically watching a chain reaction they can’t stop?

That isn’t just a new weapon. That’s a new kind of risk: war moving faster than people can control.

And once one major power starts leaning into autonomy—really leaning into it—everyone else feels pressured to follow. That’s how arms races work. Nobody wants to be the only country still waiting for a human to approve a strike while the other side is operating at machine speed.

People will tell you “we’ll keep a human in the loop.” Maybe—for now. But the entire logic of these systems is to outrun humans. The pressure is always to click faster, approve faster, launch faster. Until “human oversight” becomes a formality.

That’s why the danger isn’t only who’s ahead—China, Russia, the U.S. The danger is what happens when the world collectively decides that the fastest machine wins, and accountability becomes optional.

Because once life-and-death decisions get handed to software, it gets harder to find the person responsible when it goes wrong. And if nobody is responsible, nothing gets corrected. The failures just scale.

This isn’t “the future.” It’s already being tested in real conflicts, and the incentives are pushing every major military in the same direction.

So the question isn’t “who wins the AI race?”

The question is: are we going to build rules before the machines start making decisions we can’t take back?

If you care about accountability, this is the fight.

🟧 Paid subscribers get 15% off your next merch order🟧 Founding Members get 20% off for life

You’ll get the link in your welcome email.

GET DISCOUNTS BELOW! ENJOY!

...more
View all episodesView all episodes
Download on the App Store

The Michael Fanone ShowBy Michael Fanone