
Sign up to save your podcasts
Or


Here’s one that slipped under the radar — and it shouldn’t have.
While everyone’s been stuck in the usual Trump chaos cycle, a major AI company just dropped two federal lawsuits that could turn into a real constitutional problem for this administration. Not a cable-news “legal peril” segment. A first principles problem: did the government try to punish a company for protected speech?
The company is Anthropic — one of the biggest players in the AI race, the maker of Claude. And their claim isn’t subtle: they say the Pentagon wanted their tech used in ways they consider unethical, they refused, and the administration responded by moving to blacklist them as a “supply-chain risk.”
That label isn’t some harmless acronym. “Supply-chain risk” is the kind of designation you’d expect to hear in the context of hostile foreign actors — something that can cut you off from government work and poison your reputation across the industry. If Anthropic is right that this was used as a punishment tool instead of a genuine security finding, that’s the sort of thing judges don’t treat like normal politics.
Why this is dangerous for Trump
Because in court, motive matters.
Anthropic’s lawyers are arguing this wasn’t about national security — it was about retaliation. They’re pointing to public statements and pressure campaigns as evidence that the administration’s real issue wasn’t “risk,” it was disobedience and “woke” branding. If a court buys that, you’re suddenly in First Amendment retaliation territory — the government using its massive contracting and regulatory power to punish a viewpoint.
And that’s only layer one.
They’re also stacking due process and administrative-law claims — basically arguing the government tried to wreck their business relationships and cancel contracts without following the rules that exist precisely to stop agencies from freelancing punishment.
Two lawsuits. Multiple legal theories. Aggressive posture. That’s not a “contract dispute.” That’s a company saying: you don’t get to weaponize the government because we told you no.
Why you should care even if you don’t care about AI
Because this is the playbook: label → isolate → starve → make an example.
Today it’s an AI firm. Tomorrow it’s a university. A newsroom. A nonprofit. A contractor. Anyone who won’t bend the knee.
The core question the courts will have to answer is simple: Can the federal government blacklist a private company because it doesn’t like the company’s position on how its own technology should be used? If the answer is no — and a judge finds retaliatory intent — it doesn’t just hit Trump. It sets a precedent that limits the next administration’s ability to do the same thing to someone else.
That’s why this matters. Power is testing whether guardrails exist.
Your support keeps this show growing, keeps us on the road, and keeps these stories from getting buried.
🟧 Paid subscribers get 15% off your next merch order🟧 Founding Members get 20% off for life
You’ll get the link in your welcome email.
GET DISCOUNTS BELOW! ENJOY!
By Michael FanoneHere’s one that slipped under the radar — and it shouldn’t have.
While everyone’s been stuck in the usual Trump chaos cycle, a major AI company just dropped two federal lawsuits that could turn into a real constitutional problem for this administration. Not a cable-news “legal peril” segment. A first principles problem: did the government try to punish a company for protected speech?
The company is Anthropic — one of the biggest players in the AI race, the maker of Claude. And their claim isn’t subtle: they say the Pentagon wanted their tech used in ways they consider unethical, they refused, and the administration responded by moving to blacklist them as a “supply-chain risk.”
That label isn’t some harmless acronym. “Supply-chain risk” is the kind of designation you’d expect to hear in the context of hostile foreign actors — something that can cut you off from government work and poison your reputation across the industry. If Anthropic is right that this was used as a punishment tool instead of a genuine security finding, that’s the sort of thing judges don’t treat like normal politics.
Why this is dangerous for Trump
Because in court, motive matters.
Anthropic’s lawyers are arguing this wasn’t about national security — it was about retaliation. They’re pointing to public statements and pressure campaigns as evidence that the administration’s real issue wasn’t “risk,” it was disobedience and “woke” branding. If a court buys that, you’re suddenly in First Amendment retaliation territory — the government using its massive contracting and regulatory power to punish a viewpoint.
And that’s only layer one.
They’re also stacking due process and administrative-law claims — basically arguing the government tried to wreck their business relationships and cancel contracts without following the rules that exist precisely to stop agencies from freelancing punishment.
Two lawsuits. Multiple legal theories. Aggressive posture. That’s not a “contract dispute.” That’s a company saying: you don’t get to weaponize the government because we told you no.
Why you should care even if you don’t care about AI
Because this is the playbook: label → isolate → starve → make an example.
Today it’s an AI firm. Tomorrow it’s a university. A newsroom. A nonprofit. A contractor. Anyone who won’t bend the knee.
The core question the courts will have to answer is simple: Can the federal government blacklist a private company because it doesn’t like the company’s position on how its own technology should be used? If the answer is no — and a judge finds retaliatory intent — it doesn’t just hit Trump. It sets a precedent that limits the next administration’s ability to do the same thing to someone else.
That’s why this matters. Power is testing whether guardrails exist.
Your support keeps this show growing, keeps us on the road, and keeps these stories from getting buried.
🟧 Paid subscribers get 15% off your next merch order🟧 Founding Members get 20% off for life
You’ll get the link in your welcome email.
GET DISCOUNTS BELOW! ENJOY!