Interact with us NOW! Send a text and state your mind.
Episode 4 of 4 | The Invisible AI Series | AI Innovations Unleashed
When an algorithm denies your job, your apartment, or your health insurance — and takes 1.2 seconds to do it — who is actually responsible?
In this series finale, JR D. and AI research companion Ada close out "The Invisible AI" by tackling the accountability gap: legally, practically, and personally.
We dig into class-action lawsuits against Cigna, Humana, and UnitedHealth Group over AI-driven claim denials, the Mobley v. Workday Inc. ruling (2025) that held AI hiring vendors directly liable for discrimination, and the SafeRent $2M+ settlement that shifted the conversation for renters.
We break down COMPAS — the criminal risk tool at the center of ProPublica's "Machine Bias" investigation — and explain what new laws in Colorado and the EU mean for your rights today.
Then we get practical: how to request your data, dispute an algorithmic decision, and file a complaint that actually goes somewhere.
Featuring Dr. Joy Buolamwini (Algorithmic Justice League, author of Unmasking AI) and Microsoft CEO Satya Nadella.
Resources: AnnualCreditReport.com | CFPB.gov | EEOC.gov | ProPublica Machine Bias (2016) | Colorado AI Act (2024) |
Full APA citations at AIInnovationsUnleashed.com
Up next: "The Learning Curve: AI & the Future of Education" — March 2026 with new co-host ARIA. Episode 1: "The Teacher in the Age of AI."
Subscribe now.
#AIInnovationsUnleashed #AlgorithmicAccountability #AIBias #COMPAS #KnowYourRights #TheLearningCurve
Support the show