Growth Strategy with Alyssa Evans

The Amazon AI That Discriminated Against Women


Listen Later

In 2014, Amazon spent years and millions of dollars building a hiring AI to streamline recruitment. By 2017, they killed the entire project. Why? It systematically discriminated against women.

This is the story of how one of the world's most sophisticated tech companies with unlimited resources, world-class machine learning engineers, and genuine commitment to fairness, still shipped biased AI because they lacked systematic frameworks for AI risk management.

In this episode, you'll learn:

  • What happened with Amazon's hiring AI and why it failed
  • How AI learns bias from historical data (even when teams have good intentions)
  • Why 85% of companies ship AI without systematic testing (Stanford HAI AI Index Report 2024)
  • The gap between academic AI ethics frameworks and what practitioners actually need
  • Introduction to the NIST AI Risk Management Framework (NIST AI RMF)
  • Why healthcare has ethics boards, academia has IRBs, but tech has... nothing systematic
  • What it takes to build AI governance for small teams (not just Fortune 500s)

Read the blog: drivegrowthpartners.co Download free templates: growyourstrategy.co #AIEthics #AIGovernance #ResponsibleAI #AIRiskManagement #NISTAIFramework #AlgorithmicFairness #MachineLearningBias #AIPolicy #TechEthics #AICompliance #ArtificialIntelligenceEthics #AITesting #FairnessTesting #BiasInAI #EthicalAI #AIRegulation #TechPolicy #SaaSFounders #ProductManagement #AIForBusiness #AIStrategy #TechLeadership #WomenInTech #AIAccountability #DigitalEthics #USTechForce #AIGovernanceFramework #MLEthics #DataEthics #AIBias

...more
View all episodesView all episodes
Download on the App Store

Growth Strategy with Alyssa EvansBy Alyssa Evans

  • 5
  • 5
  • 5
  • 5
  • 5

5

3 ratings