Data & Society

Predictive Policing: Bias In, Bias Out


Listen Later

Kristian Lum will elaborate on the concept of “bias in, bias out” in machine learning with a simple, non-technical example. She will then demonstrate how applying machine learning to police records can result in the over-policing of historically over-policed communities. Using a case study from Oakland, CA, she will show one specific case of how predictive policing not only perpetuates the biases that were previously encoded in the police data, but – under some circumstances – actually amplifies those biases.

...more
View all episodesView all episodes
Download on the App Store

Data & SocietyBy Data & Society

  • 4.8
  • 4.8
  • 4.8
  • 4.8
  • 4.8

4.8

22 ratings


More shows like Data & Society

View all
This Machine Kills by This Machine Kills

This Machine Kills

206 Listeners