
Sign up to save your podcasts
Or
Kristian Lum will elaborate on the concept of “bias in, bias out” in machine learning with a simple, non-technical example. She will then demonstrate how applying machine learning to police records can result in the over-policing of historically over-policed communities. Using a case study from Oakland, CA, she will show one specific case of how predictive policing not only perpetuates the biases that were previously encoded in the police data, but – under some circumstances – actually amplifies those biases.
4.8
2323 ratings
Kristian Lum will elaborate on the concept of “bias in, bias out” in machine learning with a simple, non-technical example. She will then demonstrate how applying machine learning to police records can result in the over-policing of historically over-policed communities. Using a case study from Oakland, CA, she will show one specific case of how predictive policing not only perpetuates the biases that were previously encoded in the police data, but – under some circumstances – actually amplifies those biases.
3,315 Listeners
30,915 Listeners
32,202 Listeners
3,594 Listeners
43,391 Listeners
7,842 Listeners
10,682 Listeners
2,288 Listeners
4,314 Listeners
523 Listeners
5,420 Listeners
117 Listeners
15,180 Listeners
4,287 Listeners
312 Listeners