
Sign up to save your podcasts
Or
It’s been proven that facial recognition software isn’t good at accurately identifying people of color. It’s also known that police departments around the country use facial recognition tools to identify suspects and make arrests. And now we know about what is possibly the first confirmed wrongful arrest made as a result of mistaken identification by software. The New York Times reported last week that Robert Williams, a Black man, was wrongfully arrested in Detroit in January. Molly Wood speaks with Joy Buolamwini, who has been researching this topic for years as a computer scientist based at the MIT Media Lab and head of the nonprofit Algorithmic Justice League. She said that, like racism, algorithmic bias is systemic.
4.4
7171 ratings
It’s been proven that facial recognition software isn’t good at accurately identifying people of color. It’s also known that police departments around the country use facial recognition tools to identify suspects and make arrests. And now we know about what is possibly the first confirmed wrongful arrest made as a result of mistaken identification by software. The New York Times reported last week that Robert Williams, a Black man, was wrongfully arrested in Detroit in January. Molly Wood speaks with Joy Buolamwini, who has been researching this topic for years as a computer scientist based at the MIT Media Lab and head of the nonprofit Algorithmic Justice League. She said that, like racism, algorithmic bias is systemic.
1,271 Listeners
1,647 Listeners
890 Listeners
8,630 Listeners
30,823 Listeners
1,358 Listeners
10 Listeners
38 Listeners
5,496 Listeners
1,435 Listeners
9,555 Listeners
3,581 Listeners
5,438 Listeners
1,319 Listeners
82 Listeners
222 Listeners
132 Listeners