
Sign up to save your podcasts
Or
It’s been proven that facial recognition software isn’t good at accurately identifying people of color. It’s also known that police departments around the country use facial recognition tools to identify suspects and make arrests. And now we know about what is possibly the first confirmed wrongful arrest made as a result of mistaken identification by software. The New York Times reported last week that Robert Williams, a Black man, was wrongfully arrested in Detroit in January. Molly Wood speaks with Joy Buolamwini, who has been researching this topic for years as a computer scientist based at the MIT Media Lab and head of the nonprofit Algorithmic Justice League. She said that, like racism, algorithmic bias is systemic.
4.5
12361,236 ratings
It’s been proven that facial recognition software isn’t good at accurately identifying people of color. It’s also known that police departments around the country use facial recognition tools to identify suspects and make arrests. And now we know about what is possibly the first confirmed wrongful arrest made as a result of mistaken identification by software. The New York Times reported last week that Robert Williams, a Black man, was wrongfully arrested in Detroit in January. Molly Wood speaks with Joy Buolamwini, who has been researching this topic for years as a computer scientist based at the MIT Media Lab and head of the nonprofit Algorithmic Justice League. She said that, like racism, algorithmic bias is systemic.
1,647 Listeners
902 Listeners
4,359 Listeners
1,753 Listeners
8,659 Listeners
30,839 Listeners
1,358 Listeners
32,283 Listeners
2,171 Listeners
5,497 Listeners
1,438 Listeners
9,568 Listeners
10,141 Listeners
3,587 Listeners
6,259 Listeners
163 Listeners
2,744 Listeners
1,334 Listeners
90 Listeners