
Sign up to save your podcasts
Or


It’s been proven that facial recognition software isn’t good at accurately identifying people of color. It’s also known that police departments around the country use facial recognition tools to identify suspects and make arrests. And now we know about what is possibly the first confirmed wrongful arrest made as a result of mistaken identification by software. The New York Times reported last week that Robert Williams, a Black man, was wrongfully arrested in Detroit in January. Molly Wood speaks with Joy Buolamwini, who has been researching this topic for years as a computer scientist based at the MIT Media Lab and head of the nonprofit Algorithmic Justice League. She said that, like racism, algorithmic bias is systemic.
By Marketplace4.5
12471,247 ratings
It’s been proven that facial recognition software isn’t good at accurately identifying people of color. It’s also known that police departments around the country use facial recognition tools to identify suspects and make arrests. And now we know about what is possibly the first confirmed wrongful arrest made as a result of mistaken identification by software. The New York Times reported last week that Robert Williams, a Black man, was wrongfully arrested in Detroit in January. Molly Wood speaks with Joy Buolamwini, who has been researching this topic for years as a computer scientist based at the MIT Media Lab and head of the nonprofit Algorithmic Justice League. She said that, like racism, algorithmic bias is systemic.

31,993 Listeners

30,734 Listeners

8,765 Listeners

926 Listeners

1,389 Listeners

1,707 Listeners

4,324 Listeners

2,179 Listeners

5,490 Listeners

56,500 Listeners

1,446 Listeners

9,536 Listeners

3,588 Listeners

6,444 Listeners

6,396 Listeners

163 Listeners

2,997 Listeners

5,510 Listeners

1,378 Listeners

90 Listeners