
Sign up to save your podcasts
Or


It’s been proven that facial recognition software isn’t good at accurately identifying people of color. It’s also known that police departments around the country use facial recognition tools to identify suspects and make arrests. And now we know about what is possibly the first confirmed wrongful arrest made as a result of mistaken identification by software. The New York Times reported last week that Robert Williams, a Black man, was wrongfully arrested in Detroit in January. Molly Wood speaks with Joy Buolamwini, who has been researching this topic for years as a computer scientist based at the MIT Media Lab and head of the nonprofit Algorithmic Justice League. She said that, like racism, algorithmic bias is systemic.
By Marketplace4.4
7676 ratings
It’s been proven that facial recognition software isn’t good at accurately identifying people of color. It’s also known that police departments around the country use facial recognition tools to identify suspects and make arrests. And now we know about what is possibly the first confirmed wrongful arrest made as a result of mistaken identification by software. The New York Times reported last week that Robert Williams, a Black man, was wrongfully arrested in Detroit in January. Molly Wood speaks with Joy Buolamwini, who has been researching this topic for years as a computer scientist based at the MIT Media Lab and head of the nonprofit Algorithmic Justice League. She said that, like racism, algorithmic bias is systemic.

38,599 Listeners

6,842 Listeners

30,868 Listeners

8,777 Listeners

5,138 Listeners

932 Listeners

1,387 Listeners

1,282 Listeners

6,451 Listeners

5,497 Listeners

57,023 Listeners

9,578 Listeners

10 Listeners

16,454 Listeners

36 Listeners

6,573 Listeners

6,454 Listeners