
Sign up to save your podcasts
Or


It’s been proven that facial recognition software isn’t good at accurately identifying people of color. It’s also known that police departments around the country use facial recognition tools to identify suspects and make arrests. And now we know about what is possibly the first confirmed wrongful arrest made as a result of mistaken identification by software. The New York Times reported last week that Robert Williams, a Black man, was wrongfully arrested in Detroit in January. Molly Wood speaks with Joy Buolamwini, who has been researching this topic for years as a computer scientist based at the MIT Media Lab and head of the nonprofit Algorithmic Justice League. She said that, like racism, algorithmic bias is systemic.
By Marketplace4.5
12561,256 ratings
It’s been proven that facial recognition software isn’t good at accurately identifying people of color. It’s also known that police departments around the country use facial recognition tools to identify suspects and make arrests. And now we know about what is possibly the first confirmed wrongful arrest made as a result of mistaken identification by software. The New York Times reported last week that Robert Williams, a Black man, was wrongfully arrested in Detroit in January. Molly Wood speaks with Joy Buolamwini, who has been researching this topic for years as a computer scientist based at the MIT Media Lab and head of the nonprofit Algorithmic Justice League. She said that, like racism, algorithmic bias is systemic.

32,250 Listeners

30,625 Listeners

8,790 Listeners

936 Listeners

1,390 Listeners

1,649 Listeners

2,178 Listeners

5,482 Listeners

113,307 Listeners

56,974 Listeners

9,551 Listeners

10,329 Listeners

3,618 Listeners

6,101 Listeners

6,590 Listeners

6,466 Listeners

163 Listeners

2,989 Listeners

154 Listeners

1,376 Listeners

91 Listeners