
Sign up to save your podcasts
Or


In facial recognition and AI development, computers are trained on massive sets of data, millions of pictures gathered from all over the web. There are only a few publicly available datasets, and a lot of organizations use them. And they are problematic. Molly speaks with Vinay Prabhu, chief scientist at UnifyID. He and Abeba Birhane at University College Dublin recently studied these academic datasets. Most of the pictures are gathered without consent, people can be identified in them and there are racist and pornographic images and text. Ultimately, the researchers said, maybe it’s not the data that’s the problem. Maybe it’s the whole field.
By Marketplace4.4
7777 ratings
In facial recognition and AI development, computers are trained on massive sets of data, millions of pictures gathered from all over the web. There are only a few publicly available datasets, and a lot of organizations use them. And they are problematic. Molly speaks with Vinay Prabhu, chief scientist at UnifyID. He and Abeba Birhane at University College Dublin recently studied these academic datasets. Most of the pictures are gathered without consent, people can be identified in them and there are racist and pornographic images and text. Ultimately, the researchers said, maybe it’s not the data that’s the problem. Maybe it’s the whole field.

30,609 Listeners

8,801 Listeners

941 Listeners

1,390 Listeners

1,290 Listeners

3,228 Listeners

1,713 Listeners

9,724 Listeners

1,649 Listeners

5,480 Listeners

113,121 Listeners

1,448 Listeners

9,556 Listeners

10 Listeners

35 Listeners

5,576 Listeners

16,525 Listeners