Using publicly available information, Clearview AI built a massive database of faceprints for commercial use and police use. Did they have the right to do so? Not in Illinois at least. What does this important case mean for those outside of Illinois? And how will other states be looking to provide similar protections?
More on Catherine Crump.
SPEAKERS
Catherine Crump, Wayne Stacy
Wayne Stacy 00:00
Welcome, everyone to the Berkeley Center for Law and Technology's Experts series podcast. Today we are talking about biometric privacy and the intersection of First Amendment and privacy law. To guide us through this discussion, we have the director of the Samuelson Law, Technology, and Public Policy Clinic. Catherine Crump. Catherine has been at the clinic since 2014, and is one of the nation's leading experts on this topic. So with that in mind, Catherine, can you start us down this discussion about Clearview AI?
Catherine Crump 00:33
We're going to talk today about a really interesting new ruling from an Illinois State Court, which issued a decision in which the ACLU and other organizations sued a company called Clearview AI for violating an Illinois law famous to those of us in the privacy world called the Illinois Biometric and Identification Privacy Act, and specifically for building a massive database of face prints it scraped from the internet, and then it sells access to these images to police departments who want to use them for facial recognition. So this really is a decision that pits privacy and free speech against each other.
Wayne Stacy 01:14
I just want to make clear to the listeners, that clear view is collecting all this information from public sources.
Catherine Crump 01:22
Yes, yes, that's right. So the Clearview AI product really shook the privacy community. It was actually the subject of a really well publicized New York Times article by a reporter named Kashmir Hill, it was in the magazine. And so it's a private company. And what it does is it scrapes photos off the public Internet, and yes, they're publicly available, right? So you know, when you get a picture taken of you at a law firm event, or maybe someone posts something on a public page of Facebook, all of those photos are available for viewing. And so what Clearview AI did, is it harness the power of the internet to gather all of these images. And to create a truly massive database of publicly available images. I think it's said it had something like 3 billion such images. And then it created face prints of those, right, which are a particular way to calculate facial geometry to create a unique imprint of people. So what was really what was really remarkable about this was that it's just vast in scope compared to the biometric database has been created before. So it's not like no one hadn't used this technique. In the past, you know, law enforcement has been doing facial recognition for probably 20 years, at least. But it's always done so on carefully controlled databases. So for example, DMV photos. And so this was different because it involved a private company, creating a truly massive database of photos online. And so it was just bigger in scope than anything anyone had seen before.
Wayne Stacy 02:55
I think back to the classic Fourth Amendment cases: you put your garbage on the streets, and the police can rummage through it. This resembles the garbage case, because I'm making these pictures publicly available. Why can't people just use them?
Catherine Crump