
Sign up to save your podcasts
Or


Grant Reeher: Welcome to the Campbell Conversations, I'm Grant Reeher. When it comes to generating data that can be used against us by the government, are we are own worst enemies? My guest today has a lot to say about that. Andrew Guthrie Ferguson is a professor at George Washington University Law School, and he's the author of a new book. It's titled, “Your Data Will Be Used Against You: Policing in the Age of Self-Surveillance”. Professor Ferguson, welcome to the program.
Andrew Ferguson: Thank you so much for having me.
GR: Well, we really appreciate you making the time. So, I'm going to spend, I think, most of our time just unpacking a very provocative sentence that you write near the beginning of your book. You write, “you are, at best, a warrant away from having your most intimate personal details revealed to a government agent looking to incarcerate, embarrass, or intimidate you.” There's a lot to unpack there. So, that's going to be our roadmap. But first, I thought it would help to set some of the context for this by breaking down some of the building blocks for your analysis in your argument. So let me start with this one, you write of certain kind of data sources, and it's right in the title of your book, you call them ‘self-surveillance’. Just give us a clear idea of what that means, the kinds of data mines that you're talking about there.
AF: Sure. So, if you live in a world where you have a smartphone in your pocket or a smartwatch on your wrist, or drive any modern car that is essentially a smart surveillance car, or have a ring doorbell camera on your front door, an Alexa in your home, use the internet, ask Google questions, you are creating data about yourself. And the thing that I think we haven't fully recognized is that all of our smart devices are surveillance devices. In fact, that's sort of what you're purchasing. You're purchasing some insight about yourself, about your life, about your patterns that you think will add value to your life. The downside risk is that all that data is available to law enforcement, to prosecutors. And it is, as you said, at best, a warrant away from being revealed. And what the book does is try to explain how we have sort of built this world of self-surveillance. Not to judge us, we are all digital citizens in one way or another, but to recognize that these purchases and these choices have costs and that we haven't really wrestled with the costs of that self-surveillance.
GR: So, in your book, you take the first several chapters breaking out that different kind of data, and you just gave us an idea of that when you just listed the different kinds of devices that, you know, a lot of us have, certainly almost everybody has the cell phone, the smartphone. And you talk about data that's in and about our homes, and you have one about our things, our bodies, our cities, our papers and our likes. Now, some of those I think we understand immediately. And when you just described that list that you gave, you know, we all can relate to that. I was curious to hear you spell out a little bit about what our cities and our papers mean in those chapters. That's less clear on the surface.
AF: Sure. So, we as a society have invested a lot of money in building surveillance devices in our cities. Whether they're automated license plate readers that catch the license plates that drive down your streets, cameras that are connected to real time crime centers where police can surveil the city streets from the comfort of their command center. Whether it is other forms, you know, drones and other new technologies that are watching us. That is a form of self-surveillance. It’s democratically mediated, it's our city council saying that we think that we can buy a world of safety through surveillance. But these are choices. Choices that we make with our tax dollars, we make with our elected choices of representatives. And it is another form of this world where we are building surveillance devices for when we go out in public, but also, of course, as we are in our home. And the papers part, again, the chapters that begin the book talk about how our homes have been transformed into smart homes, our bodies have been transformed using biological markers, be it your Fitbit or your smartwatch or your smart pacemaker or whatever it would be. But papers, papers also date back to the founding fathers. We have papers, effects, persons in homes as our fourth amendment protections. And most of what we do now is mediated through a digital form that takes the place of papers. You know, my parents used to get a bank statement in the mail, literally every month, that they saved in the attic so we would have boxes of papers and that was your financial records. Now it's all digital on your phone. We used to write letters. Again, they actually had love letters that they wrote back when they were courting in the air and now people have, like, texts and maybe an email or two to do that. But all of that digital information is recorded somewhere, usually held by some third party, and is available if there is a criminal prosecution, if there is a warrant, if there is a desire. And what I don't think we realize is that that exposure, that digital exposure leaves us all very vulnerable to political winds that change, to certain kinds of targeted prosecutions. And it's something that we should just have a conversation about, and the book is trying to start that conversation.
GR: Yeah. Just a couple of quick observations about our cities. I just want to, you'd be interested in hearing this, but that question and that issue has been one that has come up in Syracuse, because Syracuse has been instituting these. It has been framed as something that will make the city run more efficiently and provide better services. But at the same time, there has been conversation about this idea that we are, you know, increasing our sense of monitoring. And I remember I worked several years ago over a number of different summers in London, and it was often said, I remember it being said that because of the heavy use of CCTV, that pretty much anywhere in London, you were on television somewhere. And I remember taking some comfort in that as a pedestrian walking around. But I see the other side of this now that you bring it up.
AF: And that's part of it, it's that duality, right? It's not that smart cities are bad, it's not that security cameras are bad, but in an unregulated world, in a world where that technology can be weaponized for other things that we didn't agree to or don't think is wise, we run into problems. And that's the hard part. We are not wrong in thinking that maybe certain forms of surveillance will help law enforcement. It will. And the book is actually filled with stories of people getting caught because of their data and like, bad people getting caught because of their data. And that's not a bad thing. The danger is that who is bad or what is criminal can change politically, can change because of who's getting targeted and without rules and regulations to limit it. It really does expose all of us to essentially the whims of who's in charge and what they want to prosecute.
GR: Yeah, anyone who's watched British crime television knows, the first thing that detectives all go to is the CCTV. But you're right, the double edge-ness. I'm Grant Reeher, you're listening to the Campbell Conversations on WRVO Public Media, and my guest is George Washington University law professor Andrew Guthrie Ferguson and we're discussing his new book. It's titled, “Your Data Will Be Used Against You: Policing in the Age of Self-Surveillance”. So, give us some more examples, maybe some of the ones that you think are most telling or the most intriguing or the most disturbing of how the information can be used against us. I mean, we understand that there are some good things that can be done with it, but as you say, it depends on whoever is in power and using this. Give us some examples of things we might want to be worried about.
AF: There’s a story that I think both captures the promise and dangers and just difficulty of these issues. (It) involves a gentleman who had a smart pacemaker in his heart. So, I think we can all agree that the medical inventions and interventions that allow someone to live better because their heart is beating at a certain way and recording that data for the individual themselves so it can take care of themselves, go to the doctor, but is also that data is also being forwarded to the doctor's office is a good. That's actually the kind of thing that we should encourage, we would like. But the case in the book is when detectives go to the doctor's office and get this guy's heartbeat and use it against him in a criminal case, because, and this is the tension, he was involved in some insurance fraud. He was basically, burned down his house, claimed it was, you know, an accident when really it was arson. And the detectives wanted to disprove his story by showing that his heartbeat didn't match what he said had happened. And this is, you know, the detectives aren't necessarily doing anything wrong in the sense of they have a crime, they're trying to investigate it. But the idea that our own heartbeats could be used against us in a court of law, I think raises some really difficult questions of, are we okay with that? Because the truth of the matter is, there is nothing too private. The data from your smart bed, your smart toothbrush, your period app that tracks your menstrual cycle, your digital diary, whatever it is, there is nothing too private that cannot be obtained with a warrant. And those stories are only going to multiply and grow as we become more dependent on technologies. When we have sensors in our home, listening to our conversations called Alexa or Echo, or we have video doorbells on our doors watching what we're doing. And there are reasons for that. There are reasons why someone might want to buy those technologies, but the danger is you are surveilling yourself far more than you're surveilling anyone else. And the question is, is that trade off worth the while?
GR: And to that point, let me put the question to you that a district attorney might put to you. And you probably will anticipate this, but the information at least is reliable in and of itself. I mean, what we conclude from it, we can debate. But, you know, in the case that you just made the gentleman's heartbeat was what it was. And so, if we compare that, say, to eyewitness testimony, which is notoriously unreliable, or say a false confession that is, you know, pulled out of somebody by overly aggressive police tactics. The question is, if I'm innocent, why should I be worried about this? I mean, couldn't I probably just use this to prove my innocence more often than it might be used to indicate a false guilt?
AF: I think that there are, you know, strong arguments for why there's law enforcement use. But I also think that, you know, if you're pregnant in a state that just criminalized abortion, like, you being innocent might mean something different, right? Your smartwatch is revealing, you know, certain pieces about your heart and your bodily, you know, your heat and everything else. And your period app is revealing, you know, certain very private things. And, like, what is criminal can change. If you walked out of the house with ‘No Kings’ protest sign, guess what? Your doorbell camera probably caught that. And that might be evidence to use against you in a way where suddenly dissent in certain forms is criminalized. And so that's sort of the tension there that we have a reality of, like what is criminal and can prosecutions be weaponized? At the same time, there is no question that this data is helpful for law enforcement. But what's weird or what's strange or what's different is that we have now just opened our lives up in ways where that data just wasn't available, you know, for decades, you just couldn't have found what was going on in someone's house. But now that they put, like, a cat-cam to watch their cat while they’re away at work, suddenly we have a perfect record of what's going on inside their house. And the question is, should the normal rules where we really couldn't have gotten that beforehand, now apply to this new world where we've created so much data? And we all do it, right? Google, the things you Google, probably has a better sense of what's going on in your head and your immediate concerns then lots and lots of other people. Maybe more than your spouse, right? You might Google like what is this strange rash? Before telling your spouse about the strange rash or whatever it would be? And like, the fact that those sort of thoughts now put out to a third party are open for government investigation is problematic because, ‘what is that rash?’ might be like, ‘how do I obtain an abortion?’, or, ‘how do I get services for my trans kid?’, or, ‘how do I protest a government I don't think is actually in the way I think is law for a constitutional?’. And the fact that all those digital trails and thoughts are now potential evidence against you is a problem. There are good things when there are, and there are cases in the book that talk about like, people are like, how do I dispose of a dead body or how do I get rid of the smell? And like those are also quite revealing about someone's culpability in a crime. But right now, we don't have rules that distinguish between those two, and we haven't, as a society, agreed about whether we trust the government to have all of our intimate data to use as they wish.
GR: And want to come back and explore exactly what the situation is regarding government rules on that when we come back from the break. You're listening to the Campbell Conversations on WRVO Public Media. I'm Grant Reeher and I'm talking with Andrew Guthrie Ferguson. He's a professor at George Washington University Law School and the author of a new book titled, “Your Data Will Be Used Against You: Policing in the Age of Self-Surveillance” and we've been discussing the book and the difficult and thorny issues that it raises. I wonder if part of the reason why we're in the situation we're in is just the rapidity of the change in this arena. I mean, it's, you know, I've been around for a while, I'm relatively old and I've seen a lot of different changes, and this one just seems to have come pretty quickly, even within the internet age. Do you think that's, like, we're just not catching up to the reality of this? Is that part of it?
AF: I think that there is a disconnect between the speed that the technology has evolved and just become so consumer friendly, and the laws and rules and regulations around it. I also think it's hard. I think that sometimes the reason why you don't see legislation covering some of this information is that it's very difficult. Like in the book, I actually try to take a piece of it. I'm trying to actually focus on not data privacy writ large, but again, it would be wonderful if we had, you know, a federal regulation on that or even, data protection in a certain sort of European framework. I'm really talking about, could we set up rules about how this data can be used in criminal cases and criminal prosecutions? In part because that's my background, I teach criminal procedure and criminal law. I was a former public defender before teaching, so I see the world in sort of the criminal process. But also I think it's just an easier balance. Like, we might be able to come up with agreements about this is okay, this is not okay, oh, this is really private, maybe we want to have a higher standard and it's not going to be perfect, but we might be able to move forward.
GR: So, tell us what the current rules are, generally, regarding the criminal justice use of this data. Is it sort of like the general warrant? You know, I have a reason to go to a judge and I say, here's why we could use this data and the judge decides one way or another?
AF: So, a lot of the data that we create and we give off to third parties doesn't even need a warrant to be obtained. You've either given it up as sort of a condition of using it, using the technology that you probably signed some form you didn't read, and agreed that law enforcement get access with sometimes just a request, but sometimes a subpoena. Sometimes you literally paid for it, right? You paid your taxes, paid for the camera that's capturing where you're going and what you're doing out there. Sometimes if it gets a little bit more private data, there is a warrant requirement. The Fourth Amendment of the United States Constitution requires a probable cause warrant. Again, a pretty low standard. It’s definitely lower than 51%, that's like a preponderance of evidence, so it's lower than that. It can be wrong. And really, with a predicate like allegation, there's a crime, it's pretty easy to get almost anything that has been created. So, all of your work in your computer, on your phone, with a warrant, it's all available. We have not seen federal law that would regulate this kind of technology in any real way. Like, you know, a lot of the federal laws that exist were written, like, decades ago. And while we seen some states move to sort of counteract sort of consumer privacy, sort of how the company and the individual might interact and your control of your data, most of those state privacy laws have a law enforcement carve out. So, law enforcement get access, but we'll limit how the companies get access. And so, we have not had a national debate about whether or not this data should be used against us. There is, you know, percolating cases. The United States Supreme Court is hearing a case on geofence warrants, which is the idea of using a warrant to get your Google data, your location data. So, in that case, a gentleman was accused of robbing a bank, and his cell phone basically gave him away because he was there. But in order to get the cell phone information about this particular defendant, the police had to essentially search your cell phone data and my cell phone data. In fact, everyone, like 500 million people of Google were in this sensor vault that they sort of collected everyone's location data. So, to find the one bad guy they actually surveilled us to probably if we had Google services enabled on our phones, which most of us do. So that's, you know, part of the tension. So, the courts are slowly moving forward. The legislators, I'm not sure we're seeing a whole lot of action in Congress right now, but maybe we could be. But the point of the book is to kind of to try to provoke citizens and communities and families to sort of debate this, discuss this, in the hopes that we could push this conversation forward.
GR: You mentioned some states doing some things. Is there any state in particular that you would regard in the forefront of thinking about this from a criminal justice perspective?
AF: One of the interesting states, Illinois had this thing for biometric evidence. And so, there's this law called the Biometric Information Protection Act, BIPA, that was the first sort of biometrics. So, think about facial recognition, think about the ability to take your sort of biological marker data. And because Illinois had a law, it actually somewhat thwarted companies trying to move nationally because every time they had data from Illinois citizens, they could, like, run afoul of this. So, Facebook, which, of course, has everyone's data, also had Illinois citizens. And when they were running their own facial recognition matches, it could create a problem. And so, in certain states, for certain kinds of data, there are some protections. But no state, even California, which tries to be ahead of the curve on a lot of these sort of privacy and definitely has some consumer privacy laws that are good, haven't really figured out what we can do and what the balance is. Essentially, I think the fair takeaway is if you build it, if you create the data, the police can come for it and use it against you.
GR: Interesting. If you've just joined us, you're listening to the Campbell Conversations on WRVO Public Media. I'm Grant Reeher and my guest is the law professor Andrew Guthrie Ferguson. You know, when you're talking about this and I think about the politics that would surround it, it would seem to me that, although it does tap into some areas where there is deep partisan division, I mean, criminal justice obviously is one where Democrats and Republicans have really fought pretty hard against each other in recent years. It also seems to be the case that there's a lot of room for political overlap in terms of concerns about privacy. On the one hand, suspicions about how government might overstep or misuse something. I'm just curious to get your sense of your reflections on the political environment in which these conversations are going to happen.
AF: I think it's bipartisan, because both sides should realize that their data can be used against them. You know, just a few years ago, people who were a strong sort of Second Amendment supporters were really concerned that the federal government would come up with like a federal database of gun owners and be able to, like, bang down their doors and steal their guns. Well, with automated license plate readers that read, you know, who's driving their cars, of you put an automated license plate reader outside a gun show, or a gun range, or a place you buy ammunition, like you don't need that federal database, you can find out who you are. All of our data reveals the things we do and whether we want to, you know, privately pray in a certain community, whether we want to protest in a certain way, it all gets revealed and it can be used against us. Like, ironically enough, like, Donald Trump's data was used against him. Like he was actually upset that his texts and stuff were being used in a criminal prosecution and felt the same sort of violation. This feels like it's it's wrong. And we're seeing now, as the Trump administration is using this data to go after, like, you know, Jim Comey, who, you know, former FBI director, like literally a career prosecutor about a straight an arrow as someone might think that despite his, like, political, interference, like, not necessarily the person you would think would be targeted to have his data targeted and revealed about him, but he too is now a target and it shows that everyone is vulnerable. Even if you are powerful, even if you're affluent and privileged, like the aperture of surveillance of who is being targeted has expanded far beyond where we used to target, which tended to be poorer communities, black and brown communities, places that did not have the political voice to respond. But now everyone who uses a digital device, who uses digital means to communicate, has essentially revealed themselves and is still simply sitting there vulnerable to the political winds shifting to target them.
GR: Setting aside the government for a minute, just thinking about private companies, publicly owned corporations that have this data, they pretty much have carte blanche, right, to use it, or at least they often set that up if I'm going to join some kind of system that I've checked the box somewhere, or I'm thinking of employers. You know, you and I work at universities, I'm assuming that the university can, at any time that it wants, go and look through our emails. Am I right about that?
AF: You are. And you've agreed to that. And sometimes you get that, you know, pop up screen when you log back on. That's true. And what's even, I guess, more difficult for the company is like, if you're a U.S. company and a law enforcement officer with a duly authorized warrant shows up to get the data, there’s really, there's not much you can do. That’s the law, you have to turn it over. So, if you have it, and some companies have chosen not to collect it, but if you're a data collection company and that's your business plan, like collecting it is what you're doing, so if you have it and they come for it and they have a warrant, it's very hard to push back. It's very hard to say no. And that means that the warrant protection, the that we kind of have in our back in our head, oh, there's some judge, you know, who will make sure this isn't abused, isn't that protective. It's not nothing, it's something, it's better than, you know, not having it at all, but it shouldn't give you confidence that your data will remain secure. Because it's very easy to get a warrant to get it.
GR: We've only got about a couple minutes left, and there are two big questions, and I'm sorry that I'm going to be unfair to you this way, but two big questions I want to try to squeeze in at the end. One is, give us an idea of at least the kinds of changes you might like to see. I know you said the main point of the book is to stimulate our conversations about this, but, are there any big changes that if you could wave a magic wand, you'd like to see?
AF: Sure. You know, toward the end of the book, I have separate chapters, I have a book essentially written for judges to say, this is what you could do using the Fourth Amendment to sort of strengthen our protections of data privacy using existing law. And my various, you know, academic theories about this. I have a chapter for legislators, like if you are in the legislature and you're concerned about the issues raised in this book, here's how you can build in certain kind of heightened protections or build on parallels that (we) already have in the law is actually not that high a step to do it. We can do it pretty easily, and I try to lay out what those answers would be. And then for communities and what communities can do and how communities can support this sort of collective action. I purposely don't focus on individuals because that's sort of a false framing. I think if it’s always, like, I can't negotiate with Amazon or the FBI as an individual, like I can't change those terms and services. But as a community, we can say maybe we don't want this kind of technology, maybe we don't want to share our data in certain ways. And legislators can regulate any of this or all of it if they wish to. If they had the political will, they could easily make those choices.
GR: Well, let me focus on the individual with about 30s left. I get what you're saying about we don't have the same leverage that governments do at different levels or judges. Well, if you could just, with a couple seconds left, what advice would you give us as individuals to practice better self-surveillance hygiene? I suppose is the way I would put it.
AF: Sure. I mean, you should educate yourself. I think you should make informed choices about whether you really do need the cat-cam to watch your cat, when you're working. Whether the connections, right, you can have a ring doorbell and not connected to a larger system, you can make those choices. You can not necessarily click ‘I agree’ and give away all your data for all reasons just because you're too lazy to figure out what, you know, where it might go. And more importantly, you can support the advocacy groups, the community groups that are pushing back against these technologies in more cohesive and forceful ways. And so, like, there's a whole chapter on supporting the journalists who are exposing the technology and the legislators who are doing some real hard work on this.
GR: Well, we'll have to leave it there. That was Andrew Guthrie Ferguson, and again, his new book is titled, “Your Data Will Be Used Against You: Policing in the Age of Self-Surveillance”. If you're already paranoid, my advice is not to read this book, but otherwise it is a fascinating read as well as eye-opening. Professor Ferguson, thanks again for talking with me, really appreciate it.
AF: Thank you.
GR: You've been listening to the Campbell Conversations on WRVO Public Media, conversations in the public interest.
By Grant Reeher3.8
88 ratings
Grant Reeher: Welcome to the Campbell Conversations, I'm Grant Reeher. When it comes to generating data that can be used against us by the government, are we are own worst enemies? My guest today has a lot to say about that. Andrew Guthrie Ferguson is a professor at George Washington University Law School, and he's the author of a new book. It's titled, “Your Data Will Be Used Against You: Policing in the Age of Self-Surveillance”. Professor Ferguson, welcome to the program.
Andrew Ferguson: Thank you so much for having me.
GR: Well, we really appreciate you making the time. So, I'm going to spend, I think, most of our time just unpacking a very provocative sentence that you write near the beginning of your book. You write, “you are, at best, a warrant away from having your most intimate personal details revealed to a government agent looking to incarcerate, embarrass, or intimidate you.” There's a lot to unpack there. So, that's going to be our roadmap. But first, I thought it would help to set some of the context for this by breaking down some of the building blocks for your analysis in your argument. So let me start with this one, you write of certain kind of data sources, and it's right in the title of your book, you call them ‘self-surveillance’. Just give us a clear idea of what that means, the kinds of data mines that you're talking about there.
AF: Sure. So, if you live in a world where you have a smartphone in your pocket or a smartwatch on your wrist, or drive any modern car that is essentially a smart surveillance car, or have a ring doorbell camera on your front door, an Alexa in your home, use the internet, ask Google questions, you are creating data about yourself. And the thing that I think we haven't fully recognized is that all of our smart devices are surveillance devices. In fact, that's sort of what you're purchasing. You're purchasing some insight about yourself, about your life, about your patterns that you think will add value to your life. The downside risk is that all that data is available to law enforcement, to prosecutors. And it is, as you said, at best, a warrant away from being revealed. And what the book does is try to explain how we have sort of built this world of self-surveillance. Not to judge us, we are all digital citizens in one way or another, but to recognize that these purchases and these choices have costs and that we haven't really wrestled with the costs of that self-surveillance.
GR: So, in your book, you take the first several chapters breaking out that different kind of data, and you just gave us an idea of that when you just listed the different kinds of devices that, you know, a lot of us have, certainly almost everybody has the cell phone, the smartphone. And you talk about data that's in and about our homes, and you have one about our things, our bodies, our cities, our papers and our likes. Now, some of those I think we understand immediately. And when you just described that list that you gave, you know, we all can relate to that. I was curious to hear you spell out a little bit about what our cities and our papers mean in those chapters. That's less clear on the surface.
AF: Sure. So, we as a society have invested a lot of money in building surveillance devices in our cities. Whether they're automated license plate readers that catch the license plates that drive down your streets, cameras that are connected to real time crime centers where police can surveil the city streets from the comfort of their command center. Whether it is other forms, you know, drones and other new technologies that are watching us. That is a form of self-surveillance. It’s democratically mediated, it's our city council saying that we think that we can buy a world of safety through surveillance. But these are choices. Choices that we make with our tax dollars, we make with our elected choices of representatives. And it is another form of this world where we are building surveillance devices for when we go out in public, but also, of course, as we are in our home. And the papers part, again, the chapters that begin the book talk about how our homes have been transformed into smart homes, our bodies have been transformed using biological markers, be it your Fitbit or your smartwatch or your smart pacemaker or whatever it would be. But papers, papers also date back to the founding fathers. We have papers, effects, persons in homes as our fourth amendment protections. And most of what we do now is mediated through a digital form that takes the place of papers. You know, my parents used to get a bank statement in the mail, literally every month, that they saved in the attic so we would have boxes of papers and that was your financial records. Now it's all digital on your phone. We used to write letters. Again, they actually had love letters that they wrote back when they were courting in the air and now people have, like, texts and maybe an email or two to do that. But all of that digital information is recorded somewhere, usually held by some third party, and is available if there is a criminal prosecution, if there is a warrant, if there is a desire. And what I don't think we realize is that that exposure, that digital exposure leaves us all very vulnerable to political winds that change, to certain kinds of targeted prosecutions. And it's something that we should just have a conversation about, and the book is trying to start that conversation.
GR: Yeah. Just a couple of quick observations about our cities. I just want to, you'd be interested in hearing this, but that question and that issue has been one that has come up in Syracuse, because Syracuse has been instituting these. It has been framed as something that will make the city run more efficiently and provide better services. But at the same time, there has been conversation about this idea that we are, you know, increasing our sense of monitoring. And I remember I worked several years ago over a number of different summers in London, and it was often said, I remember it being said that because of the heavy use of CCTV, that pretty much anywhere in London, you were on television somewhere. And I remember taking some comfort in that as a pedestrian walking around. But I see the other side of this now that you bring it up.
AF: And that's part of it, it's that duality, right? It's not that smart cities are bad, it's not that security cameras are bad, but in an unregulated world, in a world where that technology can be weaponized for other things that we didn't agree to or don't think is wise, we run into problems. And that's the hard part. We are not wrong in thinking that maybe certain forms of surveillance will help law enforcement. It will. And the book is actually filled with stories of people getting caught because of their data and like, bad people getting caught because of their data. And that's not a bad thing. The danger is that who is bad or what is criminal can change politically, can change because of who's getting targeted and without rules and regulations to limit it. It really does expose all of us to essentially the whims of who's in charge and what they want to prosecute.
GR: Yeah, anyone who's watched British crime television knows, the first thing that detectives all go to is the CCTV. But you're right, the double edge-ness. I'm Grant Reeher, you're listening to the Campbell Conversations on WRVO Public Media, and my guest is George Washington University law professor Andrew Guthrie Ferguson and we're discussing his new book. It's titled, “Your Data Will Be Used Against You: Policing in the Age of Self-Surveillance”. So, give us some more examples, maybe some of the ones that you think are most telling or the most intriguing or the most disturbing of how the information can be used against us. I mean, we understand that there are some good things that can be done with it, but as you say, it depends on whoever is in power and using this. Give us some examples of things we might want to be worried about.
AF: There’s a story that I think both captures the promise and dangers and just difficulty of these issues. (It) involves a gentleman who had a smart pacemaker in his heart. So, I think we can all agree that the medical inventions and interventions that allow someone to live better because their heart is beating at a certain way and recording that data for the individual themselves so it can take care of themselves, go to the doctor, but is also that data is also being forwarded to the doctor's office is a good. That's actually the kind of thing that we should encourage, we would like. But the case in the book is when detectives go to the doctor's office and get this guy's heartbeat and use it against him in a criminal case, because, and this is the tension, he was involved in some insurance fraud. He was basically, burned down his house, claimed it was, you know, an accident when really it was arson. And the detectives wanted to disprove his story by showing that his heartbeat didn't match what he said had happened. And this is, you know, the detectives aren't necessarily doing anything wrong in the sense of they have a crime, they're trying to investigate it. But the idea that our own heartbeats could be used against us in a court of law, I think raises some really difficult questions of, are we okay with that? Because the truth of the matter is, there is nothing too private. The data from your smart bed, your smart toothbrush, your period app that tracks your menstrual cycle, your digital diary, whatever it is, there is nothing too private that cannot be obtained with a warrant. And those stories are only going to multiply and grow as we become more dependent on technologies. When we have sensors in our home, listening to our conversations called Alexa or Echo, or we have video doorbells on our doors watching what we're doing. And there are reasons for that. There are reasons why someone might want to buy those technologies, but the danger is you are surveilling yourself far more than you're surveilling anyone else. And the question is, is that trade off worth the while?
GR: And to that point, let me put the question to you that a district attorney might put to you. And you probably will anticipate this, but the information at least is reliable in and of itself. I mean, what we conclude from it, we can debate. But, you know, in the case that you just made the gentleman's heartbeat was what it was. And so, if we compare that, say, to eyewitness testimony, which is notoriously unreliable, or say a false confession that is, you know, pulled out of somebody by overly aggressive police tactics. The question is, if I'm innocent, why should I be worried about this? I mean, couldn't I probably just use this to prove my innocence more often than it might be used to indicate a false guilt?
AF: I think that there are, you know, strong arguments for why there's law enforcement use. But I also think that, you know, if you're pregnant in a state that just criminalized abortion, like, you being innocent might mean something different, right? Your smartwatch is revealing, you know, certain pieces about your heart and your bodily, you know, your heat and everything else. And your period app is revealing, you know, certain very private things. And, like, what is criminal can change. If you walked out of the house with ‘No Kings’ protest sign, guess what? Your doorbell camera probably caught that. And that might be evidence to use against you in a way where suddenly dissent in certain forms is criminalized. And so that's sort of the tension there that we have a reality of, like what is criminal and can prosecutions be weaponized? At the same time, there is no question that this data is helpful for law enforcement. But what's weird or what's strange or what's different is that we have now just opened our lives up in ways where that data just wasn't available, you know, for decades, you just couldn't have found what was going on in someone's house. But now that they put, like, a cat-cam to watch their cat while they’re away at work, suddenly we have a perfect record of what's going on inside their house. And the question is, should the normal rules where we really couldn't have gotten that beforehand, now apply to this new world where we've created so much data? And we all do it, right? Google, the things you Google, probably has a better sense of what's going on in your head and your immediate concerns then lots and lots of other people. Maybe more than your spouse, right? You might Google like what is this strange rash? Before telling your spouse about the strange rash or whatever it would be? And like, the fact that those sort of thoughts now put out to a third party are open for government investigation is problematic because, ‘what is that rash?’ might be like, ‘how do I obtain an abortion?’, or, ‘how do I get services for my trans kid?’, or, ‘how do I protest a government I don't think is actually in the way I think is law for a constitutional?’. And the fact that all those digital trails and thoughts are now potential evidence against you is a problem. There are good things when there are, and there are cases in the book that talk about like, people are like, how do I dispose of a dead body or how do I get rid of the smell? And like those are also quite revealing about someone's culpability in a crime. But right now, we don't have rules that distinguish between those two, and we haven't, as a society, agreed about whether we trust the government to have all of our intimate data to use as they wish.
GR: And want to come back and explore exactly what the situation is regarding government rules on that when we come back from the break. You're listening to the Campbell Conversations on WRVO Public Media. I'm Grant Reeher and I'm talking with Andrew Guthrie Ferguson. He's a professor at George Washington University Law School and the author of a new book titled, “Your Data Will Be Used Against You: Policing in the Age of Self-Surveillance” and we've been discussing the book and the difficult and thorny issues that it raises. I wonder if part of the reason why we're in the situation we're in is just the rapidity of the change in this arena. I mean, it's, you know, I've been around for a while, I'm relatively old and I've seen a lot of different changes, and this one just seems to have come pretty quickly, even within the internet age. Do you think that's, like, we're just not catching up to the reality of this? Is that part of it?
AF: I think that there is a disconnect between the speed that the technology has evolved and just become so consumer friendly, and the laws and rules and regulations around it. I also think it's hard. I think that sometimes the reason why you don't see legislation covering some of this information is that it's very difficult. Like in the book, I actually try to take a piece of it. I'm trying to actually focus on not data privacy writ large, but again, it would be wonderful if we had, you know, a federal regulation on that or even, data protection in a certain sort of European framework. I'm really talking about, could we set up rules about how this data can be used in criminal cases and criminal prosecutions? In part because that's my background, I teach criminal procedure and criminal law. I was a former public defender before teaching, so I see the world in sort of the criminal process. But also I think it's just an easier balance. Like, we might be able to come up with agreements about this is okay, this is not okay, oh, this is really private, maybe we want to have a higher standard and it's not going to be perfect, but we might be able to move forward.
GR: So, tell us what the current rules are, generally, regarding the criminal justice use of this data. Is it sort of like the general warrant? You know, I have a reason to go to a judge and I say, here's why we could use this data and the judge decides one way or another?
AF: So, a lot of the data that we create and we give off to third parties doesn't even need a warrant to be obtained. You've either given it up as sort of a condition of using it, using the technology that you probably signed some form you didn't read, and agreed that law enforcement get access with sometimes just a request, but sometimes a subpoena. Sometimes you literally paid for it, right? You paid your taxes, paid for the camera that's capturing where you're going and what you're doing out there. Sometimes if it gets a little bit more private data, there is a warrant requirement. The Fourth Amendment of the United States Constitution requires a probable cause warrant. Again, a pretty low standard. It’s definitely lower than 51%, that's like a preponderance of evidence, so it's lower than that. It can be wrong. And really, with a predicate like allegation, there's a crime, it's pretty easy to get almost anything that has been created. So, all of your work in your computer, on your phone, with a warrant, it's all available. We have not seen federal law that would regulate this kind of technology in any real way. Like, you know, a lot of the federal laws that exist were written, like, decades ago. And while we seen some states move to sort of counteract sort of consumer privacy, sort of how the company and the individual might interact and your control of your data, most of those state privacy laws have a law enforcement carve out. So, law enforcement get access, but we'll limit how the companies get access. And so, we have not had a national debate about whether or not this data should be used against us. There is, you know, percolating cases. The United States Supreme Court is hearing a case on geofence warrants, which is the idea of using a warrant to get your Google data, your location data. So, in that case, a gentleman was accused of robbing a bank, and his cell phone basically gave him away because he was there. But in order to get the cell phone information about this particular defendant, the police had to essentially search your cell phone data and my cell phone data. In fact, everyone, like 500 million people of Google were in this sensor vault that they sort of collected everyone's location data. So, to find the one bad guy they actually surveilled us to probably if we had Google services enabled on our phones, which most of us do. So that's, you know, part of the tension. So, the courts are slowly moving forward. The legislators, I'm not sure we're seeing a whole lot of action in Congress right now, but maybe we could be. But the point of the book is to kind of to try to provoke citizens and communities and families to sort of debate this, discuss this, in the hopes that we could push this conversation forward.
GR: You mentioned some states doing some things. Is there any state in particular that you would regard in the forefront of thinking about this from a criminal justice perspective?
AF: One of the interesting states, Illinois had this thing for biometric evidence. And so, there's this law called the Biometric Information Protection Act, BIPA, that was the first sort of biometrics. So, think about facial recognition, think about the ability to take your sort of biological marker data. And because Illinois had a law, it actually somewhat thwarted companies trying to move nationally because every time they had data from Illinois citizens, they could, like, run afoul of this. So, Facebook, which, of course, has everyone's data, also had Illinois citizens. And when they were running their own facial recognition matches, it could create a problem. And so, in certain states, for certain kinds of data, there are some protections. But no state, even California, which tries to be ahead of the curve on a lot of these sort of privacy and definitely has some consumer privacy laws that are good, haven't really figured out what we can do and what the balance is. Essentially, I think the fair takeaway is if you build it, if you create the data, the police can come for it and use it against you.
GR: Interesting. If you've just joined us, you're listening to the Campbell Conversations on WRVO Public Media. I'm Grant Reeher and my guest is the law professor Andrew Guthrie Ferguson. You know, when you're talking about this and I think about the politics that would surround it, it would seem to me that, although it does tap into some areas where there is deep partisan division, I mean, criminal justice obviously is one where Democrats and Republicans have really fought pretty hard against each other in recent years. It also seems to be the case that there's a lot of room for political overlap in terms of concerns about privacy. On the one hand, suspicions about how government might overstep or misuse something. I'm just curious to get your sense of your reflections on the political environment in which these conversations are going to happen.
AF: I think it's bipartisan, because both sides should realize that their data can be used against them. You know, just a few years ago, people who were a strong sort of Second Amendment supporters were really concerned that the federal government would come up with like a federal database of gun owners and be able to, like, bang down their doors and steal their guns. Well, with automated license plate readers that read, you know, who's driving their cars, of you put an automated license plate reader outside a gun show, or a gun range, or a place you buy ammunition, like you don't need that federal database, you can find out who you are. All of our data reveals the things we do and whether we want to, you know, privately pray in a certain community, whether we want to protest in a certain way, it all gets revealed and it can be used against us. Like, ironically enough, like, Donald Trump's data was used against him. Like he was actually upset that his texts and stuff were being used in a criminal prosecution and felt the same sort of violation. This feels like it's it's wrong. And we're seeing now, as the Trump administration is using this data to go after, like, you know, Jim Comey, who, you know, former FBI director, like literally a career prosecutor about a straight an arrow as someone might think that despite his, like, political, interference, like, not necessarily the person you would think would be targeted to have his data targeted and revealed about him, but he too is now a target and it shows that everyone is vulnerable. Even if you are powerful, even if you're affluent and privileged, like the aperture of surveillance of who is being targeted has expanded far beyond where we used to target, which tended to be poorer communities, black and brown communities, places that did not have the political voice to respond. But now everyone who uses a digital device, who uses digital means to communicate, has essentially revealed themselves and is still simply sitting there vulnerable to the political winds shifting to target them.
GR: Setting aside the government for a minute, just thinking about private companies, publicly owned corporations that have this data, they pretty much have carte blanche, right, to use it, or at least they often set that up if I'm going to join some kind of system that I've checked the box somewhere, or I'm thinking of employers. You know, you and I work at universities, I'm assuming that the university can, at any time that it wants, go and look through our emails. Am I right about that?
AF: You are. And you've agreed to that. And sometimes you get that, you know, pop up screen when you log back on. That's true. And what's even, I guess, more difficult for the company is like, if you're a U.S. company and a law enforcement officer with a duly authorized warrant shows up to get the data, there’s really, there's not much you can do. That’s the law, you have to turn it over. So, if you have it, and some companies have chosen not to collect it, but if you're a data collection company and that's your business plan, like collecting it is what you're doing, so if you have it and they come for it and they have a warrant, it's very hard to push back. It's very hard to say no. And that means that the warrant protection, the that we kind of have in our back in our head, oh, there's some judge, you know, who will make sure this isn't abused, isn't that protective. It's not nothing, it's something, it's better than, you know, not having it at all, but it shouldn't give you confidence that your data will remain secure. Because it's very easy to get a warrant to get it.
GR: We've only got about a couple minutes left, and there are two big questions, and I'm sorry that I'm going to be unfair to you this way, but two big questions I want to try to squeeze in at the end. One is, give us an idea of at least the kinds of changes you might like to see. I know you said the main point of the book is to stimulate our conversations about this, but, are there any big changes that if you could wave a magic wand, you'd like to see?
AF: Sure. You know, toward the end of the book, I have separate chapters, I have a book essentially written for judges to say, this is what you could do using the Fourth Amendment to sort of strengthen our protections of data privacy using existing law. And my various, you know, academic theories about this. I have a chapter for legislators, like if you are in the legislature and you're concerned about the issues raised in this book, here's how you can build in certain kind of heightened protections or build on parallels that (we) already have in the law is actually not that high a step to do it. We can do it pretty easily, and I try to lay out what those answers would be. And then for communities and what communities can do and how communities can support this sort of collective action. I purposely don't focus on individuals because that's sort of a false framing. I think if it’s always, like, I can't negotiate with Amazon or the FBI as an individual, like I can't change those terms and services. But as a community, we can say maybe we don't want this kind of technology, maybe we don't want to share our data in certain ways. And legislators can regulate any of this or all of it if they wish to. If they had the political will, they could easily make those choices.
GR: Well, let me focus on the individual with about 30s left. I get what you're saying about we don't have the same leverage that governments do at different levels or judges. Well, if you could just, with a couple seconds left, what advice would you give us as individuals to practice better self-surveillance hygiene? I suppose is the way I would put it.
AF: Sure. I mean, you should educate yourself. I think you should make informed choices about whether you really do need the cat-cam to watch your cat, when you're working. Whether the connections, right, you can have a ring doorbell and not connected to a larger system, you can make those choices. You can not necessarily click ‘I agree’ and give away all your data for all reasons just because you're too lazy to figure out what, you know, where it might go. And more importantly, you can support the advocacy groups, the community groups that are pushing back against these technologies in more cohesive and forceful ways. And so, like, there's a whole chapter on supporting the journalists who are exposing the technology and the legislators who are doing some real hard work on this.
GR: Well, we'll have to leave it there. That was Andrew Guthrie Ferguson, and again, his new book is titled, “Your Data Will Be Used Against You: Policing in the Age of Self-Surveillance”. If you're already paranoid, my advice is not to read this book, but otherwise it is a fascinating read as well as eye-opening. Professor Ferguson, thanks again for talking with me, really appreciate it.
AF: Thank you.
GR: You've been listening to the Campbell Conversations on WRVO Public Media, conversations in the public interest.

50 Listeners

70 Listeners