I recently had the chance to speak with former Ontario Information and Privacy Commissioner Dr. Ann Cavoukian about big data and privacy. Dr. Cavoukian is currently Executive Director of Ryerson University’s Privacy and Big Data Institute and is best known for her leadership in the development of Privacy by Design (PbD).
What’s more, she came up with PbD language that made its way into the GDPR, which will go into effect in 2018. First developed in the 1990s, PbD addresses the growing privacy concerns brought upon by big data and IoT devices.
Many worry about PbD’s interference with innovation and businesses, but that’s not the case.
When working with government agencies and organizations, Dr. Cavoukian’s singular approach is that big data and privacy can operate together seamlessly. At the core, her message is this: you can simultaneously collect data and protect customer privacy.
Transcript
With Privacy by Design principles codified in the new General Data Protection Regulation, which will go into effect in 2018, it might help to understand the intent and origins of it. And that's why I called former Ontario Information and Privacy Commissioner, Dr. Ann Cavoukian. She is currently Executive Director of Ryerson University's Privacy and Big Data Institute and is best known for her leadership in the development of Privacy by Design. When working with government agencies and organizations, Dr. Cavoukian's singular approach is that big data and privacy can operate together seamlessly. At the core, her message is this, you can simultaneously collect data and protect customer privacy.
Thank you, Dr. Cavoukian for joining us today. I was wondering, as Information and Privacy Commissioner of Ontario, what did you see what was effective when convincing organizations and government agencies to treat people's private data carefully?
The approach I took...I always think that the carrot is better than the stick, and I did have order-making power as Commissioner. So I had the authority to order government organizations, for example, who were in breach of the Privacy Act to do something, to change what they were doing and tell them what to do. But the problem...whenever you have to order someone to do something, they will do it because they are required to by law, but they're not gonna be happy about it, and it is unlikely to change their behavior after that particular change that you've ordered. So, I always led with the carrot in terms of meeting with them, trying to explain why it was in both their best interest, in citizens' best interest, in customers' best interest, when I'm talking to businesses. Why it's very, very important to make it...I always talk about positive sum, not zero sum, make it a win-win proposition. It's gotta be a win for both the organization who's doing the data collection and the data use and the customers or citizens that they're serving. It's gotta be a win for both parties, and when you can present it that way, it gives you a seat at the table every time. And let me explain what I mean by that. Many years ago I was asked to join the board of the European Biometrics Forum, and I was honored, of course, but I was surprised because in Europe they have more privacy commissioners than anywhere else in the world. Hundreds of them, they're brilliant. They're wonderful, and I said, "Why are you coming to me as opposed to one of your own?" And they said, "It's simple." They said, "You don't say 'no' to biometrics. You say 'yes' to biometrics, and 'Here are the privacy protective measures that I insist you put on them.'" They said, "We may not like how much you want us to do, but we can try to accommodate that. But what we can't accommodate is if someone says, 'We don't like your industry.'" You know, basically to say "no" to the entire industry is untenable. So, when you go in with an "and" instead of a "versus," it's not me versus your interests. It's my interests in privacy and your interests in the business or the government, whatever you're doing. So, zero sum paradigms are one interest versus another. You can only have security at the expense of privacy, for example. In my world, that doesn't cut it.
Dr. Cavoukian, can you tell us a little bit more about Privacy by Design?
I really crystallized Privacy by Design really after 9/11, because at 9/11 it became crystal clear that everybody was talking about the vital need for public safety and security, of course. But it was always construed as at the expense of privacy, so if you have to give up your privacy, so be it. Public safety's more important. Well, of course public safety is extremely important, and we did a position piece at that point for our national newspaper, "The Globe and Mail," and the position I took was public safety is paramount with privacy embedded into the process. You have to have both. There's no point in just having public safety without privacy. Privacy forms the basis of our freedoms. You wanna live in free democratic society, you have to be able to have moments of reserve and reflection and intimacy and solitude. You have to be able to do that.
Data minimalization is important, but what do you think about companies that do collect everything with hopes that they might use it in the future?
See, what they're asking for, they're asking for trouble, because I can bet you dollars to doughnuts that's gonna come back to bite you. Because, especially with data, that you're not clear about what you're gonna do with it, so you got data sitting there. What data does is in identifiable form is attracts hackers. It attracts rogue employees on the inside who will make inappropriate use of the data, sell the data, do something with the data. It just...you're asking for trouble, because keeping data in identifiable form, once the uses have been addressed, just begs trouble. I always tell people, if you wanna keep the data, keep the data, but de-identify it. Strip the personal identifiers, make sure you have the data aggregated, de-identified, encrypted, something that protects it from this kind of rogue activity. And you've been reading lately all about the hackers who are in, I think they were in the IRS for God's sakes, and they're getting in everywhere here in my country. They're getting into so many databases, and it's not only appalling in terms of the data loss, it's embarrassing for the government departments who are supposed to be protecting this data. And it fuels even additional distrust on the part of the public, so I would say to companies, "Do yourself a huge favor. You don't need the data, don't keep it in identifiable form. You can keep it in aggregate form. You can encrypt it. You can do lots of things. Do not keep it in identifiable form where it can be accessed in an unauthorized manner, especially if it's sensitive data." Oh my god, health data...Rogue employees, we have a rash of it here, where...and it's just curiosity, it's ridiculous. The damage is huge, and for patients, and I can tell you, I've been a patient in hospitals many times. The thought that anyone else is accessing my data...it's so personal and so sensitive. So when I speak this way to boards of directors and senior executives, they get it. They don't want the trouble, or I haven't even talked costs. Once these data breaches happen these days, it's not just lawsuits, they're class action lawsuits that are initiated. It's huge, and then the damage to your reputation, the damage to your brand, can be irreparable.
Right. Yeah, I remember Meg Whitman said something about how it takes years and years to build your brand and reputation and seconds ruined.
Yeah, yes. That is so true. There's a great book called "The Reputation Economy" by Michael Fertik. He's the CEO of reputation.com. It's fabulous. You'd love it. It's all about exactly how long it takes to build your reputation, how dear it is and how you should cherish it and go to great lengths to protect it.
Can you speak about data ownership?
You may have custody and control over a lot of data, your customer's data, but you don't own that data. And with that custody and control comes an enormous duty of care. You gotta protect that data, restrict your use of the data to what you've identified to the customer, and then if you wanna use it for additional purposes, then you've gotta go back to the customer and get their consent for secondary uses of the data. Now, that rarely happens, I know that. In Privacy by Design, one of the principles talks about privacy as the default setting. The reason you want privacy to be the default setting...what that means is if a company has privacy as the default setting, it means that they can say to their customers, "We can give you privacy assurance from the get-go. We're collecting your information for this purpose," so they identify the purpose of the data collection. "We're only gonna use it for that purpose, and unless you give us specific consent to use it for additional purposes, the default is we won't be able to use it for anything else." It's a model of positive consent, it gives privacy assurance, and it gives enormous, enormous trust and consumer confidence in terms of companies that do this. I would say to companies, "Do this, because it'll give you a competitive advantage over the other guys."
As you know, because you sent it to me, the Pew Research Center, their latest study on Americans' attitudes, you can see how high the numbers are, in the 90 percents. People have had it. They want control. This is not a single study. There have been multiple surveys that have come out in the last few months like this. Ninety percent of the public, they don't trust the government or businesses or anyone. They feel they don't have control. They want privacy. They don't have it, so you have, ever since, actually, Edward Snowden, you have the highest level of distrust on the part of the public and the lowest levels of consumer confidence. So, how do we change that? So, when I talk to businesses, I say, "You change that by telling your customers you are giving them privacy. They don't even have to ask for it. You are embedding it as the default setting which means it comes part and parcel of the system." They're getting it. I do what I call my neighbors test. I explain these terms to my neighbors who are very bright people, but they're not in the privacy field. So, when I was explaining this to my neighbor across the street, Pat, she said, "You mean, if privacy's the default, I get privacy for free? I don't have to figure out how to ask for it?" And I said, "Yes." She said, "That's what I want. Sign me up!"
See, people want to be given privacy assurance without having to go to the lengths they have to go to now to find the privacy policy, search through the terms of service, find the checkout box. I mean, it's so full of legalese. It's impossible for people to do this. They wanna be given privacy assurance as the default. That's your biggest bet if you're a private-sector company. You will gain such a competitive advantage. You will build the trust of your customers, and you will have enormous loyalty, and you will attract new opportunity.
What are your Privacy by Design recommendations for wearables and IoT innovators and developers?
The internet of things, wearable devices and new app developers and start up...they are clueless about privacy, and I'm not trying to be disrespectful. They're working hard, say an app developer, they're working hard to build their app. They're focused on the app. That's all they're thinking about, how to deliver what the app's supposed to deliver on. And then you say, "What about privacy?" And they say, "Oh, don't worry about it. We've got it taken care of. You know, the third-party security vendor's gonna do it. We got that covered." They don't have it covered, and what they don't realize is they don't know they don't have it covered. "Give it to the security guys and they're gonna take care of it," and that's the problem. When I speak to app developers...I was at Tim O'Reilly's Web 2.0 last year or the year before, and there's 800 people in the room, I was talking about Privacy by Design, and I said, "Look, do yourself a favor. Build in privacy. Right now you're just starting your app developing, build it in right now at the front end, and then you're gonna be golden. This is the time to do it, and it's easy if you do it up front." I had dozens of people come up to me afterwards because they didn't even know they were supposed to. It had never appeared on their radar. It's not resistance to it. They hadn't thought of it. So our biggest job is educating, especially the young people, the app developers, the brilliant minds. My experience, it's not that they resist the messaging, they haven't been exposed to the messaging. Oh, I should just tell you, we started Privacy by Design certification. We've partnered with Deloitte and I’ll send you the link and we're, Ryerson University, where I am housed, we are offering this certification for Privacy by Design. But my assessment arm, my audit arm, my partner, is Deloitte, and we're partnering together, and we've had a real, real, just a deluge of interest.
So, do you think that's also why people are also hiring Chief Privacy Officers?
What are some qualities that are required in a Chief Privacy Officer? Is it just a law background?
No, in fact, I'm gonna say the opposite, and this is gonna sound like heresy to most people. I love lawyers. Some of my best friends are lawyers. Don't just restrict your hiring of Chief Privacy Officers to lawyers. The problem with hiring a lawyer is they're understandably going to bring a legal regulatory compliance approach to it, which, of course, you want that covered. I'm not saying...You have to be in compliance with whatever legislation is in your jurisdiction. But if that's all you do, it's not enough. I want you to go farther. When I ask you to do Privacy by Design, it's all about raising the bar. Doing technical measures such as embedding privacy into the design that you're offering into the data architecture, embedding privacy as a default setting. That's not a legalistic term. It's a policy term. It's computer science. It's a... You need a much broader skill set than law alone. So, for example, I'm not a lawyer, and I managed to be Commissioner for three terms. And I certainly valued my legal department, but I didn't rely on it exclusively. I always went farther, and if you're a lawyer, the tendency is just to stick to the law. I want you to do more than that. You have to have an understanding of computer science, technology, encryption, how can you... De-identification protocols are critical, combined with the risk of re-identification framework. When you look at the big data world, the internet of things, they're going to do amazing things with data. Let's make sure it's strongly de-identified and resist re-identification attacks.
There have been reports that people can re-identify people without data.
That's right, but if you examine those reports carefully, Cindy, a lot of them are based on studies where the initial de-identification was very weak. They didn't use strong de-identification protocols. So, like anything, if you start with bad encryption, you're gonna have easy decryption. So, it's all about doing it properly at the outset using proper standards. There's now four standards of de-identification that have all come out that are risk-based, and they're excellent.
Are you a fan of possibly replacing privacy policies with something simpler, like a nutrition label?
It's a very clever idea. They have tried to do that in the past. It's hard to do, and I think your simplest one for doing the nutrition kinda label would be if you did embed privacy as the default setting. Because then you could have a nutrition label that said, "Privacy built in." You know how, I think, Intel had something years ago where you had security built it or something. You could say, "Privacy embedded in the system."
Want to join us live? Save a seat here: https://www.varonis.com/state-of-cybercrime
More from Varonis ⬇️
Visit our website: https://www.varonis.com
LinkedIn: https://www.linkedin.com/company/varonis
X/Twitter: https://twitter.com/varonis
Instagram: https://www.instagram.com/varonislife/