
Sign up to save your podcasts
Or
We continue our conversation with cyber and tech attorney Camille Stewart on discerning one's appetite for risk. In other words, how much information are you willing to share online in exchange for something free?
It's a loaded question and Camille takes us through the lines of questioning one would take when taking a fun quiz or survey online. As always, there are no easy answers or shortcuts to achieving the state of privacy savvy nirvana.
What's also risky is that we shouldn't connect laws made in the physical world to cyberspace. Camille warns: if we start making comparisons because at face value, the connection appears to be similar, but in reality isn't, we may set up ourselves up to truly stifle innovation.
And if anybody remembers Henrietta Lacks, her data was used to create all of these things that are very wonderful but she never got any compensation for it. Not knowing how your information is used takes away all of your control, right? And a world where your data is commoditized and it has a value, you should be in control of the value of your data. And whether it's as simple as we're giving away our right to choose how and when we disburse our information and/or privacy that leads us to security implications, those things are important.
For example, you don't care that there's information pooled and aggregated from a number of different places about you because you've posted it freely or because you traded it for a service that's very convenient until the moment when you realize that because you took the quiz and let this information out or because you didn't care that your address was posted on like a Spokeo site or something else, you didn't realize that all of the questions to your banking security information are now all easily searched on the internet and probably being aggregated by some random organization. So somebody could easily take and say, "Oh, what's your mother's maiden name? Okay. And what city do you live in? Okay. And what high school did you go to? Okay."
And those are three pieces of information that maybe you didn't post in the same place but you posted and didn't care because you traded it for something or you posted it and you didn't think it through and now they can aggregate it because you use those two things for everything and now someone has access to your bank account, they've got access to your email, they've got access to all of these things that are really important to you and your privacy has now translated into your security.
So we all, just like organizations, just have to press it, have to make this vision become their appetite for risk. We as individuals have to do the same. And so if you are willing to risk because you think either, "They won't look for me," or, "I'm willing to take the hits because my bank will reimburse me," or whatever the decision which you are making, I want you to be informed.
I'm not telling you what your risk calculus is but I wanna encourage people to understand how information can be used, understand what they're putting out there and make decisions accordingly. So your answer to that might be like "Look, I don't wanna give up taking Facebook for this or sharing information in a community that I trust on some social site but what I will do is have a set of answers that I don't share with anyone to those normal questions that they use for password reset that are wrong but only I know the fake answers that I'm using for them."
So instead of your actual mother's maiden name, you're using something else and you've decided that that's one of the ways that you will protect yourself because you really wanna still use these other tools and that might be the way you protect yourself. So I challenge people not to give up the things that they love, like I mean, I would assess whether or not certain things are worth the risk, right?
Like a quiz on Facebook that makes you provide data to an external third party that you're not really sure of how they're using it, not likely worth it. But the quizzes where you can just kinda take them, that might be worth it. I mean, the answers you provide for those questions still are revealing about you but maybe not in a way that's super impactful. Maybe in a way that's likely just for marketing and if you're okay with that, then take that or you go resilient the other way.
Versus if there is human input, we would decide that that is something that they can then own the production of, right, because they contributed to the making of whatever the end product is. It's hard to speculate but there will have to be a line drawn and it's likely somewhere in there, right? The sense that there is enough human interjection, whether that is from the input from whatever creative process is happening by the machine or in the creation of the process or program or software that is being used and then spit out some creation on the end, there will have to be a law or I guess at least case law that kinda dictates where that line is drawn.
But those will be the things that's fun, right? Tiffany, and other lawyers like myself, I think those are the things that we enjoy most about the space is that stuff is unclear. And as these things roll out you get to make connections with the monkey case and AI and with other things that have already happened and new processes, new tech, new innovations and try to help draw those lines.
And so it's dangerous to make those comparisons without some level of assessment. And so I would tell people to challenge those assessments when you hear them and try to poke holes in them, because bad facts make for bad law. And if we take the easy route and just start making comparisons because on their face they seem similar, we may set up ourselves up to truly stifle innovation, which is exactly what we're trying to prevent.
That is not the same as cyber-based. And to liken the two in the way that you use rules is not smart, right? It's your first inclination to wanna try to stop data flow at the edge of a country, at the edge of some imaginary border, but it is not realistic because the internet by its very nature is global and interconnected and, you know, traverses the world freely and you can't really stop things on that line, which is why things like GDPR are important for organizations across the world because as a company that has a global reach because you're on the internet, you will be affected by how laws are created in different localities.
So that's a very big example but it happens in very discreet ways too when it comes to technology, cyberspace, and physical laws. Or the physical space and laws that are operated in that way and so I would challenge people that when you hear people make a one for one connection very easily without some level of assessment to try to question that to make sure it really is the best way to adapt some things to the given situation.
The reason for example, Tiffany's likening of AI to this monkey case, it's an easy connection to make because in your head you think, "Well, the monkey is not human, they made a thing, and if they can't own the thing then when you do that online and a machine makes a thing, they can't own a thing." But it very well may not be the same analysis that needs to be made in setting, right? The lines may become very different because none of us could create a monkey. So if I can't create a monkey, then it's harder to control the output of that monkey. But I could very well create a machine that could then create an output and shouldn't I be the owner of that output if I created the machine that then created the output?
Want to join us live? Save a seat here: https://www.varonis.com/state-of-cybercrime
More from Varonis ⬇️
Visit our website: https://www.varonis.com
LinkedIn: https://www.linkedin.com/company/varonis
X/Twitter: https://twitter.com/varonis
Instagram: https://www.instagram.com/varonislife/
5
137137 ratings
We continue our conversation with cyber and tech attorney Camille Stewart on discerning one's appetite for risk. In other words, how much information are you willing to share online in exchange for something free?
It's a loaded question and Camille takes us through the lines of questioning one would take when taking a fun quiz or survey online. As always, there are no easy answers or shortcuts to achieving the state of privacy savvy nirvana.
What's also risky is that we shouldn't connect laws made in the physical world to cyberspace. Camille warns: if we start making comparisons because at face value, the connection appears to be similar, but in reality isn't, we may set up ourselves up to truly stifle innovation.
And if anybody remembers Henrietta Lacks, her data was used to create all of these things that are very wonderful but she never got any compensation for it. Not knowing how your information is used takes away all of your control, right? And a world where your data is commoditized and it has a value, you should be in control of the value of your data. And whether it's as simple as we're giving away our right to choose how and when we disburse our information and/or privacy that leads us to security implications, those things are important.
For example, you don't care that there's information pooled and aggregated from a number of different places about you because you've posted it freely or because you traded it for a service that's very convenient until the moment when you realize that because you took the quiz and let this information out or because you didn't care that your address was posted on like a Spokeo site or something else, you didn't realize that all of the questions to your banking security information are now all easily searched on the internet and probably being aggregated by some random organization. So somebody could easily take and say, "Oh, what's your mother's maiden name? Okay. And what city do you live in? Okay. And what high school did you go to? Okay."
And those are three pieces of information that maybe you didn't post in the same place but you posted and didn't care because you traded it for something or you posted it and you didn't think it through and now they can aggregate it because you use those two things for everything and now someone has access to your bank account, they've got access to your email, they've got access to all of these things that are really important to you and your privacy has now translated into your security.
So we all, just like organizations, just have to press it, have to make this vision become their appetite for risk. We as individuals have to do the same. And so if you are willing to risk because you think either, "They won't look for me," or, "I'm willing to take the hits because my bank will reimburse me," or whatever the decision which you are making, I want you to be informed.
I'm not telling you what your risk calculus is but I wanna encourage people to understand how information can be used, understand what they're putting out there and make decisions accordingly. So your answer to that might be like "Look, I don't wanna give up taking Facebook for this or sharing information in a community that I trust on some social site but what I will do is have a set of answers that I don't share with anyone to those normal questions that they use for password reset that are wrong but only I know the fake answers that I'm using for them."
So instead of your actual mother's maiden name, you're using something else and you've decided that that's one of the ways that you will protect yourself because you really wanna still use these other tools and that might be the way you protect yourself. So I challenge people not to give up the things that they love, like I mean, I would assess whether or not certain things are worth the risk, right?
Like a quiz on Facebook that makes you provide data to an external third party that you're not really sure of how they're using it, not likely worth it. But the quizzes where you can just kinda take them, that might be worth it. I mean, the answers you provide for those questions still are revealing about you but maybe not in a way that's super impactful. Maybe in a way that's likely just for marketing and if you're okay with that, then take that or you go resilient the other way.
Versus if there is human input, we would decide that that is something that they can then own the production of, right, because they contributed to the making of whatever the end product is. It's hard to speculate but there will have to be a line drawn and it's likely somewhere in there, right? The sense that there is enough human interjection, whether that is from the input from whatever creative process is happening by the machine or in the creation of the process or program or software that is being used and then spit out some creation on the end, there will have to be a law or I guess at least case law that kinda dictates where that line is drawn.
But those will be the things that's fun, right? Tiffany, and other lawyers like myself, I think those are the things that we enjoy most about the space is that stuff is unclear. And as these things roll out you get to make connections with the monkey case and AI and with other things that have already happened and new processes, new tech, new innovations and try to help draw those lines.
And so it's dangerous to make those comparisons without some level of assessment. And so I would tell people to challenge those assessments when you hear them and try to poke holes in them, because bad facts make for bad law. And if we take the easy route and just start making comparisons because on their face they seem similar, we may set up ourselves up to truly stifle innovation, which is exactly what we're trying to prevent.
That is not the same as cyber-based. And to liken the two in the way that you use rules is not smart, right? It's your first inclination to wanna try to stop data flow at the edge of a country, at the edge of some imaginary border, but it is not realistic because the internet by its very nature is global and interconnected and, you know, traverses the world freely and you can't really stop things on that line, which is why things like GDPR are important for organizations across the world because as a company that has a global reach because you're on the internet, you will be affected by how laws are created in different localities.
So that's a very big example but it happens in very discreet ways too when it comes to technology, cyberspace, and physical laws. Or the physical space and laws that are operated in that way and so I would challenge people that when you hear people make a one for one connection very easily without some level of assessment to try to question that to make sure it really is the best way to adapt some things to the given situation.
The reason for example, Tiffany's likening of AI to this monkey case, it's an easy connection to make because in your head you think, "Well, the monkey is not human, they made a thing, and if they can't own the thing then when you do that online and a machine makes a thing, they can't own a thing." But it very well may not be the same analysis that needs to be made in setting, right? The lines may become very different because none of us could create a monkey. So if I can't create a monkey, then it's harder to control the output of that monkey. But I could very well create a machine that could then create an output and shouldn't I be the owner of that output if I created the machine that then created the output?
Want to join us live? Save a seat here: https://www.varonis.com/state-of-cybercrime
More from Varonis ⬇️
Visit our website: https://www.varonis.com
LinkedIn: https://www.linkedin.com/company/varonis
X/Twitter: https://twitter.com/varonis
Instagram: https://www.instagram.com/varonislife/
4,334 Listeners
181 Listeners
927 Listeners
7,865 Listeners
129 Listeners