Share Državljan D
Share to email
Share to Facebook
Share to X
By Državljan D
The podcast currently has 135 episodes available.
We sat down with an award-winning investigative journalist Mackenzie Funk, whose book, “The Hank Show: How a House-Painting, Drug-Running DEA Informant Built the Machine That Rules Our Lives” tells a story about the origin of surveillance capitalism we know and hate today.
We talk about the book and the man behind the story, but we also touch upon his legacy, the surveillance capitalism that stems from data economy and big data intermediaries and the way we have to address this on a local but also on a systemic level.
Also included in the conversation are the critiques of predictive policing, the issue of almost unchecked business of data analytics and the way forward.
00:00:06 Domen Savič / Citizen D
Welcome everybody. It’s the 23rd of October 2023, but you’re listening to this episode of Citizen D podcast on the 15th of November same year.
With us today is Mackenzie Funk, an award-winning investigative journalist whose second book, and it’s a mouthful, “The Hank Show how a house painting drug running DEA informant built the machine that rules our lives”, tells a story about, well, I’m just going to say it, it basically describes the birth of surveillance capitalism.
Hello, Mack. Welcome to the show. Would you say the description is accurate?
00:00:44 Mackenzie Funk
Yes, you’re right. That is a mouthful. I never had an easy way to explain what the book was about except that it was… Yes, it was the person who we’ve all never heard of, who started so much of this world we now live in.
00:01:01 Domen Savič / Citizen D
And before we start with the book, so the first question I’d like to ask is before this… You were so you’re an award-winning investigative journalist and you focused mostly, or you focused mostly on environmental stories, global warming and stuff like that. What made you pick the the Hank Asher story?
00:01:21 Mackenzie Funk
I think, as crazy as it sounds, that surveillance and climate change are very similar in some ways, especially to an audience that doesn’t know very much about either one.
They can seem very dense and very boring at first and so you need to find a way to tell a story that is not just about the facts of it. So, in the case of climate change, it was just about here’s the science. And so, for many years before, I was working on my climate change.
You would have these arguments in the United States between the people who believed in the science and the people who did not, and it was not a very good way to win the argument, just to be right, just to just to have the scientists have the right facts.
And so, I was trying to find a way with the climate change book to tell a story that would maybe show the stakes show why people should care.
When it came to surveillance, that was my same technique as I focused on a person because I figured even if you’re not interested in privacy or surveillance, you might be interested in this person. But the second thing is that they’re both these big systems and the reason they that some people find them hard to… they find them boring or hard to understand is because they’re it’s so complex.
But there are these big systems that seem to fall the heaviest on the people who are poorest among us and the most vulnerable. It’s becoming more obvious with both climate change and privacy that the poorest people in the world, the poorest people in each of our countries, those are the people who are bearing the brunt of this and that there can even be winners in these new economies from climate change and from surveillance capitalism.
And those winners are not the poorest, they’re the usual winners in our societies, and I find both of these both climate change and our lack of privacy to be accelerating some of the worst inequities in society.
00:03:25 Domen Savič / Citizen D
Before we jump into that, I saved this part the last part of the show. But first let’s talk about the book. So how did you find this guy? Hank Asher, the guy who… I’ll let you tell the story or a brief recap, but this was…
Reading your book this was the first time I heard about it and I am working in the field of, you know, digital privacy and digital activism for the last fifteen to twenty years
00:03:56 Mackenzie Funk
Yeah, I would not say that I was by any means a privacy expert when I started this, but I certainly paid attention from reporting abroad, especially in in places like China and Russia, I was very careful about… I tried to understand where my information is going and who will see it and even after the Snowden revelations in the United States, I became a little careful about even what my own government was looking at.
And I had never heard of Hank Asher. His name first came up when my… it’s a complex story, but a magazine approached me. An editor I knew, and he said I have a story for you.
And the story was about this group of people who were trying to stop child predators as they called them, and they were using this software built by a person named Hank Asher, who they described on their website as the father of data fusion. So, I was looking at this group, and I saw that note on their website. And I said, who, what?
And that is that is what it began my journey, and I then googled the name and then I saw that he had been a cocaine smuggler, that he made a fortune multiple times and then lost it in Florida in the 80s, I saw that he had this crazy story that he was involved in the security build up after 9/11 in the United States.
That his technology undergirded some of the biggest surveillance companies in the world and certainly in this country, and that it was just this character nobody had ever heard of. And I found that amazing.
But what really did it for me was when I stumbled upon his obituary page because he had died by the time I heard his name and the things that people wrote about this man.
Just on his online obituary website that his company set up for him and then another one that the Funeral Home put up are just not the kinds of things you see written about most dead people you know.
You read obituaries, you read the comments people make, and they say nice things about people. But the ones that they said about Asher was like, he changed my life. He paid for my kids’ college. He fixed my broken teeth, or he yelled at me. He swore at me. He was the craziest person I ever met. He changed… He changed this country forever. He changed my life forever, and he just seemed to have this outsized impact not only on privacy, but also on the people around him.
00:06:39 Domen Savič / Citizen D
The book is full of these anecdotes or happenings in in his life when you, I mean, the title sort of tells a pretty good story. So, you have a guy who’s been drug running for, for the DEA, or who was an informant for the DEA, but at the same time, he was saving lives helping people, you know chase down, child molesters and stuff.
But what was in your view the thing that surprised you the most? Like when you were doing, when you were doing the research for the book?
00:07:16 Mackenzie Funk
Of course, everything about I got perhaps a little too obsessed with understanding what exactly he was doing in the 1980s when in the United States, the center of all drug smuggling was Florida because of its proximity to Latin America, to Colombia, to Jamaica, to the Caribbean, and I became obsessed with that. But that was interesting, but not surprising.
The biggest surprise was how these open records laws in Florida were a good thing in this country. We wanted transparency and the States, especially after some scandals, Watergate and others. They decided to make sure that the public could know what the government was doing in their name and so many states opened up their records, opened up their books so that citizens could see what the government was doing and people like Hank Asher, in states like Florida, which had very open public records, exploited this.
They were able, using this state law, they were able to go in and get all of the driver’s license records and all of the vehicle registration records, marriage records, birth records, divorce records, all the housing record. Everything, everything you can imagine that the government, local government would use to sort of have a citizen move through their life.
This was a public record and because the governments wanted this to be they, they wanted citizens to know what they were doing. They opened it up to a new species of person, which is these data, data aggregators.
And the technology, by the time Asher came along in the late 80s and 1990s, you were suddenly able to scoop all this up and make sense of it in a way that I don’t think you could have when these laws were written.
So that was the big surprise that this, this very progressive policy was became something very different and would… say if we compare the Hank Asher’s period where he was buying up or gathering all of these data points and putting them in a in a database in a searchable database, would you say there are similarities if we draw a comparison with Mark Zuckerberg or Elon Musk or any of the other big intermediaries, digital intermediaries’’ bosses…
00:09:54 Mackenzie Funk
Yeah, I have similarities and an important difference in the big trick with Hank Asher’s era. We’re talking in the 1990s and early 2000s is that they are going out to all these different databases and trying to make sense of these little tidbits of information they get from each one, they’re trying to connect all of them to a single person.
And to do that, they ended up assigning a tracking number to each American citizen and each American resident, and also many people across the world, so that each of us has our own our own bar code or something like this.
And then they had very tricky rules that became algorithms or and now are different through machine learning, they’ve perfected these but different rules that assign a new data point that comes in to an individual with some degree of faith that this is going to the right person so I am not the only Mackenzie Funk in the United States.
It turns out there are many of us and they tend to know when I’ve moved or when I’ve bought something. And they do so because they’re able to see, of course, my address, but also the people I’ve lived with and my age, my gender, these kinds of details. So, they can with some faith put this new data point with me.
Someone with a name like John Smith, it’s two very common names in this country, it’s much more difficult, and yet they would have ways to make sense of this.
Contrast that with what Zuckerberg built, which is you have to use your real name on Facebook, and the moment you you do, you become your own aggregator. Zuckerberg did not need a fancy algorithm or a fancy set of rules to attach each new data point to an individual because we did that for him, we did it by logging in under our real names.
We did it by putting Facebook on our computers and on our phones, and so it almost turns the problem on its head from the perspective of the of the aggregator and I think that’s the biggest difference with what’s come out today and Asher’s day is that back then the problem was information was coming in from all sides and we need to find a way to really attach it to a person.
In Zuckerberg’s era, it was how do we make people build their own databases and then give us access to it and I think right now in a maybe post Facebook era, we’re a little bit in a more of a hybrid space between the two. The same problem Asher solved is now being solved again from the from the data broker perspective, they’re trying to do it again.
00:12:51 Domen Savič / Citizen D
And just to continue your train of thought… So, what are some of the lessons that we can learn from his story right regarding data economy regarding these data points regarding data aggregation regarding I don’t know use of data in, in policing and surveillance?
00:13:19 Mackenzie Funk
Yeah, I. Well, one thing is that Asher, I think had high minded goals with what he was doing and why he was doing it. Yes, he wanted to get rich, but like many people in Silicon Valley, say 15 years ago, he kind of believed he was the good guy and that he was aggregating all this information to do good things, and that included, especially in his case, using this information to go after child predators, to go after child molesters and to help the police do that.
Then after the attacks of 9/11, he used it to go after supposed terrorists, he thought it would help the police solve crimes. He thought it would eventually, you know, help companies determine lower rates for people who had good driving records, for instance, or otherwise look good. And it was hard for Hank Asher to see the downside of what he built.
But I think he was an early stand in for people like Zuckerberg, who may have claimed high minded principles. At the beginning, you know bringing the world more connected or whatever the slogan was in in the early days of Facebook and yet all this stuff has such a dark side that if you build this this weapon, eventually you’re not going to be the one in control of where it’s aimed or it will become take on its own.
And with Asher, you could see the dark side of this pretty quickly, his technology was used not by him, but by the people who took control of his first company to change the results of one of the most important elections in this country by kicking black Americans off the voter rolls in Florida, they did this in the name of keeping fraud out of elections. But the end result was that, it appears that George Bush beat out Al Gore in the 2000 election in Florida and therefore won the entire country.
You can draw a line from what Asher built to that moment, not something that he would have wanted. He actually was in favor of the Democrats in favor of Al Gore, but that’s how his technology was used, I don’t think that he would like how it was used much later in Florida, again disenfranchising and taking away the vote of black Americans.
I don’t know that he would have liked how it misidentified various Muslims in this country as somehow being terrorists. But it did. I don’t think he would like how people were wrongly accused because a police officer who was badly trained in this information would pick the name of somebody from his database and say “Oh, this has got to be our suspect”, but that’s happened.
So, I think that’s the parable you can draw from what he built is that this stuff has such power and eventually its creator won’t be in control of where this power goes.
00:16:44 Domen Savič / Citizen D
Do you think… what you just said… that this lends itself to a theory that, you know, technology is neutral and that it matters who’s running the show if I can borrow a phrase from the title of the book, or would you say that technology has some inherent biases and inherent problems built in, and the driver behind the steering wheel isn’t the only, let’s say good or bad switch?
00:17:20 Mackenzie Funk
That’s a good question. I think the reality is that it is both things, I think the driver matters quite a bit, but also the technology can have something built in that makes it inherently worse or better, and I don’t think that it has to be one or the other.
And that again is the lesson I would see here, if you think about what Hank Asher built and why he built it, in some sense, it’s a suspicion machine. Keep in mind that this is a former cocaine smuggler who suddenly is wanting to work with the police, he knows that he has something hidden in his own background, he thinks like a criminal.
He knows how the how the Drug Enforcement Agency, the DEA, has gone after other drug smugglers, and he’s seen how they put information together and draw connections and so what he kind of built a machine that would, that would show hidden connections between people or between people and what they were trying to hide. For instance, say their assets.
And a lot of the machine was built based on his own psychology and because of that, it’s something that surfaces, things that most Americans would think would be hidden, like if you’ve lived with the same person three times in a row and they know that because his machines have your entire address history and they can see “Oh, these people were roommates.”
What if in a time that this country was a much more anti-gay, what if that was your husband? If you’re a man and that’s your husband, but you were keeping your relationship a secret… Well, that’s the kind of hidden information that his machines would surface.
And they basically remember every little thing that you might want to hide, or think would be hidden. And that’s based on in part who the maker was and how he started to think about what would be useful to investigators and what criminals or other people might be hiding… So yeah, there there’s some inherent danger to the technology, but also how it gets used is very much depends on who’s in control of it.
00:19:44 Domen Savič / Citizen D
And if I can just backtrack a little bit, you’ve mentioned that after Watergate, the government wanted to be transparent and open. So nowadays you have digital activists, you have human rights activists arguing for more transparency in the government works in police works in in these supposed closed system systems.
But on the other side, transparency on its own… I sometimes get the feeling it it’s not doing much good, I would say, in a way that OK, sure, you open up all the databases and people are drowning in these data points. And every once in a while, you have this Hank Asher type of type of person who, you know, makes sense and a little bit of money on the side from all of these.
So would you say, how would you argue or what would be the correct way to sort of argue for transparency, but at the same time not argue for, OK, you just open up the floodgates and let it all out. And, you know, we’ll figure it out as we go…
00:21:08 Mackenzie Funk
I am no expert in the data privacy laws in the United States versus Europe, but I do know that you have a much better understanding that data collected should be used for the purposes it was collected and not for other things, and one of the big things that went wrong in the United States and I think continues to go wrong and round around the world is that this idea of transparency also means that companies can purchase or otherwise acquire data it was never meant to be used for, say, advertising.
When it’s something that a local government has collected, but because there are no controls on how it is used. Two or three purchasers down the line, then it becomes transformed into something very different.
So I think the answer to much of the transparency and privacy problem, which is real, is some sort of control on… is this used for what the person who gave up their information to their local government, for instance, intended if I give my information to the Department of Motor vehicles in my state here, I don’t expect that they are going to then sell that information on to LexisNexis, the data broker.
I don’t expect that LexisNexis will then send that information onto immigration authorities and that the immigration authorities then could come and come to my home if I am not a say, I’m not a citizen and I don’t have the right papers to be here. Those are not things I expect when I go to get a driver’s license. And I think some sort of… something like that is built into the laws, can go a long way to helping… I tend to agree that transparency isn’t enough, I tend to think that that governments are collecting too much information and they are keeping it far longer than they need to because then.
If there is transparency, if there are transparency laws and it happens to be in the database still, well then, they might have to give it to whoever asks for it. If they don’t collect it in the first place, or if they don’t retain it for a long time, that’s often safer.
But I think the tension is always going to be there, but I think a lot of it is about the very the gut check. Is this what I thought was going to happen to my information?
00:23:47 Domen Savič / Citizen D
And do you see things changing like now, like in the past? Like maybe after the pandemic or even during the pandemic, it seems that you know the data industry, all of these big digital intermediaries, they finally lost its, I should say user-friendly halo, right?
And people are sort of pushing… in the United States and also in Europe and also in other places across the globe, they’re pushing for a for a different approach to these to these issues of privacy of, you know, companies knowing almost everything or everything about the person.
So do you see that happening as we move away from from, let’s say, the COVID pandemic and go forward?
00:24:36 Mackenzie Funk
Yes, I do see it happening and I think again, I know the situation in the United States better than I know in Europe, but in the United States, the big change was after the election, in the first election of Donald Trump.
After the 2016 election and Facebook’s role in that election, the Cambridge Analytica scandal, those things seem to very much change the public perception of these of social media in particular.
And then you see more controls on our phones, both Android and Apple have done a better job as to not letting other companies track us all over the web all the time. Not that they’re perfect, but we have a lot more control than we did eight years ago.
And I think that is that is a good thing. That said, I think we’re coming back, especially in in the United States from 20 years of almost no control on privacy and, and so even if there’s some incremental gains on the digital space…
A lot of this information is already out there and a lot of the information that companies like Hank Asher built and those are big parts of LexisNexis and Thomson Reuters and TransUnion, those databases are still there and so they’ve got20 going on 25 years of information about each of us and they never got rid of it. They didn’t purge it, they kept it. And so even if governments are collecting less, even if our phones are giving a little bit less information about us, the length of that history that some of these data brokers have on us and that still includes Facebook, that really matters because they can get a sense of where your life is going and where you’ve been and those are… That’s pretty important information when they’re trying to help other companies make decisions about your life.
I looked at healthcare in the United States, doctors began to look at not just your clinical conditions, but at the conditions of your life now in general, where do you live? Do you have access to a vehicle to get to your doctor’s appointments? Or a bus or a train?
Do you drive a certain kind of car? Because certain kinds of cars can be, you know, can you can judge someone based on their car. You can kind of guess what their general health is. Where do you get your information? Because if you’re getting all your news from Facebook versus of newspaper, you might be lower information and you might not follow the directions of your healthcare provider as much.
That kind of information or even things like where did you live when you were growing up nd was that area worse in terms of air pollution… that kind of information, going toward healthcare decisions now is very… You’re not going to easily get away from that just because your phone isn’t telling Apple or Google exactly where you are right now.
00:27:59 Domen Savič / Citizen D
And just to follow up… So do you think that all of these data points about the person or the information packets you just described, do you think these are actually used by people or by companies by industries collecting it or do you feel that this data economy is basically I’m not going to say a mirage, but basically something that looks nice, looks fancy, looks I don’t know, important, but at the same time you just get these, you know, data dumps that that nobody uses and then, you know, when somebody hacks a big, you know pharmaceutical or health related database, then it becomes a problem, right?
00:28:47 Mackenzie Funk
Yeah, I think they are using it. In fact, I know they are using it, but most of the Hank Asher’s products, most of the world’s biggest banks and most of the biggest companies use those products. Most of the law enforcement agencies in this country use these products and I’ve seen… I mean, imagine it’s not just the little bits of data, right? It’s tied to a person’s identity, all the information you might want to know about them.
For figuring out how much to charge them for insurance or like I said, healthcare decisions, or if you’re a police officer trying to track someone down, where do they live? Where have they lived? Who are their friends and who are their relatives? If you’re trying to find them, where would they go?
That kind of information is in there if you are an immigration officer trying to track people down as, as could happen in in this country and has happened in this country. It’s the same kinds of questions, and also when did they arrive? When does their name first appear in this database? Does this show that their recent arrival in this country?
And I found evidence that it’s being used all the time. I don’t know how banks work in Europe, but when I log in from a new location, say I have a new computer or a new phone and I want to access my bank information sometimes, it will send me a bunch of questions: which of these street names have you lived on or which of these streets have you lived on in your past? Who of these people have you lived with? Which of these people have you lived with? Little details about my biography that only I would know.
You would see things that stretch back 15-20 years into my history and that’s a Hank Asher product built right in and used for something that I think is pretty benign. It’s making sure that you are really you, but it’s a show of the reach that these products have everywhere in the background of our lives in this country. And nobody really has any idea…
00:31:10 Domen Savič / Citizen D
So, the case that you just described with the banks it perfectly I think describes the issue between privacy and security, right? So, you have these data products that are a boon for security, but at the same time they are literally almost or not even almost literally killing our privacy. Right?
So, do you see this conflict? You know, resolving in in the in the future, in, in any shape or form, or do you think we’re just going to go back and forth between, you know, this is too much security. We want more privacy or this is too much privacy, we want more security?
00:31:54 Mackenzie Funk
Yeah, I don’t see it resolving and nor do I see the sister of that one, which is the privacy and convenience question. And we see of course that that one all the more with the new AI products that say Microsoft or Apple is coming out with how much of your information do you want to give to these companies, especially if they’re uploading it to the cloud to make your life more convenient to complete your emails or calculate the distance between when you have to leave so you can get to your next meeting or to reschedule a meeting.
These APIs, these AI products can already do this if you let them have access to your calendar or all of your emails and I find that scary and very interesting because I think a lot of people will choose convenience over privacy here and that the privacy controls will always come late. And as for security and privacy, it goes back to your earlier question about are they actually using these products? Do they actually work?
I think we’re finding that as much as, at least in this country, as much as they were used for, say, counterterrorism, or for predictive policing or all these things where the idea was, if you get enough information about everyone in in a country or in a city, and if you really build these algorithms to help us predict who’s who among the general population. You can stop crime before it occurs or stop a terrorist attack before it occurs.
So far, that hasn’t worked out as well as it was supposed to so far… yes, police officers are using Hank Asher’s products to say, figure out someone’s address history or who their relatives are, but they’re not using it in sort of an AI powered way… let’s let’s just give everybody a score, and I mean they are using it, but when they’ve used it, it hasn’t worked very well.
So, a lot of them have dropped it. So, a lot of the predictive policing projects have been dropped in major U.S. cities because it didn’t work as well as they thought it would. It just turned out to be racism at scale.
It just turned out to wrongly tar huge swaths of a city with the idea that they might have a higher risk of being a criminal, which didn’t always match up with reality and was certainly unfair to those people who were caught up in the algorithm.
So, I do think that there’s been… Yeah, I sometimes think that is showing that, especially with law enforcement and counterterrorism, that it’s not as useful as they thought.
00:34:49 Domen Savič / Citizen D
I was asking this because, you know, we in so this podcast is a part of the NGO with the same name and we focus on privacy, security and amongst other things, surveillance capitalism. And we’ve been doing this investigation in the CCTV cams that are set up by municipalities across the across the country and we realized that it’s basically… it’s a promise of security.
They never deliver because the whole pipeline that follows detection, we have somewhere around 500 CCTV cameras across Ljubljana, and at the same time, they’re not doing much in terms of crime prevention, but they’re doing a lot in terms of, you know, privacy invasion, right. And if you see, if you look at across the world, you see that, you know, some technologies work in particular cases, but they sure don’t work in in other cases, right.
So the debate around security and privacy becomes very targeted, I should say. Or you know, you have to look at individual cases, individual technology and so forth. So, would you say the data, the data policing or the policing with the use of data, is that something similar or would you say the success rate is much higher or higher than that?
00:36:18 Mackenzie Funk
Hmm… we have this similar debate with traffic cameras and surveillance cameras here. I would say that the use of data in policing, it has been useful and hat Asher brought and what his companies bring to local police departments has accelerated the kinds of investigations they would do.
For instance, they I was talking to these detectives who worked in Florida in the 1990s or in the 1980s, about how they would go about their casework and they would go and track down each of these individual pieces of information from different police departments across the state, and they would not necessarily have access to say the stuff you could get from credit cards, which always have a good address history, or they would or they could get it, but they would have to get a court order and the court order would take weeks. They could get someone’s phone records to know who they’ve been calling. But that again would take weeks or months and so to just to get enough information to try to track down a suspect in a crime, it would take them a very long time.
And all of them found these products that Hank Asher built to be transformed and to help them do the work they were doing much faster and I as much as I was skeptical of where the technology has gone.
I wanted to be fair and recognized that that it really did help some of these police and do their jobs and I and I think in a way that it didn’t necessarily hurt the privacy so much, no more than they already could. It was just happening faster, but things changed.
When Asher’s company and others began to make this more algorithmic and to start assigning scores to someone’s criminality or their likelihood of being a terrorist, and I think where the a lot of the failure has been, is that when taking humans out of the loop or in saying that we need all of this so that we can solve these crimes, I don’t think you need all of this. And you certainly don’t need the scoring technology to help you surface people you think might be more likely to be criminals than others.
It’s very different to try to predict crime than it is to try to track down suspects in a crime that already occurred. And I think that once they started to try to predict the future using this data, that’s where many of the biggest harms have happened and then when it goes away from these narrow uses. I can see that someone investigating a murder, for instance, might it might be more fair to give them this information than someone investigating a minor crime.
One thing we have in the United States, not just the surveillance cameras around the around cities, but are the license plate readers on all of the police cars and on many city cars all over the country. And many of those were installed by a private company and that private company has taken all these scanned licenses and they’re able to give predictions is about where someone will be.
For instance, if you always see someone’s car parked in front of a certain address at a certain time of day, you can predict that they’ll be there pretty soon, and some of that, that just seems like too much. Like who actually needs that?
You know, who needs that is real estate companies or banks trying to use data for things very different than policing and these same companies were selling to the police and they were also selling to business.
And I think the selling to business part doesn’t need to happen and the retention of all these scans over the course of many years, does not need to happen.
Other people using it were immigration authorities trying to track down people. In this country, who might who might not have come legally, and my opinion on that is that in the United States, there are so many people who came to the country without the right papers and there are only so many they’re going to want to arrest.
And it seems like if you are going to try to prioritize people, it’s the worst idea to go after people who have been here for so long that they might have children who are citizens in this country and who have real lives and who are living like any citizen, if you have to prioritize.
And yet these systems, they will surface the people who are living the most, who are not trying to hide right? The ones who engage in banking, the ones who properly registered their car and got a driver’s license and got insurance, bought a home.
All that information, that all goes into these systems. And so, it turns out that the people who are not trying to hide, who are living the non-criminal, very average lives are the ones who are easiest to find in these systems. And so, if they’re used for these purposes like that for immigration enforcement, for instance, that just seems like a total perversion of what what they’re for.
00:42:08 Domen Savič / Citizen D
I have two more questions before we wrap up. So, the first one is basically a continuation of this debate.
So why do you think these technologies, these data silos that are basically telling us where a person is going to be at a particular moment, why do you think the usage exploded in literally every corner of our society, so if you like, take Hank Asher, for example, right at in the beginning, he was basically helping out the police to catch pretty basic criminals, right? And then, the usage extended to anti-terrorism and then it extended to anti child molester.
So why do you think the people are so excited about copy-pasting the same basic solution to, you know, places or fields that that may not have very much or much in common with. You know, the previous or the original field that this technology was used in?
00:43:20 Mackenzie Funk
That’s a good question. I don’t know the full answer, but I can say that, before this was used for policing, the original product that Hank Asher had and that many of these things grew out of was not policing, but insurance. Private insurance companies trying to get a sense of who am I selling this insurance policy to and how can I reduce my risk?
And the idea of reducing risk is basically look at the past to try to understand the future. Try to do a calculation about what someone has done before in order to predict what they’ll do in the future, and then if you can do that and whoever does that best they’ll make the most money, if you’re an insurance company to those clients who are riskier and they’ll charge much less money to clients who are not really going to have any problems.
So basically, they’re getting money for nothing, and that that’s the whole idea. It seems like with Hank Asher, they took this idea of risk management from insurance and applied it across all of society.
This idea that you can just look at someone’s past to predict the future, or more broadly look at people like this person and what they’ve done in the past to predict the future. That it’s a very seductive idea, because imagine for one if you can predict the future, you can make money off it or you can protect against whatever coming danger is.
So, we all love to know what’s coming down around the corner. Human literature is full, is full of stories about time travel, or the person who could see the future or the seer and whoever knows what’s coming is almost the most powerful person in the world. And that’s what these systems promise to police and everyone else and counterterrorism investigators is a chance to predict the future.
And maybe they get it right, for the most part, but the problem is, this isn’t advertising. In a lot of these systems are not used just to send somebody an advertisement for a tennis shoe, or a handbag they might want. These are used to make life and death decisions about that really affect people’s lives, who gets insurance, who gets a job and who gets arrested and getting it right most of the time, I don’t think should be good enough.
I think it’s very different if you send the wrong ad to the to somebody, no big deal, you’ve wasted a little bit of money. But if you arrest somebody because you think they’re going to do something wrong, that’s a very big deal. And so, we can’t have the logic of insurance or the logic of advertising which is similar to that of insurance, which is your risk pool just… If you get it 70%, that’s great, if you get 50%, that’s great.
We can’t have that logic be the same logic for the rest of our lives.
00:46:36 Domen Savič / Citizen D
And since you mentioned healthcare… OK, two more questions. Since you mentioned healthcare, why do you think this data surveillance or data economy isn’t regulated as well as health field, right?
So, you’ve just mentioned 70% isn’t good enough… imagine if you would have those options going to a doctor, right. And the doctor would go well, you know, I can heal you, but I can also kill you, so let’s see, let’s see what happens next.
Why do you think the like, the regulatory frameworks, and this is the second part of this question, still heavily rely on the end user. So you’ve mentioned Apple and Android building all of these protections in in in our devices in our, you know, online services and stuff, why do you think there wasn’t a bigger, let’s say, systemic push for, I’m not going to say corporate protections of our privacy, but let’s say state protections of our privacy.
So why are we at the end of the day David versus the Goliath?
00:47:54 Mackenzie Funk
Good questions. The first one… healthcare. It’s very easy to understand that these are life and death decisions. It’s very easy to understand that what happens with the doctor or at the hospital is going to change everything for a person.
I think it was harder for people to understand and imagine that what happened with privacy would have such impact, and this goes back to your earlier question about why did you go from or how do you go from climate change to privacy.
Early on and climate change, everyone was talking about polar bears, right? They couldn’t imagine they knew they cared, but they couldn’t imagine that really that climate change would really affect their lives.
They couldn’t imagine the wildfire smoke or the glaciers disappearing or anything really changing for them. So sure, they cared about it, but not in a way that was visceral and close to them, not like a healthcare decision.
And I think with privacy, a lot of people are still stuck in the imagination that it’s these advertisements that are that it’s about advertising that it’s about these ads following you around as you surf the Internet and that, yes, it’s a little bit creepy, but what does it really have to do with my life?
How does it really change my life and that the public has been slow to wake up to the reality that it has everything to do with your life and more and more the information about you is out there being used to make decisions about your life and what opportunities you’ll be given.
And once that understanding is there and it’s becoming more there than maybe data will be regulated a little bit more like healthcare because we’ll see that it’s the same level of decision.
As to the second part of the question – why is it always the end user? Why is it not systemic? I think it’s related a little bit, I think it especially in the United States, people think well, it’s your choice to use Facebook or it’s your choice to give up this information online and if you really care, you can do something about it and it’s almost the companies that still dominate the privacy landscape, which are American companies are built with this American individualist ethos.
That we are a country where we pretend that the individual has lots of power and that we are supposedly self-reliant and that the government shouldn’t step in and regulate, and that if you really care, you’ll take action on your own. And of course, that’s nonsense. That’s nonsense with climate change and it’s nonsense with privacy. The idea that people would have any way to meaningfully counteract the most powerful forces in our lives without banding together without asking government to do what government is supposed to do, which is step in when it’s a collective problem, it’s nonsense, but it’s still permeating.
I think the debate and that’s the biggest issue is that that that we’re stuck with this individualist sort of American ethos.
And another point on that, the idea that there is any meaningful choice anymore that you that you can somehow avoid the digital economy that you don’t have to give up your information if you don’t want to, that you don’t have to use these tools that their tech companies built, that’s not… I mean for, for young people, for anybody who wants to participate in society as it is now, you have to be using these tools and therefore you have to be giving up information for the most part.
And that’s not really a meaningful choice, when a when a gun is at your head.
00:51:54 Domen Savič / Citizen D
We’re trying, we’re not always successful in wrapping up on a positive note, so please if you have like a like a… I’m not going to say an uplifting message for the end and I’m not even going to ask you about how’s the current US election going and what do you think will happen after November in this field… but is there a place to or is there a thing that you notice regarding data economy regarding you know these protections, that is, let’s say improving in the last couple of years? So, is there a light at the end of the tunnel?
00:52:34 Mackenzie Funk
Yes, I do think as we’ve talked about that the public is much more skeptical of Silicon Valley and of these technologies in the United States in particular, then the public was five years or eight years ago.
And I think it we are slowly waking up to the fact that this has life and death consequences for people that privacy is not just an ad that follows you around the Internet and that knowledge, even if it’s too late, even if it’s a little slow, will eventually translate to change, I think.
I think it must, we’re not stuck just doing whatever these companies have us do entirely because we are, they’re still voting in all of ours countries and there are still elections, and those elections, if voters care about these issues and voters care more and more about these issues, I think there can be change and little things.
I saw from Hank Asher’s technology for instance, the utility companies, we’re talking Internet companies and electricity companies, water companies for many years they were all getting together and they created a database of all these different customers and what their address was, what their names were, and they did it in the name of making sure they were credit worthy so you couldn’t switch from one electricity provider to another.
And if you were someone who didn’t pay your bills that that they would know about that. But they started taking this information and selling it to the data brokers which you would never imagine that you move into a house and you decide to get electricity and because of that some corporation knows where you are or that they sell that to police.
And that was one of those things that once it was uncovered in the United States, a few years ago, there was outrage and they stopped selling it to the data brokers that went away. It’s little things like that, there is a change.
There are in this country and I know also in the EU there are lots of small fights against overreach by the surveillance companies, not all of them are victorious, but there are more and more, and I think that is a sign for some hope.
00:55:10 Domen Savič / Citizen D
Excellent, that’s a perfect way to end this debate about the really horrifying situation everywhere. The book is called The Hank Show, thank you Mackenzie for dropping by, for, for sharing your thoughts, it’s really nice to see these ideas of privacy and the ineffectiveness of the security capitalism echoing around the world.
Thank you so much, dear listener, this has been the Citizen D podcast, we publish an episode every month, so feel free to subscribe and we’ll talk to each other next month. Thanks again, Mackenzie.
With us today is Meredith Whittaker, president of the Signal Foundation who serves on its board of directors. She was formerly the Minderoo Research Professor at New York University (NYU), and the co-founder and faculty director of the AI Now Institute.
She also served as a senior advisor on AI to Chair Lina Khan at the Federal Trade Commission. Whittaker was employed at Google for 13 years, where she founded Google’s Open Research group and co-founded the M-Lab. In 2018, she was a core organizer of the Google Walkouts and resigned from the company in July 2019. She now runs Signal, the leading global privacy-orientated NGO.
00:00:55 Meredith Whittaker / Signal
But I wouldn’t actually say that the walkout was the very beginning for me, the walkout was a culmination of a lot of work, a lot of thinking, a lot of conversations that I’d had over more than a decade. And the walkout also wasn’t just me. It was thousands and thousands of people. It was apparently the biggest labor action that has happened in tech, with 20,000 people leaving work in protest, you know, against the unethical business conduct at Google and against a culture that persistently valued some people more than others and developed products that often caused serious risk for those who were devalued, so to speak, due to that culture and those design decisions.
I think the walkout was one way in which throughout my career, in many, many ways I have endeavored to be accountable to my analysis, I have endeavored to do what I can to change things when I saw them going in in a bad direction, but I had worked for many years and many different ways, from the inside trying to influence trying to shape policy and many of these things I still do… So again, I think the walkout wasn’t the beginning. It was one manifestation of a theory of change that looked to collective action from below to remedy some of the dangers and harms of the concentrated tech business model.
00:03:52 Domen Savič / Citizen D
00:04:42 Meredith Whittaker / Signal
Conservatives, liberals, leftists… recognized simply structural check on toxic capitalism and labor power involved, the workers having some say in what they work on and how. I don’t know that this is individual so much as going back to some of the basics and recognizing that we have an industry that is making some decisions and putting revenue and growth above the common good in ways that could be really, really dangerous given the power and information possessed by this industry.
00:06:11 Domen Savič / Citizen D
00:07:15 Meredith Whittaker / Signal
We are asking people to validate our claims and we are making insofar as possible everything available for them to do that and I think that is why Signal is so trusted, because in fact we, we are going above and beyond to be trustworthy in a way that most actors in the ecosystem can’t or are unwilling to for a number of reasons.
00:08:36 Domen Savič / Citizen D
00:08:55 Meredith Whittaker / Signal
And so, it is difficult to do the opposite. We actually end up having to rewrite parts of the stack, so to speak, in order to enable privacy, in order to reject data collection as a norm. So that is difficult because we are swimming upstream against a massive current in a trillion-dollar industry, where privacy has not been something that was prioritized and trust around privacy is certainly not been part of the business model. Now it’s also difficult or related to that it’s difficult because there isn’t a business model for privacy at this point in the tech industry, and this is one of the huge harms that we are, we are grappling with.
The profit motive is oppositional to privacy, data collection is oppositional to privacy. So it’s difficult from that perspective in that we have to really think about our structure and protect ourselves from the imperatives of profit and growth, not necessarily because they’re bad in and of themselves, but because following those imperatives, would at this point lead us down a path towards surveillance toward data collection.
So, this is why Signal is structured as a non-profit. This is why we really go out of our way to take the incentives for surveillance off the table when it comes to Signal again. So, we’re structured for success in the long term, so we stay laser focused on our mission.
00:11:29 Domen Savič / Citizen D
So, is it hard for you, for Signal to argue for privacy when faced with a fake dilemma of choice between privacy and security?
00:12:17 Meredith Whittaker / Signal
And the motive is that there are some among governments and law enforcement who feel that the fundamental human right to private communication should not be available to people online, that there should be no communications network that is not tappable, that law enforcement or governments aren’t able to surveil.
And I think that is… It’s just simply incredibly dangerous, and it flies in the face of the long-standing expert consensus that knows there is no way to create a backdoor, create a way in that only “the good guys” can access, that anytime you create a flaw in these infrastructures, anyone with the tools and expertise to exploit that flaw will, and so you are corroding the very same cyber security measures, the very same private communications networks that your government also relies on, that your law enforcement also relies on and you are making those vulnerable to hackers, to hostile nations and to whoever else might want to infiltrate those.
So, it is a very pernicious line of argument, but I don’t think it’s always in good faith and I don’t think that we’re ever going to win this battle simply by being correct simply by force of argument. We’ve been correct for multiple, multiple decades. The facts have not changed, but the will to create some magical formula that lets the government spy on everything does not seem to die.
00:14:35 Domen Savič / Citizen D
00:15:09 Meredith Whittaker / Signal
And I think that that provoked a kind of counter reaction. You saw a number of the platform companies, you had iOS and Android adding encryption to their operating systems, you had a turn to privacy from the industry that wanted to in effect, save their reputation if we’re going to be cynical about it, one to distance themselves from government spying by adding privacy features, and immediately after that in 2015, you see a showdown between the FBI and Apple in the US over the encryption on the iPhone.
And you begin to see an escalating campaign, as it were, to undermine the privacy guarantees that have been put in place post Snowden, most profound among these is Signal and the Signal protocol.
Often people who are perhaps a bit parochial or confused that want to undermine and walk back these changes. Now there are many kind of dynamics that I think have helped or hindered this, but I see this as one more salvo in an ongoing battle and no sign that we are losing the war… and in fact in the last couple of years we, those of us in the privacy world who are pushing for these fundamental human rights, have had a number of wins, have pushed back on a number of pieces of very bad legislation in the face of often incredibly emotional and compelling narratives that are difficult to fight against, particularly when someone is bringing a heartfelt story to the table and then you’re on the other side debating about the nuances of cryptographic mathematics or something, right?
00:18:50 Domen Savič / Citizen D
00:19:37 Meredith Whittaker / Signal
There are great services for private e-mail like Proton, but Proton is also interoperable with other major e-mail services like Gmail or Outlook which means that if I e-mail somebody on Gmail from my Proton account, Google has all of that information shared on their servers, right?
Similarly, if I were to pay for privacy, I’m paying the subscription fee so that you don’t collect my data, say, but I e-mail somebody or I connect with somebody, communicate with them in in some way and they don’t pay for that, I would also not be kept private, so you know communication networks are really something that shows us just how interdependent we are when it comes to privacy.
00:21:25 Domen Savič / Citizen D
00:22:19 Meredith Whittaker / Signal
We need to ask to efficiently fulfill the role of governments in these core services is, what type of technical infrastructure that is not controlled by monopoly actors? Would we need to be able to govern technology in a way that is more democratic? What type of governance structures do we need to put in place that allow participation at the level of design and features and how a technology behaves, how it respects or fails to respect fundamental rights.
And those are questions that are ultimately very exciting and I do think we are in a time when there’s no longer any debate over whether this business model is good or bad. You even have institutions like Combinator, which has done, you know, arguably as much as anyone to cement and promote the toxic Silicon Valley business model, is now coming out and saying like, “Hey, we’re actually not very into big tech. We’re looking at little tech now. We want to promote the, you know, the small players!” and whether that’s good faith or not, I think that really shows us that there is a sea change in terms of sentiment and that there is an opportunity to think through.
How would we dismantle and disarm the centralized power that is held by these platform companies? You know, what would independent cloud infrastructure look like? What would independent communication networks look like, what would interoperable protocols that enable more flexibility and independence at the application layer look like and how do we find the capital to fund these things and maintain them forever over time?
And how do we put in place governance structures that don’t behave like the boardrooms of big tech with their focus on profit, revenue and growth over everything else, but have more civic minded duties and processes that are working to aerate the tech ecosystem and make it more amenable to building technologies that actually serve beneficial futures.
00:25:46 Domen Savič / Citizen D
00:26:33 Meredith Whittaker / Signal
So, I think in a sense that when I say we need to get down to a more local level, that’s not fetishizing the small and the local. That’s really recognizing the function of these platforms, the role they play vis-à-vis government services, vis-à-vis commerce and communications really does vary across contexts and that the people in those contexts are almost certainly best positioned to answer some of these questions.
We don’t want to repeat the mistakes of one size fits all billion-user platforms, but simply do that kind of interventions right, because there is no one-size-fits-all.
00:28:09 Domen Savič / Citizen D
00:29:11 Meredith Whittaker / Signal
In the US, which I know is different than many other places, but it’s also generally funded through philanthropy, so you don’t have long term sustainable funding in most cases you are at the whim of whatever foundation or your donor might think it is important at that moment, and of course that is also susceptible to trends and to whim and to hype, which makes it very difficult to pursue a long term strategy, particularly when you are rowing upstream against vested interests that may frankly, may have a lot more access to some of the leaders in philanthropy than some of the activists who are on the ground doing real work but not being seen and appreciated.
So again, I think the political economy of NGO work and civil society needs a lot more scrutiny. And I think we need to be a bit bolder in frankly demanding the kind of support and capital that we need to do this work. And there’s a lot of really good ideas out there, really good architectures, incredibly brilliant thinking around how we could build tech differently, how we could build more respectful tech, but an idea is not the solution.
An idea is a possible template and what is not generally understood or let’s say respected is just how much work and how much money it costs to build reliable tech. It’s never just built once. This isn’t two guys in a garage, who come up with something genius and the world changes. No, it’s two guys in a garage, a really good idea and then billions of dollars of capital and hundreds of thousands of hours of labor who make that idea real, to maintain that idea in a volatile and dynamic environment and to do that forever or until that idea dies, or that tech doesn’t exist.
I think we also need to reframe our understanding of tech and recognize that we can’t have Sam Altman be the only one who’s talking big money, right? We also need to recognize that we’re serious about this change. We need to be at the table, and we need to be demanding a cut of that.
00:32:19 Domen Savič / Citizen D
00:32:57 Meredith Whittaker / Signal
We can look at things like race, science, that couched structural inequality as neutral biological destiny, they were just observing differences and then determining what those differences meant in ways that were pernicious and harmful for the world. I think we need to question narratives of neutrality to begin with, and then particularly in tech, I think that this has been a conflation of computational technology with scientific progress, which has been promoted by the tech industry.
The reason we’re all suddenly using Google or we’re all hosting on Amazon is not because those companies were successful during the primitive accumulation stage of tech but is simply because what they discovered is that significant scientific advance and they are introducing that to the world and as such, they bear no responsibility for that as such, that what they are doing is neutral and inevitable, it cannot be changed and as such, if you were to question it or if you were to say desire to regulate it in a way that wasn’t beneficial for those companies, you are anti-progress or anti-science, you’re putting your finger on the scales of human advancement, and I think that narrative has done as much as anything to really chill our ability to grapple with and meaningfully regulate these technologies over the past number of decades.
00:35:49 Domen Savič / Citizen D
00:36:50 Meredith Whittaker / Signal
You know, these are people who imagine themselves as always, in a position of power, and thus don’t generally question in the type of levers of power that we are creating and the way those could be misused, if somebody with more pernicious intentions were occupying their seat, so, I think this is an age-old pathology and it’s why we need to hold anyone who has a position of power to incredibly stringent standards and recognize that it’s really not personal, but that if you are going to take that kind of responsibility, you need to be held accountable and the people who are worthy of that responsibility should be embracing that.
00:38:05 Domen Savič / Citizen D
00:38:10 Meredith Whittaker / Signal
00:38:34 Domen Savič / Citizen D
00:39:18 Meredith Whittaker / Signal
So, there are many, many things, I would say 99% of the things that we would like to do are not things that we choose to do, because we do really value focus and we balance, we think long and hard about new features, we think about whether we can build those features in a way that that meets our very strict privacy bar, we think about whether those features are useful to people.
For example, Signal introduced Stories a couple of years ago, similar to Instagram, but our Stories are actually private, right? And while they’re not a hugely common feature in the US, they’re massive in South Asia and in Brazil and we were hearing from people that pick up Signal and it feels broken, because it doesn’t have this feature that has been a core way of communicating among the people who use Signal there.
So, it’s a lot of conversations, a lot of collaboration with our Chief Product Officer Clancy Childs, who is very brilliant and very experienced and has been working on messaging now for over a decade, who really has a lot of instincts there and we try to do some market research.
We do don’t collect user data, we don’t collect telemetry and analytics the way almost every other communication service does, so we often don’t have or we almost never have, the kind of signals that our competition does, but we do have other ways of fetching information and doing user research in the field that gives us a sense of how are people using Signal, what might they enjoy and we go from there.
00:42:08 Domen Savič / Citizen D
Now the end result for a normal user is gif search, right? It looks exactly the same as if we just shoved it in there and said “Hey, we’re giving all the data to Meta”, but in fact we spent orders of magnitude more time, creativity and rigor doing that than the competition.
00:44:14 Domen Savič / Citizen D
00:45:04 Meredith Whittaker / Signal
00:46:02 Domen Savič / Citizen D
00:47:20 Meredith Whittaker / Signal
How do we put the disinformation to rest and how do we push back against it? Well, we really do go above and beyond. We are open source. You can validate our claims if you have the skills, you can go into our repos, there are people who watch every commit we do and analyze it on Reddit, looking for new features, discussing what we’re doing, submitting bug reports, so we are one of the most scrutinized apps and one of the most scrutinized and audited cryptographic protocols in the world.
How does that translate to a popular message? Well, we are out there talking about Signal security as much as we can. We spend a lot of time when these campaigns of disinformation that are saying that Signal is a CIA asset or whatever the nonsense is. We spend a lot of time pushing back, even though we know this is fully fabricated and kind of, I would say almost lazy. We also recognize the stakes are really high, that there are people who don’t have the expertise to validate these claims themselves and can get really worried by those claims.
So, we also ask that the overzealous community of people who may be amplifying those claims or making them in service of getting attention or going viral. please find something else to do, because digital security is life and death for a number of people who use Signal in authoritarian context, and these kinds of rumors can have real harmful impacts on people, even if they are completely baseless.
00:49:54 Domen Savič / Citizen D
00:50:32 Meredith Whittaker / Signal
Podcast Citizen D gives you a reason for being a productive citizen. Citizen D features talks by experts in different fields focusing on the pressing topics in the field of information society and media. We can do it. Full steam ahead!
V jesen zakorakajmo z videonadzorom javnih površin in trendih na temu področju.
Z namestnikom Informacijskega pooblaščenca, mag. Andrejem Tomšičem v pogovoru predebatiramo zakonske varovalke zasebnosti, lažno dilemo med izbiro varnosti ali zasebnosti ter premišljujemo o politični komponenti videonadzora ter zakaj na področju videonadzora še vedno pristajamo na nedokazane prednosti videonadzora na področju varnosti, hkrati pa se avtomatično brez pomislekov odpovedujemo lastni zasebnosti.
V pogovoru še o domačih inšpekcijskih praksah, pravilih transparentenega objavljanja lokacij nadzornih videokamer in mednarodnih trendih na področju videonadzora ter vedno bolj invazivne tehnologije in vloge civilne družbe ter posameznika, ki ima veliko moč pri oblikovanju družbenega diskurza o tej temi.
Podcast Državljan D je podcast za produktivno preživljanje časa, v katerem igramo vlogo državljana. Državljan D v pogovorih s strokovnjaki določenega področja informira in aktivira. Da se. Gremo naprej!
Financirano s strani Evropske unije. Izražena stališča in mnenja so zgolj stališča in mnenja avtorja(-ev) in ni nujno, da odražajo stališča in mnenja Evropske unije ali Evropske izvajalske agencije za izobraževanje in kulturo (EACEA). Zanje ne moreta biti odgovorna niti Evropska unija niti EACEA.
Sacha Altay is a post-doctoral fellow working on misinformation, trust, and social media in the Digital Democracy Lab at the University of Zurich. We sat down with him to discuss the perception of disinformation, the failed attempts of self- and co-regulatory frameworks that try to limit the its spread and the way we should be addressing this problem.
00:00:06 Domen Savič / Citizen D
Welcome everybody. It’s the 25th of June 2024, but you’re listening to this podcast episode of Domen Savič / Citizen D podcast on the 15th of July same year.
With us today is Sacha Altay, postdoctoral Fellow at the University of Oxford within the Reuters Institute working on misinformation, trust and social media. So of course we’re going to talk about football. That’s a little opening joke. So welcome, Sasha. Thank you for being with us.
00:00:38 Sacha Altay
Thank you. Thank you for having me. I’m not with the Reuters Institute anymore, now I am at the University of Zurich. I was, you know, in Oxford last year.
00:00:45 Domen Savič / Citizen D
OK, excellent, things change so fast. And speaking of you working in the field of psychology and disinformation and trust in social media and so forth.
It seems that our world nowadays seems to run on disinformation in various areas. You have disinformation in politics and economy and environment and public health. There are numerous attempts in the EU, in the US, all around the world to sort of level out the playing field for the media consumer, you have regulatory attempts, self-regulatory protocols, increased efforts in education.
My opening question to you would be how did we get here? Was it always like this or did something change in the recent past so that disinformation became so prevalent and so influential in so many areas of our lives?
00:01:49 Sacha Altay
So, I’m going to answer this question first by talking about how people talk about it rather than whether there is more disinformation or misinformation today than before. Just how do people talk about it and whether people talk about it more today than before?
And I think it’s pretty clear when you look at the scientific literature or the number of news articles published or Google searches, that people are more interested in myths and disinformation or conspiracy theories now than before. And by now, I mean broadly, after the 2016 U.S. presidential election and the Brexit in the UK. After these two major events, interest in misinformation, disinformation and conspiracy theory really spiked both in the news headlines, in intellectuals’ circles, as well as in scientific research.
And I think clearly the COVID-19 pandemic, where the director of the World Health Organization, said that, you know, there was an infodemic, like a lot of misinformation about COVID-19 etc. So, I think interest also piped during that time and more recently very recently with the release, or at least the democratization of the ChatGPT and worries about the power of generative AI.
There have been new fears again about the impact that generative AI may have on elections. For instance, during the 2024 elections that are being held almost everywhere around the world. So, I think, yeah, clearly people are more worried about it. Recently at the World Economic Forum, for instance, missing misinformation was considered the number one risk in the next two years for democracies in front of climate change in front of war in front of any other risk. So clearly people, scientists and leaders are very worried about it, and I think it’s unprecedent.
But of course, we don’t have very good data. About 100 years ago, we were also extremely worried about this stuff. I doubt it, but if it’s possible, we don’t have very good data on it. But let’s say at least that yeah, now we are very worried about it, more than in the past as we can document it.
00:04:13 Domen Savič / Citizen D
Would you say that these threats or the perception of threats of this and misinformation are credible? So, is it really that big of a problem then, like the media and the politicians and everybody I guess is saying?
00:04:28 Sacha Altay
Yeah, because I’ve talked about perceptions. Now let’s look at the evidence. Let’s say that before 2016 there was some work on it, but the work was quite limited compared to today, and I think that since 2016 there has been a lot of great empirical work to look at.
The prevalence and impact of miss or disinformation, and most of this work, at least in Western democracies, like the US or Western, Europe has shown that mis- and disinformation is very, very small.
It’s consumed by a very small number of people who have pre-existing attitudes that basically predispose them to consume and accept the messages in the mist and disinformation. So, in the US, for instance, we know that it’s mostly, I don’t know, for Trump supporters that are consuming mostly for-Trump misinformation, and you can say the same for the other. It’s the same the other way around, so that’s what we know because we know that the average news consumer doesn’t consume or even stumble upon much misinformation. So that’s that has been well established.
And regarding the impact, it’s a bit trickier, but all the attempts we’ve done suggest that the effect is small and smaller than most other things like even just following the news like of course, when you follow the news, you get more informed about what happens in the world. But you also may develop some biased perceptions of the world because of course, the news doesn’t cover everything perfectly and they are not completely neutral etcetera.
So let’s say that the impact of mis- and dis-information is very small compared to just media bias effects and of course a lot of people just don’t consume the news and just not very interested in the news or politics and so are just broadly, uninformed about many of these things, and this has a much stronger impact than misinformation could ever be.
00:06:24 Domen Savič / Citizen D
So, it sounds like we don’t have a problem there.
00:06:30 Sacha Altay
I mean the way I see it is that we have problems. For instance, I don’t know, we have people who deny that climate change is happening and that it’s human cause. And I think it’s a problem that people disagree with, that it’s a scientific fact that has been established people disagree with. It’s a problem.
And these people will also say that they believe in misinformation, etc. And so, I think often people jump to the conclusion that “people who vote for populist leaders also believe in fake news”. It must be because of false news that they vote for populist leaders or like the same for Brexit, for Trump, etc. All these people, they say they believe in fosters. And so, I think we tend to attribute this bad stuff to false information, and I think false information is most often a symptom of other problems.
For instance, we know that at the country level in countries with more corruption like I don’t know, countries in the Middle East compared to northern Europe, countries in the Middle East are more corrupted than countries in northern Europe like Denmark and belief in conspiracy theories is much higher in like the Middle East than in Denmark, for instance.
And that’s because it makes sense to believe that elites are corrupt or competing against people in corrupt countries. So, there is some rationality to it. And we also know that people who distrust institutions for various reasons, some good, some of the, the less good, are more likely to believe in misinformation, conspiracy theories… literally because they are looking for information that goes against the establishments against institutions, and sometimes people are warranted to do so.
But let’s say that in Western democracies, where elites are often right, it leads to bad, bad outcomes.
00:08:15 Domen Savič / Citizen D
So, you would say that that this whole, let’s say one of the main reasons or important reasons for the prevalence of let’s say belief in disinformation is actually decrease in trust towards let’s say public institutions, governments, mass media outlets and so forth.
00:08:37 Sacha Altay
I think it depends. I think, I don’t know, for instance, in times of war, I don’t know if you look at Russian propaganda, some of the best predictors of believing in Russian propaganda is identifying strongly as Russia. And I don’t know, believing in the Great Russia narrative, for instance, that you want a great Russia. So, you’re going to buy the Russian propaganda.
But if you’re Ukrainian and have a Ukrainian identity, you’re not going to believe any of the of the Russian propaganda. So, in that case, it’s mostly about identity and I think most of the time identity plays a very important role. People believe stuff that aligns with identity, people and various identities.
You can have political identity, national identities, many kinds of identities. But yeah, as you mentioned, I think at least for conspiracy theories, they are often constructed really in opposition to events that are covered in mainstream news.
And they very rarely come up with their own stuff. Most often they just look at what’s happening in the news, and they say that’s false. It’s actually something else that’s happening. But what they mostly do is they wait for mainstream news to do something, and then they say, oh, it’s actually the opposite. And so, I think in that case, for conspiracy theories trust in institutions is, yeah, a very strong picture.
But if you look at other stuff, like for instance naturopathy or alternative medicines, then it’s mostly distrust of health institutions, not political institutions for instance. But yeah, trust is key to understanding belief in misinformation.
00:10:11 Domen Savič / Citizen D
Do you have any thoughts on the general distrust phenomenon? If you look at the way or if you look at all the areas that disinformation is rampant in, it’s basically you know, we don’t trust literally anything, right. We don’t trust the as we said, the governments, we don’t trust the media, we don’t trust the researchers in the field of yeah, environment and other issues. So why is this mistrust so prevalent?
00:10:46 Sacha Altay
I don’t know. I think it’s complex. It depends on the context. I think one potential explanation is that it’s hard for us to evaluate how trustworthy many institutions are, even evaluating how competent some people are.
For instance, when I tell people that they do behavioral sciences, a lot of people don’t even know that it exists. Basically, you can study human behavior in a scientific way because for them science is mostly about, I don’t know, biology, geology, physics. And so, I think now the division of cognitive labor, like how experts, people can be, is extremely high in today’s society.
So, I think we have trouble understanding, you know, how experts. Some people are on vaccines, on GMOs, on nuclear energy or stuff like that, and that it really goes beyond our own experience and that basically with our own eyes. Or our own brain, our own experience, we cannot come up with conclusions to write about this and we need to trust other people.
But then we need signs that we can trust them, and often we don’t see these people, they don’t talk to us. They are far away. They may be anonymous. They may have weird names. And so, we I think it’s hard to trust people that far away and so that may be one reason but honestly, its they are there are many different reasons.
For instance, during the COVID-19 pandemic, the authorities in many countries did not communicate well about guidelines about scientific evidence, and so it forced the distrust. They also had some measures that were very restrictive and then not restrictive, and all that a lot of people didn’t like that, and it affects trust, so there are there are many ways to lose trust and one thing is that it’s easier to lose trust than to gain it. So that’s why also institutions etcetera need to be careful because they can very easily lose people’s trust.
00:12:45 Domen Savič / Citizen D
So, what does that tell us about, let’s say the countering of disinformation pandemic or infodemic. So, are these attempts of, let’s say, regulation, co-regulation, fact checkers, everything that’s been going on since, let’s say, yeah, since the election of Trump… are these countermeasures sort of focusing on the right problem? Are they addressing the issue or are they mostly greenwashing?
Greenwashing in the simplest of terms, saying “Ohh, you know, people are dumb because they believe disinformation, and now we just have to give them knowledge or science and everything will be OK.”
00:13:33 Sacha Altay
I mean, I think these people mean well most of the time and in my opinion, I think many of these measures like fact checking labels etc… They mostly target the symptoms, and they can help a little bit. I mean I’d rather have sack checks than have no fact checks.
I think it’s a good thing that they are fact checkers, it’s just that we need to not lose track of the bigger picture and remember that fact checks are effective at correcting misperceptions, but they are they are not effective at changing people’s minds about who to vote, how they feel about some politicians and so that that that’s a problem like you can fact check as many as Trump’s statement as you’d like, people are still gonna vote for Trump if they have some pre-existing values, attitudes and reasons to vote for Trump, and we need to understand why some people are attracted to populist leaders to, I don’t know, people who prone anti vaccine stuff.
We need to understand the reasons why that’s the case and so that that way I’m just worried sometimes when leaders think that, yeah, if we address the disinformation problem, the deeper problem are going to go away, like populism is going to go away with fact checking.
I mean, no other thing is really defending that. But I think we need to keep track of that there are some deeper factors that affect this, and we need to target them. Some of them may be, yeah, lack of trust or polarization, just how people feel about political opponents, etc.
And then of course, there are some even deeper factors that I mentioned. Corruption, inequalities, poverties. There is stuff like that that do affect belief in misinformation. And of course, I’ve been advocating to target this because if we reduce inequality or if we reduce poverty, it will have all the benefits, then reducing misinformation.
00:15:27 Domen Savič / Citizen D
But that’s hard, right?
00:15:30 Sacha Altay
Yes, of course. But I think it’s something we need to be clear about is that there is no easy solution. There is no technological fix that will solve these problems. There is no magic solution or whatever, like it’s a tough problem because social political problems are often very complex and populism, anti-science attitudes et cetera are not going to go away with fact checking or media literacy.
00:15:54 Domen Savič / Citizen D
But so. So, we’ve just had the EU election, right. And did you feel… I’m coming from Slovenia, you’re now based in Switzerland. Did you feel the debates, the discussions before the election sort of highlighted these issues that are that are that are underlying the disinformation pandemic?
Did you have the feel of political representatives really knowing what’s going on in regard to as, as we said before, crisis, lack of trust in in several different areas for them to sort of change the way we’re addressing this issue as we move yeah, into a new European Commission mandate?
00:16:42 Sacha Altay
I’m not sure. I’m not sure, to be honest, I’ve not followed it very closely, but I’ve seen a lot of technological solutions being proposed. I mean, I’m not thinking about the elections right now, but stuff like adding labels on social media to say that it will hurt your mental health or holding tech companies accountable and it’s good to hold them accountable.
But I think a lot of solutions are a bit too focused on technological fixes and not enough on the functioning of democracies, institutions and deeper factors, but often as politicians, it’s easier to blame Mark Zuckerberg or big tech and say that you’re going to go against big tech than to say that the problem is politicians lying or pushing lies or instrumentalizing some facts or political.
00:17:35 Sacha Altay
I think it’s easier for them to blame big tech than themselves and the system, so no, I’m not sure there has been really a wake up call.
00:17:50 Domen Savič / Citizen D
And what are your thoughts on on exactly this issue of this techno deterministic solutionism, that tech is is the problem, but at the same time, it’s also the the grand solution to to every problem that that we have what what is fueling this idea or this way of of proposing solutions in in in this area?
00:18:17 Sacha Altay
I mean, I’m not totally sure I’m more and more interested in in this area and the more I read about it, the more I realize that we have always done that as humans to blame new technologies for so many problems, like we’ve blame writing for losing our memory.
Like if we if we write, we don’t need our memory anymore with blame books for disconnecting us from reality, we blame every new technology for many reasons that now we would consider very silly.
But at the time we took it, at least some people took it seriously and pushed it. But no, I’m fascinated by why we do so. In France, for instance, at the last year’s election, the extreme right party did well and a lot of commentators they’re saying that it’s because of TikTok.
It’s these stupid kids basically on TikTok being influenced by a young leader and of course it’s a very simplistic explanation that contradicts most of what we know about the effects of mass communication and how people use new technologies and how people decide for who to vote.
But for some reason these are very, very attractive and we tend to always blame the same people. It’s like young people who have this weird new information technology, tick tock or whatever and that may explain why they are different from us and do stuff that we don’t understand. So, I’m not sure why there is this focus on technology… there are many theories. Some say that yeah, it’s intentional. It’s like politicians, for instance, pushing these narratives so that we don’t blame them, but I think also these explanations tend to be quite intuitive, like a lot of people don’t really know what TikTok is. They think it’s just young people who dance. And if suddenly you have politics, then people may be influenced by it.
Because in psychology there’s this well-established finding that we tend to overestimate how much people are influenced by bad media effects like advertisement, propaganda, etcetera? We tend to say no, I’m not so much influenced by it, but other people, and especially by political opponents, are extremely influenced by it. So, I think, yeah, we have this tendency to think that. So, these explanations are very, very intuitive and most people of course are not super interested in the truth. So, for them the job is done.
You know, why did Trump get elected? Well, because they watch Fox News and they are a bit stupid, and they’re not very well educated and then they hope this is the end of the explanation. And so, the problem is Fox News and of course Fox News maybe a bit problematic, but removing Fox News will not remove populism. I think it’s also a bit of laziness to some extent.
00:20:55 Domen Savič / Citizen D
Hmm. So, addressing this issue moving forward, then we can also talk about not just the co-regulatory practices or the regulatory practices, self-regulation, whatever, but also about media literacy, about training, about changing the way people interact with media outlets or with the social media and other informational faucets so.
So how would how would you what needs to be done or what would you do to sort of change, change the tie the tide of this, yeah, of this stalemate where you always have, you know, fact checkers playing whack a mole with, with the disinformation spreaders.
00:21:44 Sacha Altay
I’m not sure. It’s also why a lot of people don’t like my work is that I also don’t have solutions to to offer because the problem is very complex and I’m just advocating for realization that the problem is complex, then the solutions that a lot of people are offering that are very, very small set and are not going to save the world.
But at least when it comes to media literacy, I think a lot of the media literacy programs, they assume that people are gullible and too trusting, whereas we know that a lot of people are either just completely uninterested in the news and avoid it altogether, or do not trust the news and start from basically cynicism instead of gullibility.
And so, I think it would be perhaps more interesting to try to foster interest and trust in reliable information, not necessarily the news, but also some news influences, Wikipedia or high-quality sources of information in general, rather than to alert people about misinformation like they did during the COVID-19 instead of saying there is misinformation everywhere, be careful, say look there is reliable information and you can find it here and you can trust us for these reasons and we are being transparent and we are being accountable etc.
So yeah, I’ve been advocating for that for this shift in focus. But again, I don’t think it’s going to necessarily have huge effects. It’s just a small, small change intervention that already has small effects.
00:23:21 Domen Savič / Citizen D
So, is there a cause and effect of these underlying reasons that are that are then birthing the era of disinformation? Would you say, if you look back at, let’s say historical developments moving, past Trump, but by going further down the line? What caused this or what are some of the happenings that pushed us into the situation that we that we have today?
00:23:55 Sacha Altay
I mean, I’m not sure, to be honest, I’m not exactly sure it’s often very complex and I’m not an expert in the rise of populism in the US or the rise of anti EU sentiment in in the UK. It’s some people are experts in this domain and I don’t think they have clear answers.
So no, to be honest, I don’t know. It’s just complex. I just know that information and information practices often downstream of attitudes and values and political identities and stuff like that and so, they rarely play an important causal role.
They are most often there to, like, rationalize people’s attitudes and behaviors. But no, I’m not sure why populist leaders are rising in the world, and why Trump got elected, and why the Brexit happened. It’s complex, I’m not sure.
00:24:53 Domen Savič / Citizen D
Do you think it has something to do with role separation? So this is partially my theory, or this is something that I subscribe to, looking at this field from, let’s say the disinformation researcher, but also from someone who’s focusing on media literacy and is interacting with people on on this issue.
Coming from Slovenia, let’s say from the Balkan region, Central Europe, whatever it is, it almost seems like that that on one side nobody wants to talk about political engagement or the role of a citizen, because this has some bad connotation moving backwards to Yugoslavia or the east versus West.
But at the same time, everybody’s emphasizing this individual role of a consumer that you are in charge of everything. You vote with your wallet basically, and you’re the one who’s making things happen, right. And then at the same time, every time something, let’s say, bad happens, everybody’s escaping the responsibility or part of their part of responsibility of certain issues, be it surveillance, be it disinformation, be it anything or everything else that’s happening around us.
00:26:20 Sacha Altay
I mean, I’m not sure, to be honest. Because you’re saying that there’s been a shift in seeing consumers, the information consumer…yeah, I’m not so sure.
0:26:40 Domen Savič / Citizen D
It’s just, thinking about it, seeing how how people on one side they recognize that the problem, let’s say of disinformation or issues related to this information s very broad and very composed out of different outlets and at the same time they want, as you said, this techno deterministic solution. Just push of a button and and everything will go away, right?
And then maybe if we if we move on to let’s say the relationship between politics, between party politics or governments and disinformation, would you say there is a link between those two? Or should we focus or should we talk about this information in connection with, let’s say, political agendas and other? Yeah, other yeah, with things related to politics. Or is this completely separated? Are these two issues completely apart?
00:27:50 Sacha Altay
No, no, of course they are very much related in the case of disinformation. So like false or misleading information spread with the intent to cause harm. Very often it’s spread by government or like foreign governments, for instance, doing like information operations and so they are clearly political.
Like, I don’t know, increasing polarization within the society, reducing support for some states, or increasing support for some other states. So there are some clear political actors behind disinformation campaigns, that that’s for sure. In the case of misinformation, not all misinformation is related to politics.
But let’s say that scholars have focused more on political misinformation than other types of misinformation. For some reason, it’s not totally clear why some people think it’s more impactful, even though others have agreed.
I have argued that health misinformation is also probably problematic, and that we should focus a bit more, but yeah, a lot of misinformation, at least misinformation that matters, and that people see and that has the potential to be impactful is spread by politicians because they do have a wide coverage, and sometimes they even do so in the media.
When Trump was in power in the US and he clearly spread a lot of falsehood through mainstream media because he’s the president and the news have they have to cover him and sometimes he tells lies and that was a difficult situation, but of course, in general politicians instrumentalized facts or spread lies to gain supporters to political gains in general.
So, and of course, not all politicians do that, but many people have agreed that when politicians do it, it’s more problematic than when a random user with 300 followers does it because of course the impact is not the same, and because politics can have an impact even if people don’t really believe it.
For instance, if Republicans in the US say that they are against masks, then Republicans in the US can use a mask as an identity factor. They can say, oh, I’m not wearing masks so that I identify as a as a Republican and then it becomes this bookmarker, this party line and then people wear, not wear masks, just to identify like that. Even if they don’t really, even if they have nothing against masks. So that’s problematic.
00:30:30 Domen Savič / Citizen D
And focusing on, you’ve mentioned it a few times, the role of mass media, right? So, on one hand you have the theory of media watch, public watchdogs of the fourth estate. On the other hand, you’ve just mentioned that you have them as megaphones that are amplifying, let’s say the dis- or misinformation conspiracy theories. How do you see the role of mass media in all of this? Are they friends or enemies of the people?
00:31:02 Sacha Altay
I mean, first I’d like to say that the news media plays a very important role still, despite the rise in social media, despite the advent of like the Web 2.0 and everyone being able to be a content creator, etc. We know that when people consume news, they still are mostly turn to mainstream media so they still play a very important role and then the question is, are they friends or enemy?
I think it depends on the context and the country. I’d say that in Western democracies where there is a strong public media ecosystem that’s free and has money to do good work in the UK with the BBC, for instance. Then I think they have a good media ecosystem and that the news is mostly friend, even though for instance in the UK there are also a lot of tabloids that probably don’t help people being informed that much if not create a bit biased perception.
But overall on average, let’s say that in countries like the UK, the news is a friend in the sense that on average they help people being more informed about what’s happening in the world and perhaps make more informed decisions and be better citizens. But then in other countries, I don’t know, like of course, obviously like China for instance.
Many news outlets are controlled more or less by the state and are not totally free to do what they want or like in Russia, or to some extent in in India. In these countries it’s not totally clear whether the news is a friend or an enemy.
Because if you’re if your friend of the government and the government is a is a dictator dictatorship, then of course you’re not serving the people, but you’re just serving the the the regime.
And in countries like the US, for instance, where they don’t have a strong public service media on average, let’s say it’s all right following the news in the US is better than totally not following it, but it’s also less good than than I think in other countries like like the UK or France.
So it depends. And of course there are some good news outlets. There are some bad ones and and and of course they it’s hard for user outlet to give a completely perfect representation of reality and cover topics fairly and satisfy all audiences. And so I think they have, they have a lot of work to do and then they should do it better, but I think a lot of them are trying.
Trust in the news has been declining at least a little bit in the in the last 10 years, but more the the largest drop has been in interest in the news, where people just are less and less interested everywhere in in the news and participation with the news. So just like talking about the news sharing news online etc.
I have done some work on it and it’s declining in many countries, so people are just turning away from the news, but hopefully we can create a new form of news that attracts people and that interest people. And I think there are a lot of benefits or at least potential gains in like news influencer like on TikTok.
In France, for instance, we have Hugo decrypt, the news influencers that do a very good job at summarizing what’s going on. And they also have deeper explainers and I think it better fits how new, how young people consume news and information in general online, so the news also needs to adapt to its audience and be better to gain trust.
00:34:33 Domen Savič / Citizen D
Just to follow up… What do you think are some of the reasons the people stop engaging with with mass media or with, with, with news outlets? Is there something like you can or we can point a figure at and say, OK, it’s this, it’s this or it’s just yeah, a complex situation that you can’t really describe?
00:34:55 Sacha Altay
There’s a lot of work on that and there are some clear reasons that come up all the time. One of them is that the news is too negative and it brings down people’s mood.
It happened, for instance, in COVID-19. A lot of people consume more news, but at the same time, the news was mostly negative. And so, lots of people turned away from the news because it was too negative. And in general, that’s something people who start avoiding the news or who avoid the news mentioned. That’s one of the main things.
A lot of people also feel just overwhelmed by the amount of news that there is and they think there is too much they cannot follow everything. That’s also the reason… I forgot the other reasons I think they are slightly less important.
I think negativity in the news is one, but at the same time, we know that negative news is also more interesting. So you know, it’s also there are some positive news outlets, but the positive news outlets are not very successful. And I think it shows that people are actually not very interested in positive news. They they say they want it, but I mean a lot of people say they want stuff, but then when you give it to them, they they don’t like it.
And of course, we are interested in the trains that are late and not the trains that are on time so it’s difficult. There’s no easy solution I think to remedy this problem.
But I do think that maybe the news sometimes covers negative events too much and doesn’t give the broad picture enough, like even about stuff like poverty, hunger around the world. Most people think that, yeah, poverty is increasing, that hunger is increasing and that’s totally false, like extreme poverty has been reducing a lot in in the last 50 years. Same for hunger and many of the of the world’s problems are actually getting better.
Many people don’t know that, and I think maybe the news just needs to also give the big picture a little bit more. But again, I don’t know, it’s complex.
00:37:00 Domen Savič / Citizen D
And wrapping up. Are there some, let’s say low hanging fruits in this fight against disinformation about media perception, about reality perception that that you are, let’s say, paying attention to when you’re looking at it?
Are there some things that should be happening or will happen that will make you say OK we did something right or this is something that that was worth pursuing or that is worth pursuing if we want to wake up one day in, to put it very stereotypically, in a brave new world – in a positive way?
00:37:49 Sacha Altay
I’m not sure, but to be honest, some example I see are like really Hugo decrypt in France, this French influencer I think is doing a lot of good to interest people in the news that are usually not interested, and I see a lot of value in that, probably more than fact checking.
I’d say that just the work of Hugo decrypt reduces misperceptions in French more than all fact checking in France just because a lot of people who would not have consumed news now consume news because of him and are aware of a lot of stuff and get some context on events, etcetera.
And so, I think this is going in the right direction and it makes me optimistic. I also very much trust the European Union to do some work to hold big tech accountable. Of course, I don’t think big Tech is the cause of all our problem, but I do think that we need to hold them accountable. And they are so powerful that at the country level in Europe, we cannot do much against them.
But united with Europe, I think we can increase, for instance, the sharing the data or there are a lot of stuff that we can push these big companies to do, that would help us better understand the ecosystem.
Because one thing I haven’t said is that we don’t know much about the descriptively about the disinformation ecosystem in general in most countries, we know a lot about the US and some Western democracies, but because we don’t have good access to data, we don’t know much and a lot of people are jumping to conclusions proposing solutions, et cetera, but might take has mostly been guys we don’t know much yet and in most countries that matters and even countries like India, we don’t know much about the new ecosystem.
We don’t know much about what’s going on with the WhatsApp group and in these countries and now we are proposing solutions based on, I think, a biased understanding of information, ecosystems and so, I believe the European Union could help us have a better overview of information ecosystems, but I think, yeah, that’s mostly what makes me hopeful.
But when I’ve seen discourse and news coverage during the European elections, I’ve not seen much misinformation. I’ve not seen much AI generated misinformation. So, I think it’s pretty good news and it was the same in many other countries, even like India or many countries like there.
There was of course some false information as usual but there was no massive information operation that worked and influenced people. So, it also makes me pretty hopeful that, at least for now, I think things are going fine. Of course, we need to make it better, but it’s not like here.
00:40:45 Domen Savič / Citizen D
Excellent. We’ll end up on a on a positive note. Thank you Sasha for dropping by, for sharing your thoughts on the issue. We are off the next month, so see you all again in September. And yes, thank you again Sasha for for dropping by.
00:41:02 Sacha Altay
Thank you very much for having me.
Podcast Citizen D gives you a reason for being a productive citizen. Citizen D features talks by experts in different fields focusing on the pressing topics in the field of information society and media. We can do it. Full steam ahead!
V stoti epizodi se s filozofom in programerjem Filipom Dobranićem pogovarjamo o pogovoru o umetni inteligenci.
Kako to področju uokvirjajo množični mediji in kako politiki, zakaj je to lahko težava in kako bi jo morali najprej reševati na področju jezika, ki pogojuje področje politike in v nekaterih primerih celo področje tehnološkega razvoja.
Od tega, da bo umetna inteligenca rešila svet, do tega, da ga ugonobila, o pomenu pojma umetna inteligenca in drugih podrobnostih tega področja, ki uokvirjajo razpravo na temu področju.
Podcast Državljan D je podcast za produktivno preživljanje časa, v katerem igramo vlogo državljana. Državljan D v pogovorih s strokovnjaki določenega področja informira in aktivira. Da se. Gremo naprej!
We sat down with Nina Jankowicz, an American researcher and writer, currently working as the Vice President at the UK-based Centre for Information Resilience, to talk about fight against disinformation and online harassment, the role of different actors in this area and the differences between EU and USA in this field.
Nina is the author of How to Lose the Information War, on Russian use of disinformation as geopolitical strategy, and How to Be a Woman Online, a handbook for fighting against online harassment of women.
She briefly served as executive director of the newly created United States Department of Homeland Security (DHS)’s Disinformation Governance Board, resigning from the position amid the dissolution of the board by DHS in May 2022.
00:00:06 Domen Savič / Citizen D
OK, welcome everybody. It’s the 12th of April 2024, but you’re listening to this episode of Citizen D Podcast on the 15th of May 2024. With us today is Nina Jankowicz, an American researcher and writer currently working as the vice president at the UK Based Center for Information Resilience.
She’s the author of how to lose an information war on Russian use of this information as geopolitical strategy and how to be a woman online, a handbook for fighting against online harassment of women. She briefly served as an executive director for the newly created United States Department of Homeland Security disinformation Governance board resigning from the position amid the dissolution of the board in May 2020.
Welcome, Nina. First of all, thank you for joining us. It’s really a pleasure.
00:00:56 Nina Jankowicz
Thanks for having me on Citizen D, yeah, I’m excited to be here.
00:01:00 Domen Savič / Citizen D
Excellent. First let’s start at the beginning, right. This information government board, your brief tenure, the critique which you wrote for Foreign Policy where you highlighted the issue of political say squabbles that are limiting the fight against this information.
Uh, it seems that no matter where you go in the US, in the UK, it’s always the same, right? Every time somebody tries to do something in terms of regulating the disinformation, fake news, propaganda and other issues, the backlash, it’s always, you know, they’re going to take our freedoms, we have to defend democracy, free speech and everything else.
Would you say that this is something that just needs to be taken into the account or is there a correct approach to these types of regulatory bodies where people actually feel that the body will have an actual benefit on the entire of society, not just the political parties or sides or whatever.
00:02:08 Nina Jankowicz
Sure. So let me start with kind of a little description of what the board actually was and what people say it was. For the listeners that may not be familiar, so prior to joining DHS, I’ll say a little bit about myself too.
I had been an expert on disinformation and analyst, focusing particularly on Russia and Eastern Europe, but increasingly especially during the pandemic in the lead up to the January 6th insurrection, looking at some of the domestic disinformation that we had seen so that’s my background.
I had testified before Congress for both Republicans and Democrats. I had worked with members of Congress across the aisle as I was a fellow at the Wilson Center and I had done a lot of work kind of supporting policymakers all around the world and trying to put forward solutions to disinformation that upheld democracy right that made sure that our democratic freedoms were protected.
I came to the job optimistic because I didn’t really expect to get a political appointment from the Biden administration, and so when they came and said, you know, will you serve your country? I said, of course, this is in my area of expertise. It seems like there’s actually room to do something here, so let’s get something done. And I’m bringing with me everything I’ve learned in my research, including a lot of policy analysis of the things that had been tried in Central and Eastern Europe that that may or may not have worked right, which is what I write about in my first book.
The Disinformation Governance Board was meant to be an internal coordination body bringing together all of the different components as we call them of the Department of Homeland Security, which have very disparate missions. It’s a huge organization, and in some ways of Frankenstein, right?
It includes everything from our emergency and disaster management agency, FEMA, to our Customs and Border protection to TSA, the guys that make you take your shoes off at the airport to our cyber security and Infrastructure Security Agency, SISA, which actually dealt with election security and kind of beefing up local and state election partners ahead of elections and potential incursions by the Russians or the Chinese, so a lot of different missions and the idea was to bring everybody together, make sure that we had a shared definition of disinformation, make sure that in the work that each agency or component was doing, that it was upholding civil rights, civil liberties and the right to privacy which Americans of course hold dear.
That was the brief that I was given that I was going to bring these people together. I was going to assess what was going on in the agency and make recommendations for the board to adopt or not adopt. And we would go forward and make policy as a policy making body. How was to in the policy part of DHS, right, the policy shop, as we say.
When the board was announced, I had been working at DHS for about 8 weeks at that point, at the very beginning of my tenure, I said to my bosses, I think it’s important that we announced this and that we announced it transparently. There had been incidents for similar agents announced that they there were I wouldn’t even call it backlash, there were rumors there were lies about them.
I’m thinking in particular of the Czech Center against terrorism and hybrid threats when they were announced, there was a lot of excitement among disinformation researchers and, you know, people in the national security space. And then people on kind of who were skeptical of the government in in the Czech Republic really didn’t like the idea of this center against terrorism and hybrid threats, which in part was dealing with disinformation. And that was because they communicated poorly about it.
They weren’t going to be doing any fact checking, they were very narrowly focused on the mission of the Ministry of Interior and, you know, threats related to terrorism. But they didn’t communicate that very, very well and I write about that in my book. So I said to my bosses we need to make sure we communicate well, that did not happen.
You know, I wasn’t actually a very high-ranking person within DHS, I wasn’t confirmed by the Senate. There were, you know, a lot of people who outranked me there. And my job is to give them advice. And I gave lots of advice about different approaches to communication that I thought we should take. And they were all kind of rejected in favor of an approach that was announced to the board but didn’t give a lot of detail about it and the problem with that was that we left a vacuum for the adversaries, both political adversaries.
But I will also say Russia wrote about this to fill in the blank and what they filled in the blank with was wise. They said that this information governance Board was going to govern what was true and false on the Internet, which was not true. They said that I was going to have the power to send men with guns to the homes of Americans with whom I disagreed – also not true. I was not a law enforcement official. I had no budget. I had no operational authority.
That operational authority lied with all lay with all the components, but even they were not going to send men with guns to the homes of Americans with whom they disagree. That was just preposterous. And also, in that vacuum, because there was so little information about the board it was ironic, of course, having written about online abuse and studied online abuse that I would then be subjected to a very widespread hate and harassment campaign that looked it.
I mean, my family was doxxed, our personal information was released on the Internet at the time I was pregnant, people were saying horrible things about my reproductive status, my baby, my husband, whatever.
There other members of my family were targeted. There was wide skepticism about my personal life, I got death threats and all sorts of other nasty, unsavory, violent threats. And also, it’s not the same, like the violent stuff is worse, of course, but people just lied about me, they lied about all sorts things they said that the Hunter Biden laptop was a Russian disinformation operation.
I actually never said anything like that. I urged people to be skeptical about the laptop given when it was released and who was shopping it around. Rudy Giuliani. He does not exactly have a very good track record of telling the truth. Right. And I said, listen, we don’t know what this is. It can’t be independently verified right now, treated as a Trump campaign product, and somehow that has been construed to say that I believe that it’s Russian disinformation that’s been repeated over and over. They said that I believed in the Russia gate conspiracy theories, if you read my book I kind of say we don’t really have the evidence that says that Russia, Russia was colluding with Trump, but we do have evidence that Russia tried to interfere in our election, right, that’s open-source evidence, we know about that.
So, my opinions, my thoughts were totally misconstrued. I was painted as a young woke liberal woman who was coming away to take your coming into governments, take your rights. And so, to this day, people believe that I committed treason against the United States because I was attempting to violate the 1st amendment and take that away from my fellow citizens.
And I’m sure my grandfather is rolling over in his grave. So, my grandfather was 10 years old when he was deported to a gulag by the Soviet Union, made his way through Central Asia and ended up in the UK in the early 50s and immigrated to the United States. Kind of looking for the American dream.
So that is very much an experience that I carry with me and that is in my DNA, and for somebody to say that I wanted to take away Americans fundamental rights and freedoms is just so anathema to me.
And of course, what happened with this in the end I was pregnant at the time I decided to resign because the department really didn’t have my back. They were not pushing back vociferously on these lies. They thought that strategic silence was the right thing to do, that like this would blow over.
I kept being told that it would blow over the next weekend or the weekend after and as we waited and time grew and, you know, issued a fact sheet that wasn’t very good. They had, you know, people give various statements to the media. But I wasn’t allowed to speak for myself. And it was my record and my family that was being disparaged and endangered. So I decided to leave. They had asked me to stay on as a policy advisor and I just decided it wasn’t worth it, then they disbanded the board after I left later that summer.
And you know, I was at home with my baby trying not to think about anything.
This is a really frustrating and sad moment for me, and I think it’s one that’s important for your audience to understand as well as much as there has been political backlash against some of the counter disinformation in attempts in Europe, there’s never been anything like what we’re seeing in the States right now.
And it’s not just about me, it has now expanded from the disinformation Governance Board and me in 2022 where kind of the Democrats presented the board as a fait accompli to the Republicans and said, OK, you win. We’re not going to fight you at all and that’s kind of an admission of guilt, right?
It looks like one, even though nothing there was no wrongdoing and then.
Having had that proof of concept, the Republicans went further and said look at all the rest of this censorship happening, and they basically redefined the term of censorship as any person studying the information environment. They’re a censor.
Any cooperation or communication between an academic or somebody at a think tank and a social media platform or the government that’s also censorship, right? That’s the claim right now. And I think that that is a completely false definition of censorship.
Censorship has a clear definition of what that looks in the American context. But B, it is it is doing the opposite effect of what it says it is. It’s not defending people’s fundamental rights and freedoms, it is having a chilling effect.
I’m the researchers who are doing this work and just trying to stand up for democracy. So that’s where we are right now in the states. It’s not a not a pretty picture and to me, it represents the most fundamental threat to academic integrity and freedom of expression since the McCarthy era.
00:12:01 Domen Savič / Citizen D
Why do you think looking back on your involvement with the DHS, the backlash, the response because it sounds like the arguments you wrote about in your op-eds and everything everywhere else.
They sound like there is a little black book on how to discredit people, right? And anybody can rent it out in a library, read the paragraphs, and then use it in in different contexts in the US and EU all around the world, right? So why do you think this is such a transferable procedure?
So why do you think the same things people are using to you know, media assassinate type person or gather up gather up like reasoning for personal attacks. Why do you think this works like literally anywhere in the world?
0:13:05 Nina Jankowicz
A good question. I mean, I think ultimately, whether it’s online or in a newspaper, or just gossiping with coworkers around the water cooler, right, people are interested in personal stories, and the more salacious, the better.
And so, you know, claiming that I’m this crazy woke, you know, feminazi, working in the DHS, that’s a good narrative, and it makes money for people. It makes money for people.
I know because it’s been repeated over and over and over again. Whenever anybody has a chance to just mention me, that’s how they describe me. Right? So, it clearly is resonating with people. And I think also there’s been a normalization, at least in the United States and our politics of these sorts of personal attacks since Donald Trump’s presidency.
We have seen that sort of personalization not just happen within him and his office, but also being normalized in the Republican Party to some extent. I wouldn’t say all Republicans do this, but people there are certainly people within the party who have enormous influence who act this way.
And it’s interesting because they do it bombastically in public, but when you get them behind closed doors, they don’t act that way and so you know that it is – it is something that is done for show. It is something that is an act and it’s really depressing to me.
And I have actually said directly to members of Congress who have attacked me in this extremely personalized way and put me and my family in danger that I thought they were engaging in a dereliction of duty as elected officials. They have an important role to play in setting the tone of discourse in our country and by acting the way that they do, they are telegraphing to their constituents that that is OK and generally some of the constituents, not all, but some will go farther.
They will say worse things they will, you know, send violent threats. They might show up to people’s houses. They might engage in violence like, that’s not beyond the pale. We’ve seen that already in the United States over the past four years. And so I really think we need to call that out and make sure that people recognize how strange that is.
I remember there was a conversation with a governor and I forget which governor; it was a state governor here in the States and he was asked about the gendered and misogynistic attacks on Nikki Haley, who was running for president for the Republican Party nomination at the time.
He said that’s just politics and actually if you look at other countries, other developed democracies, it’s not as accepted in other places, it still happens. Yeah, there are still misogynistic comments, but they are not the norm. And I think we really need to get back to that. And of course, the Internet getting back to the global nature of all of this.
Internet really supercharges that ability to be mean and personalized, and downright violent, right? There’s no consequences for any of this stuff.
00:16:25 Domen Savič / Citizen D
So, so would you say, you know looking at potential, I’m not going to say remedies because that would be too optimistic, but the right way to sort of address these issues, right? So we’ve had like spanning from let’s say the first election of Donald Trump, the Golden Age of, this information writing and fact checking and active citizenship we’ve had nothing but failures up to up to today, right?
Nothing that we’ve advised on in 2015, worked. In fact, if you compare, the first notes about, you know, the media is lying to you, you have to check everything, everything is disinformation and propaganda. I think it’s sort of… like to me, it seemed that the biggest blowback came during the pandemic crisis, right? I usually joke in a very sarcastic way about this and say that the best fact checkers or the best people that really took fact checking to heart were the antivaxx community during pandemic, because they said nothing is true, everything is a lie, we have to figure out everything on our own, right?
So moving forward, are there some lessons, not to be learned, but are there some lessons we can take as a basis to sort of reformat the discussion the framework in which we are addressing the fight against this information, a fight against propaganda.
00:18:03 Nina Jankowicz
Yeah, I’ve always been. I don’t want to say skeptical of fact checking. I think it has a time and a place and serves useful purposes, but I think we need to be very clear about its limits, so a fact check, a reactive fact check of a piece of disinformation or misinformation is not going to necessarily reach the people who already believe the mistruth, right.
In fact, it often causes them to double down on that mistruth and so I am much more in favor of investing in information literacy and people react poorly, especially in the states.
That’s beginning to be another kind of divining or lightning rod issue here because they think that means that people are gonna be told, OK, this media outlet is trustworthy and this one is biased, this one just publishes false news.
And that’s actually not what media literacy is. It’s giving people the tools that they need to navigate today’s information environment and while I don’t want to put all of the emphasis on people, right, I think there needs to be other parts of the solution. I do think that if people understand how social media platforms work, why they’re being shown certain content, if I’m looking for a new pair of shoes and then suddenly I’m getting Instagram ads about different pairs of shoes that are, you know, that have the characteristics of the ones that I’ve been looking for, like there’s a reason it might not be the best shoe for me, it’s just that they’ve got a very good online advertising team and similarly, you know, knowing that the platforms are targeting you based on your interests and the things that you’ve engaged.
Before I wrote a piece in 2020, about how it’s so easy to go from in our context, we had a big movement to reopen businesses in about, let’s say, April or May of 2020. And even if you just have that as a purely economic argument, if you knew like, if you didn’t believe in any COVID misinformation, you didn’t think vaccines, you know had 5G we didn’t have vaccines back then but you know all these crazy conspiracy theories?
Yeah if you didn’t believe in all of that, but you just wanted to reopen your business, it was a couple of steps removed from joining a a reopening Facebook Group to going into like crazy conspiracy theories about heavy metals and like the vaccine was a conspiracy to get people to become sheep.
And it was just a few steps removed and that’s the sort of thing that I think people need to be… they need to understand.
So I’ve always advocated that governments, in addition to taking kind of the national security precautions that they all should take in addition to making sure that they’ve got like one belly button of policy coordination within their government so that that can all go out to different ministries or agencies that they that they need to be investing in information literacy as well.
And it wouldn’t be prescriptive… it wouldn’t say Fox News bad, New York Times good, it would say you know today’s information environment is very polluted and we all need to kind of recognize that navigating it is difficult and here are some tools to do that.
So that’s part of it. I think the other thing that we have seen from the countries that have actually been successful at this is that they have publicly prioritized that disinformation is a problem, like they’ve publicly declared that.
So if you look at Sweden and Finland and the psychological defense that they do with their populations or Ukraine, obviously Ukraine is in war, so it’s an extraordinary situation. But, you know, people recognize that this information is being used to undermine them and undermine their efforts as they try to resist this enemy.
And so that’s part of it. And then also, and this is something that governments struggle a little bit more with, again creating that policy center somewhere at the heart of government with full political buy in from the very top of an administration and something that’s going to stay from administration to administration that can’t quickly be undone by political upheavals, is incredibly important and no president since 2016, while Trump didn’t prioritize it and Biden didn’t figure out a way to do it right.
I think DHS what I was supposed to do was a smaller version of this within the department itself, and it failed miserably.
And now, as a result, a lot of the counter disinformation work that the US government is doing has been rolled back. So, it’s difficult and you have to telegraph it the right way, you have to be very transparent about what you’re doing and say we’re not doing any fact checking.
You know, we’re going to be putting good information out there. We’re looking at what’s happening online, and we’re going to provide Americans with information about how to get to the polls, how to vote, where to find their voting place. You know, we see a lot of misinformation and disinformation about that sort of stuff that that might be all it means, but you need to be transparent about it and for some reason the US government couldn’t figure out how to do that.
I will say one final thought, which is that when the Salisbury poisoning happened in the UK in 2018, that was, in my opinion, a very good example of how to be transparent, how to declassify information quickly?
How to work across government from very different parts of government, from the police to the foreign and Commonwealth Office to the people who dealt with chemical weapons to get information out to the public quickly and I think we could all learn a lot from that.
I know there are probably still people who believe the Salisbury operation was a conspiracy or a plant by MI5 or whatever, but I think the majority of the world and certainly the majority of Brits believe the truth, which is that Russia did this on purpose as retribution against Scripal.
00:23:47 Domen Savič / Citizen D
So, my next question would be like the difference or looking at the different, let’s say sources of this information, right. So, you have these let’s say let’s call them like homegrown disinformation outlets that are operating within a certain interior politics or party politics, within a specific geographical area like a country or a region and then you have these, you already mentioned them, foreign ops, influencing or trying to lower the attention of the enemy, of the foreign countries.
Would you say that dealing with these different types, so to say of disinformation could be the same or is there like a huge difference between addressing these two issues. Because we’ve been, like the pandemic was sort of like a highlight of everything mashed together, from local disinformation outlets that were basically just monetizing the pandemics informational black hole, then you had the Russian foreign operations that were trying to sort of, yeah, dissolve or inject sort of incoherence and stuff.
00:25:03 Nina Jankowicz
Yeah, it’s difficult to draw the line right. I think everyone wants there to be a very clear line between what our foreign adversaries are doing and what’s going on at home, and this is something that’s been incredibly misunderstood by almost everyone from the press to policymakers since the very beginning of the kind of current era of interest in disinformation and I talk about in my book how the most successful disinformation is the disinformation that manipulates pre-existing grievances in society.
So, if you have again to use a US example, if you’ve got people who believe Texas should secede from the rest of the United States, that’s a really easy grievance to identify and manipulate. If you are Russia and you might identify some local actors who believe that, like you know, local civil society groups or a local media outlet that that takes that kind of position for themselves as an editorial position and you can work with them. And that’s when you get information laundering.
Right, so we see narratives or pieces of, quote UN quote, intelligence introduced to local actors and then they are able to spread it without kind of the Russians or the Chinese or the Iranians, showing their hands that aha, this is a foreign actor and it’s so it gives them a layer of insulation, plausible deniability.
And it also makes it harder for at least in the American context because of our free speech laws, which I agree with, just to put that out there so no one can slice this and say that I didn’t say that it makes it much harder to respond, because if you’ve got somebody who fervently believes something, they have a right to say it right.
So how do you respond to it then? And that’s where you get into kind of the idea of counter speech. You know, somebody putting out that information and hoping that the story that they tell is more compelling than the lie.
So, it’s really difficult and I guess the other thing that I would say is that the nations that have been good at this and I think the UK is another good example again and also probably the Nords and the Balts.
They have an actor agnostic approach, right? It doesn’t matter if it’s coming from Russia or coming from inside the house. It means that you know they’re going to respond to disinformation that affects public safety, public health or democracy in the same way, but and I think that’s the right approach, I don’t think that in any of these cases, well if it’s a, it’s if if it’s a fake Russian account, by all means take it off, take it off of Twitter.
But we need to be sure of that or Facebook or TikTok or whatever. I don’t think that you know, I have in fact if you go back to my very first New York Times op-ed in 2017, September 2017, I say that playing whack a troll is not the right approach to countering disinformation, we need to do better strategic communications.
We need to tell better stories. We need to, you know, prioritize this as a policy area and talk about it transparently, not playing whack control, not removing those accounts. And unfortunately, you know, I think as you and I have talked about, that’s where we’ve ended up that people still think that the response to disinformation is censorship, that the response to disinformation is removing stuff online. And that is just not true.
00:28:21 Domen Savič / Citizen D
And how would you compare like the current, let’s say regulatory uptick in in the EU that is sort of, you know, I mean they’re on the way out. You know, elections are coming in two months, but they’re still trying to sort of push out a lot of regulatory frameworks that address, you know, financing of the media transparency of the operations, the role of digital intermediaries…
They’ve had some success in terms of addressing or highlighting these issues, California and other states in the US started, you know, writing their own versions of, let’s say, the same regulatory frameworks. Do you think this is part of the solution or is it just like a PR stunt for the politicians to sort of show off and say yes, we’re doing something?
00:29:17 Nina Jankowicz
DSA and related pieces of legislation are good. I don’t know if they’re gonna work. I think we have to wait and see to, you know, see if the implementation is as good as the idea. But I think we need to start somewhere and particularly with the transparency and oversight question that one, I think is the most critical.
So, when people ask me what I would do to regulate social media, I tell them that we can’t even have a conversation until we’re all using the same set of facts and right now, that’s true, right?
And researchers, I don’t have to tell your audience, I’m sure, but researchers access to data on social media platforms has been almost entirely cut off or monetized to the point where they can’t get access to it.
So, what we’re dealing with now is basically hearsay. It’s like when Elon Musk decides he wants to release snippets of conversations without their full context to pre-selected journalists. That’s part of what we get or whistleblowers or stuff that’s been gotten through. You know, these congressional investigations that are going on here again released in a retaliatory or untransparent manner. And what I think the DSA can solve and other regulatory regimes, including the E Safety Commissioner in Australia, who has the power to essentially, ask questions of social media platforms based on safety related issues that allows us a shared set of facts.
It means that we will know how the platforms are dealing with content moderation questions. We will know how they’re dealing with foreign interference questions. We will know in in issues where people feel aggrieved that their content was removed or demoted or they have their account shut down. We will know why that happened.
And until this point we’ve not had anything that has allowed us to do that. We could make inferences even when we had access to more data, but we would. We would only have kind of what the data showed and the inferences that we could make and then what the platform said.
The truth is probably somewhere in the middle, right? So, I’m excited about that part of it. I think there’s still a lot of work to be done thinking about the way that our information environment today has effects on public safety and the online safety bill in the UK makes some attempts to square that circle, but it also has a couple of concerning elements to it, including the requirement to break encryption, things like Signal, who have been really pushing back on that and I think rightly, but on the other hand, it also, you know, I think makes it legal deep fake porn and other types of harassment that are based on protected characteristic.
So, it’s a difficult dance to do, but I think that that is going to be the longer term harm that we have to kind of iterate on and see as things are developing, what’s going on the transparency issue. And if we solve that and we make sure that there is a non-partisan intermediary body overseeing that data and distributing it to researchers and journalists, frankly, then then we’ll have a conversation. We’ll have the basis on which we can have that conversation and make it more productive.
00:32:32 Domen Savič / Citizen D
And what would be the result of this conversation? You often hear about transparency that is the cornerstone against fight against disinformation, but then they never mention the next steps, in terms of, OK, we have transparency, we know what’s going on behind the curtain. What’s next?
00:32:59 Nina Jankowicz
Yeah. I mean, I think then we have some sort of intermediary regulator who can respond to exigent threats on the platforms and either give guidance or set regulations for what they must do if a certain thing happens. So, I like to compare it to the airline industry. I like to actualize what goes on the Internet, because I think a lot of people still believe that there is a firm line between the online and the offline worlds and that just isn’t true anymore.
So, Boeing obviously has been having a lot of problems over the past couple of years recently here in the States ad door blew off an airplane and immediately we had an investigation by the FAA into Boeing and what was going on there.
I think we need regulation similarly for technology platforms that have such an immense influence on our lives and have so much of our personal data as well, right? We need to make sure that they’re safeguarding that appropriately, and that if they are not safeguarding it appropriately, if they are not, you know, investing enough in making sure that child sexual abuse material or terrorist content, or if we ever get there, you know, other online harms that deal with adults and everyday people.
Deep fake pornography is one example, right? If they’re not doing their due diligence and expending a certain level of effort, then they get fined or then they have some sort of penalty. That’s what happens in every under every other industry, from finance to cars to airplanes to food. And to think that we are giving over so much again of our personal information to these companies for access to something that we use in our daily lives to connect with people, to do business, to, to stay in touch with families, and there is nobody, nobody who’s watching what they’re doing.
I mean, it’s shocking. It’s shocking. So, I think that would be the logical next step. But again, we need that shared set of facts to start from a common ground there, because otherwise we’re just going to get hoodwinked by political actors and private companies who are trying to make a buck.
00:35:00 Domen Savič / Citizen D
It seems to me like this is the perfect topic or this subject of regulating, let’s say intermediaries or online platforms. It’s like the perfect storm situation where everybody can or anybody can add to its own two cents. The political actors are trying to sort of lessen the encryption, as you’ve mentioned, and trying to sort of convince us that you know, terrorism will be gone if they just have access to everything.
Then you have the private sector that is just “No, no, no. Just leave us alone. We will handle this perfectly on our own because we are the only ones who know what to do.”
So, who do you think of all these actors should, let’s say, start the conversation about transparency and about next steps like you’ve mentioned political actors, you know, we’ve talked about the market forces… who should be the like the instigator of this conversation?
00:36:05 Nina Jankowicz
I think it’s got to be researchers. I don’t. I don’t think either. Nobody, nobody is unbiased enough of the political actors and the and the private companies to really drive the conversation, and it’s the researchers who have worked with the data. It’s the researchers who have uncovered some of the harms online, and it’s the researchers who want to be able to continue to do some so without this exorbitant cost?
And I think they they’re starting to do that. The problem is that with due respect to my beloved colleagues in academia, academia moves very slowly. It’s really difficult to get consensus on things because everybody has very strongly held opinions based in their own research and their own experiences and they don’t communicate very well.
So, I think most academics would agree with that, so you know, I think there’s a lot of work to be done to make those points to the public as well. And I’m doing some work with some academics to start to do that in the future.
But it is difficult. It’s a really difficult battle because everyone turns off their brains as soon as you start thinking about something technical, but I think again, that’s where the analogy to the airline or the food industry or whatever comes in very handy, because I think everyone hopefully agrees now that what happens online does not stay online and that means that it needs to be regulated.
00:37:27 Domen Savič / Citizen D
Yeah. And speaking about regulation, just one more topic before we wrap up… your latest book talks about online harassment, it describes your experience, it talks about other women’s and other types of harassment online.
What’s your take on the current, let’s say regulatory situation in that area? Is this similar to this information or fight against this information? Is it something completely different? I’m guessing harassment is as old as this information operations if not older so?
How do we address this issue moving forward, especially now that we are, you know, walking, maybe even sleepwalking in in the era of generative AI, deep fakes, and everything else that just you know, turns the volume up to 11 in terms of harassing.
00:38:23 Nina Jankowicz
I mean, I could have spent an entire hour on that question alone, but I’ll try to be quick. I mean, I think the state of online harassment and online harms online safety, especially in the United States, but really, most places around the world is primitive at best, the state of regulation and in most cases, I think really discounts the harm, particularly that women and minorities face when they encounter online harassment, right?
The intent of this harassment, whether we’re talking about just, you know, trolling someone or deep fake pornography or violent threats, is to silence them. So, it is a speech issue.
And yet, when people turn 18, especially when women turn 18, right, we have a lot of conversation in, in the online harm space about girls and I had a student at Oxford asked me once, why is it that when girls turn 18, we no longer care about them and I think that’s a really, really good and trenchant question.
They’re not minors anymore. Sure, but are we asking just by raising our voices to receive rape threats, to be targeted in this way, to have to fear for our lives when we’re walking around, you know, outside? I mean, I have a cyber stalker who I had to get a protective order against, I had a bad experience with law enforcement when I did that.
The detective who handled my case basically said, well, he lives in New York. Why do you think he’s gonna show up here in your home in Virginia? And I’m like, because he has my address and he has a habit of showing up at people’s homes and places of employment and events that they’re at. And I have a baby, and I don’t want him to, you know or see my baby or be in proximity to him.
There are many law enforcement agencies that react similarly. So, I think I think we need much more help there. The fact that deep tech pornography has not been criminalized or even civil penalty instituted at the federal level in the United States is shocking to me. When the harm is so, so, so clear. And it has become so easy to create stuff like this and you asked if this is a similar question to disinformation or not.
I mean, I think the response that I hear from skeptics is that this is just, you know, you should have to deal with it. You should just buck up and deal with it. Often, that’s white men who say that. They just don’t get the same sorts of harassment that their female counterparts do. There’s a lot of data on that. And so, what do I think needs to happen? I mean it’s tricky, right? Because it runs into speech questions. But I think in regulating the platforms we might be able to find a happy medium there.
Obviously like these, these platforms have terms of service that they are meant to implement that already say that you’re not allowed to harass somebody based on protected characteristics like gender, age, religion, ethnicity. The problem is they don’t enforce them, so perhaps there’s an enforcement mechanism that says if you don’t enforce your own terms of service, you are not upholding your duty of care to your users and that’s a market problem, right? That’s kind of a consumer protection problem.
There’s a lot of different ways to do it, but I think until people recognize the harm that online harms do cause that it’s not just sticks and stones, but there is a real physical threat that that comes from a lot of this stuff, then we’re going to be going around in circles.
But I am hopeful. Danielle Citron, who’s a privacy scholar, a former MacArthur fellow, she has two great books. One of them is hate crimes in cyberspace and the other is her newer one is called the fight for privacy, and she mentions how in the 70s, of course, it was legal, perfectly legal and normal to get sexually harassed at work.
And women came together and we banned it against that, and now it is very much not legal. Right. And you can you can have harassment claims against your bosses, your coworkers, for creating an unsafe work environment. We just need to bring America, bring the world up to speed. That that’s not OK online either because it is a speech issue, like I said, coming back around to it, it means that people, women, marginalized communities are silencing themselves because they are afraid of these horrible things that might happen to them online if they do speak out, so that’s where we are. I’m hopeful that we can change it for right now, situation is not very good.
00:42:51 Domen Savič / Citizen D
And just one more question before we wrap up. Like there is a, there is a ton of political movement in the US and the European Union, EU, US election is coming up the EU election is coming up. Do you see this these topics well addressed or during the election cycle? I sometimes think it’s a generational issue, like the older people, let’s say above 50, they just don’t have, as you said, the same experience or it almost looks like they don’t live in in the same world.
So, is this a generational issue? Do you see like working in this field working with younger or young people? Do you see a change of narrative and perceptions of these issues as people go, you know, are younger and are more involved in these types of situations?
00:43:59 Nina Jankowicz
Yeah, I think so. I think younger people understand the harms that the Internet can cause more, more acutely than people, a couple of generations ahead of them. But I’ve also had some really enlightening and encouraging conversations with people who are many years my senior, so I wouldn’t entirely write them off, though I do think we’ll see more and more legislation coming from kind of younger MP’s, younger members of Congress as we start to address these issues.
00:44:33 Domen Savič / Citizen D
00:44:48 Nina Jankowicz
Yes, I agree. I’ll be watching the elections in the EU and I wish you all the same as well. Bye bye.
Podcast Citizen D gives you a reason for being a productive citizen. Citizen D features talks by experts in different fields focusing on the pressing topics in the field of information society and media. We can do it. Full steam ahead!
We sat down with Mark Dempsey, a Senior EU Advocacy Officer for global free speech organization ARTICLE 19.
Prior to ARTICLE 19, Mark consulted for the European Commission on a project focused on data protection laws in non-EU countries. ARTICLE 19’S work in Brussels is driven by the goal of ensuring that the European information environment is free, fair, accessible, inclusive and decentralized.
During our conversation we touched on the current regulatory frameworks within the EU, digital activism fatigue, the role of the end-user and the omni-present role of digital technologies and services that need to be put in check by regulation.
Welcome everybody. It’s the 5th of April 2024, but you’re listening to this Citizen D podcast episode on the 15th of April same year with us today is Mark Dempsy, Senior EU advocacy officer for Global Free Speech organization, Article 19.
Prior to Article 19, Mark consulted for the European Commission on the project focused on data protection laws and in non-EU countries, and even before that, Mark, you’ve worked in the financial sector. Is that right?
00:00:39 Mark Dempsey / Article 19
Yeah, correct. Correct. Yeah.
00:00:41 Domen Savič / Citizen D
Before we start our conversation, I just want to know what made you switch from the financial sector to digital rights, which one is worse?
00:00:53 Mark Dempsey / Article 19
Which one is worse is a funny way of putting it. So, I worked for a long time in development, finance, and then regulatory finance. And I was an EU policy advisor for the Financial Conduct Authority in London, where I would go to Brussels, and I’d sit in Council working groups discussing nascent proposed legislative proposals from the Commission and I think really what made me make the move is an awareness of the growing encroachment of big tech on our lives and the controlling of the narratives.
Not just in the sort of public space perspective, but also commercially and the reliance of so many businesses like Amazon, Google, etcetera controlled the various data flows and the commercial relationship?
So it was, it was the encroachment of big tech in our lives. But it was also the realization that there was there was an extent of regulatory capture in finance, which was never going to change.
And I think coming from Ireland, seeing the financial crisis and the way taxpayers have basically footed the whole bill, no one in the US went to jail, there is still this aura of invincibility around finance and I just became, I think, quite disillusioned really and it was very hard to see the tangible effects of regulation to people on the street, and I felt that digital rights was more of an area where I could make more of a difference, so I took the time out to go to Hertie and to do a masters.
And I was lucky to work closely with Joanna Bryson, who has this unique view as a technologist, where she looks at society and the human rights impact and the digital rights that come attached to it. So, I think though she was a bit of an inspiration.
I’m sure she’d like it; she’d be heartened to hear that. And then, of course, there was such a huge avalanche, as you say, of regulation in the EU and it felt like a very good moment to be involved, even if I did come in a little bit late. When I came into the DMA/DSA in terms in terms of their negotiations were being finalized as I joined Article 19, but the GDPR, of course was a good experience with them. The Commission was a good learning experience and set me up nicely for the Article 19 role.
And where of course I came straight into the European Media Freedom Act, which was in the middle of negotiations, which in itself is a is a fascinating piece of legislation which we can talk about later.
00:03:32 Domen Savič / Citizen D
Sure. Because my next question would be exactly that, right. So, the EU Commission mandate is wrapping up, they’ve done a ton of work in terms of regulatory proposals or regulatory laws in terms of digital rights.
You’ve mentioned in the DSA the DMA, the European Media Freedom Act previously the GDPR. Is this the right way or are we heading in the right direction with these proposals or with these laws that are that are, as you’ve mentioned, focusing on big tech on, on intermediaries, on everything that’s happening in, in the, in the digital world right now?
00:04:12 Mark Dempsey / Article 19
I mean, I don’t know whether I’m the right person to ask. That question is is in itself questionable. I think the EU can be commended for making the first move and I know that they sometimes get criticized as being mainly regulatory cage, so to speak, and that they seem to specialize more in regulation. But that’s because they do have this immense capacity of civil servants within the Commission.
If we’re going in the right direction, it’s too early to tell. I mean what is disappointing is that the process of agreeing proposals ultimately at the end of the trilogue is still very non-transparent, and I think with the EMFA, with the DSA probably less so the DMA, there were agreements made, or at least there were. There were positions for justice by the Council at the end, which undermines the initial legislative proposals, so I think that process is simply, it shouldn’t be tenable anymore, but I think it probably will be because I don’t see any changes happening.
So we are dependent very much on a strong Parliament, and you’re very dependent on having your champions in Parliament to push the civil rights aspects of any legislation. But if you don’t have those people, they go into the rapporteur, for example, I, the person who oversees the file, if they go into final negotiations and they’re not particularly, how do you say, favorable towards civil civil rights, then our position is severely weakened.
So I think the proof will be in the pudding in terms of enforcement, I mean again, and I know for some they’ve heard this ad nauseam, but this all lies in the enforcement and particularly with these new legislative pieces that have come in.
There are mechanisms which gives the EU a strong regulatory position, so they are going to be regulators for DSA/DMA, AI act and we just have to see how that plays out.
I mean, the GDPR has been a lesson for the EU, I think they realized that. So, let’s see if they put the right resources work closely with civil society actors and that look, that will all happen in the next couple of years. And then it’s a case of understanding how all these pieces of legislation interact with each other.
I mean there could be unintended consequences, they’ll have to work very closely. The different units, like the G Connect to GG comp. The new AI office. They’ll have to work very closely together and coordinate and in a non-siloed fashion and I think the EU is not known for our coordinating well amongst different units. Things tend to be quite silent, so iit’s really up in the air as to whether as to where all this goes.
But I do go back to commending the EU for at least taking the step because, I mean the US, you don’t even have a federal privacy law, so I think certain states as we know like California, they’ve learned from the GDPR and they’ve created their own strong privacy laws, but I think the EU is really being the first mover here, but I think they tend to love saying that and praise themselves. But we’re beyond that now, now it’s about whether they can actually enforce.
00:07:45 Domen Savič / Citizen D
OK, I’m going to, I’m going to flip the question and ask you a little bit differently, so, is there a a phenomena or is there a, let’s say a mistake in the field that should be corrected by these EU legislative regulatory frameworks?
And if they don’t do it, then it it will sort of signal that the regulatory framework has I guess I could say failed in in in its attempt?
00:08:20 Mark Dempsey / Article 19
Well, something that Article 19 is looking at is the concentration of market power, so if the DMA doesn’t do what it’s supposed to do then I think we will have an issue. You could say that that particular piece of legislation will have failed.
I mean the whole point of it is to create fair and contestable markets and that means ultimately as well opening up these platforms, and I think there are avenues within the legislation to do that.
But the Commission hasn’t mandated that so far yet.For example, if they were to open up recommender systems, for example, and allow third party recommended systems into your social media feeds, I think that would be something that would certainly disrupt the business models and provide the choice which the DMA proposes to do.
So, I think that there are ways for them to decide that something is not working and therefore we will have to make some fundamental changes to the business models to make sure that these markets are fairer, more open and more contestable.
But again, this all goes down to a certain level of political will and the second level of courageous leadership. Because if you look at the, if you look at the parallels of finance, those steps weren’t taken, I mean that there wasn’t political courage to say.
You have these large institutions which are deemed too big to fail, but post financial crisis, there has to be ways of breaking them up so they don’t have a dominant position in the market. And if you look at your Goldman Sachs and you look at the the large American banks, they’re still making remarkable profits and their share prices have held and gone up and the level of pay compensation that executives get is just absurd when you compare the ratios going back over the decades.
00:10:22 Domen Savič / Citizen D
Would you say that’s something that compares or that’s something that is similar between the financial and the big tech industry or area in terms of big players positions that do not change no matter what because politicians that are afraid of them?
I used to say that politicians are in love with intermediaries, with big tech giants and they don’t want to hurt their lovers in a way, so they dance around them and sort of not face enough pressure from the public. They know they usually do something small and say OK we’ve done our job and now the rest is on the end users.
00:11:11 Mark Dempsey / Article 19
Yeah, I think there’s actually absolutely an element of that. I mean, unfortunately it seems to be sort of human nature that there is this admiration for the ability to make money, so when an executive from a big tech firm walks in, there’s fawning over that person, and that is definitely a parallel that exists for the banking sector.
I mean, we need legislators and regulators to think about how are these companies improving society and I think it’s quite clear they’ve been doing the opposite for some time on so many different levels, not least when it comes to undermine democracy.
But still, when we you’re not getting that forceful language by regulators to change this, yeah, the regulation is the regulatory proposals to the DMA they say are a step, but really that’s all they are and I think the DSA is probably more limited in what it can do than the DMA. But there is a huge cultural shift that needs to happen, and it’s admittedly difficult because they’re entrenched monopolies at this point and it’s hard to change that.
It means that to change that, you have to disrupt business models and that take that will take a certain amount of strong leadership and willingness to push against the grain of the usual argument of, you know, innovation is being stymied by regulation. I mean, none of this is proven and when you have… I mean there are so many incidents that have happened as a result of big tech and then of course you’ve got AI and AI adds a whole new component because they’ve already started reinforcing their dominant positions by forging partnerships, which are basically acquisitions and taking over all the all, all the staff of the company that they’re going to partnership with and I know then that there’s investigations into this.
But yeah, does it bode well for the future? I think it’s hard to say that it does at the moment given how how quickly generative AI seems to have become something that’s again completely been taken over by big tech.
00:13:23 Domen Savič / Citizen D
Speaking about the big tech and you’ve mentioned GDPR in in the beginning, so yeah, another interesting party, so to say in this fight is the end user right? So you’ve had the GDPR put complete control over our own data into our own hands and you know, that proved a bit of a challenge for people not being used to be in the driving seat. So on the other side, you have these big tech giants, you already mentioned these firms that are that are that’s basically a a monopoly, if we’re lucky… So how do you see the role of the individual user moving forward?
Should we as activists sort of try to move away from this logic that we have to empower the people and everything else will follow and focus more democratic institutions and more on the systemic players?
00:14:29 Mark Dempsey / Article 19
Yeah, I think we absolutely want to focus more on the systemic players because, has GDPR ever worked? I mean has more people understood the value of their personal data… I mean, that’s really the question, right? And I’m not sure that they have, I mean that there’s the whole cookie fatigue, there’s the determined services, there’s the ability for big tech to hover up data in a transaction which no money is transferred.
They’ve managed to to yet again control that sort of transaction and I think the question is always like how do you get people to value their personal data, what needs to be change to have a simpler choice?
I mean cookie fatigue; we all know that a very small proportion of people actually go to each of the cookie options and then decide to reject them all. I mean, as digital activists were still very much a minority when it comes to valuing personal data, yeah, there needs to be systemic changes made beyond the idiosyncrasies of GDPR.
I mean I could flip the question back to you, right? I know you are interviewing me for this podcast, but I’d be kind of curious to hear what you think on that as well.
00:15:59 Domen Savič / Citizen D
I mean… So my first campaign was back in, what was it, 2012? I think we’ve organized the the first anti ACTA protest in in the country and and back then, the field was very, you know diluted, you had you had different lawyers and let’s say tech engineers working on this subject and they came together saying, you know, well, this ACTA thing is really troublesome and we have to pull our weights together.
But then afterwards they sort of drifted off in into their own, you know, spheres and this is something that was to me, a signal that, OK, we need more of a focused pressure on these issues because putting together a coalition just to have it fall apart after the fact, it’s not very effective, right?
You can’t do that every time. And then then it was like from 2015 to 2018, yeah, exactly to the to the GDPR, it was this, I’m going to call it the golden era for activism, where you had small individual fights popping up and you could, you know, hammer them down individually on your own by motivating the crowd every time something came along now.
And moving forward, you said it with with the AI and everything, I don’t see NGOs or activists being the right way to fight, because there’s just too much of everything, it’s happening constantly.
It’s on all the time and at the same time people, you know you, you don’t have the time, the energy, the funds to sort of brief the people in real time about these issues and encourage them to sort of go against the big tech or the bad players in.
And this is something that, that, that was very memorable to me when we met in Brussels recently and you spoke about the digital fatigue right and I love the term, because I was, you know, I am in the process of questioning that that very same question, that this is just untenable moving forward.
Like, are we just, you know, destroying our health and whatever you know, because there’s just too much of everything and at the same time, there’s a very asynchronous feeling to the field that you said before in the trialogue and everything.
You know, it all goes someplace you don’t want it to go right and you have no influence over the…
00:19:04 Mark Dempsey / Article 19
Yeah… And I mean, you’re absolutely right. I mean, digital fatigue is… things have changed so quickly, the results in terms of flops, civil society is looking for are relatively minimal, so of course there’s a level of cynicism, skepticism that creeps into the conversation, that brings to the question like, how do you mobilize civil society to be as effective as possible? Like why are we choosing our battles wrong? Are we, are we choosing too many battles?
I think we must fight on. I mean I think we have to… there has been success successes, they have and there’s definitely a growing awareness of what civil society can bring to the table, at least within this EU bubble, but it is a challenge to bring about a culture where fundamental rights and everything that encompasses become a serious part of the conversation. And if you look at the DSA, we have been pushing the Commission to create a sort of more formal structure or a forum for CSOs to have a regular dialogue with the Commission, and we’ve had one or two workshops, but nothing has been formalized yet. But.
I think they realize that, one, they don’t really have all the capacity that they need and that the CSO community, along with the academic community and all the various researchers attached to it, bring a huge amount of expertise, and I think with the DSA, a lot of this hinges on the provision which allows for data access.
If that process happens properly, which is obviously not at all guaranteed, because I mean there will be ways which Facebook and Google use this process as a means of further controlling access to data. Yes, they give access to data, but it will be a long, laborious process till they get to that point where you can access it. And I think this is why the Commission really has to come in quite strongly and make sure that this access happens relatively seamlessly and to a wide number of people, so the right research can be done.
So I think there are elements of hope there which should make our job easier, but ultimately, as long as they’re considering their bottom line, I the platforms they will be doing their best to make sure that that isn’t the case, but there’s the twin goal of making sure that people beyond our community understand what their fundamental rights are when it comes to the platforms.
Then there’s the bringing in the culture and the common and the digital services can either reflect fundamental rights, how they need to be respected and how they can do that as quasi regulatory gatekeepers.
But I mean, I think the digital fatigue also comes comes purely from the fact that you have all this change happening extremely quickly in the private sector side along with this huge number of legislative proposals which are very dense and no one still knows how they work with each other.
So that’s where you got to take a step back holistically and understand the consequences of each piece of legislation. But then of course, there’s all the case law that’s going to naturally follow an that case, though, will be very important.
But again, that’s a slow process as well.
00:22:42 Domen Savič / Citizen D
True, yeah…I mean, I wasn’t trying to sound like defeatist, I just realized very, very soon that that in this area, like in the digital rights, you always have to question yourself from time to time just to sort of open up to, to new ideas or to new venues or to new approaches.
00:23:04 Mark Dempsey / Article 19
I keep referring to the EU public and the EU is really a bubble, there is not nearly enough voices being brought into the conversations in the global South, for example, and that’s a whole different discussion.
But as we both know, there is a huge amount of data exploitation happening when it comes to the building of data centers in Kenya, for example, and the lack of frameworks to properly control power, the large platforms are working in local economies such as that of Kenya or in other African countries, there’s a there’s basically no governance framework there whatsoever and that doesn’t get discussed enough.
And then of course the sort of the far-reaching consequences of the EU’s digital proposals, the so-called buses, in fact, but there are negatives negative externalities, which the Commission is is presumably becoming more aware of, but as more players from the global South get brought into the conversation and I think with Article 19, the European work is a very, very small piece because ultimately it’s a global organization which is working and protecting journalists in Mexico, Brazil, where arguably its work is more impactful, because then there’s a much greater need to do something now because of the actions of government, which tend to be more imposing themselves and the people in negative ways, which largely isn’t the case, and bar a few countries in in the EU.
00:24:53 Domen Savič / Citizen D
Do you think that’s a good aspect of, this is going to sound weird, of areas, of countries that do not have formal regulation in place? So do you think activists can do more in the jungle than in the urban centers where everything is written down and stamped and approved? And yeah, you have all these institutions that are supposedly there to help you, but in practice…
00:25:29 Mark Dempsey / Article 19
Yeah, it’s just an interesting question, what I suppose they can learn from the failures of… I mean do they have more opportunities to change things quicker? Possibly. Possibly. I mean, I’d have to think of an example.
00:25:56 Domen Savič / Citizen D
Yeah, I mean just just looking at at everything that’s happened in in the last, 10 years, I remember a couple of instances where you had working in inside the EU or working to change the EU proposals but you’ve often realised that the length of the process, when you have the a very narrow window of impacting the legal frameworks or the regulatory process and then, you know, two years go by, three years go by and this law comes into power and then everybody’s like, wondering, yo, what’s this?
And then you go well, remember, like, three years back when you said it was nothing and we said it was a big thing and you should act on it then. Well, this is it, right?
So, I’m thinking the whole systemic legal, the length of the process sort of you know helps the bad guys to sort of have the upper hand while activists are battling from day-to-day and then realizing that OK, now we have 2 years of nothing before people realize what’s going on in the first place, right?
00:27:26 Mark Dempsey / Article 19
Yeah. I mean, are you speaking to the the sort of speed of how things change? And then by the time it becomes law, it’s more or less, I wouldn’t say outdated, but less relevant than it might be?
00:27:36 Domen Savič / Citizen D
So there’s that, and there’s also the whole legal process or the regulatory framework process takes so much time that people lose attention in between or people stop following the issue and then realize after everything’s done that, whoops, this is something we should pay attention to two years ago.
00:28:00 Mark Dempsey / Article 19
I mean, I think it’s very hard to have a fast regulatory or a fast legal process, right? Because whatever one says about democratic deficits, I do think way the Parliament is structured and the various committees and the rapporteurs and the champions, I think it’s actually quite a good process.
I mean I think that the level of detail that one is forced to go into to have a provision the way you want it to be, I think it’s actually not bad. I think it’s just the trialogue process that really can become very deflating and I think again it sort of comes down to strong personalities who care about what we care about being in the position to push back against the Member States. And then of course, as the voting modalities, I mean in Council, it’s generally anonymity and then there’s this horse trading that’s done, for example, when Poland was in, was leaning towards an autocracy, you had it supporting Hungary at different junctures. And then of course when that starts happening that it reminds the initial piece of legislation.
So I think there are real changes that need to be made in voting modalities and how the trial logs operate. But I think as I said before, it’s it’s hard to envisage that happening.
00:29:30 Domen Savič / Citizen D
Before we wrap up and you’ve mentioned Poland and Hungary and this perfectly segues into the into the last piece of of the conversation. So, so you’ve mentioned it before the the European Media Freedom Act.
How’s that going? Yeah, in terms of, when will we or can we actually already see some movements in in this area regarding the effectiveness, the usefulness of this proposal or is this something that is again ruined, so to say, by the trilogue, by exemptions, by doing everything and nothing at the same time?
00:30:14 Mark Dempsey / Article 19
I think it’s the the eventual outcome certainly wasn’t what we as a civil society activists would have liked. I think for some, it’s a bit late. I think it will be interesting to see how the Commission reacts to what’s happening in Slovakia, because arguably it’s already acting against the spirit of the regulation, which is on the statute book, so it’s effectively… they need to be acting with the letter of the law, or at least the spirit of the regulation and they’re not.
So I think the Commission has to come down quite hard and fast to stop Slovakia going down the route of Hungary. I think it’s a piece of legislation, it was always going to be very difficult because you’re bringing together a huge number of actors and you’re regulating media in ways that you didn’t do before from a European level.
I think the juries out. I mean again… you have this new European Board of Media services, you’ve got a lot of responsibility on the national regulators, the Commission has to make sure that there’s a proper effective, well resourced secretariat in place to service the board and my understanding is that none of these are European level. There’s still a lot of institutional building to be done to make sure that all these are properly in place.
I honestly it’s difficult for me to say how this is going to work out, but I think it… on a very on a basic level, it’s good that it’s there, that the law is now is now in place and there are there are elements of law, like the Merger Assessment test, to retain plurality and make sure the media freedom stays in place and that mergers don’t just happen without other economic avenues being sought to make sure that this an independent media outlet can exist independently.
Uh, I think it’s great that they’re there, but it’s going to be interesting to see how such a complicated regulation can be properly enforced and again, the Commission has put itself in a position where it’s sort of in the eye of the storm.
That if these regulations which largely… they are there to protect democracy, particularly the Media Freedom Act, don’t work as intended, I think they’ll really end up undermining themselves in a big way. So again, if we have, with the impending elections and you’ve got a right leaning Parliame, at least these legislative pieces are there, but have you still got a right leaning Parliament and not strong leadership in the Commission, these won’t be particularly effective.
00:33:08 Domen Savič / Citizen D
Yeah, this was actually going to be my wrap up question to this debate and it’s all very interesting so we’re coming into this, pre election or pre pre election cycle for the European Parliament and I was wondering to hear your thoughts on how should we, let’s say, as NGOs, but also maybe as as journalists or as media, as public opinion shapers, how should we frame the the issue of of digital rights?
What is the difference, talking about digital rights now compared to, let’s say, the last the last pre election pre election cycle that happened 2018, 2019?
So what should we change to sort of bring more impact to these topics and not just brush them away as “these are kids on TikTok topics”?
00:34:07 Mark Dempsey / Article 19
So I’m going to like, I feel quite strongly that the way over the last five years, digital has become even more… it’s become a much an even stronger part of our daily lives. It’s everywhere. I mean it’s remarkable how it’s intertwined in everything we do, and arguably it already was in 2019, but even more so now.
So for me, digital rights cannot be something that is apolitical, I think digital rights has to be has to become explicitly political and what do I mean by that?
I mean that if we look at what is happening in Gaza, if we look at the +972 media outlet article the other day about the use of AI when it comes to targeting civilians in Gaza, much of that AI was designed by platforms like Google.
There is a complete lack of accountability when it comes to the interaction of geopolitics and the platforms role in it. So I think that as digital rights activists, we have to start embracing the political implications of this and calling out the platforms for involving themselves in conflict, for example.
Because in conflict you’re taking sides, so these are not neutral actors and I think as digital rights activists, this is something we need to become much more involved in.
And yes, it’s more, it’s of course more sensitive and of course it has to be more nuanced, but I think that there’s a leadership role there to be taken by digital rights bodies and communities and at least having that conversation with geopolitical actors.
I think that for me, that’s the major gap that exists and that was for me that was very much highlighted when it came to what’s happening in Gaza.
00:36:07 Domen Savič / Citizen D
Yeah, that sounds perfect, thank you Mark for the pleasure, this was this was really interesting, lovely to to talk to you. Best of luck going forward and hope to see you again sometime in person.
00:36:23 Mark Dempsey / Article 19
Thank you very much and pleasure. I really enjoyed this. And yeah, there’s lots we could talk about even further, of course. But yeah, see you next in Brussels.
Podcast Citizen D gives you a reason for being a productive citizen. Citizen D features talks by experts in different fields focusing on the pressing topics in the field of information society and media. We can do it. Full steam ahead!
Alja Isakovič, razvijalka programske opreme in uporabniških procesov, se v zadnjem času posveča izzivom etičnega razvoja informacijske tehnologije in tehnološkemu okoljskemu odtisu ter poudarja aktivacijo posameznika in povezovanju v skupine za doseganje družbeno relevantnih ciljev.
V pogovoru se dotakneva tem vključevanja žensk v razvoj informacijske družbe, problemu tehnoloških revolucij, ki jedo svoje otroke, načneva pa tudi probleme gradnikov umetne inteligence, o pomanjkanju volje in moči posameznika ter lokalni aktivaciji ter razlogih za optimizem na temu področju.
Premišljujeva tudi o prihodnosti, evropskih volitvah ter politični komponenti tehnoloških tem, ki bi morale priti na tapeto novih odločevalcev.
Podcast Državljan D je podcast za produktivno preživljanje časa, v katerem igramo vlogo državljana. Državljan D v pogovorih s strokovnjaki določenega področja informira in aktivira. Da se. Gremo naprej!
So človekove pravice zadnje orodje revnih proti finančno premožnejšimi elitami ali gre za vrednostni sistem, ki ga sprejemamo vsi udeleženci družbenega okolja?
Z direktorico Pravnega centra za varstvo človekovih pravic in okolja ter eno od pobudnic Pravne mreže za varstvo demokracije, Katarino Bervar Sternad smo se pogovarjali o logiki človekovih pravic, vlogi nevladnih organizacij, problemom delovanja v prid človekovim pravicam in poti v svetlo prihodnost.
Kakšna je situacija na področju človekovih pravic v Sloveniji in kakšne so razlike v primerjavi s tujino, kako so v Sloveniji pozicionirane nevladne organizacije in kaj je političen problem človekovih pravic?
Podcast Državljan D je podcast za produktivno preživljanje časa, v katerem igramo vlogo državljana. Državljan D v pogovorih s strokovnjaki določenega področja informira in aktivira. Da se. Gremo naprej!
Za začetek leta se z Urško Henigman, novinarko uredništva izobraževalnega, otroškega in mladinskega ter dokumentarno-feljtonskega programa na prvem programu nacionalnega radija pogovarjamo o medijski reprezentaciji spolnih praks v Sloveniji, ujetosti spolnosti med politiko, mediji in družbo ter o uredniških premislekih pri oblikovanju s spolnostjo povezanih medijskih vsebin.
Pogovor teče o zgodovini medijske reprezentacije spolnosti v Sloveniji, težavah z iz tujine plačanih aktivistov, ki spolnost spreminjajo v temo strankarske politike, o vsebinskih premislekih in potrebah občinstva na tem področju.
Kaj vse je spolnost, kako nastajajo s spolnostjo povezane medijske vsebine, kako pomemben je senzibiliziran novinar in zakaj je takih vsebin še vedno premalo?
Podcast Državljan D je podcast za produktivno preživljanje časa, v katerem igramo vlogo državljana. Državljan D v pogovorih s strokovnjaki določenega področja informira in aktivira. Da se. Gremo naprej!
The podcast currently has 135 episodes available.
4 Listeners
7 Listeners
1 Listeners
10 Listeners
0 Listeners
0 Listeners
0 Listeners
3 Listeners
2 Listeners