DIGETHIX 9: Pegasus NSO Spyware and the Business of Surveillance

Seth and Bernd Dürrwächter review the Pegasus NSO Spyware scandal. Bernd is principal at AnalyticDimensions.com, a consultancy for big data analytics & data science projects. Bernd has been advising the podcast from the beginning, using insights from his decades of industry experience.

The first half of the conversation goes over the details of the Pegasus scandal and other incidents that NSO has been involved with in the past. The second half turns toward the broader questions of cybersecurity and consent. The key questions for this episode are: if someone gets access to our phone or laptop, just how much reach does that person have into our life? What kinds of surveillance have been enabled by our technologies? What can we do about it?

Potentially Helpful Articles:

1. https://www.theverge.com/22589942/nso-group-pegasus-project-amnesty-investigation-journalists-activists-targeted
2. https://www.calcalistech.com/ctech/articles/0,7340,L-3912882,00.html
3. https://www.amnesty.org/en/latest/research/2021/07/forensic-methodology-report-how-to-catch-nso-groups-pegasus/
4. https://info.lookout.com/rs/051-ESQ-475/images/pegasus-exploits-technical-details.pdf
5. https://www.amnesty.org/en/latest/research/2019/10/morocco-human-rights-defenders-targeted-with-nso-groups-spyware/
6. https://citizenlab.ca/2016/08/million-dollar-dissident-iphone-zero-day-nso-group-uae/

 

Other Articles on Tracking:

1. https://www.nytimes.com/interactive/2019/12/19/opinion/location-tracking-cell-phone.html
2. https://www.theverge.com/2021/7/20/22586161/cell-phone-location-data-grindr-catholic-church-report-data-brokers

Contact us:

digethix.org
facebook.com/digethix
twitter.com/digethix
instagram.com/digethixfuture
EMAIL: digethix@mindandculture.org

 

Music: “Dreams” from Bensound.com

Seth Villegas 0:00
Welcome to the DIGETHIX podcast. My name is Seth Villegas. I’m a PhD candidate at Boston University working on the philosophical ethics of emerging and experimental technologies. Here in the podcast, we break down issues of ethics and technology from both the technical and socio political perspective. Our goal is to help the next generation of technologists to build better systems and to avoid potential ethical pitfalls. On today’s episode, I will be reviewing the Pegasus nsos spyware scandal with my collaborator Bernd Dürrwächter. Bernd is a principal at analytic dimensions comm a consultancy for big data analytics and data science projects.


Bernd has been advised in the podcast from its inception, using insights from his decades of industry experience. we’ve compiled lists of articles and other resources that we referenced in this conversation. Please look them over and do your own research. A big thank you to Mitchell Clark and to Mitchell’s reporting at the verge. The first half of our conversation, we’ll talk specifically about the Pegasus and so spyware scandal, about how it works and what other incidents have happened in the past. The second half of our conversation will turn towards broader issues of cybersecurity in general, and towards the problem of consent. The key questions for this episode our if someone gets access to our phone or laptop, just how much reach does that person have into our life? What kinds of surveillance have been enabled by our technologies? wWhat and how can we deal with these things? Before we get started, I want to give a big thank you to Nicole Smith, who designed the logo art for this episode, and to Louise Salinas, who helps to coordinate ditch ethics, the intro and outro track greens was composed by Benjamin Tissot through Bensounds.com. If you like this episode, and would like us to continue to do episodes like this one, please consider leaving a review or giving us feedback directly. You can always email us digethix@mindandculture.org. You can also reach us on Facebook and Twitter @digethix and on Instagram @digethixfuture. Mow for a conversation on Pegasus. NSO spyware and the business of surveillance.


So today, we’re going to be talking about Pegasus, particularly about this last scandal that came out a, a journalist was able to uncover about 50,000 phone numbers which Well, I think it still needs to be verified how many of them were actually compromised phones. But a Israeli organization called NSO, was supplying spyware to governments really sets what they say. And they’re able to use this really powerful skyward to make what’s called a zero attack. And basically, these are, this means that the spyware is able to go active without anyone actually clicking a link, all they have to do is receive a message, which if you think about it is a really sophisticated, early sophisticated kind of attack. And as I’ve been looking through articles, the Israeli government actually considers this type of spyware to be a weapon, meaning this is like military grade, spyware that’s been deployed mainly against journalists, activists, and other kind of political high, you know, kind of high interest people. So for instance, the in France Macron was one of the people who, who’s, who had a compromised phone. And in fact, most of the attacks that I was able to see, were focused mainly on people in Europe, at least from this latest string of events. However, it’s really important to note that NSO actually has a long history of these kinds of events and of exploiting vulnerabilities within phones, particularly within iPhones, of basically trying to jailbreak the phone for for lack of a better term, or at least, that’s how we look at report puts it. And basically, once it’s Gil, once you jailbreak an iPhone, that basically means you can run all kinds of software for whatever purpose you want. And if the person doesn’t know, then basically every all their personal information on their phone is no longer secure. So we’re going to be leaving a lot of links to articles, including to pass things to just so that people can of course do their own research, I think it’s really important to look into. There’s a lookout report from 2016 that actually goes over the code, the technical aspects of how that attack happened. But a similar kind of audit will probably need to come out from this latest array of attacks, because one of the actual consequences of the lookout report was, and so completely changed the ways in which they attacked like their vector of attack, because of course, they don’t want people to know that they’re spying on them. That’s the whole point. So anyway, that’s just to give a little bit of background information and we’ll kind of take it from here but but Bernd, I just love to hear your kind of initial reaction. When you hear something like this has happened?


Bernd Dürrwächter 5:03
This thing comes to mind this the the organization was spying on people called ns o and web thing in America called NSA, as if somebody designed that it was like a tom clancy novel who tries to tell a real story with some of the actors being abstracted to the color innocent instead of in the sea. Because it doesn’t remind, well, it’s a bit to be fair to them. That’s just the it’s just like the initial of the creators, right? It’s right, along with their names, but but it does make it sound like if America does remind me of 2001, Space Odyssey how the computer was called how and was one letter offset from IBM, the big computer company,below the alphabet from IBM. Butall the other parallel, they were the NSA hack. The American NSA spying on Americans, Edward Snowden, uncovered us that this is early spyware was in the end used against also Israeli journalists write something that was supposed to be a foreign intelligence to defend the country of Israel was then turned against political enemies inside the country. And the other thing I want to add to this, from a technical perspective, this being able to send somebody a secret message, because the thing was not only Can’t you defend yourself against using message, they were not visible, they didn’t show up in your…literally the phone can receive a message without you seeing the message. And that’s debated whether those are backdoors by Google and Apple, right, that they’re not necessarily vulnerabilities. But you know, they use stuff like that to update the device or revoke your like when your phone gets lost, they can disable it remotely. So there are some superpowers or admin powers that these companies have like the will make the phones or do the service. And so in a way, they always have this power to control your phone. That’s why some people jailbreak it themselves, you know, they root their phones, so they have total control over it. And that was an existing capability that just got abused. So we have to be clear about that. It’s not some evil, just some evil hacker, the technical capability was there all along. So somebody else deemed it necessary to control your phone, and then somebody in your bad actor use that for a different purpose? I think that’s important to point out. And what I found interesting, I hadn’t heard that before that you said the Israeli intelligence service considered it a weapon, which is not unusual for cyber warfare. It’s but the irony of the same place that can invent the stuff acknowledges that a weapon you will figure they defend themselves or no, it’s not a weapon, but we cannot legally admit to that.


Seth Villegas 7:34
Yeah, I mean, there’s definitely a lot of curious things about this particular event. So for instance, the CEO actually came out and gave an interview about this thing. And one of the things that he said sort of defend himself was, okay, look like we only give this technology to governments, right, which, which actually doesn’t say anything about whether it’s been used fairly or not, for First of all, but you He said, Okay, so you, we can only give it to people approved by the Israeli government, which, you know, again, we’re talking about political relationships about allies and whatnot. And he said, Okay, well, once we give them the software, like, we’re not responsible for what they do, if we knew that they were targeting activists, and journalists and whatnot, we obviously wouldn’t let them use it. Because I’d be there’s human rights violations that are implicated with using technology in this way. It was just really interesting to hear that this kind of like, Oh, well, you know, like, we gave them a gun. But, you know, I didn’t realize they were going to use it on whoever they wanted. It’s it’s just kind of a bizarre situation, I think people are acting really viscerally to it. But one of the reasons why I started with the history of this particular company is because if they have a method of hacking into people’s phones, that gets patched out or it gets fixed, you know, say there’s a certain backdoor that they’re able to explain, then they find another way in, right. That’s their job. They develop technology to do this kind of thing. And something you’ve actually mentioned in the past burn is about a kinds of arms race that can happen when people are deliberately trying to do these things. And again, this is a military grade organization who’s trying to tamper with consumer products and that doesn’t even seem like a fair fight.


Bernd Dürrwächter 9:25
It’s interesting that what you said about the big kind of creative weapon that the make knowledge and as human rights violations, and yet the mentalities like what the end user does with is not our responsibility. It’s the classical diffusion of responsibility, right? Where me as a manufacturer of a system that can be used for dual purposes, it’s not my responsibility to restrict that to bad actors, which is actually bogus, because we have international agreements on arms traits, right? Like very tight control over nuclear weapons or weapons of mass destruction, and even regular weapons. I mean, they being undermined by criminals and organized crime and so on. Some rogue governments, but we already have a precedent for that humans in general agree that weapons trades should be restricted and accounted for. So to say, Oh, it’s not our responsibility, while at the same time they acknowledge is that it’s a weapon. It’s kind of like Riesling or, you know, I refer to that as externalizing. The cost I get all the benefit by sound technology, but whatever societal cost to freedom of political stabilities. That’s a cost society cares. I don’t have a stake in that. Right. That’s, that’s the unfairness of the whole system. The other thing I want to add this is interesting after the Cold War, so the Americans, the Russians were cold war, they got the sweetest sophisticated spy agencies. Right. Once, you know, the Soviet economy kind of fell apart and wasn’t considered that much of any meat anymore. What ended up in the 90s that they make us use all these spy resources display on Western European countries? Evil the way allies, right Western was the only way the United States by the use of industrial espionage? Right, just stealing trade secrets from French and German companies. So I find that curious that France was one of the, you know, we help governments to spy on their enemies. And in a way, like, I wasn’t really sure that Israel and France are at war with each other that like house house always said, like, Oh, we help you, your government without the government knowing that you attack another government?


Seth Villegas 11:22
actually one of the things I want to bring up and you might actually know more about this than I do is…they found let me see. Let me just so Amnesty International found about 212 infected DNS servers, specifically in Germany, by far the most, which was really interesting to see how focused on Europe most of those infected servers were. And again, it just speaks to a kind of suspicion. I don’t know what kind of politics is going on in the background. But it’s it. I would think these countries are at peace in some sense, right? Like there’s not like an active conflict going on. It’s we’re not talking about say, Israel deploying spyware against its explicit enemies, the though they have, right. So for instance, I think a country like Morocco comes to mind, for instance, and they’re specifically targeting activists in Morocco as well, that was one of the incidents that happened a few years ago. And so the kind of motivations for this are? Well, well, I mean, from here, what the CEO said, I’m not really sure. But one of the things I do want to kind of point out is, you know, we’ve kind of talked in the past that there’s not a lot we can do about, say bad actors, you know, so it’s like kind of people who want to just do whatever they want, they want to try and hack into things. But I think it’s different when you have institutional national level support for spyware of this kind that’s immensely powerful and immensely targeted. Because for instance, NSO is executed attacks, that costs hundreds of 1000s of dollars, like $500,000, upwards of like, $800,000, to specifically hack into one person’s phone without their knowledge. And that’s kind of scary. I mean, I don’t really know what to think about that. Right. But but I think it speaks to the scale of these things, as opposed to, you know, the person in their basement who’s trying to hack into another person’s phone, right, this is a completely different level of attack. And I just want to really make that clear, of why it’s such a big issue and why people are making a big deal about it.


Bernd Dürrwächter 13:37
Also sets in context where we’re sitting in the United States and kind of finger point critically examine other countries practices that the US is kind of like a leader in those things that has some implicit landed because what comes to mind is the Stuxnet virus, a while back with the NSA are one of the American technology agencies was experimenting with this. Remember that technologies, but it was infecting Windows computers through some vulnerability, all the way that the when the computer already booted up, so there’s no way for virus software to recognize it. And we have that in Latin was extremely contagious like computer viruses. The way to replicate themselves was like biological viruses. It’s been building as a as a hacker mechanism. And so the CIA or the NSA, they have this lab where they try out all these Sweetie, this malware and it got out of the lab, right? The classical sci fi movie where the biological virus gets nothing except it was a computer virus. And then in Iran, what is it they have centrifuges with which thing rich uranium which is eventually to make nuclear weapons out of and it just conveniently that Stuxnet virus was used to destroy the central folks right? And which we think was deployed by Israel against Iran, but you could trace back this computer virus back to the Americans, right. So they already set the precedents to give what they knew was a very lethal cyber warfare weapon, someone ended up in Israel. stance, which was actually not surprising because the US has been supplying weapons to Israel, there’s their allies, right? So in a way, like, instead of the US attacking Iran directly they use the secondary state and with with Russia having this clearly, you know, we’ve had lost the Cold War, I keep thinking whenever not some other actors harms the US or the Western allies. Somewhere in the end, it’s either Russia or China, like the sweetie systemic, different ideologies, where it’s a proxy war, Russia themselves would do it, and they would get trade embargoes, and it would be a big political mess. But if they go through other states acting on behalf of them harming their enemies, you know, this is always something to keep in mind. Like, is it really Israel versus France? Or is it somewhere in the back, you have the old communist system versus the Western European system? It’s never really what it seems to be on the surface. There’s there’s longer history and then more strategic adversaries at play with just some of them are just pawns, like when you have this country fighting that country in Europe, who’s really behind that? Possibly?


Seth Villegas 16:00
Yeah, and I think one of the things to point out here is that if we’re talking about government clients, you know, one of the strongest national ties that Israel has is to the US. So I don’t want to imagine like, that wouldn’t be the case. And also, you know, unfortunately, because the the way that media is concentrated, there’s lots of incentives, let’s say not to be that full chain, military secrets of the US. Right. So those are just kind of things to consider. But I’m glad that there’s a story out here so that people understand what’s possible, because we’ve talked about this kind of offline in the past of, there’s really only so much you can do if someone wants to, like tap into what you’re doing. And and see, and it’s just really, I mean, it’s it’s really scary, right? Even thinking back to remember the FBI demanded the backdoors to two iPhones, if they’re looking at cases and stuff, and from what you’re saying. It’s not that they couldn’t give them the power, or they don’t have that power, right. It’s there’s, there’s kind of a brand protection that’s going on there as well. But if someone can take advantage of that, or if, again, the lockout report in 2016 talked about, you know, if there’s a certain process that you can interrupt in some way, and then get it to run random code of your choice, then you can run this whole process of getting more and more malware into a system and to have sophisticated kinds of attacks like that are going to be really hard to deal with, for for almost everyone. And if you’re in a politically sensitive position, then you have all the much more to worry about, especially because we depend upon our technology so much to connect with each other.


Bernd Dürrwächter 17:48
This is a really difficult space to navigate. Because on the on the one hand, you fight the naivety of the average consumer who’s not aware just how much is going on in the background. And on the other hand, being stamped as a conspiracy theorists, because there’s plenty of those very unreasonable conspiracy theorists. And this is a whole complex that does exist as well documented, the average person doesn’t understand because it’s too complex. But it’s definitely another conspiracy theory because things like Google being founded by the CIA, and that’s a well established fact. Right? And this whole video came out on September 11 attacks at that point, the threshold was very low to sacrifice privacy for for national security. Or this whole bog with FBI versus Apple may very well have backdoors. But the the model at the public conversation so much, like you said, so Apple doesn’t take a brand damage, and the government doesn’t look like overreaching, right? So they diffuse this whole thing, have this public fight. We want the back door, no, we’re not gonna give it to you. Now the public believes there’s no backdoor when really, technologists have established like what I’ve just described this MP, invisible message. That’s the mechanisms that companies build into it that’s being used by the government. It’s this, the public debate isn’t really with you. That’s why investigative journalism is so important and valuable. And this is why they silenced the media, good journalists, white, white, why these measures go because they’re uncovering a lot of this. And my personal belief is actually the more people know the truth in democratic society, the more this will be taken care of. So transparency, then people can vote, they can choose to not buy certain products, they can go on the streets and protest. So a lot of effort is being put into diffuser or put smokescreen out the issue so people can really have an informed political opinion.


Seth Villegas 19:28
Exactly. And the fact that say things like zero click attacks, like this ones are possible that those attacks can you know, send you a message and then make sure you don’t see it. So you don’t even know that it’s happened. It’s aI don’t know it’s just a it’s it’s really sophisticated on a level that most people like i think it’s it’s hard to fathom. And it’s funny because there’s actually lots of things going around now to be like, oh, Okay, like, you know, here’s how to check to make sure that your phone doesn’t have Pegasus. And we’re talking about, you know, 50,000 targets, which is obviously a lot. But you know, in comparison to everyone who has a smartphone, it’s not that big of a percentage. And so it’s it’s odd people worrying about, like this specific sort of thing, I don’t think is what we’re trying to do, but rather to raise awareness that these sorts of things can happen. And that there’s both, you know, exploits in the ways that things run, and also these, you know, superpowers admin superpowers that you’re talking about, that the company kind of always has, regardless of what you think about it. And I think that raises issues with, say, rising political tensions and efforts to say, censor other people or to track other people, you know, people who are on this side or that side of the government, depending on who’s in charge. And I think that’s where we would start to run into issues where people are kind of unduly targeted by these systems. And honestly, I mean, we have no idea how much this is already happening. But it’s just something that we kind of have to keep in mind of, the more sophisticated these technologies become. And the more we depend on them, the more we’re also making ourselves kind of vulnerable to those attacks, right, or adding an extra vectors for those things to happen. So it’s not just as simple as embracing technology and all of its glory. But there’s this other thing that comes with it.


Bernd Dürrwächter 21:34
What I find remarkable is we had exactly the same scenario when the US, NSA was spying. You know, I remember in Germany, the head of state, Angela Merkel, she’s like the equivalent of a US president. But for Germany, it’s not called president, but she’s a head of state. And her phone was being spied on by the NSA, whether it was like 2010, these 10 years ago, and what a big shot that was because the US was supposed to be an ally. Right? They never really considered that. I mean, these phones are federal states, they’re encrypted, they have all kinds of security measures. Right? You know, Germany has its own secret service for national security and this and that, I’m kind of surprised like, how was it impossible for microns form to be back like that? Didn’t they learn anything from 10 years prior? Or is it really just No, it’s only hardened against adversaries we didn’t assume as well what hack it but it’s usually a total security package. So this friends not have a spy agencies who knows how to secure the phones have their own heads of states, in there’s a little bit level of incompetence. Like these things are known that cyber warfare is a thing that’s like the main thing, because we agreed on we can nuke each other anymore, we can torture and kill the soldiers, you know, it’s just the new domain. So now this, the conflicts are reserved just moves to a higher level, it’s becoming economic. Now it becomes information warfare, in a way like Steven Pinker or documented is like, it’s actually a safer version death works. Not so many people die anymore, right? Like, as shocking as it is to the freedom of thought and all this and autonomy of decision making. In the bigger picture. The wars are not fought on the battlefield anymore, soldiers die, or we don’t know if the whole planet but that conflict is not carried out in the economic and the information sphere, which doesn’t make it any better. But I’m just surprised about like governments like France or in in Germany, how they haven’t recognized that that information will first or even the US in 2016, that they didn’t really harden against that, what they knew Russia was interfering with information warfare, this, you have the bad actors, but at the same time, it’s almost reckless to know there’s bad actors and not do anything about it. So it’s almost like reckless endangerment even if your intention wasn’t to do anything harm, but to know, it was a bad actor and not defend against it.


Seth Villegas 23:40
I mean, I feel like it’s maybe going a bit far to sit, they don’t have they haven’t tried to prepare for these kinds of attacks, I’m sure that they, they have right and partially It is a matter of money and time and a lot of cases, right? Where if you really try hard enough to get into one person’s specific phone, you know, maybe there’s a vulnerability maybe. So for instance, again, going back to 2016. lookout, put out this report, basically that went step by step on how the the spyware work, right. And I would advise anyone who’s interested in that to get to go and read it. And we’ll have a link in the description. And basically, what that was talking about is very kind of convoluted process of using the way that the iPhone marked, like temporary memory for trash, right, for, you know, kind of the way allocates and reallocates resources, and then just kind of, you know, just just throwing in line by line of code, right and there there are just certain processes that were that were vulnerable. And I think one of the reasons why Apple has been under criticism is how much more could they have done and one of the things that Apple actually said is like, well, it’s our job to protect everyone. I don’t know how much we can protect one specific person to this kind of targeted attack. And that’s kind of why I’ve been emphasizing this right of it. It really is trying to get into one person’s phone get into one person’s data. And that’s why it’s really his espionage. Right? It’s, it’s, it’s something that’s very aggressive, very direct.


Bernd Dürrwächter 25:31
I mean, I hear with arguments, it can be hard. And you know, no matter how much you harden your house against 10 crumbles, if somebody stayed to get enough is enough resources, they will break into the house. And if they just nuke it, or something like that. On the other hand, I worked in computer security. And that’s like your main job, so to say, there’s nothing we can do about it. So if you’re working on the German Secret Service, or French, you could serve as a new cyber expert, you’re not doing a job. If you say, Oh, well, there’s nothing I could do about it. Because a lot of these vulnerabilities if you go on the nerd level and follow people like Bruce Schneier, right, there’s a lot of security experts who actually point out this vulnerabilities. Even if they don’t know that particular interest, they don’t have access to the source code. But there’s this whole industry in computer security called penetration testing, they get paid for hacking this white hat hacker, right, they get paid to try to hack into the system, there’s even contest. There’s sometimes even professional hackers who become white attacker, because your bank of america says we pay you a 10,000 bounty for every hacking breaking, you find we’d rather pay a white hat hacker than have a black hat hacker actually break into our systems. So there are methods to address this. So to me, it’s it still comes across as negligence of duty. If you’re the Secret Service of a country, and your job is to secure the President’s phone, they’re clearly not doing their job as good because you have the security experts who say you should be doing this, you should be doing this. And then if you look under the covers, they have high turnover, they didn’t get the budget allocated or some mid level political issues just kept them from actually focusing on because I heard there’s a lot of companies to who got hacked, where a lot of, you know, can pick the space shovel where the experts usually warn the company, you shouldn’t start the spatial and icy conditions, right? I’m sure within the French government or Indian Government, somebody pointed out on a technical level, and these people don’t get the the attention level of a higher opposition maker to actually do something about at least that’s my experience. And I wonder to what extent that played out with these.


Seth Villegas 27:23
Yeah, and I think one of the other things that you’re speaking to, in that you’ve kind of done a really good job of explaining is just how different conflict is now. And I think there may be a sense in which we don’t really understand the implications of what it means to have a primarily cyber warfare driven mode of conflict, you know, especially of international conflict of, you know, Nash, again, like national level actors, military grade technology, you know, you know, encryption, as you said before, was a military grade technology at one point. And I definitely do think it is, it is definitely embarrassing if the head of state ends up getting hacked in this way, though, for sure.


Bernd Dürrwächter 28:10
I want to make clear that the reason why I care so much on the responsibility, because one of the pillars of ethics is the if you know something’s dangerous, and you’re professional, it’s your duty to deploy it safely. So it cannot be abused or blew up, right. And one of the precedent cases one of the case studies we use in our training is Equifax. I believe Equifax sits on this huge data Trove that they gathered from people without their consent, and they use it for credit score, right? That same data can be used for identity theft, to impersonate you taking out a credit, which very facts Equifax will verify against their database. So it’s kind of like they create this vulnerability by having all this data from people and there was a data breach. Like some hardcore hackers, commercial hackers broke into the systems and Equifax Oh, bummer, a big accident, right? No, you should have known just how sensitive this information is. And worse yet, they turn around and then sold like credit record monitoring or logging to the customers who were affected by the breach that they cost. There’s almost you could reject that of this government where the government creates these weapons that later Oh, good thing we have a secret service and or an NSA you can defend against those it’s almost like a self fulfilling prophecy, we cause a problem where with this solution to like Kaspersky labs, right, whether it was some shady they were working with the actual hackers and then Kaspersky Labs is a security company that helps you fix the ransomware that they were and then that’s where it becomes unethical. If you’re, if you say, who there’s nothing can do about it, when really it’s your your moral duty to do the best you do do something about it. You know, there’s always 95 you know, the remaining 5% where nobody can do anything about it, but to me, it’s an ethical lapse when you sit in a high level of responsibility. And you don’t put the diligence in it to keep the accidents or there’s a bad actor and we agreed on that criminals are by definition, unethical. If it’s like the definition of crime, but the person who deals with the criminal need to do their, their duty, otherwise they’re they’re becoming enablers.


Seth Villegas 30:11
I have a couple of thoughts on this. First off, I do think there’s a difference between, say a company like Equifax, right, who really does specialize in data analysis, like like these other sorts of things. And they, they really should understand why the system is vulnerable, and what sorts of things can come out of it. And if they have some other thing that they’re trying to do, or maybe eats into their bottom line to change that weight things, the way that things work, that they won’t do it. But I also do know that that some other industries, and unfortunately, one of the ones that really comes to mind is the data infrastructure of hospitals, of where they kind of have to rebuild the record system, you’re kind of piecemeal, because they’re not in the business of that right like, and so that leads these weird convoluted systems that probably aren’t as secure as they need to be. And it’s harder to move institutions like that to completely redesign those things, because they also depend upon the that infrastructure for, you know, patient records, right, it’s that they’re using all the time. And so I would hope something like this would be kind of a big reminder of how important it is to keep those things secure. And this is the last thing I’ll say on this point, is something you’d mentioned earlier, especially when you’re talking about admin superpowers is it’s a matter of exploiting capability, or it’s a matter of explaining things that the phone’s already doing, right, like, like it is already tracking your location, when you probably don’t want it to right, it is hardy reading your messages when you don’t want it to. And so the fact that someone is spying on your phone to take advantage of those capabilities that are already sort of built into the system is also a problem, right? It’s not simply that, you know, someone’s going in and has to figure out how to do all those things. They’re kind of really tapping into tracking capacities that are already there. They’re just redeploying them for their own purposes.


Bernd Dürrwächter 32:17
One thing I want to what you just said, I want to anchor and so you mentioned health care, right? They’re very vulnerable, because they focus on healing people not not managing it systems, even though there’s, there’s it so it’s actually a positive note, right? We’ve been criticizing and kind of like, point out what the dystopian world would have been. But there’s there’s a positive development of all the hackers like the criminal hacker, so that economic, you know, organized crime hackers who are economically benefit from this ransomware there is a subset of these hackers who have developed a code of ethics, like back in the mafia, they also had a code of ethics, it’s certainly like to kill children or stuff like that. And they will commit it. No, we will only extort banks, only organizations, but we will not attack hospitals where there’s a risk of people dying when they don’t have a record of allergies or medication. And then there’s a set, we don’t care. We just want money. We don’t care if people die, or you have there was a Russian organization, and we hate Americans. Anyway, every American is good American. But I found it curious that you had a subset of hackers who became ethical hackers, like we will only extort money from people who have too much. It’s almost like, they were like, what would you call the SEC, like, right wing people kill people and lifting people to destroy property is almost no way it does have an issue with capitalism, we don’t want to kill people who just extract money from those who already have too much. And it’s like, you know, it’s this little silver lining. If we can just turn more hackers into ethical hackers to do not harm people, at least they only do harm to those who can afford the


Seth Villegas 33:42
Well, this is one of the really bizarre things, I think about the kind of range of ransom attacks, and that’s probably something we’ll need to talk about as well is I know, for at least a few of those cases, the hackers would go to a company, right? And they would demand a bunch of money. And then the company would just be like, Hey, you know, this is way too much. And then hackers would be like, Oh, I’m sorry, we didn’t mean to ask for that much. But we’ll take that we just we just want the money. We don’t want to ruin your business.


Bernd Dürrwächter 34:14
Well, that’s like the Steiner business. It’s called negotiation, right? Because the hacker is not served if they can afford it all Oh, god of businesses, you know, when you go back to the mafia asking for protection, right? They go in a restaurant and ask for protection, or you could catch Slayer. If they would ask for too much money. They don’t have a future business to extort any more. So it’s much better to have a monthly installment and smaller amount than one lump sum. So it’s kind of makes good business sense if you think about, okay, so sustainable hacking is what Yeah, that’s right. Yeah. economically sustainable.


Seth Villegas 34:48
So I think a lot of people would probably object to this notion of hacker ethics, I think, because, you know, again, this is very sort of self driven. It’s people who kind of already operating on their own I mean, it is important to note that people do say penetration attacks specifically for the service of specific businesses, governments, other types of things, while still making the connection here to political actors to because I don’t think we have a shared value system in all of society, because we got the right wing spectrum, the left wing spectrum, the libertarians and anarchists and this and that.


Bernd Dürrwächter 35:23
I know, here’s an example. So in Germany in 1970s, there was this left wing terrorist group that kidnapped politicians and sea level executives and companies. First, they thought they just randomly murder people. Turns out, they picked out all the old Nazis, like people who somehow survived the Nuremberg Trials never were uncovered. And now they were in high ranking positions. And this group somehow figured out who was an old Nazi and did war crimes just never got discovered. And they picked them out. And so all of a sudden, there was a large part in the German population who harbored them right to give them support, give them a safe house and stuff because they morally agreed, yeah, these Nazis shouldn’t run the modern Germany company. That’s just unethical. So this almost a form of vigilante justice. And I wonder to what extent like these ethical hackers actually become more socially acceptable if they say, the banks are way too rich, their practices, so we’re going to extort money out of them. And then anybody who was a hardcore anti capitalist will say, Yeah, go for it. That’s a that’s a moral hack, right? Because it represents all values. So in that point, is a video unethical by some people’s standards? Well, capitalism is unethical. Therefore, if you take money away from billionaires, that’s by our system, a noble goal.


Seth Villegas 36:32
American movies actually how a lot of like one of the highest sort of tropes actually it you know, especially if you’re going to have a protagonist, one of the reasons that that person can be admirable is in part, a lot of those scenarios are depicted in which there’s some sort of justice. Right? Right. Whatever it’s been.


Bernd Dürrwächter 36:50
Well, it’s the age old Robin Hood, right? The original Robin Hood from the 1500s. Right? It’s okay to steal if you take it from the rich.


Seth Villegas 36:58
Right. And so to kind of roll back on this a little bit, because I think the, the whole quagmire of ethical hacking is, is going to be very, I think, troublesome for lots of people. But something we can make a concrete connection to, though, is the way in which computer science students today, kind of take on the moniker of hacker, right and, right, so this, this is something that you see all over the place in coding competitions, you know, basically everywhere, right? Where, I mean, you’re not like a hacker in the sense that you’re doing things that are outside of the confines of society, or, you know, creating dangerous viruses, though, you know, sometimes people do do that. I don’t think they’re doing that in their classes, though. But but the sense that there’s something of that ethos that kind of exists within this sort of tech world. And I know, you’re probably more involved with this than I am. So what kind of do you see and how, say, how have hackers sort of changed, right, like, especially in terms of who it’s applied to, and Did you see anything kind of consistent from like when you got into the industry to now are somewhat clarified that I’m not involved in hacking as you just suggested?


Bernd Dürrwächter 38:17
I actually take issue with a So Facebook kind of introduced that made it mainstream their headquarters is literally on one hacker Wait, wait when they develop the campus? You know, Zuckerberg was so excited. I’m a hacker. So yeah, the word hacker is overloaded. And that happens a lot in American English where words get overloaded with different meanings, right? It’s not the activity is made mobile but it’s considered a whole different activity. And that that meaning of hacking is kind of like an expression of rebel ism right against the old system. Right that we’re not gonna let and you know, it’s largely came of age with a millennial technologists, right? There was this schism between anybody before millennials had a very top down and had to be approved and authority base that came way back out of the 50s and 60s, so like this very military driven space program every No. And that was the the rebel ism of the millennial technologists. They were like the third generation computer technologists basically, we knew Oh, we want to kind of let the old people tell us how to do stuff. So that’s where that hacking basically do something that’s not approved, but not only criminal ways. Just disrespect authority, disrespect the older generation. And the ethical issue with that is no matter which way you put it, I’ve heard people of that generation and younger even say that there’s a certain self righteousness, like what old people do is not okay. But we what we do is okay, which of course every generation set has, the people who are now old used to say that too, and when they were younger, it’s like, we’re gonna change society for the better and then they get older and become corrupt, right, they get more busy defending the status quo. And we’ll be funny to watch this as well with these younger hackers, you know, like Zuckerberg is now almost 40 right? See what happens when he’s 60 and he’s busy entrenching his own position and then the new generation will come up with ways to hack his way or one of the tick tock or there’s already more and more on social media, right? The young people aren’t actually on Facebook, they’re on anti Facebook kind of social media. So to that extent hacking is still against the rules, but not what’s commonly considered criminal. All right? It’s no this is why just hacking and the ethical aspects who’s to define that bs societal consensus, if you consult other people, if you decided by yourself, you’re just another totalitarian dictator, no matter how much you believe you’re doing good.


Seth Villegas 40:27
Exactly, I think you put it really well it to kind of give some extra context to this. So for instance, the people who are considered hackers in say, the 1980s, were the people who developed worms for the first time, they were the people who developed viruses, very vindictive, you know, untested software. And a lot of cases there, there was one big case that came out from a, I think, from a PhD student at MIT, who hacked a lot of the academic servers just was something that made things run really slow history of things like Trojan horses, or there was the people in India who would load kind of malware onto bootleg CDs, right. And they felt like it was again, like just that the people who were buying these things, right, that they’re getting their just desserts, because they’re already doing something bad, right? They’re just finding a way to punish them through this technology. And I think that that element of self righteousness is still kind of there. But yeah, the way that most people talk about today, unless they’re actual hackers, again, kind of people operating outside of the law is mainly as you know, anti establishment in their sort of orientation, right? Like, we’re gonna make new systems where I do our own thing, actually see that a lot in the crypto space, if I’m being honest, right, especially, you know, again, kind of against the banking system of, okay, like, Look, we’re gonna do this whole thing, you know, it’s gonna be way better. It’s, and it’s really hard to escape that. And I think it as people are kind of thinking about the culture around technology. It’s just something to kind of, do just kind of keep in mind, right, and that people should just remember that, that if those can be used to justify lots of different kinds of things. And that’s why I think it’s necessary to add these sorts of broader conversations about, well, what is it that we’re going to find acceptable? And what Aren’t we going to find acceptable right.


Bernd Dürrwächter 42:28
Now that another thing to unwind, just for clarification, this a lot of stuff isn’t really related technology, people are subject to this if technologies enabling or causing. So one example I use this a whole way Uber operated, how they broke all kinds of laws and Ubers transportation. It’s not a technology company, no matter how much when they got listed on stock market. It’s a technology company. No, their main business is transportation. They just it’s kind of like saying, General Electric is a technology company, because they use CAD software design, the engines or any company always use computer in the last 60 years, right? They’re not automatically technology company. So they went in all these markets and broke the taxi, right? They had to get licensed to get a taxi. And that was a $50,000. A year ago, I had to buy their level. So everybody abide by the rules, and they just broke the rules, just like organized crime, breaks the rules and gets into business that nobody else occupies, because it’s criminals that operate there. And then they created so much momentum for people using that service that they say, Well, people clearly wanted. So now we’re going to change the rules. Airbnb did the same thing, right? They use technology. Sure. Uber was the first time you could order a taxi with a cap. You can use your phone nowadays to find the living space couch serving was nothing new, or people renting out their their home, right. None of this was really new. And the method wasn’t even the technology that enabled that it was the ruthlessness, like we can go and break the rules of society. And breaking the rules of technology was just one aspect. But I think it’s a whole generation, more upsetting of the old order, which happens with every generation, right? So in a way, it’s somebody pointed out while you talk about that technology, ethics ethics is ethics. Technology is a tool but it’s human ethics is always human.


Seth Villegas 44:08
Well, one thing I will push back on what you’re saying, though, in regards to the technology is the rate of adoption right and when you’re talking about the ruthlessness I think you kind of you’re also speaking about this right? But the the rate of like how fast these things were able to get adopted pretty much everywhere, is a big part of the story. And and actually, you’ll see this that say Airbnb had trouble operating in specific regions of the country because of the conflicts with local law, and how much people were able to push back against things really, really did come down to say like a local political fight. But it I do think that there’s still kind of enabling that happens here and they’re also able to kind of dress up the narrative in a specific kind of way to be like, No, no, this is going to work this this way now. And I think that does go back to this sort of like hacker style of thinking of, well, you know, your taxi rules are stupid, so we’re just not going to follow them.


Bernd Dürrwächter 45:19
Right? I mean, you said you took issues with my argument, but it’s you kind of confirming it. Because what I’m basically saying in most cases, it’s not. Oh, cool. There’s a new technology, what can we do with it? Oh, we discover this superpower. And most technology is advanced and developed to a political or economic and drivers say we really want to corner the taxi market. And what we need, we need all the data and we need a way to nudge people or get in front of people because Facebook was around already, they really only exploded once they there was this mobile phone that everybody had, and they could put an app on there. So technology is an enabler. But I want to move away from that. It’s only because of technology, because there’s always these actors who try to, you know, dominate the market take over the world. And then they look for any tools that they could use or develop it from scratch, because a lot of the big data technology was developed from Facebook or Google, right. So they started out with this vision. And now they developed the software to realize the vision, not the other way around. So Oh, there’s big data, we can pick up the SEC know that technology is living to develop as an enabling factor for their ambitions, which some of which are ethically questionable.


Seth Villegas 46:25
I think, because we started this with Pegasus, I do want to kind of get back to talking about spyware, surveillance and things like that, you know, especially in the last kind of last few minutes that we have. And one of the things I actually really want people to think about is not necessarily about, again, these kind of military grade attacks that are very targeted, but instead about the kind of vulnerabilities potentially known vulnerabilities of consumer apps to exploit things within the kind of consumer ecosystem. And what I mean by that is, apps can leave themselves on and listen to your conversations, right? They can get data from other apps. You know, Facebook used to have this really scary thing. And I think is actually something you may have even pointed out to me when it’s trying to get us to be able to coordinate software one time have you get an app and asks you to install something and ask for every permission possible of like, oh, like, I’d be happy to let use the software. Just let me look at everything that you have in here. And everything else that that might be connected to, which is a little alarming to say the least. But I think that those sorts of things are really common. And so if we kind of get distracted by the Oh, at least, you know, NSO will think twice about spying on people, right? It’s like, well, yes. But the sorts of things that happen all the time? Aren’t…We just kind of accept them? I guess I don’t know how else to put I don’t know if it is that we just don’t know about it, or, or what but but like I’m like, when I rub into these things, then I kind of feel like there’s no way around them if I want to use the technology that feels bad. And I’m glad that apps are a little bit more intentional about like, oh, like, Hey, can we track you in this way or that way? And you can opt out of it. But I don’t even know if that works, right? That’s just you know, it could just be…could just be you know, interface, right?


Bernd Dürrwächter 48:28
I mean, it’s the strange thing, this developments, one of the ethical principles, informed consent, right, maintain my decision autonomy in some strange way. While the app tells you exactly what’s going to do is actually fairly transparent, right? You get a list, I basically want to do everything I want to read and write your contacts. And the author has left me with, you know, Candy Crush, and the game app, and I want to read all your contacts. It’s right there. They literally say what they had to do. It’s like it’s up to you, is that discerning consumers? Like, why do you need to read my phone? And I, I do that, but the majority of the people, basically, oh, well, whatever, you know, I’m the impulse to play Candy Crush, it’s more important than or, you know, maybe people don’t read it. But that’s almost like application of duty on the consumer side, right? Where a company can’t do more in a free society and free market, then we’re transparent, that you make the decision whether you want to enter in our contract. This only becomes a hazard when you say Google and Apple are such essential services like water and electricity right there that point of you can’t do without having a phone email. Right? If they put a 42 page Terms of Service in front of you that you can either say yes or no, you have no way of negotiating, that’s, you have no choice. When we’re talking about the capture last time, there’s certain things I’m a captive audience, I really have no choice because I have to go to my electronic banking and that’s where it becomes this informed consent doesn’t really work because abt can unconstant Just give me the information. But consent is only meaningful if I can withdraw it without severe consequences.


Seth Villegas 49:57
So I think at this point, I’d be working about what, what we can do, if anything, as I think part of it is that at least just recognizing that we really don’t have to use these services, right. And especially for things like candy crush, that there really may be a way to play Candy Crush without, you know, giving access to everything. And I think there has to be a larger expectation of sort of selective sharing. And I think that there’s a, there’s a bit more of that going on. But I think part of it at least years ago, was, well, we know we can get away with asking for this, because either people, you know, don’t care, they don’t know, or they’re kind of resigned to it. And so we’ll just do it anyway. And you’ll make sure we’re not like legally liable or anything like that. But I do think that framing those expectations will be increasingly important. And as services are being developed, to specifically not track, right, I think that that sort of pressure will probably help the whole ecosystem just by those kinds of services existing, you know.


It’s interesting, I was looking at precedents outside of technology, just how the industry is because the wheel is usually already been amended somewhere else. But it’s a different vehicle, you know, like rail board was cars, there was a precedent with the way Cablevision, a cable TV used to work, you get 500 channels and ended up eventually paying 120 $150. And so there’s this whole movement called ala carte. It’s like, I don’t want to watch all 500 channels, and I want to spend $120, they ended up making packages only to cost you $40 and get 20 channels, right. And now we’re at the point, you can stream you can sign up to whatever seems servers and then up impossible and $20. But that was a real debate. And I think the FTC will step in sec anti competitive that you offer everything at once for high price, you need to give the consumer choice. It’s kind of like the motto of the FTC ensuring that consumers have a choice. And it’s really you can translate that directly to any 42 page Terms of Service, because Google Apple, they have a lot of functionality. Why don’t they make a menu, it’s like, I only consent, I only use this and this and this. And therefore you can only get the persons that are relevant for this feature that I actually need. But why don’t they do that, because it’s complicated and really contradicts their their business interests. That would be the fair way, still enabling them to do business, still empowering all the consumers. But give me the choice because that’s the issue here. I don’t have a choice. I consent to all amendments, since it’s an essential service, I can’t do without a phone, I can’t do without email. So it’s kind of solutions right there. Because you can look at other industries, how they solve it. But there’s really not an economic or political interest by those who maintain the system. So it’s it’s not a lack of, what can we do about it in terms of ideas more like worse, the political will or even from the consumer side worse, worse than not consumer upset? Or worse? The FTC? Why don’t they address the information providers the same way they did? The cable providers?


Yeah. And I think this is a really great example of something that seems very doable, right? And if you think about it, if you did implement something like this, I’m sure companies could come up with ways to incentivize people to opt into things. Well, I think in part because the amount of money that they’re able to generate from data, right, that’s supposedly free, right is staggering, right? So Some say that they probably could, they probably couldn’t. I’m not just saying monetary incentives, right. But but like an if there could be additional services, or this is…


Bernd Dürrwächter 53:35
This is this is where and for when I say that companies have no incentive or motivation, they can do it, but they don’t want to, it’s like, well, there’s a far fetched statement. It’s like, no, what is the logical conclusion when Facebook has such sophisticated AI and data that they can pinpoint your emotions, your interest rates remarkable. And then when it comes to filtering out hate speech, that’s too complicated the amount of resources that would take That’s nonsense, because we do far more complicated stuff. When it comes to rolling into people’s emotions. You can do that the audience this point, you’re a sophisticated AI and how to filter out hate speech. But that’s clearly we have no incentive to do so. So we pretend we don’t have the capability but really like no, you’re, if you’re interested in something, you go to the moon, you you invent you stop by the three, there’s so much ingenuity behind those when they have a motivation to do so. Well, and as you said, their business is data, right? So there’s no reason for them to, or rather, there’s no incentive for them to do that on their own, like you really would have some data, raw material, their actual incentive were power is these algorithms to make use of data, right? I mean, that’s where the innovation is in capability, because there’s always been data but and that’s the thing, what does it take to filter out? You know, hate speech or anything that violates human rights, or the forms that are fully anti semitic in where we all agree, it’s not a page here. It does admit it. And it did not take a stance. It’s like we’ll put all our resources. I mean, I would do Just out of my reputation sake for where the company, right? It’s even if you don’t have the data, it’s like, you don’t want to be involved and stuff like that. But that’s where the economic interest is like conflict of interest, they make money with it, why would they accept the range of money?


Seth Villegas 55:18
Okay, great. I think the senate a pretty good job of wrapping up on this. Do you have any kind of Final thoughts before? before we sign off?


Bernd Dürrwächter 55:28
I always find myself though, the more I talk about, I realize like we’re pretty well informed. But I wish more lay people or a broader spectrum of society would just be aware of the issues without I think it would perform in the kind of same outreach we have with racial discrimination. And then collectively, more people would, yeah, we need to do something about I don’t think we’re at that threshold yet where enough people know what’s going on to be upset about it and politically, read a movement or express it with a vote soon.


Seth Villegas 55:54
Yeah, and I think part of the other problem with that is that we too, are kind of dependent upon lots of these, like big tech platforms, right, for even dissemination for this kind of information. Right. So if it really did came toa misalignment of incentives, then it would be hard to get this kind of information out. But but that’s in part why I think it’s really important to have conversations like this with as many people as possible. And you’re just trying to get this stuff out there. Because the I’m hoping that the research that we’re able to do and even the stuff we’ve kind of collected here, people can kind of look through and form their own opinions. Right. And you know, out of that we can start our conversation of like, Well, what does it mean that there’s been this transition to these types of things, that these things that are impossible? And once we do that, you know,maybe we can actually come to the maybe we can actually talk about, like, what would the right regulations be, you know, how should they be applied and everything like that, but we’re just not really at that point yet.


Thank you for listening, we really value your feedback. So if you liked this episode, please consider leaving a review, sharing it or getting in touch with us directly. You can find that information on our website digitex.org. Or you can email us dg ethics at mining culture.org. You can also reach us on Facebook and Twitter at digitex and on Instagram at dijet iX feature. For this episode Burton, I want to tackle both the technical aspects of how Pegasus attacks work, and also the broader context in which those attacks happened. As the title alludes to NSO is likely going to continue to try and find new vulnerabilities and new vectors for attack. This means that there will always be continued conflict between cybersecurity experts and the people trying to penetrate those systems.

It’d be easy to say those sorts of attacks aren’t really targeted at regular people. But while that’s true, I also do worry about the more mundane forms of surveillance that we encounter on a daily basis. If anything, these kinds of attacks go to show just to what extent someone can invade our lives if they really want to. To me, it would seem that this generation of technological systems reaches incredibly far into every aspect of what we do. And personally, I don’t think we’ve done nearly enough to negotiate where those boundaries start and where they should end. This is signing off


Transcribed by https://otter.ai